CN111221661A - Intelligent editing method, mobile terminal and computer readable storage medium - Google Patents

Intelligent editing method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN111221661A
CN111221661A CN201811428098.5A CN201811428098A CN111221661A CN 111221661 A CN111221661 A CN 111221661A CN 201811428098 A CN201811428098 A CN 201811428098A CN 111221661 A CN111221661 A CN 111221661A
Authority
CN
China
Prior art keywords
intelligent
editing
edit box
common
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201811428098.5A
Other languages
Chinese (zh)
Inventor
阙新华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qiku Internet Technology Shenzhen Co Ltd
Original Assignee
Qiku Internet Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qiku Internet Technology Shenzhen Co Ltd filed Critical Qiku Internet Technology Shenzhen Co Ltd
Priority to CN201811428098.5A priority Critical patent/CN111221661A/en
Publication of CN111221661A publication Critical patent/CN111221661A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/543User-generated data transfer, e.g. clipboards, dynamic data exchange [DDE], object linking and embedding [OLE]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses an intelligent editing method, a mobile terminal and a computer readable storage medium, comprising the following steps: receiving an instruction edited by a user, and entering a common editing frame; when the condition that the intelligent edit box is triggered to enter is met, jumping from the common edit box to the intelligent edit box; receiving information edited by a user in the intelligent editing frame; and receiving a finishing instruction, returning to the common editing frame, and storing the content edited in the intelligent editing frame. According to the intelligent editing method, the mobile terminal and the computer readable storage medium, when large-segment characters need to be edited, the ordinary editing frame is skipped to the intelligent editing frame for copying, pasting, selective deleting, personalized designing and other editing, corresponding instructions are received and then the ordinary editing frame is returned, edited contents are sent, great convenience is brought to a user for editing the large-segment characters, and user experience is improved.

Description

Intelligent editing method, mobile terminal and computer readable storage medium
Technical Field
The present invention relates to the field of computers, and in particular, to an intelligent editing method, a mobile terminal, and a computer-readable storage medium.
Background
When a mobile terminal sends a short message, sends a WeChat message and chats a QQ message and needs to edit a large segment of characters, the large segment of characters are edited due to the small common editing box, and the operation is extremely inconvenient, so that the use experience of a user using the mobile terminal is influenced, and how to edit the large segment of characters at the mobile terminal is urgent.
Disclosure of Invention
The invention mainly aims to provide an intelligent editing method, a mobile terminal and a computer readable storage medium, and solves the problem that the traditional common editing frame is small and has limitation on editing large-segment characters.
The invention mainly aims to provide an intelligent editing method, which comprises the following steps:
receiving an instruction edited by a user, and entering a common editing frame;
when the condition that the intelligent edit box is triggered to enter is met, jumping from the common edit box to the intelligent edit box;
receiving information edited by a user in the intelligent editing frame;
and receiving a finishing instruction, returning to the common editing frame, and storing the content edited in the intelligent editing frame.
Further, when the condition that the intelligent edit box is triggered to enter is met, the step of jumping from the common edit box to the intelligent edit box comprises the following steps:
judging whether the common editing frame contains editing content or not;
if the common editing frame contains editing contents, storing the contents in the common editing frame into the intelligent editing frame, and entering the intelligent editing frame at the same time;
and if the common editing frame does not contain the editing content, directly entering the intelligent editing frame.
Further, the step of jumping from the common editing box to the intelligent editing box when the condition of triggering to enter the intelligent editing box is satisfied further includes:
judging whether the current common edit box is a common edit box returned after receiving the completion instruction;
if yes, the system cannot automatically enter an intelligent edit box;
if not, continuously judging whether the number of the words input by the common edit box reaches the preset number of the words;
if the number of words input by the common editing frame reaches the preset number of words, automatically entering an intelligent editing frame;
and if the number of words input by the common editing frame does not reach the preset number of words, the intelligent editing frame cannot be automatically entered.
Further, the step of jumping from the common editing box to the intelligent editing box when the condition of triggering to enter the intelligent editing box is satisfied further includes:
receiving an instruction of entering the intelligent edit box input by a user; jumping from the common edit box to the intelligent edit box.
Further, the step of receiving a completion instruction, returning to the ordinary edit box, and saving the content edited in the intelligent edit box includes:
and if the finished instruction is received, storing the contents edited by the intelligent editing frame into the common editing frame, and returning to the common editing frame.
Further, after the steps of receiving a finishing instruction, returning to the ordinary edit box, and saving the content edited in the intelligent edit box, the method further includes:
and after returning to the common edit box, automatically sending out the content in the common edit box.
Further, the step of receiving the information edited by the user by the intelligent editing box comprises:
timing the editing time of the information edited by the user;
and when the editing time reaches the preset time, automatically caching the information edited by the user, clearing the editing time and restarting timing.
Further, the step of receiving the information edited by the user in the intelligent editing box further comprises:
and performing personalized processing on the information edited by the user, wherein the personalized processing comprises the step of leading the information edited by the user into a preset template or processing the information according to the received personalized operation input by the user.
The application also provides a mobile terminal, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of any one of the methods when executing the computer program.
An embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of any one of the methods described above.
According to the intelligent editing method, the mobile terminal and the computer readable storage medium, when large-segment characters need to be edited, the ordinary editing frame is skipped to the intelligent editing frame for copying, pasting, selective deleting, personalized designing and other editing, corresponding instructions are received and then the ordinary editing frame is returned, edited contents are sent, great convenience is brought to a user for editing the large-segment characters, and user experience is improved.
Drawings
Fig. 1 is a schematic flowchart of a method for intelligent editing according to an embodiment of the present application;
fig. 2 is a schematic flowchart illustrating step S2 in the method for intelligent editing according to an embodiment of the present application;
fig. 3 is a block diagram illustrating a structure of a mobile terminal according to an embodiment of the present application.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, an embodiment of the present application provides an intelligent editing method, which is applied to text input of a mobile terminal, and includes the steps of:
s1, receiving an instruction edited by a user, and entering a common edit box;
s2, jumping from the common edit box to the intelligent edit box when the condition of triggering to enter the intelligent edit box is met;
s3, receiving information edited by a user in the intelligent editing frame;
and S4, receiving a finishing instruction, returning to the common edit box, and storing the edited content in the intelligent edit box.
As described in step S1, when the user needs to edit information, the user edits in the normal edit box of the mobile terminal, and since the normal edit boxes such as texting, wechat, and QQ are small, when a large text is required to be input, it is very inconvenient to edit, selectively delete, and selectively copy the large text in the normal edit box, and thus a large edit box is required to perform the above operations. Therefore, the intelligent editing frame is designed, when large-section characters need to be edited, the instruction entering the intelligent editing frame is received, and editing can be carried out in the intelligent editing frame. Due to the fact that the editing frame of the intelligent editing frame is large, operation operations such as editing, selective deleting and selective copying of large-section characters can be conveniently executed in the intelligent editing frame, and the mobile terminal is an electronic device such as a smart phone and a tablet personal computer.
As described in the above step S2, when the state of triggering entry into the intelligent edit box is satisfied, the normal edit box jumps to the intelligent edit box for editing, that is, the interface of the normal edit box jumps to the interface of the intelligent edit box, so as to complete the editing operation.
As described in step S3, the information edited by the user is received in the intelligent edit box, and the information edited by the user is displayed in the intelligent edit box.
As described in step S4 above, when the editing is completed, a completion instruction input by the user is received, and the normal edit box is returned. After receiving a finishing instruction input by a user, storing the edited content in the intelligent editing frame, wherein the edited content can be stored in a common editing frame by storing the edited content in the intelligent editing frame, so that the edited content is convenient for the user to send, or the edited content can be stored in a pasting board, and the user only needs to copy and paste the content after returning to the common editing frame. The edited content is preferably stored in a common editing frame, and the user naturally wants to send out the edited message when editing the content, so that the step of copying and pasting is omitted, the time of user operation is saved, and the user experience is improved compared with the step of storing the message in a pasting board.
Referring to fig. 2, in an embodiment, the step S2 of jumping from the normal edit box to the intelligent edit box when the state triggering entry to the intelligent edit box is satisfied includes:
s21, judging whether the common edit box contains edit content or not;
s22, if the common edit box contains edit content, storing the content in the common edit box into the intelligent edit box, and entering the intelligent edit box;
and S23, if the normal edit box does not contain the edit content, directly entering the intelligent edit box.
As described in the above step S21, that is, whether the input content is already included in the normal edit box is checked, and the content in the normal edit box is further checked to facilitate the next operation.
As described in step S22, when the content is contained in the normal edit box, the content can be transferred to the smart edit box and displayed in the smart edit box, so that the time for editing the content in the normal edit box and editing the content in the smart edit box again is saved. In addition, the edited content in the common editing frame is deleted, so that the situation that the edited content is mixed with the edited content before after the content input in the intelligent input frame returns, and even ambiguity is caused is avoided. For example: after 'not' is input in the common edit box, it is found that large text is required to be edited or an intelligent edit box is required to be edited, an instruction for entering the intelligent edit box is input, after the intelligent edit box is edited, the 'not … …' is changed into 'not … …' to cause ambiguity, and a user cannot conveniently view the intelligent edit box but does not display the intelligent edit box, so that the content in the common edit box needs to be stored in the intelligent edit box and deleted. In addition, the content in the ordinary edit box can be directly skipped to the intelligent edit box for continuous editing without deleting the content in the ordinary edit box, and the content edited in the intelligent edit box is saved in the ordinary edit box after the content is input before the content is finished. It is understood that this application encompasses such methods of operation, but such methods of operation are not generally employed.
As described in step S23, if there is no edited content in the normal edit box, the intelligent edit box can be entered directly, that is, the content in the normal edit box does not need to be saved in the intelligent edit box.
In one embodiment, the step S2 of jumping from the normal edit box to the intelligent edit box when the condition of triggering to enter the intelligent edit box is satisfied further includes:
s221, judging whether the current common edit box is a common edit box returned after receiving the completion instruction;
s222, if yes, the system cannot automatically enter an intelligent edit box;
s223, if not, continuously judging whether the number of the words input by the common edit box reaches the preset number of the words;
s224, if the word number input by the common edit box reaches the preset word number, the system automatically enters an intelligent edit box;
and S225, if the number of the words input by the ordinary edit box does not reach the preset number of the words, the intelligent edit box cannot be automatically entered.
As described in the above steps S221 to S222, if the current frame is in the normal edit box, it needs to be determined whether the current frame is a returned normal edit box after the editing of the intelligent edit box is completed. If so, the system does not automatically enter the intelligent edit box, that is, after the intelligent edit box is completed, if the number of words reaches the predetermined number of words, according to the subsequent step S224, the system directly enters the intelligent edit box, so that the system automatically enters the intelligent edit box when the user returns to the ordinary edit box after the input in the intelligent edit box is completed. And equivalently, returning to the common editing frame from the intelligent editing frame, automatically entering the intelligent editing frame, and repeating repeatedly, so that the next operation cannot be performed. Therefore, whether the intelligent edit box is a common edit box during returning needs to be judged, if so, the intelligent edit box cannot be automatically entered, and the existing problems are solved.
As described in steps S223-S225, if the normal edit box is not returned from the intelligent edit box, the number of words input in the normal edit box is determined; and if the number of words input in the common editing frame reaches the preset number of words, automatically entering the intelligent editing frame in the system. If the preset word number is not reached, the intelligent editing frame cannot be automatically entered, the preset word number can be self-contained in the system or can be set by the user, when the user does not need to edit the large-segment characters, the large-segment characters can be left in the common editing frame to continue editing, and the user can conveniently check the messages in time. However, when the number of words that the user unconsciously inputs is large during editing, the editing frame of the common editing frame is too small, so that the user is inconvenient to check, supplement, delete and the like, and forgets to use the intelligent editing frame. At this time, a step of automatically entering the intelligent edit box is set, so that the situation can be avoided, and if the user still wants to edit in the common edit box, the user can return to the common edit box to continue editing.
In an embodiment, the step S1 of jumping from the normal edit box to the intelligent edit box when the condition of triggering to enter the intelligent edit box is satisfied further includes:
receiving an instruction of entering the intelligent edit box input by a user; and jumping from the common editing frame to the intelligent editing frame.
As described above, the user can input an instruction to enter the smart edit box externally; when a user feels that large-segment characters need to be input, or the user thinks that the input is more convenient in the intelligent editing box and the like needs to actively enter the intelligent editing box, an instruction for entering the intelligent editing box can be input externally, and the user can directly enter the intelligent editing box for editing; the instruction input by the user to enter the intelligent edit box is not limited, does not conflict with other instructions, and can be conveniently input, for example: the intelligent editing frame can be used for long-time pressing of the common editing frame or continuous clicking of two or more common editing frames, and a virtual key capable of entering the intelligent editing frame can be popped up on an editing page of the common editing frame and can be popped up along with the common editing frame.
In an embodiment, the step S4 of receiving a completion instruction and returning to the normal edit box includes: and if the finished instruction is received, storing the contents edited by the intelligent editing frame into the common editing frame, and returning to the common editing frame.
As described in the above step S4, after the intelligent edit box is edited, the contents edited in the intelligent edit box are saved in the normal edit box for the user to send. In addition, the completed instruction is an instruction for completing editing in the intelligent editing box, and the instruction returns to the common editing box after being received, so that the chat interface can be checked, and the next operation is facilitated. In the embodiment, when the edited content in the intelligent editing frame is stored in the common editing frame, the common editing frame directly sends the stored content out, the content is edited in the intelligent editing frame in detail, and all the content is inconvenient to watch in the common editing frame, so that the stored content can be directly sent out, the time for sending the content by clicking again by a user is saved, and the user experience is greatly increased. In addition, in another embodiment, the content in the intelligent editing box can be directly sent to the chat interface, and the normal editing box is returned, so that the operation time of the user is saved.
As described in step S3, in an embodiment, the step of receiving the information edited by the user in the smart edit box further includes:
and performing personalized processing on the information edited by the user, wherein the personalized processing comprises the step of leading the information edited by the user into a preset template or processing the information according to the received personalized operation input by the user.
For different sending targets of the edited information, personalized processing can be performed through some preset templates, for example, backgrounds between lovers can be sweet and little, the overall sent characters are heart-shaped, the personalized operation can also be processing some edited contents, important information is edited into different and prominent characters or marks, for example: marking important information as other colors, underlining, changing font size, adding a repetition number and the like; in addition, word spacing, fonts, paragraph formats and the like can be adjusted, and certainly, some templates can be preset for the convenience of users, and the preset templates can be carried by the system or can be designed in advance.
According to the intelligent editing method, when large-segment characters need to be edited, the ordinary editing frame is jumped to the intelligent editing frame to be copied, pasted, selectively deleted, individually designed and the like, corresponding instructions are received, the intelligent editing method returns to the ordinary editing frame, edited contents are sent, great convenience is brought to a user to edit the large-segment characters, and user experience is improved.
Referring to fig. 3, an embodiment of the present invention further provides a mobile terminal, which includes a processor 1080 and a memory 1020, where the memory 1020 stores a computer program, and the processor 1080 executes the computer program to implement the steps of any one of the above methods. The mobile terminal of this embodiment is a carrier of the intelligent edit box and the common edit box in the above embodiments.
For convenience of explanation, only the parts related to the embodiments of the present invention are shown, and details of the specific techniques are not disclosed. The mobile terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (personal digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and the like, taking the mobile terminal as the mobile phone as an example:
fig. 3 is a block diagram illustrating a partial structure of a mobile phone related to a mobile terminal according to an embodiment of the present invention. Referring to fig. 3, the cellular phone includes: radio Frequency (RF) circuit 1010, memory 1020, input unit 1030, display unit 1040, microphone 1050, audio circuit 1060, wireless fidelity (WiFi) module 1070, processor 1080, and power supply 1090. Those skilled in the art will appreciate that the handset configuration shown in fig. 3 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 3:
RF circuit 1010 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for processing downlink information of a base station after receiving the downlink information to processor 1080; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuit 1010 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1010 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to global system for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 1020 can be used for storing software programs and modules, and the processor 1080 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1020. The memory 1020 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1020 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1030 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1030 may include a touch panel 1031 and other input devices 1032. The touch panel 1031, also referred to as a touch screen, may collect touch operations by a user (e.g., operations by a user on or near the touch panel 1031 using any suitable object or accessory such as a finger, a stylus, etc.) and drive corresponding connection devices according to a preset program. Alternatively, the touch panel 1031 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1080, and can receive and execute commands sent by the processor 1080. In addition, the touch panel 1031 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1030 may include other input devices 1032 in addition to the touch panel 1031. In particular, other input devices 1032 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, or the like.
The display unit 1040 may be used to display information input by a user or information provided to the user, as well as various menus of the cellular phone, an application interface of an application program, and the like. The Display unit 1040 may include a Display panel 1041, and optionally, the Display panel 1041 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-emitting diode (OLED), or the like. Further, the touch panel 1031 can cover the display panel 1041, and when the touch panel 1031 detects a touch operation on or near the touch panel 1031, the touch operation is transmitted to the processor 1080 to determine the type of the touch event, and then the processor 1080 provides a corresponding visual output on the display panel 1041 according to the type of the touch event. Although in fig. 3, the touch panel 1031 and the display panel 1041 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1031 and the display panel 1041 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1050, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1041 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1041 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 1060, speaker 1061, and microphone 1062 may provide an audio interface between a user and a cell phone. The audio circuit 1060 can transmit the electrical signal converted from the received audio data to the speaker 1061, and the electrical signal is converted into a sound signal by the speaker 1061 and output; on the other hand, the microphone 1062 converts the collected sound signals into electrical signals, which are received by the audio circuit 1060 and converted into audio data, which are then processed by the audio data output processor 1080 and transmitted to, for example, another cellular phone via the RF circuit 1010, or output to the memory 1020 for further processing. In the embodiment of the application, voice input can be performed through the microphone in the intelligent editing frame, and the input voice is converted into characters to be edited.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help the user to send and receive e-mail, browse web pages, access streaming media, etc. through the WiFi module 1070, which provides wireless broadband internet access for the user. Although fig. 3 shows the WiFi module 1070, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1080 is a control center of the mobile phone, connects various parts of the whole mobile phone by using various interfaces and lines, and executes various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1020 and calling data stored in the memory 1020, thereby integrally monitoring the mobile phone. Optionally, processor 1080 may include one or more processing units; preferably, the processor 1080 may integrate an application processor, which handles primarily the operating system, user interfaces, applications, etc., and a modem processor, which handles primarily the wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 1080.
The handset also includes a power source 1090 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 1080 via a power management system to manage charging, discharging, and power consumption via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, the computer program, when executed by a processor, implementing a method for intelligent editing, including: receiving an instruction edited by a user, and entering a common editing frame; when the condition that the intelligent edit box is triggered to enter is met, jumping from the common edit box to the intelligent edit box; receiving information edited by a user in the intelligent editing frame; and receiving a finishing instruction, returning to the common editing frame, and storing the content edited in the intelligent editing frame.
In one embodiment, when the state triggering the entering of the intelligent edit box is satisfied, the step of jumping from the common edit box to the intelligent edit box includes: judging whether the common editing frame contains editing content or not; if the common editing frame contains editing contents, storing the contents in the common editing frame into the intelligent editing frame, and entering the intelligent editing frame at the same time; and if the common editing frame does not contain the editing content, directly entering the intelligent editing frame.
In one embodiment, the step of jumping from the normal edit box to the intelligent edit box when the condition of triggering to enter the intelligent edit box is satisfied further includes: judging whether the current common edit box is a common edit box returned after receiving the completion instruction; if yes, the system cannot automatically enter an intelligent edit box; if not, continuously judging whether the number of the words input by the common edit box reaches the preset number of the words; if the number of words input by the common editing frame reaches the preset number of words, automatically entering an intelligent editing frame; if the number of words input by the common editing frame does not reach the preset number of words, the intelligent editing frame cannot be automatically entered;
in one embodiment, the step of jumping from the normal edit box to the intelligent edit box when the condition of triggering to enter the intelligent edit box is satisfied further includes: receiving an instruction of entering the intelligent edit box input by a user; skipping from the common editing frame to the intelligent editing frame;
in one embodiment, the step of receiving a completion instruction, returning to the normal edit box, and saving the content edited in the intelligent edit box includes: if the finished instruction is received, storing the contents edited by the intelligent editing frame into a common editing frame, and returning to the common editing frame;
in one embodiment, after the steps of receiving a completion instruction, returning to the normal edit box, and saving the content edited in the intelligent edit box, the method further includes: and after returning to the common edit box, automatically sending out the content in the common edit box.
In one embodiment, the step of receiving the information edited by the user by the intelligent editing box comprises the following steps: timing the editing time of the information edited by the user; and when the editing time reaches the preset time, automatically caching the information edited by the user, clearing the editing time and restarting timing.
In one embodiment, the step of receiving the information edited by the user in the intelligent editing box further comprises: and performing personalized processing on the information edited by the user, wherein the personalized processing comprises the step of leading the information edited by the user into a preset template or processing the information according to the received personalized operation input by the user.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided herein and used in the examples may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (SSRDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct bused dynamic RAM (DRDRAM), and bused dynamic RAM (RDRAM).
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. A method of intelligent editing, comprising:
receiving an instruction edited by a user, and entering a common editing frame;
when the condition that the intelligent edit box is triggered to enter is met, jumping from the common edit box to the intelligent edit box;
receiving information edited by a user in the intelligent editing frame;
and receiving a finishing instruction, returning to the common editing frame, and storing the content edited in the intelligent editing frame.
2. The intelligent editing method according to claim 1, wherein the step of jumping from the normal edit box to the intelligent edit box when the condition of triggering to enter the intelligent edit box is satisfied comprises:
judging whether the common editing frame contains editing content or not;
if the common editing frame contains editing contents, storing the contents in the common editing frame into the intelligent editing frame, and entering the intelligent editing frame at the same time;
and if the common editing frame does not contain the editing content, directly entering the intelligent editing frame.
3. The intelligent editing method according to claim 1 or 2, wherein the step of jumping from the normal edit box to the intelligent edit box when the condition of triggering to enter the intelligent edit box is satisfied further comprises:
judging whether the current common edit box is a common edit box returned after receiving the completion instruction;
if yes, the system cannot automatically enter an intelligent edit box;
if not, continuously judging whether the number of the words input by the common edit box reaches the preset number of the words;
if the number of words input by the common editing frame reaches the preset number of words, automatically entering an intelligent editing frame;
and if the number of words input by the common editing frame does not reach the preset number of words, the intelligent editing frame cannot be automatically entered.
4. The method for intelligent editing according to claim 1, wherein the step of jumping from the normal editing box to the intelligent editing box when the condition of triggering to enter the intelligent editing box is satisfied further comprises:
receiving an instruction of entering the intelligent edit box input by a user; jumping from the common edit box to the intelligent edit box.
5. The method for intelligent editing according to claim 1, wherein the steps of receiving a completion instruction, returning to the normal edit box, and saving the content edited in the intelligent edit box further comprise:
and if the finished instruction is received, storing the contents edited by the intelligent editing frame into the common editing frame, and returning to the common editing frame.
6. The method for intelligent editing according to claim 5, after the steps of receiving a completion instruction, returning to the normal edit box, and saving the content edited in the intelligent edit box, further comprising:
and automatically sending out the content in the common edit box.
7. The intelligent editing method according to claim 1, wherein the step of receiving the information edited by the user in the intelligent editing box comprises:
timing the editing time of the user editing information;
and when the editing time reaches the preset time, automatically caching the information edited by the user, clearing the editing time and restarting timing.
8. The method of intelligent editing of claim 1, wherein the step of receiving user-edited information within the intelligent editing box further comprises:
and performing personalized processing on the information edited by the user, wherein the personalized processing comprises the step of leading the information edited by the user into a preset template or processing the information according to the received personalized operation input by the user.
9. A mobile terminal, characterized in that it comprises a memory and a processor, said memory storing a computer program, characterized in that said processor, when executing said computer program, implements the steps of the method according to any one of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 8.
CN201811428098.5A 2018-11-27 2018-11-27 Intelligent editing method, mobile terminal and computer readable storage medium Withdrawn CN111221661A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811428098.5A CN111221661A (en) 2018-11-27 2018-11-27 Intelligent editing method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811428098.5A CN111221661A (en) 2018-11-27 2018-11-27 Intelligent editing method, mobile terminal and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN111221661A true CN111221661A (en) 2020-06-02

Family

ID=70827428

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811428098.5A Withdrawn CN111221661A (en) 2018-11-27 2018-11-27 Intelligent editing method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111221661A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113791771A (en) * 2021-08-16 2021-12-14 成都摹客科技有限公司 Editing method and system of composite component, storage medium and electronic equipment
CN113867581A (en) * 2021-09-13 2021-12-31 维沃移动通信有限公司 Content editing method and device and electronic equipment
CN114063855A (en) * 2021-11-12 2022-02-18 北京字跳网络技术有限公司 Content input method, device, equipment and medium of instant communication software

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314300A (en) * 2010-07-08 2012-01-11 广东国笔科技股份有限公司 A kind of intelligent switching system and method
CN103064758A (en) * 2012-12-14 2013-04-24 广东欧珀移动通信有限公司 Edited content timing saving method for mobile terminal device and mobile terminal device
CN103809948A (en) * 2012-11-12 2014-05-21 三亚中兴软件有限责任公司 Mobile application edit-box application method and device based on event monitoring
CN104142911A (en) * 2013-05-08 2014-11-12 腾讯科技(深圳)有限公司 Text information inputting method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314300A (en) * 2010-07-08 2012-01-11 广东国笔科技股份有限公司 A kind of intelligent switching system and method
CN103809948A (en) * 2012-11-12 2014-05-21 三亚中兴软件有限责任公司 Mobile application edit-box application method and device based on event monitoring
CN103064758A (en) * 2012-12-14 2013-04-24 广东欧珀移动通信有限公司 Edited content timing saving method for mobile terminal device and mobile terminal device
CN104142911A (en) * 2013-05-08 2014-11-12 腾讯科技(深圳)有限公司 Text information inputting method and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113791771A (en) * 2021-08-16 2021-12-14 成都摹客科技有限公司 Editing method and system of composite component, storage medium and electronic equipment
CN113867581A (en) * 2021-09-13 2021-12-31 维沃移动通信有限公司 Content editing method and device and electronic equipment
WO2023036237A1 (en) * 2021-09-13 2023-03-16 维沃移动通信有限公司 Content editing method and apparatus and electronic device
CN114063855A (en) * 2021-11-12 2022-02-18 北京字跳网络技术有限公司 Content input method, device, equipment and medium of instant communication software

Similar Documents

Publication Publication Date Title
CN106775637B (en) Page display method and device for application program
CN108958867B (en) Task operation method and device for application
CN106657669B (en) Mobile terminal information session interaction control method and device and mobile terminal
CN109542287B (en) Message reminding method and device, mobile terminal and storage medium
CN106303070B (en) notification message prompting method and device and mobile terminal
CN104281568B (en) Paraphrasing display method and paraphrasing display device
CN109948102B (en) Page content editing method and terminal
CN109189303B (en) Text editing method and mobile terminal
CN109739402B (en) Fast application processing method and terminal equipment
CN109284144B (en) Fast application processing method and mobile terminal
EP3249857B1 (en) Chat history display method and apparatus
US20160292946A1 (en) Method and apparatus for collecting statistics on network information
CN107748741B (en) Text editing method and mobile terminal
CN111476209B (en) Handwriting input recognition method, handwriting input recognition equipment and computer storage medium
CN111221661A (en) Intelligent editing method, mobile terminal and computer readable storage medium
CN107103074B (en) Processing method of shared information and mobile terminal
CN111222001A (en) Method for marking image, mobile terminal and storage medium
CN109271262B (en) Display method and terminal
CN111405043B (en) Information processing method and device and electronic equipment
CN106327342B (en) Processing method of emoticon and terminal
CN106791174B (en) Alarm clock adjusting method and device and mobile terminal
CN110333803B (en) Multimedia object selection method and terminal equipment
CN106202422B (en) The treating method and apparatus of Web page icon
JP2021532492A (en) Character input method and terminal
CN105095161B (en) Method and device for displaying rich text information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200602