CN112732134A - Information identification method, mobile terminal and storage medium - Google Patents

Information identification method, mobile terminal and storage medium Download PDF

Info

Publication number
CN112732134A
CN112732134A CN202011531213.9A CN202011531213A CN112732134A CN 112732134 A CN112732134 A CN 112732134A CN 202011531213 A CN202011531213 A CN 202011531213A CN 112732134 A CN112732134 A CN 112732134A
Authority
CN
China
Prior art keywords
area
information
screen
mobile terminal
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011531213.9A
Other languages
Chinese (zh)
Inventor
杨涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Microphone Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Microphone Holdings Co Ltd filed Critical Shenzhen Microphone Holdings Co Ltd
Priority to CN202011531213.9A priority Critical patent/CN112732134A/en
Publication of CN112732134A publication Critical patent/CN112732134A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

The application provides an information identification method, a mobile terminal and a storage medium. The method comprises the following steps: acquiring a trigger instruction, wherein the trigger instruction is used for selecting an area to be identified; executing an information identification instruction based on the area to be identified, and identifying the information content in the area to be identified; and determining and/or displaying the information processing shortcut key according to the information content. The information content in the area to be identified comprises picture content and/or text content. After the mobile terminal determines the information content, the mobile terminal determines an information processing shortcut key according to the text content and/or the picture content in the information content. The method and the device improve the intelligence of the mobile terminal and improve the user experience.

Description

Information identification method, mobile terminal and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to an information identification method, a mobile terminal, and a storage medium.
Background
In the process of man-machine interaction, information is usually presented to a user through a screen of the mobile terminal. The information presented in this screen typically includes information that is useful to the user and information that is not useful to the user. The user often needs to perform the corresponding operation using optionally useful information. For example, when the useful information is a link, the user may need to acquire the link and open the link in the corresponding application.
When the user determines that useful information is included in the screen, the optional useful information is typically selected by a cursor. When the useful information needs to be applied to other application scenes, the user can paste the useful information into other application scenes in a copy and paste mode.
However, the use mode of the useful information has the problems of difficult acquisition and inconvenient use.
The foregoing description is provided for general background information and is not admitted to be prior art.
Disclosure of Invention
In view of the above technical problems, the present application provides an information identification method, a mobile terminal, and a storage medium, which enable a user to use information displayed in a screen more conveniently.
In order to solve the above technical problem, the present application provides an information identification method, applied to a mobile terminal, including:
s101, acquiring a trigger instruction, wherein the trigger instruction is used for selecting an area to be identified;
s102, executing an information identification instruction based on the area to be identified, and identifying information content in the area to be identified;
s103, determining and/or displaying the information processing shortcut key according to the information content.
Optionally, the method further comprises at least one of:
when the trigger instruction is triggered on different display interfaces, the information processing shortcut key jumps to different links and/or executes different preset instructions;
the triggering instruction is triggered on at least one of the following display interfaces: the system comprises a screen locking interface, a negative screen, an application interface, a main interface and an expansion interface.
Optionally, the step S101 includes at least one of:
one finger presses the screen, the other finger moves on the screen, and the trigger instruction determines the area to be identified according to the moving track of the other finger on the screen;
pressing the screen by one finger, and determining the full screen area of the screen as the area to be identified;
pressing the screen by two fingers for a long time, and determining an area between the two fingers as the area to be identified;
pressing the screen by more than two hands, and determining a minimum area containing the position of the finger as the area to be identified;
judging whether the trigger instruction is triggered or not according to the number of continuous clicks, and when the trigger instruction is triggered, determining an area selected by an identification selection area selection frame as the area to be identified;
when the triggering instruction is to draw a circle on a screen (optionally, the circle may be a closed ring or a curve which is not closed but is close to a ring), determining that a minimum rectangle containing the circle is the area to be identified.
Optionally, the step S101 includes displaying at least one identification selection area selection box, and after S101, the method further includes: and acquiring an adjusting instruction of the selection frame of the identification selection area, wherein the adjusting instruction is used for adjusting the size and the position of the selection frame of the identification selection area.
Optionally, the identifying information content in the area to be identified includes:
acquiring the information content of the area to be identified in the selection frame of the identification selection area;
and identifying the information content in the area to be identified according to a preset identification algorithm.
Optionally, determining the information processing shortcut key comprises at least one of:
determining key information in the information content according to the information content and/or preset key information, and determining an information processing shortcut key according to the key information;
and acquiring foreground application, and determining the information processing shortcut key according to the foreground application.
Optionally, displaying the information processing shortcut key comprises at least one of:
when the information processing shortcut key is displayed in the form of a button, at least one of the following is included: the button is displayed below the selection area frame, the button is displayed inside the selection area frame, the button is displayed on the intelligent panel, and the intelligent panel is an area on the screen;
when the information processing shortcut key is displayed in the form of a card, at least one of the following is included: displaying the card in all or part of the area of the screen, wherein the card comprises at least one of the following components:
when the card demonstrates in the screen with the mode that expands, clicks jump behind the card other links and/or execute and predetermine the instruction, work as when the card demonstrates in the screen with superimposed mode, click expand behind the card or jump the show page of card, work as when the card demonstrates in the screen with the mode of hiding, the card is attached to the edge of screen with the mode that shows brief information, clicks pop out the show behind the brief information of card the complete information of card.
Optionally, at least one of the following is included:
determining the classification of the information processing shortcut keys according to the information processing shortcut keys, displaying classified classification icons on a screen according to the classification of the information processing shortcut keys, and displaying the classified information processing shortcut keys after clicking the classification icons;
before S101, the method further includes: acquiring key information, acquiring operations of jumping other links and/or executing preset instructions, and matching the key information with the operations;
after S103, the method further includes: and acquiring a shortcut key selection instruction, wherein the shortcut key selection instruction is used for indicating the selected information processing shortcut key, and jumping to a preset interface and/or executing a preset instruction according to the information processing shortcut key indicated by the shortcut key selection instruction.
Optionally, the preset interface includes a full screen display interface of the card, an application interface associated with the information processing shortcut key, and an application interface before the selection frame of the identification selection area is triggered.
The application also provides an information identification method, which is applied to the mobile terminal and comprises the following steps:
s101, acquiring a trigger instruction, wherein the trigger instruction is used for selecting an area to be identified on a screen;
s102, executing an information identification instruction based on the area to be identified, and identifying information content in the area to be identified;
s103, determining the information processing shortcut key according to the information content, and displaying the information processing shortcut key on a screen.
Optionally, when the trigger instruction is triggered on different display interfaces, the information processing shortcut key jumps to different links and/or executes different preset instructions, and the trigger instruction is triggered on at least one of the following display interfaces of the mobile terminal: screen locking interface, negative one screen, application interface, main interface and expansion interface
Optionally, the triggering instruction includes at least one of:
one finger presses the screen, the other finger moves on the screen, and the trigger instruction determines the area selected by the selection box of the identification selection area according to the moving track of the other finger on the screen;
pressing the screen by one finger, wherein the area selected by the selection box of the identification selection area is a default area, and the default area is a full screen area of the screen;
pressing the screen by long pressing of two fingers, wherein the area selected by the selection area selection box is the area between the two fingers;
pressing the screen by more than two hands, wherein the area selected by the selection area selection box is the minimum box containing the position of the finger;
acquiring the number of continuous clicks on a screen, wherein the time interval between two adjacent clicks in the continuous clicking process is less than a preset time length, judging whether to trigger the trigger instruction according to the number of continuous clicks and the preset number of clicks, and when the trigger instruction is triggered, determining that the area selected by the selection frame of the identification selection area is a default area.
Optionally, when the trigger instruction is to draw a circle on the screen, the area selected by the identification selection area box is the smallest rectangle containing the circle, and the circle comprises a closed circle and a curve which is not closed but is close to the closed circle.
Optionally, after S101, the method further includes:
and acquiring an adjusting instruction of the selection frame of the identification selection area, wherein the adjusting instruction is used for adjusting the size and the position of the selection frame of the identification selection area.
Optionally, the S102 includes:
acquiring the information content of the area to be identified in the identification selection area selection frame according to an information identification instruction;
and identifying the information content in the area to be identified according to a preset identification algorithm, wherein the information content comprises the character content in the area to be identified and/or the image content in the area to be identified.
Optionally, the S103 includes:
determining key information in the information content according to the information content and preset key information;
and determining an information processing shortcut key according to the key information.
Optionally, the method further comprises:
and acquiring foreground application of the terminal, and determining the information processing shortcut key according to the foreground application.
Optionally, when the information processing shortcut key is presented in the form of a button, the method includes at least one of:
the button is shown below the identification selection area box;
the button is shown inside the identification selection area box;
the button is displayed on the intelligent panel, and the intelligent panel is an area on the screen.
Optionally, when the information processing shortcut key is displayed in the form of a card, the method includes:
and displaying the card in all or partial areas of the screen, wherein the display modes of the card comprise default superposition, default expansion and default hiding.
Optionally, the card is displayed in all or part of the area of the screen, and the card includes at least one of the following:
when the card is displayed on a screen in a default expansion mode, clicking the card and then skipping other links and/or executing a preset instruction;
when the card is displayed on a screen in a default overlapping mode, the card is clicked and then expanded or a display page of the card is skipped;
when the card is displayed on the screen in a default hidden mode, the card is attached to the edge of the screen in a mode of displaying brief information, and the complete information of the card is displayed in a popping mode after the brief information of the card is clicked.
Optionally, the method comprises:
determining the classification of the information processing shortcut keys according to the information processing shortcut keys;
and displaying the classified icons on a screen according to the classification of the information processing shortcut keys, and displaying the classified information processing shortcut keys after clicking the classified icons.
Optionally, after S103, the method further includes:
acquiring a shortcut key selection instruction, wherein the shortcut key selection instruction is used for indicating a selected information processing shortcut key, and the information processing shortcut key can be a button or a card;
and processing the shortcut key to jump to a preset interface and/or executing a preset instruction according to the information indicated by the shortcut key selection instruction.
Optionally, the preset interface includes a full screen display interface of the card, an application interface associated with the information processing shortcut key, and an application interface before the selection frame of the identification selection area is triggered.
Optionally, before S101, the method further includes:
acquiring key information;
obtaining the operation of jumping other links and/or executing a preset instruction;
matching the key information and the operation.
The present application further provides a mobile terminal, including: the device comprises a memory and a processor, wherein the memory is stored with an information identification program, and the information identification program realizes the steps of the method when being executed by the processor.
The present application also provides a computer storage medium having a computer program stored thereon, which, when being executed by a processor, carries out the steps of the method as described above.
As described above, the information identification method of the present application is applied to a mobile terminal, and after acquiring a trigger instruction of a user, generates an identification selection area selection box according to the trigger instruction; acquiring the content in a selection frame of the identification selection area on the screen according to the information identification instruction, and determining the area selected by the selection frame of the identification selection area as the area to be identified; the information content in the area to be identified comprises picture content and/or character content; after the information content is determined, determining an information processing shortcut key according to the text content and/or the picture content in the information content; alternatively, the information processing shortcut key may be displayed in the form of a button or in the form of a card. Through the mode, the user can realize the quick execution of the function by selecting the information processing shortcut key, the intelligence of the mobile terminal is improved, and the user experience is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a communication network system architecture diagram according to an embodiment of the present application;
fig. 3 is a flowchart illustrating an information identification method according to a first embodiment;
FIG. 4 is a diagram illustrating a recognition box display according to a first embodiment;
FIG. 5 is a schematic diagram of a smart panel display according to a first embodiment;
FIG. 6 is a schematic view of a card display according to a first embodiment;
FIG. 7 is a schematic view showing a classified picture display of an information processing shortcut key according to the first embodiment;
fig. 8 is a flowchart showing an information identification method according to a second embodiment;
fig. 9 is a schematic configuration diagram showing an information identifying apparatus according to a first embodiment;
fig. 10 is a schematic configuration diagram showing an information identifying apparatus according to a second embodiment.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It will be understood that, depending on the context, the word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination". Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, items, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. The terms "or," "and/or," "including at least one of the following," and the like, as used herein, are to be construed as inclusive or mean any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", again for example," A, B or C "or" A, B and/or C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
It should be noted that step numbers such as S101 and S102 are used herein for the purpose of more clearly and briefly describing the corresponding contents, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S102 first and then S101 in specific implementations, but these steps should be within the scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The mobile terminal may be implemented in various forms. For example, the mobile terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and optionally, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access 2000(Code Division Multiple Access 2000, CDMA2000), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Frequency Division duplex Long Term Evolution (FDD-LTE), and Time Division duplex Long Term Evolution (TDD-LTE), etc.
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor that may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally, the application processor mainly handles operating systems, user interfaces, application programs, etc., and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes User Equipment (UE) 201, Evolved UMTS Terrestrial Radio Access Network (E-UTRAN) 202, Evolved Packet Core Network (EPC) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include a Mobility Management Entity (MME) 2031, a Home Subscriber Server (HSS) 2032, other MMEs 2033, a Serving Gateway (SGW) 2034, a packet data network gateway (PDN Gate Way, PGW)2035, and a Policy and Charging Rules Function (PCRF) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IP Multimedia Subsystem (IMS) or other IP services, and the like.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
First embodiment
Referring to fig. 3, fig. 3 is a flowchart of an information identification method according to an embodiment of the present application. On the basis of the embodiments shown in fig. 1 and fig. 2, as shown in fig. 2, with a mobile terminal as an execution subject, the method of the embodiment may include the following steps:
s101, acquiring a trigger instruction, wherein the trigger instruction is used for selecting an area to be identified on a screen.
In this embodiment, after acquiring a trigger instruction of a user, the mobile terminal generates an identification selection area according to the trigger instruction. On the screen, the content in the identification selection area box is the content identified in S102. That is, the content in the identification selection area selection box is the content that the user needs to be processed by the mobile terminal. The content may be any content displayed on the screen when the user uses the mobile phone. That is, the mobile terminal may implement the trigger of the identification selection area box by a trigger instruction when being in any state.
The identification checkbox may be as shown in fig. 4. Optionally, the area marked by the dashed box in the screen of the mobile terminal is the area selected by the identification selection area box. As shown in fig. 4, the screen further includes an identification button, and the user can jump to the step executed in S102 by clicking the identification button after completing the setting of the identification selection area box.
In one example, the triggering instruction is triggered on at least one of the following display interfaces of the terminal: a screen locking interface, a negative one-screen, an application interface, a main interface, an expansion interface, and the like. Optionally, when the trigger instruction is triggered on different display interfaces, the information processing shortcut key jumps to different links and/or executes different preset instructions.
Optionally, the screen locking interface comprises a screen resting interface and a screen locking wallpaper interface.
Optionally, the application interface includes a display interface when the mobile terminal starts any application. The display interface comprises an application interface when the application occupies the whole screen, or an application interface when the application occupies a part of the screen, such as an application interface in a split screen state or an application interface when the application is displayed in a popup mode.
Alternatively, the expansion interface is another display interface which may appear in the mobile terminal besides the display interface described above, such as the right screen of iphone 12.
In the present application, the display of the mobile terminal when the trigger instruction is triggered is not specifically limited. Namely, in the using process of the mobile terminal, the user can trigger the trigger instruction under any display interface. However, when the trigger instruction is triggered in a different display interface, the trigger instruction may determine the information processing shortcut key according to the display interface, in addition to determining the information processing shortcut key according to the content in the identification selection area selection box.
In one example, the trigger instruction includes at least one of a long press, a multiple click, a hard press, and a circle on the screen.
Alternatively, the long press may be a time period for which the user presses the screen for a preset time period. The multiple clicks may be the number of times that the user clicks the screen reaches a preset number of times. The pressing force can be a preset pressure value when the user presses the screen. Circling on the screen may include the user drawing any one of a closed figure, a semi-closed figure, a line, etc. on the screen.
In another example, the triggering instruction further includes one finger holding the screen and the other finger moving on the screen. At the moment, the trigger instruction determines the area selected by the selection area selection box according to the movement track of another finger on the screen, wherein the area is the minimum rectangle containing the movement track.
In this example, the trigger instruction of the mobile terminal may be set by the user according to a preset option. For example, the preset options may include the triggering manners mentioned in the above two examples. The user can select one or more trigger modes from the trigger modes as the trigger mode of the trigger instruction in the mobile terminal. Alternatively, the trigger instruction of the mobile terminal may be set by the user. For example, the user may customize the trigger instruction when drawing a "C" on the screen, when tapping the screen with a finger joint, etc.
For example, when the mobile terminal is in the screen locking interface, the user can trigger the trigger instruction by clicking the screen locking interface for multiple times to generate the selection frame of the identification selection area. Or when the mobile terminal is in negative one screen, the user can trigger a trigger instruction by long-pressing the negative one screen to generate a selection frame of the identification selection area. Or when the mobile terminal is in an application interface, a user can press the screen through one finger, and the other finger moves on the screen to trigger a trigger instruction, so as to generate a selection box of the identification selection area.
In the following, the present embodiment details a specific implementation manner of triggering a trigger instruction and generating a selection area box by a mobile terminal through a plurality of examples.
In one example, when the trigger instruction is a long press or a hard press on the screen, the manner in which the mobile terminal triggers the selection area selection box according to the trigger instruction may include at least one of the following manners:
when the screen is pressed with one finger for a long time or strongly, the area selected by the selection area selection box is identified as a default area. Optionally, the default area is a full screen area of the screen.
In this example, the user may trigger the trigger command by pressing with one finger for a long time or pressing with one finger for a long time. When the user presses for a long time or strongly, the screen of the mobile terminal can be in any display interface. When the trigger instruction is triggered, the mobile terminal generates an identification selection area selection box according to the trigger instruction, and the area in the identification selection area selection box can be determined according to the default area. The default area can be set by the user according to actual needs. For example, the default area may be a full screen area of the screen. Alternatively, the default area may be a rectangle having a preset shape, such as a square with the short side of the screen as the side, and the like, and the long pressing or pressing position of the user is on a middle line of the square. Alternatively, the default area may be a square or rectangle centered on the long press or depression position of the user.
When the screen is pressed by two fingers for a long time or strongly, the area selected by the selection area selection box is identified as the area between the two fingers.
In this example, the user may trigger the trigger instruction by pressing with two fingers for a long time or pressing with two fingers for a long time. When the user presses for a long time or strongly, the screen of the mobile terminal can be in any display interface. When the trigger instruction is triggered, the mobile terminal generates an identification selection area selection box according to the trigger instruction, and the area in the identification selection area selection box can be determined according to the positions of the two fingers.
For example, the upper left corner of the screen is used as the origin of coordinates. When the horizontal and vertical coordinates of the two fingers are different, the area in the recognition selection area selection frame can be the area between the two fingers. At this time, the coordinate of the finger close to the origin is the upper left corner of the area, and the coordinate of the finger close to the origin is the lower right corner of the area. The mobile terminal can determine the rectangular area according to the upper left corner and the lower right corner, and the rectangular area is the area selected by the identification selection area selection box.
Or when the horizontal and vertical coordinates of the two fingers are different, the area in the identification selection area frame selection can be expanded according to the shape or the width-height ratio of the preset identification selection area frame on the basis of the area between the two fingers, so that the identification selection area frame selection is obtained.
Or, when the abscissa or the ordinate of the two fingers is the same, the connecting line of the two finger coordinates may be the central line of the selection area in the selection box of the identification selection area. For example, when the vertical coordinates of the two fingers coincide, the line connecting the two fingers is parallel to the upper boundary of the screen. At this time, the mobile terminal may determine a rectangular region between two fingers as the selected region by using the connection line as a central line of the selected region and using the length of the connection line as the side length of the selected region.
When more than two fingers are used for long pressing or the screen is pressed hard, the area selected by the selection area selection box is identified as the minimum rectangular box containing the positions of the fingers.
In this example, the user may trigger the trigger instruction by pressing more than two fingers for a long time or pressing more than two fingers with force. When the user presses for a long time or strongly, the screen of the mobile terminal can be in any display interface. When the trigger instruction is triggered, the mobile terminal generates an identification selection area selection box according to the trigger instruction, and the area in the identification selection area selection box can be determined according to the position of the finger. For example, the recognition selection area box may be a smallest rectangular box containing the positions of all fingers.
In another example, when the trigger instruction is to click the screen multiple times, the terminal may trigger to identify the selection area box according to the trigger instruction.
In this example, the user may trigger the trigger instruction by clicking the screen multiple times when the mobile terminal is in any display interface. The mobile terminal may preset a fixed number of clicks, for example, when the number of consecutive screen clicks of the user reaches 3 times, the mobile terminal triggers a trigger instruction. Meanwhile, the user can set the fixed number of clicks through the mobile terminal. Or alternatively. The mobile terminal may also preset a click threshold, for example, when the number of times that the user clicks the screen continuously exceeds 3 times, the mobile terminal triggers a trigger instruction. Meanwhile, the user can set the click threshold through the mobile terminal.
Taking a preset fixed number of clicks as an example, the step of triggering a trigger instruction by the mobile terminal according to the number of clicks of the user and generating a selection frame for identifying the selection area may include:
step 1, obtaining the continuous clicking times on a screen, wherein the time interval between two adjacent clicks in the continuous clicking process is less than the preset time length.
In this step, when the mobile terminal detects that the user clicks the screen, the mobile terminal starts to accumulate the click frequency and the click times of the user. The mobile terminal determines the longest time interval between two clicks, namely the preset duration, according to the preset continuous click frequency. And when the mobile terminal detects the second click within the preset time length, the mobile terminal accumulates the click and counts the next click again. Otherwise, when the mobile terminal does not detect the second click within the preset time length, the mobile terminal sets the continuous click frequency to zero. When the mobile terminal detects the next click, the mobile terminal triggers the accumulation of the click frequency and the click times again.
And (3) when the mobile terminal does not detect the second click within the preset duration, outputting the current continuous click frequency to the step (2), and comparing the continuous click frequency with the preset click frequency. Or the mobile terminal can compare the continuous click times with the preset click times after each accumulated continuous click times.
And 2, judging whether a trigger instruction is triggered or not according to the continuous click times and the preset click times.
In this step, the mobile terminal determines whether the number of consecutive clicks is greater than or equal to a preset number of clicks according to a comparison result of the number of consecutive clicks and the preset number of clicks. And if the continuous clicking times are larger than or equal to the preset clicking times, triggering a triggering instruction. Otherwise, the mobile terminal continues to accumulate the next continuous click or waits for the next click and starts to count again according to step 1.
And 3, when the trigger instruction is triggered, determining the area selected by the selection area selection box as a default area. Optionally, the default area is a full screen area of the screen.
In this step, when the trigger instruction is triggered, the mobile terminal may select the selected area in the screen according to a preset identification selection area selection box. Alternatively, the preset identification selection area checkbox may be determined from a default area, which may be, for example, a full screen area of the screen. Alternatively, the preset identification selection area selection box may be set by the user.
In yet another example, when the triggering instruction is to draw a circle on the screen, the area selected by the selection area box is identified as the smallest rectangle containing the circle, and the circle includes a closed circle and a curve that is not closed but is close to the closed circle.
In this example, the user may trigger the trigger instruction by drawing a circle on the screen. Alternatively, the circle drawn on the screen by the user may be a closed ring shape, a closed irregular figure, an unclosed but nearly closed curve, an unclosed but nearly closed irregular figure, or the like. After the mobile terminal triggers the trigger instruction, the mobile terminal can determine the selection area selection box according to the circle drawn on the screen by the user. Alternatively, the identification selection area box may be the smallest rectangle containing the circle drawn by the user.
S102, based on the area to be identified, executing an information identification instruction, and identifying information content in the area to be identified.
In this embodiment, the information identification instruction is an instruction generated when the user clicks the identification button in fig. 4. And executing an information identification instruction based on the area to be identified, acquiring the content in the selection box of the identification selection area on the screen, and determining the area selected by the selection box of the identification selection area as the area to be identified. The information content in the area to be identified comprises picture content and/or text content.
In one example, after the mobile terminal acquires the information content in the area to be identified, the step of identifying the information content may include:
step 1, acquiring information content of an area to be identified in an identification selection area selection frame according to an information identification instruction.
In this step, after receiving the information identification instruction, the mobile terminal obtains the initial coordinate, the width and the height of the selection frame of the identification selection area. Or, the mobile terminal may further obtain the upper left corner coordinate and the lower right corner coordinate of the selection box of the identification selection area. Alternatively, the mobile terminal may use the upper left corner of the screen as the origin of coordinates. The mobile terminal can also obtain a screenshot of the current display interface. And then, the mobile terminal cuts the information content of the area to be identified in the screenshot of the current display interface according to the initial coordinate, the width and the height of the selection frame of the identification selection area, wherein the information content is part of the picture content of the screenshot of the current display interface.
And 2, identifying the information content in the area to be identified according to a preset identification algorithm, wherein the information content comprises the character content in the area to be identified and/or the image content in the area to be identified.
In this step, after determining the information content of the area to be identified, the mobile terminal inputs the information content into a preset identification model. The recognition model may be a text recognition model or a target object recognition model. The recognition model is a model obtained by pre-training according to the user requirements by an administrator.
The recognition model may be stored in the mobile terminal. After the mobile terminal determines the information content of the area to be identified, the mobile terminal may directly input the information content of the area to be identified into the corresponding model. And the mobile terminal identifies the text content and/or the picture content in the information content through the identification model.
Alternatively, the recognition model may be stored in the cloud or on a server. After the mobile terminal determines the information content of the area to be identified, the mobile terminal can upload the information content of the area to be identified to a cloud end or a server, and then the cloud end or the server inputs the information content of the area to be identified into a corresponding model. And the mobile terminal receives an information content identification result fed back by the cloud terminal or the server.
Alternatively, the training algorithms of the character recognition model and the target object recognition model may be existing algorithms or improved algorithms.
In another example, after the mobile terminal acquires the information content in the area to be identified, the step of identifying the information content may further include:
step 1, the mobile terminal acquires the information content of the area to be identified in the selection frame of the identification selection area according to the information identification instruction.
And 2, when the trigger instruction is realized through a single finger, the mobile terminal can acquire a contact coordinate of the finger in contact with the screen when the single finger of the user triggers the trigger instruction.
And 3, identifying the picture content near the contact coordinates in the information content of the area to be identified by the mobile terminal through the target object identification model.
And 4, identifying the character content near the contact coordinate in the information content of the area to be identified by the mobile terminal through the character identification model.
And 5, the mobile terminal determines the picture content and the character content as the character content in the region to be identified and/or the image content in the region to be identified, wherein the character content is included in the information content.
S103, determining and/or displaying the information processing shortcut key according to the information content.
In this embodiment, after determining the information content, the mobile terminal determines and/or displays the information processing shortcut key according to the text content and/or the picture content in the information content. Alternatively, the information processing shortcut key may be displayed in the form of a button or in the form of a card. The information processing shortcut key is used for helping a user to quickly jump to the link corresponding to the information content or is used for helping the user to quickly execute a preset instruction corresponding to the information content. The user can realize intelligent processing of the information in the screen by clicking the information processing shortcut key.
For example, when the information content is english and the default language of the mobile terminal of the user is french, the user can directly obtain the french translation of the english information content in the small card through the translation key in the information processing shortcut key. Alternatively, the information processing shortcut may also help the user jump to a translated page of a web page or to a translated page of a translation application. Meanwhile, when jumping to the corresponding translation page, the mobile terminal can also input the information content to the area to be translated and submit the translation request.
In one example, the step of determining, by the mobile terminal, the information processing shortcut key according to the information content may specifically include:
step 1, determining key information in the information content according to the information content and preset key information.
In this step, after the mobile terminal acquires the text information and/or the picture information in the information content, the mobile terminal may detect whether the information content includes the key information according to preset key information. The key information may be a keyword, a language category, etc. For example, the key information may be that the language category is english. Alternatively, the key information may also include date and time. Alternatively, the key information may also include a telephone number or the like.
And 2, determining the information processing shortcut key according to the key information.
In this step, after the mobile terminal determines the key information, the mobile terminal may determine the corresponding information processing shortcut key according to the key information. For example, when the key information is that the language category is english, the information processing shortcut may include translation. Or when the key information is date and time, the information processing shortcut key can be used for adding an event in a calendar, adding an alarm clock or adding a reminder. Or, when the key information is a telephone number, the information processing shortcut key can dial, save to an address book and the like.
In another example, the mobile terminal is in the application interface when the trigger instruction triggers. At this time, on the basis of the above example, the information processing shortcut may also be determined according to the foreground application. At this time, the step of the mobile terminal determining the information processing shortcut key according to the information content may further include:
and 3, acquiring foreground application of the terminal, and determining the information processing shortcut key according to the foreground application.
In this step, the mobile terminal acquires the foreground application. And the mobile terminal determines the information processing shortcut key corresponding to the foreground application according to the foreground application. And the mobile terminal displays the information processing shortcut key in the step 3 and the information processing shortcut key in the step 2 on a screen together on the basis of the step 2.
For example, when the foreground application is a game application, the information processing shortcut keys corresponding to the game application may include screen capture, screen recording, system acceleration, and the like.
And after the information processing shortcut key is determined through the steps, the mobile terminal keys the information processing shortcut key to be displayed on a screen. Alternatively, the information processing shortcut keys may be presented in the form of buttons, or the information processing shortcut keys may also be presented in the form of cards. In the following examples, various ways of presenting information processing shortcut keys are described:
in one example, when the information processing shortcut key is presented in the form of a button, the method for presenting the information processing shortcut key may include at least one of:
the buttons are shown below the identification selection area box.
In this example, when the identified selection area checkbox is not full screen, the buttons may be aligned to appear below the identified selection area checkbox. When the mobile terminal includes a plurality of buttons, the mobile terminal may present the buttons in two or more rows.
The buttons are shown inside the identification selection area box.
In this example, the buttons may also be arranged in a row, presented within the recognition selection area box. When the mobile terminal includes a plurality of buttons, the mobile terminal may present the buttons in two or more rows.
The button is shown in the intelligent panel, and the intelligent panel is the region on the screen.
In this example, the mobile terminal may also generate a smart panel. For example, as shown in FIG. 5, a smart panel may be included under the recognition selection area box, or anywhere on the screen. The intelligent panel comprises the buttons determined in the steps. Because the area of this intelligent panel is limited, consequently, when the button can not all be shown in the intelligent panel, this intelligent panel can also be through this intelligent panel of left and right slip, shows whole button. Alternatively, the arrow in fig. 5 shows the sliding direction of the smart panel.
In another example, when the information processing shortcut key is displayed in the form of a card, the card corresponding to the information processing shortcut key may be displayed in all or a part of the area of the screen. Optionally, the card presentation mode comprises default superposition, default expansion and default hiding. Optionally, in different display modes, the display condition of the card may include:
and when the card is displayed on the screen in a default expansion mode, clicking the card and then jumping to other links and/or executing a preset instruction.
In this example, the mobile terminal may present the cards in an expanded manner in the screen after generating the respective cards. For example, as shown in fig. 6, a vote editing card, a global favorite card, and a calendar adding card are shown in the screen. When the mobile terminal also comprises other cards, the user can view the other cards by sliding the screen up and down. Alternatively, the sliding direction may be as indicated by the arrow in fig. 6.
When a user clicks the card, the mobile terminal executes a jump instruction corresponding to the card or executes a preset instruction of the card according to a shortcut key selection instruction generated when the user clicks the card.
And when the card is displayed on the screen in a default overlapping mode, the card is expanded or the display page of the card is jumped after the card is clicked.
In this example, the mobile terminal may display the cards in a superimposed manner in the screen after generating the respective cards. When the cards are superposed, the cards on the upper layer display the card heads of the cards. The head of the card is the area where the information such as "word selecting edit", "global collection", "adding schedule" and the like is located in the card shown in fig. 6. The last card is displayed in an expanded manner. In the display process, when the mobile terminal comprises a large number of cards, the heads of the cards displayed in the form of the labels can display all the cards in a mode of reducing fonts, increasing overlapping areas and the like.
When the user clicks on the card head, the card is expanded. When the user clicks on a place other than the card, the card returns to the superimposed state. When the user clicks the expanded card, the mobile terminal executes a jump instruction corresponding to the card or executes a preset instruction of the card according to a shortcut key selection instruction generated when the user clicks the card.
When the card is displayed on the screen in a default hidden mode, the card is attached to the edge of the screen in a mode of displaying the brief information, and the complete information of the displayed card pops up after clicking the brief information of the card.
In this example, after each card is generated, the mobile terminal may display brief information of the card in a hidden manner in one boundary position of the screen. For example, the mobile terminal may present a card header of a card as shown in fig. 6 in the form of a tag at the right boundary of the mobile terminal. In the mobile terminal, the card may also be presented at the position of the left, upper or lower boundary of the mobile terminal. In the display process, when the mobile terminal comprises a large number of cards, the heads of the cards displayed in the form of the labels can display the brief information of all the cards in a mode of reducing fonts, overlapping and the like.
When the user clicks the card head, the mobile terminal pops up the card in an expanded state on the screen. When the user clicks the expanded card, the mobile terminal executes a jump instruction corresponding to the card or executes a preset instruction of the card according to a shortcut key selection instruction generated when the user clicks the card.
In another example, the mobile terminal may also sort the information processing shortcut keys. The mobile terminal displays the classification icon in a display interface, and when a user clicks the classification icon, the mobile terminal displays an information processing shortcut key corresponding to the classification. At this time, the information processing shortcut key may be displayed in the form of a key or in the form of a card. Optionally, the process of processing the classified display of the shortcut keys by the information processing device may include:
step 1, determining the classification of the information processing shortcut keys according to the information processing shortcut keys.
In this step, the mobile terminal may classify the information processing shortcut keys determined according to the information content into at least one classification according to the preset classification of the information processing shortcut keys.
And 2, displaying classified icons on a screen according to the classification of the information processing shortcut keys, and displaying the classified information processing shortcut keys after clicking the classified icons.
In this step, the mobile terminal may determine the classification icon of each classification according to the classification of the information processing shortcut key. For example, as shown in fig. 7, the mobile terminal classifies the information processing shortcut keys into 5 categories. Each of the 5 classes has its corresponding class icon.
The mobile terminal displays the sort icon on the screen. The sort icons may be arranged in a circular manner. For example, as shown in fig. 7, the 5 classified pictures are arranged in a circular manner.
After the user clicks the classification icon, the mobile terminal can display the information processing shortcut key corresponding to the classification icon. The mobile terminal can directly display the information processing shortcut key corresponding to the classification icon on the screen. Or, the mobile terminal may jump to a display interface and display the information processing shortcut key corresponding to the classification icon on the display interface. The information processing shortcut key can be displayed in the form of a key or a card.
According to the information identification method, after the mobile terminal obtains the trigger instruction of the user, the mobile terminal generates the selection frame of the identification selection area according to the trigger instruction. And the mobile terminal acquires the content in the selection box of the identification selection area on the screen according to the information identification instruction, and determines the area selected by the selection box of the identification selection area as the area to be identified. The information content in the area to be identified comprises picture content and/or text content. After the mobile terminal determines the information content, the mobile terminal determines an information processing shortcut key according to the text content and/or the picture content in the information content. Alternatively, the information processing shortcut key may be displayed in the form of a button or in the form of a card. The information processing shortcut key is used for helping a user to quickly jump to the link corresponding to the information content or is used for helping the user to quickly execute a preset instruction corresponding to the information content. In the application, the information processing shortcut key is determined by identifying the information content in the selection area selection frame, so that a user can realize quick execution of functions by selecting the information processing shortcut key, the intelligence of the mobile terminal is improved, and the user experience is improved.
Second embodiment
Fig. 8 shows a flowchart of an information identification method according to an embodiment of the present application. On the basis of the embodiments shown in fig. 1 to fig. 7, as shown in fig. 8, with a mobile terminal as an execution subject, the method of the embodiment may include the following steps:
s201, key information is obtained.
In this embodiment, the mobile terminal may obtain the key information. The key information may be obtained by user input, or the preset key information may be obtained by user selection.
S202, obtaining the operation of jumping other links and/or executing preset instructions.
In this embodiment, the mobile terminal may further obtain an operation of transferring to another link, or execute an operation of a preset instruction, or jump to another link and execute an operation of a preset instruction. Optionally, the process of the mobile terminal acquiring the operation may include that the user executes a corresponding operation according to the operation instruction of the mobile terminal. Further, the mobile terminal acquires an operation performed by the user.
And S203, matching key information and operation.
In this embodiment, after the mobile terminal obtains the key information and the operation according to S201 and S202, the user may match the preset key information with the operation. And when the matching is completed, the key information and the operation form an information processing shortcut key.
In one example, the user may correspond to an operation of acquiring the key information after acquiring the key information. And then, the mobile terminal completes the matching of the key information and the operation.
S204, a trigger instruction is obtained, the trigger instruction is used for triggering a selection frame of the identification selection area, and the selection frame of the identification selection area is used for selecting the area to be identified on the screen.
Optionally, step S204 is similar to the step S101 in the embodiment of fig. 3, and this embodiment is not described herein again.
S205, acquiring an adjusting instruction for identifying the selection area selection frame, wherein the adjusting instruction is used for adjusting the size and the position of the selection area selection frame.
In this embodiment, after the mobile terminal completes generation of the identification selection area selection box according to S204, the user may adjust the identification selection area selection box according to actual needs. For example, the user may adjust the starting position of the recognition selection area box, adjust the width and/or height of the recognition selection area box.
And the mobile terminal correspondingly generates an adjusting instruction for identifying the selection area selection frame according to the adjustment of the user. And the mobile terminal adjusts the size and the position of the selection frame of the identification selection area according to the adjustment instruction.
And S206, identifying the information content in the area to be identified according to the information identification instruction, wherein the area to be identified is the area selected by the identification selection area selection box.
And S207, determining an information processing shortcut key according to the information content, and displaying the information processing shortcut key on a screen, wherein the information processing shortcut key is used for jumping to other links and/or executing a preset instruction.
Optionally, steps S206 and S207 are similar to steps S102 and S103 in the embodiment of fig. 3, and this embodiment is not described here again.
And S208, acquiring a shortcut key selection instruction, wherein the shortcut key selection instruction is used for indicating the selected information processing shortcut key, and the information processing shortcut key can be a button or a card.
In this embodiment, the mobile terminal obtains a shortcut key selection instruction, where the shortcut key selection instruction is an information processing shortcut key selected after a user views a button or a card of the information processing shortcut key.
S209, processing the shortcut key to jump to a preset interface and/or executing a preset instruction according to the information indicated by the shortcut key selection instruction.
In the embodiment, the mobile terminal determines the information processing shortcut key selected by the user according to the shortcut key selection instruction. And the mobile terminal determines the operation of jumping to a preset interface and/or executing a preset instruction in the information processing shortcut key according to the information processing shortcut key. And the mobile terminal executes the operation of jumping to a preset interface and/or executing a preset instruction.
In one example, the preset interface includes a full screen display interface of the card, an application interface associated with the information processing shortcut key, and an application interface before the selection frame of the identification selection area is triggered.
In this example, when the operation of jumping to the preset interface is included in the information processing shortcut key, the mobile terminal may jump to a display interface such as a full-screen display interface and an associated application interface of the card according to the instruction. Optionally, when the mobile terminal jumps to the associated application interface, the associated application may be presented in a full screen, half screen, or pop-up form in the application interface.
When the information processing shortcut key comprises an operation of executing a preset instruction, the mobile terminal can jump to an application interface before the selection frame of the trigger identification selection area after the execution of the preset instruction is completed.
When the information processing shortcut key comprises a jump to a preset interface and an execution preset instruction, the mobile terminal may jump to an application interface of the associated application after the execution of the preset instruction is completed.
According to the information identification method, the mobile terminal can acquire key information and jump to other links and/or execute preset instructions. After the mobile terminal acquires the key information and the operation, the user can match the preset key information with the operation. And after the mobile terminal acquires a trigger instruction of a user, generating an identification selection area selection frame according to the trigger instruction. The mobile terminal obtains an adjusting instruction for identifying the selection area selection frame, and the adjusting instruction is used for adjusting the size and the position of the selection area selection frame. And the mobile terminal acquires the content in the selection box of the identification selection area on the screen according to the information identification instruction, and determines the area selected by the selection box of the identification selection area as the area to be identified. The information content in the area to be identified comprises picture content and/or text content. After the mobile terminal determines the information content, the mobile terminal determines an information processing shortcut key according to the text content and/or the picture content in the information content. Alternatively, the information processing shortcut key may be displayed in the form of a button or in the form of a card. The mobile terminal obtains the information processing shortcut key selected after a user views the button or card of the information processing shortcut key, and generates a shortcut key selection instruction. And the mobile terminal determines the information processing shortcut key selected by the user according to the shortcut key selection instruction. And the mobile terminal determines the operation of jumping to a preset interface and/or executing a preset instruction in the information processing shortcut key according to the information processing shortcut key. And the mobile terminal executes the operation of jumping to a preset interface and/or executing a preset instruction. In the application, the information processing shortcut key is determined by identifying the information content in the selection area selection frame, so that a user can realize quick execution of functions by selecting the information processing shortcut key, the intelligence of the mobile terminal is improved, and the user experience is improved. In addition, the method and the device for processing the information enable the information processing shortcut key to be generated by the user according to the actual requirement of the user by obtaining the key information and operation, improve the user experience and improve the flexibility of the information processing shortcut key. Meanwhile, the adjustment of the selection frame of the identification selection area can be realized by acquiring the adjustment instruction, so that the setting of the selection frame of the identification selection area can better meet the user requirements, the user experience is further improved, and the flexible adjustment of the selection frame of the identification selection area is improved.
Fig. 9 is a schematic structural diagram of an information recognition apparatus according to an embodiment of the present application, and as shown in fig. 9, an information recognition apparatus 10 according to the present embodiment is used for implementing an operation corresponding to a mobile terminal in any one of the method embodiments described above, where the information recognition apparatus 10 according to the present embodiment includes:
the acquiring module 11 is configured to acquire a trigger instruction, where the trigger instruction is used to trigger an identification selection area box, and the identification selection area box is used to select an area to be identified on a screen.
And the identification module 12 is configured to identify information content in the area to be identified according to the information identification instruction, where the area to be identified is an area selected by the identification selection area selection box.
And the determining module 13 is configured to determine an information processing shortcut key according to the information content, and display the information processing shortcut key on a screen, where the information processing shortcut key is used to jump to another link and/or execute a preset instruction.
In one example, the triggering instruction is triggered on at least one of the following display interfaces of the mobile terminal: the method comprises a screen locking interface, a negative screen, an application interface, a main interface and an expansion interface, wherein optionally, when a trigger instruction is triggered on different display interfaces, an information processing shortcut key jumps to different links and/or executes different preset instructions.
In one example, the trigger instruction includes at least one of a long press, a multiple click, a hard press, and a circle on the screen.
In another example, the triggering instruction further includes pressing one finger on the screen and moving the other finger on the screen. And determining the area selected by the selection area selection box according to the movement track of the other finger on the screen by the trigger instruction, wherein the area is the minimum rectangle containing the movement track.
In one example, when the trigger instruction is a long press or a hard press on the screen, at least one of the following is included: when the screen is pressed with one finger for a long time or strongly, the area selected by the selection area selection box is identified as a default area. When the screen is pressed by two fingers for a long time or strongly, the area selected by the selection area selection box is identified as the area between the two fingers. When more than two fingers are used for long pressing or the screen is pressed hard, the area selected by the selection area selection box is identified as the minimum rectangular box containing the positions of the fingers.
In another example, when the trigger instruction is a multi-click screen, the method includes: and acquiring the continuous clicking times on the screen, wherein the time interval between two adjacent clicks in the continuous clicking process is less than the preset time length. And judging whether to trigger a trigger instruction or not according to the continuous click times and the preset click times. When the trigger instruction is triggered, the area selected by the selection area selection box is determined as a default area.
Optionally, the default area is a full screen area of the screen.
In yet another example, when the triggering instruction is to draw a circle on the screen, the area selected by the selection area box is identified as the smallest rectangle containing the circle, and the circle includes a closed circle and a curve that is not closed but is close to the closed circle.
In one example, the identification module is specifically configured to obtain, according to the information identification instruction, information content identifying an area to be identified in the selection area box. And identifying the information content in the area to be identified according to a preset identification algorithm, wherein the information content comprises the character content in the area to be identified and/or the image content in the area to be identified.
In one example, the determining module is specifically configured to determine the key information in the information content according to the information content and preset key information. And determining the information processing shortcut key according to the key information.
In another example, the determining module is further configured to obtain a foreground application of the terminal, and determine the information processing shortcut key according to the foreground application.
In one example, when the information processing shortcut key is presented in the form of a button, at least one of the following is included: the buttons are shown below the identification selection area box. The buttons are shown inside the identification selection area box. The button is shown in the intelligent panel, and the intelligent panel is the region on the screen.
In one example, when the information processing shortcut key is presented in the form of a card, the method includes: and displaying the cards in all or partial areas of the screen, wherein the display modes of the cards comprise default superposition, default expansion and default hiding.
In one example, the card is displayed in all or part of the area of the screen, and the card comprises at least one of the following: and when the card is displayed on the screen in a default expansion mode, clicking the card and then jumping to other links and/or executing a preset instruction. And when the card is displayed on the screen in a default overlapping mode, the card is expanded or the display page of the card is jumped after the card is clicked. When the card is displayed on the screen in a default hidden mode, the card is attached to the edge of the screen in a mode of displaying the brief information, and the complete information of the displayed card pops up after clicking the brief information of the card.
In one example, the manner of displaying the information processing shortcut key further includes: and determining the classification of the information processing shortcut keys according to the information processing shortcut keys. And displaying the classified icons on the screen according to the classification of the information processing shortcut keys, and displaying the classified information processing shortcut keys after clicking the classified icons.
Optionally, the sort icons are arranged in a circular manner.
The information identification apparatus 10 provided in the embodiment of the present application may implement the method embodiment, and for details of the implementation principle and the technical effect, reference may be made to the method embodiment, which is not described herein again.
Fig. 10 is a schematic structural diagram of an information recognition apparatus according to an embodiment of the present application, and based on the embodiment shown in fig. 9, as shown in fig. 10, the information recognition apparatus 10 of the present embodiment is used to implement an operation corresponding to a mobile terminal in any one of the method embodiments described above, and the information recognition apparatus 10 of the present embodiment further includes:
and the adjusting module 14 is configured to obtain an adjusting instruction for identifying the selection area frame, where the adjusting instruction is used to adjust the size and the position of the identification selection area frame.
And the execution module 15 is configured to obtain a shortcut key selection instruction, where the shortcut key selection instruction is used to indicate a selected information processing shortcut key, and the information processing shortcut key may be a button or a card. And processing the shortcut key to jump to a preset interface and/or executing a preset instruction according to the information indicated by the shortcut key selection instruction.
And the preprocessing module 16 is used for acquiring preset key information. And acquiring the operation of jumping other links and/or executing a preset instruction. Matching key information and operations.
In one example, the preset interface includes a full screen display interface of the card, an application interface associated with the information processing shortcut key, and an application interface before the selection frame of the identification selection area is triggered.
The information identification apparatus 10 provided in the embodiment of the present application may implement the method embodiment, and for details of the implementation principle and the technical effect, reference may be made to the method embodiment, which is not described herein again.
The application also provides a mobile terminal device, the terminal device includes a memory and a processor, the memory stores an information identification program, and the information identification program is executed by the processor to implement the steps of the information identification method in any of the above embodiments.
The present application further provides a computer storage medium, in which an information identification program is stored, and the information identification program, when executed by a processor, implements the steps of the information identification method in any of the above embodiments.
In the embodiments of the mobile terminal and the computer storage medium provided in the present application, all technical features of the embodiments of the information identification method are included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with emphasis, and reference may be made to the description of other embodiments for parts that are not described or illustrated in any embodiment.
The technical features of the technical solution of the present application may be arbitrarily combined, and for brevity of description, all possible combinations of the technical features in the embodiments are not described, however, as long as there is no contradiction between the combinations of the technical features, the scope of the present application should be considered as being described in the present application.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. An information identification method, characterized in that the method comprises:
s101, acquiring a trigger instruction, wherein the trigger instruction is used for selecting an area to be identified;
s102, executing an information identification instruction based on the area to be identified, and identifying information content in the area to be identified;
s103, determining and/or displaying the information processing shortcut key according to the information content.
2. The method of claim 1, comprising at least one of:
when the trigger instruction is triggered on different display interfaces, the information processing shortcut key jumps to different links and/or executes different preset instructions;
the triggering instruction is triggered on at least one of the following display interfaces: the system comprises a screen locking interface, a negative screen, an application interface, a main interface and an expansion interface.
3. The method according to claim 1, wherein the step S101 comprises at least one of:
one finger presses the screen, the other finger moves on the screen, and the trigger instruction determines the area to be identified according to the moving track of the other finger on the screen;
pressing the screen by one finger, and determining the full screen area of the screen as the area to be identified;
pressing the screen by two fingers for a long time, and determining an area between the two fingers as the area to be identified;
pressing the screen by more than two hands, and determining a minimum area containing the position of the finger as the area to be identified;
judging whether the trigger instruction is triggered or not according to the continuous clicking times, and determining the area selected by the identification selection area selection frame as the area to be identified when the trigger instruction is triggered;
when the triggering instruction is to draw a circle on a screen, determining that the minimum rectangle containing the circle is the area to be identified.
4. The method according to any one of claims 1 to 3, wherein the step S101 includes displaying at least one identification selection area box, and the step S101 further includes: and acquiring an adjusting instruction of the selection frame of the identification selection area, wherein the adjusting instruction is used for adjusting the size and the position of the selection frame of the identification selection area.
5. The method according to any one of claims 1 to 3, wherein the identifying information content in the area to be identified comprises:
acquiring the information content of the area to be identified in the selection frame of the identification selection area;
and identifying the information content in the area to be identified according to a preset identification algorithm.
6. The method of claim 5, wherein determining information handling shortcuts comprises at least one of:
determining key information in the information content according to the information content and/or preset key information, and determining an information processing shortcut key according to the key information;
and acquiring foreground application, and determining the information processing shortcut key according to the foreground application.
7. A method according to any of claims 1 to 3, wherein displaying information handling shortcut keys comprises at least one of:
when the information processing shortcut key is displayed in the form of a button, at least one of the following is included: the button is displayed below the selection area frame, the button is displayed inside the selection area frame, the button is displayed on the intelligent panel, and the intelligent panel is an area on the screen;
when the information processing shortcut key is displayed in the form of a card, at least one of the following is included: displaying the card in all or part of the area of the screen, wherein the card comprises at least one of the following components:
when the card demonstrates in the screen with the mode that expands, clicks jump behind the card other links and/or execute and predetermine the instruction, work as when the card demonstrates in the screen with superimposed mode, click expand behind the card or jump the show page of card, work as when the card demonstrates in the screen with the mode of hiding, the card is attached to the edge of screen with the mode that shows brief information, clicks pop out the show behind the brief information of card the complete information of card.
8. The method of any one of claims 1 to 3, comprising at least one of:
determining the classification of the information processing shortcut keys according to the information processing shortcut keys, displaying classified classification icons on a screen according to the classification of the information processing shortcut keys, and displaying the classified information processing shortcut keys after clicking the classification icons;
before S101, the method further includes: acquiring key information, acquiring operations of jumping other links and/or executing preset instructions, and matching the key information with the operations;
after S103, the method further includes: and acquiring a shortcut key selection instruction, wherein the shortcut key selection instruction is used for indicating the selected information processing shortcut key, and jumping to a preset interface and/or executing a preset instruction according to the information processing shortcut key indicated by the shortcut key selection instruction.
9. A mobile terminal, characterized in that the mobile terminal comprises: memory, processor, wherein the memory has stored thereon an information recognition program which when executed by the processor implements the steps of the information recognition method according to any one of claims 1 to 8.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, realizes the steps of the information identification method according to any one of claims 1 to 8.
CN202011531213.9A 2020-12-22 2020-12-22 Information identification method, mobile terminal and storage medium Pending CN112732134A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011531213.9A CN112732134A (en) 2020-12-22 2020-12-22 Information identification method, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011531213.9A CN112732134A (en) 2020-12-22 2020-12-22 Information identification method, mobile terminal and storage medium

Publications (1)

Publication Number Publication Date
CN112732134A true CN112732134A (en) 2021-04-30

Family

ID=75604080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011531213.9A Pending CN112732134A (en) 2020-12-22 2020-12-22 Information identification method, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112732134A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113849118A (en) * 2021-10-20 2021-12-28 锐捷网络股份有限公司 Image identification method applied to electronic whiteboard and related device
CN114995933A (en) * 2022-05-18 2022-09-02 深圳传音控股股份有限公司 Information display method, intelligent terminal and storage medium
WO2022241694A1 (en) * 2021-05-19 2022-11-24 深圳传音控股股份有限公司 Processing method, mobile terminal, and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022241694A1 (en) * 2021-05-19 2022-11-24 深圳传音控股股份有限公司 Processing method, mobile terminal, and storage medium
CN113849118A (en) * 2021-10-20 2021-12-28 锐捷网络股份有限公司 Image identification method applied to electronic whiteboard and related device
CN114995933A (en) * 2022-05-18 2022-09-02 深圳传音控股股份有限公司 Information display method, intelligent terminal and storage medium

Similar Documents

Publication Publication Date Title
CN108037893B (en) Display control method and device of flexible screen and computer readable storage medium
CN112732134A (en) Information identification method, mobile terminal and storage medium
CN107885448B (en) Control method for application touch operation, mobile terminal and readable storage medium
CN112068744A (en) Interaction method, mobile terminal and storage medium
CN113487705A (en) Image annotation method, terminal and storage medium
CN112363648A (en) Shortcut display method, terminal and computer storage medium
CN112181233A (en) Message processing method, intelligent terminal and computer readable storage medium
CN108628534B (en) Character display method and mobile terminal
CN111813493B (en) Method of operating component, terminal, and storage medium
CN113126844A (en) Display method, terminal and storage medium
CN112558826A (en) Shortcut operation method, mobile terminal and storage medium
CN112199141A (en) Message processing method, terminal and computer readable storage medium
CN109683796B (en) Interaction control method, equipment and computer readable storage medium
CN109710168B (en) Screen touch method and device and computer readable storage medium
CN112068743A (en) Interaction method, terminal and storage medium
CN114647623A (en) Folder processing method, intelligent terminal and storage medium
CN114138144A (en) Control method, intelligent terminal and storage medium
CN114398113A (en) Interface display method, intelligent terminal and storage medium
CN109683793B (en) Content turning display method and device and computer readable storage medium
CN113342246A (en) Operation method, mobile terminal and storage medium
CN114442886A (en) Data processing method, intelligent terminal and storage medium
CN109522064B (en) Interaction method and interaction device of portable electronic equipment with double screens
CN112561798A (en) Picture processing method, mobile terminal and storage medium
CN107526520B (en) Search interaction method and device and computer readable storage medium
CN113253896A (en) Interface interaction method, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination