CN112083872A - Picture processing method, mobile terminal and computer storage medium - Google Patents

Picture processing method, mobile terminal and computer storage medium Download PDF

Info

Publication number
CN112083872A
CN112083872A CN202010972102.5A CN202010972102A CN112083872A CN 112083872 A CN112083872 A CN 112083872A CN 202010972102 A CN202010972102 A CN 202010972102A CN 112083872 A CN112083872 A CN 112083872A
Authority
CN
China
Prior art keywords
information
selected area
application program
picture
click
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010972102.5A
Other languages
Chinese (zh)
Inventor
陈鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN202010972102.5A priority Critical patent/CN112083872A/en
Publication of CN112083872A publication Critical patent/CN112083872A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files

Abstract

The embodiment of the invention provides a picture processing method, a mobile terminal and a computer storage medium, wherein the picture processing method comprises the following steps: receiving touch operation aiming at a target picture, and determining a selected area according to the touch operation; performing image recognition on the selected area; if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object; and receiving operation information of the at least one piece of labeling information, and starting a corresponding application program according to the operation information. The embodiment of the invention can facilitate the user to visually check the information related to the picture on the picture, and also facilitate the user to open the corresponding application program, thereby improving the operation efficiency and the user experience.

Description

Picture processing method, mobile terminal and computer storage medium
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a picture processing method, a mobile terminal, and a computer storage medium.
Background
At present, mobile terminals are increasingly used for photographing, and due to the portability of the mobile terminals, the pictures can be conveniently checked on the mobile terminals. However, when viewing the picture, the picture can only be viewed in an enlarged or reduced manner, or the picture is correspondingly edited for special effect processing, and the processing manner of the picture is monotonous.
Disclosure of Invention
The embodiment of the invention provides a picture processing method, a mobile terminal and a computer storage medium, which can quickly start a corresponding application program according to a picture, improve the operation efficiency and improve the user experience.
A first aspect of an embodiment of the present invention provides an image processing method, including:
receiving touch operation aiming at a target picture, and determining a selected area according to the touch operation;
performing image recognition on the selected area;
if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object;
and receiving operation information of the at least one piece of labeling information, and starting a corresponding application program according to the operation information.
Optionally, the receiving a touch operation for the target picture, and determining the selected area according to the touch operation includes:
receiving a click operation aiming at a target picture;
acquiring a click point of the click operation, and determining a circular area by taking the click point as a center and a preset distance as a radius;
determining the circular area as a selected area.
Optionally, the receiving a touch operation for the target picture, and determining the selected area according to the touch operation includes:
receiving a sliding gesture operation aiming at a target picture;
determining a moving track corresponding to the sliding gesture operation;
and determining the area inside the moving track as a selected area.
Optionally, the performing image recognition on the selected area includes:
calling an image recognition component to recognize the selected area;
and if the identification fails, expanding the range of the selected area and identifying the expanded selected area.
Optionally, the preset target object is a person, and if the preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object includes:
acquiring feature information of the person, wherein the feature information comprises: name and birthday information;
generating corresponding marking information according to the characteristic information;
and displaying a labeling layer in a suspending manner on the target picture, and displaying the labeling information on the labeling layer.
Optionally, the preset target object is a building, and if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object includes:
acquiring characteristic information of the building, wherein the characteristic information comprises: name and geographic location;
generating corresponding marking information according to the characteristic information;
and displaying a labeling layer in a suspending manner on the target picture, and displaying the labeling information on the labeling layer.
Optionally, the receiving operation information of the at least one piece of label information, and starting a corresponding application program according to the operation information includes:
identifying the operation information, the operation information including: single-click operation and double-click operation;
if the tagged information is a name, the clicking operation correspondingly opens a social application program and enters a chat window corresponding to the name; the double-click operation correspondingly starts a telephone application program and dials the telephone of the contact corresponding to the name;
if the marked information is the daily information, the single-click operation correspondingly opens a schedule application program; the double-click operation corresponds to starting a reminding application program.
Optionally, the receiving operation information of the at least one piece of label information, and starting a corresponding application program according to the operation information includes:
identifying the operation information, the operation information including: single-click operation and double-click operation;
if the marked information is a name, the single-click operation correspondingly starts a search application program; the double-click operation corresponds to starting a comment application program;
if the marked information is the geographic position, the map application program is correspondingly started through the clicking operation; and the double-click operation correspondingly starts the traffic application program.
A second aspect of an embodiment of the present invention provides a mobile terminal, where the mobile terminal includes a processor and a memory;
the memory is used for storing an executable program;
the processor is used for executing the executable program to realize the picture processing method.
A third aspect of the embodiments of the present invention provides a computer storage medium, where an executable program is stored on the computer storage medium, and when the executable program is executed, the image processing method is implemented.
The embodiment of the invention has the following beneficial effects:
according to the picture processing method, the mobile terminal and the computer storage medium in the embodiment of the invention, the selected area is determined according to the touch operation of the user, the corresponding label information is displayed by identifying whether the image in the selected area is a preset object, and the corresponding application program is quickly started according to the label information selected by the user, so that the user can conveniently and visually check the information related to the picture, the user can conveniently start the corresponding application program, the operation efficiency is improved, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a wireless communication system of a mobile terminal according to an embodiment of the present invention;
fig. 3 is a flowchart of a method according to an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a fixed terminal such as a Digital TV, a desktop computer, and the like. The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present invention, a communication network system on which the mobile terminal of the present invention is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present invention, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Among them, the eNodeB2021 may be connected with other eNodeB2022 through backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. The MME2031 is a control node that handles signaling between the UE201 and the EPC203, and provides bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present invention is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, the present invention provides various embodiments of the method.
Fig. 3 is a flowchart of a method according to an embodiment of the present invention. In this embodiment, the image processing method may be applied to the mobile terminal shown in fig. 1 or fig. 2, and it is understood that the image processing method may also be applied to electronic devices such as a tablet and an electronic book. The picture processing method comprises the following steps: S301-S304.
In step S301, a touch operation for a target picture is received, and a selected area is determined according to the touch operation;
in step S302, image recognition is performed on the selected area;
in step S303, if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object;
in step S304, the operation information of the at least one piece of label information is received, and the corresponding application program is opened according to the operation information.
The present embodiment is described in further detail below:
specifically, in step S301, the mobile terminal first displays the target picture. In general, a user may view a picture in an application such as a gallery, or view a photographed photo in a camera application, or view a picture in other software, which is not limited to this embodiment of the present invention. When a target picture is displayed on a display screen of the mobile terminal, receiving touch operation of a user on the target picture, wherein the touch operation can be in various forms, and can be click operation or sliding gesture operation. The clicking operation may also be a single click or a double click. And determining a final selected area according to the touch operation.
In one embodiment, the step S301 may further include steps S3011-S3013.
In step S3011, receiving a click operation for a target picture;
in step S3012, a click point of the click operation is obtained, and a circular region is determined with the click point as a center and a preset distance as a radius;
in step S3013, the circular region is determined to be a selected region.
Specifically, in the above step, the touch operation is a click operation, and according to a click point of the click operation by the user, a range of a preset radius around the click point is determined as the selected area, where the preset radius may be preset or may be associated with a size of an actual picture or a display size of a screen, and this embodiment is not limited thereto. It can be understood that, since the contact area between the finger and the screen is larger when the user usually clicks with the finger, the center coordinate of the contact area can be directly used as the click point, and then the circular area can be determined.
In another embodiment, the step S301 may further include steps S3014-S3016.
In step S3014, receiving a slide gesture operation for the target picture;
in step S3015, determining a movement trajectory corresponding to the slide gesture operation;
in step S3016, a region inside the movement trajectory is determined to be a selected region.
Specifically, in the above step, the touch operation is a slide gesture operation, and the corresponding region is determined according to a movement trajectory of the user slide gesture. When the moving track of the user is a closed curve, directly taking the inner area of the closed curve as a selected area; when the movement track of the user is an unclosed curve, the starting position and the ending position are connected in a straight line according to the starting position and the ending position of the sliding gesture of the user, and the starting position and the ending position and the movement track are jointly used as a selected area.
In step S302, after the selected region is determined, image recognition is performed on the image. In this embodiment, the selected area is identified by calling the image identification component. It will be appreciated that the image recognition component may employ corresponding algorithms known in the art. The purpose of image recognition is to determine whether a preset target object, which may be a person or a building, exists in the selected area. In this embodiment, when image recognition is performed, if the recognition fails, it indicates that the selected area may be smaller, and at this time, the range of the selected area is expanded, and recognition is performed on the expanded selected area. During the expansion, the equal-scale expansion may be performed according to a preset distance, and the user may also be prompted to select an area, which is not limited in this embodiment.
In this embodiment, the target object may be a person or a building, and the following description is made in detail with reference to different target objects.
In one embodiment, the target object is a human figure, and step S303 may further include steps S3031-S3033.
In step S3031, feature information of the person is acquired, where the feature information includes: name and birthday information;
in step S3032, generating corresponding label information according to the feature information;
in step S3033, a label layer is displayed on the target image in a floating manner, and the label information is displayed on the label layer.
Specifically, when the target object is identified to be included in the selected area and the target object is a person, the feature information of the person is acquired. In this embodiment, photos of a plurality of contacts and contact information may be preset in the terminal, and when a target object is identified as a person, the terminal automatically matches with the information in the terminal, and identifies and acquires feature information of the person, where the feature information includes: name and birthday information. And after the characteristic information is acquired, generating annotation information according to the characteristic information, wherein the annotation information is used for being superposed and displayed on the target picture. When the overlay display is performed, the label layer can be displayed on the target picture in a suspension manner, and then the label information is displayed on the label layer.
At this time, step S304 may further include steps S3041-S3043.
In step S3041, the operation information is identified, which includes: single-click operation and double-click operation;
in step S3042, if the tagged information is a name, the clicking operation correspondingly opens a social application program, and enters a chat window corresponding to the name; the double-click operation correspondingly starts a telephone application program and dials the telephone of the contact corresponding to the name;
in step S3043, if the label information is date information, the single-click operation correspondingly opens a schedule application; the double-click operation corresponds to starting a reminding application program.
Specifically, the opening of different application programs can be realized according to different operation information. When the labeled information corresponding to the user operation is a name, clicking and double clicking respectively correspond to the social contact application program and the telephone application program, so that the user can conveniently carry out information sending operation or directly carry out telephone dialing; when the labeling information corresponding to the user operation is the daily information, the single-click operation corresponds to the schedule, the user can conveniently check the corresponding schedule arrangement, the double-click operation corresponds to the reminding, and the user can conveniently set the corresponding reminding.
In another embodiment, the target object is a building, and step S303 may further include steps S3034-S3036.
In step S3034, characteristic information of the building is acquired, where the characteristic information includes: name and geographic location;
in step S3035, generating corresponding label information according to the feature information;
in step S3036, a label layer is displayed on the target image in a floating manner, and the label information is displayed on the label layer.
Specifically, when a building is identified to be included in the selected area, the terminal searches on the cloud server according to the building picture at the moment, and returns corresponding feature information according to the search result, where the feature information may include: name and geographic location. In the prior art, for example, in the process of chatting between a user and a friend in social software, the user wants to inquire the position of the friend and generally needs the friend to send positioning.
Further, at this time, step S304 may further include steps S3044-S3046.
In step S3044, the operation information is identified, which includes: single-click operation and double-click operation;
in step S3045, if the label information is a name, the single-click operation correspondingly starts a search application; the double-click operation corresponds to starting a comment application program;
in step S3046, if the label information is a geographic location, the single-click operation correspondingly opens a map application; and the double-click operation correspondingly starts the traffic application program.
Specifically, when the operation information of the user corresponds to the name of the building, different operations correspond to search applications or comment applications, so that different requirements of the user are met. For example, in the process of traveling, a user can see that a building wants to know information and can directly search the information by clicking after taking a picture; if the restaurant is the restaurant, the corresponding comment can be checked through double-click after the picture is taken, and the user experience is improved.
When the operation information of the user corresponds to the geographical position of the building, the different operations correspond to the map application or the traffic application. The map application program is used for displaying the relative direction and distance between the building and the current position of the user; the transportation applications are used to provide transportation for users, including taxis, network appointments, trains, planes, and the like. For example, a friend of the user sends a building picture, the user acquires a corresponding geographic position through identification, and at the moment, if the user clicks the corresponding geographic position, the position of the building on a map can be inquired; if the vehicle type application is double-clicked, the vehicle type application can be directly accessed, the corresponding vehicle is automatically started according to the corresponding distance, and for example, the network car booking application is started in the same city; and if the distance is long, opening the train ticket application or the aviation application.
According to the picture processing method in the embodiment of the invention, the selected area is determined according to the touch operation of the user, the corresponding label information is displayed by identifying whether the image in the selected area is a preset object or not, and the corresponding application program is quickly started according to the label information selected by the user, so that the user can conveniently and visually check the information related to the picture, the user can conveniently start the corresponding application program, the operation efficiency is improved, and the user experience is improved.
Fig. 4 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention. As shown in fig. 4, the mobile terminal includes a processor 401 (the number of processors 401 in the terminal may be one or more, and fig. 4 exemplifies one processor) and a memory 402. In the embodiment of the present invention, the processor 401 and the memory 402 may be connected by a bus or other means, wherein fig. 4 illustrates the connection by the bus. It is understood that the mobile terminal in the present embodiment may also be applied to the embodiments shown in fig. 1 or fig. 2.
The memory 402 stores therein an executable program, and the processor 401 executes the executable program to implement the following steps:
receiving touch operation aiming at a target picture, and determining a selected area according to the touch operation;
performing image recognition on the selected area;
if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object;
and receiving operation information of the at least one piece of labeling information, and starting a corresponding application program according to the operation information.
Optionally, the processor 401 receives a touch operation for the target picture, and determines the selected area according to the touch operation, including:
receiving a click operation aiming at a target picture;
acquiring a click point of the click operation, and determining a circular area by taking the click point as a center and a preset distance as a radius;
determining the circular area as a selected area.
Optionally, the processor 401 receives a touch operation for the target picture, and determines the selected area according to the touch operation, including:
receiving a sliding gesture operation aiming at a target picture;
determining a moving track corresponding to the sliding gesture operation;
and determining the area inside the moving track as a selected area.
Optionally, the processor 401 performs image recognition on the selected area, including:
calling an image recognition component to recognize the selected area;
and if the identification fails, expanding the range of the selected area and identifying the expanded selected area.
Optionally, the preset target object is a person, and if the preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object includes:
acquiring feature information of the person, wherein the feature information comprises: name and birthday information;
generating corresponding marking information according to the characteristic information;
and displaying a labeling layer in a suspending manner on the target picture, and displaying the labeling information on the labeling layer.
Optionally, the preset target object is a building, and if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object includes:
acquiring characteristic information of the building, wherein the characteristic information comprises: name and geographic location;
generating corresponding marking information according to the characteristic information;
and displaying a labeling layer in a suspending manner on the target picture, and displaying the labeling information on the labeling layer.
Optionally, the receiving, by the processor 401, operation information of the at least one piece of label information, and starting a corresponding application program according to the operation information includes:
identifying the operation information, the operation information including: single-click operation and double-click operation;
if the tagged information is a name, the clicking operation correspondingly opens a social application program and enters a chat window corresponding to the name; the double-click operation correspondingly starts a telephone application program and dials the telephone of the contact corresponding to the name;
if the marked information is the daily information, the single-click operation correspondingly opens a schedule application program; the double-click operation corresponds to starting a reminding application program.
Optionally, the receiving, by the processor 401, operation information of the at least one piece of label information, and starting a corresponding application program according to the operation information includes:
identifying the operation information, the operation information including: single-click operation and double-click operation;
if the marked information is a name, the single-click operation correspondingly starts a search application program; the double-click operation corresponds to starting a comment application program;
if the marked information is the geographic position, the map application program is correspondingly started through the clicking operation; and the double-click operation correspondingly starts the traffic application program.
The mobile terminal in the embodiment of the invention determines the selected area according to the touch operation of the user, displays the corresponding label information by identifying whether the image in the selected area is a preset object or not, and quickly opens the corresponding application program according to the label information selected by the user, so that the user can conveniently and visually check the information related to the image on the image, the user can conveniently open the corresponding application program, the operation efficiency is improved, and the user experience is improved.
An embodiment of the present invention further provides a computer storage medium, where an executable program is stored in the computer storage medium, and when the executable program is executed, the following steps are implemented:
receiving touch operation aiming at a target picture, and determining a selected area according to the touch operation;
performing image recognition on the selected area;
if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object;
and receiving operation information of the at least one piece of labeling information, and starting a corresponding application program according to the operation information.
Optionally, the executable program may also be used for executing to implement the picture processing method shown in fig. 3, which is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An image processing method, comprising:
receiving touch operation aiming at a target picture, and determining a selected area according to the touch operation;
performing image recognition on the selected area;
if a preset target object exists in the selected area, displaying at least one piece of associated labeling information according to the target object;
and receiving operation information of the at least one piece of labeling information, and starting a corresponding application program according to the operation information.
2. The picture processing method according to claim 1, wherein the receiving a touch operation for the target picture, and determining the selected area according to the touch operation comprises:
receiving a click operation aiming at a target picture;
acquiring a click point of the click operation, and determining a circular area by taking the click point as a center and a preset distance as a radius;
determining the circular area as a selected area.
3. The picture processing method according to claim 1, wherein the receiving a touch operation for the target picture, and determining the selected area according to the touch operation comprises:
receiving a sliding gesture operation aiming at a target picture;
determining a moving track corresponding to the sliding gesture operation;
and determining the area inside the moving track as a selected area.
4. A method as claimed in claim 2 or 3, wherein said image recognition of said selected area comprises:
calling an image recognition component to recognize the selected area;
and if the identification fails, expanding the range of the selected area and identifying the expanded selected area.
5. The method of claim 4, wherein the preset target object is a person, and the displaying at least one associated label information according to the target object if the preset target object exists in the selected area comprises:
acquiring feature information of the person, wherein the feature information comprises: name and birthday information;
generating corresponding marking information according to the characteristic information;
and displaying a labeling layer in a suspending manner on the target picture, and displaying the labeling information on the labeling layer.
6. The method of claim 4, wherein the preset target object is a building, and the displaying at least one associated label information according to the target object if the preset target object exists in the selected area comprises:
acquiring characteristic information of the building, wherein the characteristic information comprises: name and geographic location;
generating corresponding marking information according to the characteristic information;
and displaying a labeling layer in a suspending manner on the target picture, and displaying the labeling information on the labeling layer.
7. The method of claim 5, wherein the receiving the operation information of the at least one label information and starting a corresponding application according to the operation information comprises:
identifying the operation information, the operation information including: single-click operation and double-click operation;
if the tagged information is a name, the clicking operation correspondingly opens a social application program and enters a chat window corresponding to the name; the double-click operation correspondingly starts a telephone application program and dials the telephone of the contact corresponding to the name;
if the marked information is the daily information, the single-click operation correspondingly opens a schedule application program; the double-click operation corresponds to starting a reminding application program.
8. The method of claim 6, wherein the receiving the operation information of the at least one label information and starting a corresponding application according to the operation information comprises:
identifying the operation information, the operation information including: single-click operation and double-click operation;
if the marked information is a name, the single-click operation correspondingly starts a search application program; the double-click operation corresponds to starting a comment application program;
if the marked information is the geographic position, the map application program is correspondingly started through the clicking operation; and the double-click operation correspondingly starts the traffic application program.
9. A mobile terminal, characterized in that the mobile terminal comprises a processor and a memory;
the memory is used for storing an executable program;
the processor is configured to execute the executable program to implement the picture processing method according to any one of claims 1 to 8.
10. A computer storage medium having stored thereon an executable program that when executed performs the picture processing method of any one of claims 1-8.
CN202010972102.5A 2020-09-16 2020-09-16 Picture processing method, mobile terminal and computer storage medium Pending CN112083872A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010972102.5A CN112083872A (en) 2020-09-16 2020-09-16 Picture processing method, mobile terminal and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010972102.5A CN112083872A (en) 2020-09-16 2020-09-16 Picture processing method, mobile terminal and computer storage medium

Publications (1)

Publication Number Publication Date
CN112083872A true CN112083872A (en) 2020-12-15

Family

ID=73737273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010972102.5A Pending CN112083872A (en) 2020-09-16 2020-09-16 Picture processing method, mobile terminal and computer storage medium

Country Status (1)

Country Link
CN (1) CN112083872A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110782A (en) * 2021-03-22 2021-07-13 百度在线网络技术(北京)有限公司 Image recognition method and device, computer equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004118601A (en) * 2002-09-27 2004-04-15 Victor Co Of Japan Ltd Display unit for image contents selection screen
US20080082426A1 (en) * 2005-05-09 2008-04-03 Gokturk Salih B System and method for enabling image recognition and searching of remote content on display
CN102368269A (en) * 2011-10-25 2012-03-07 华为终端有限公司 Association relationship establishment method and device
CN103369108A (en) * 2012-03-29 2013-10-23 宇龙计算机通信科技(深圳)有限公司 Mobile terminal operating method and mobile terminal
CN104731840A (en) * 2013-12-23 2015-06-24 岳造宇 Electronic device and object information query method of dynamic image displayed by same
CN104731512A (en) * 2015-03-31 2015-06-24 努比亚技术有限公司 Method, device and terminal for sharing pictures
CN106020642A (en) * 2016-05-11 2016-10-12 深圳市金立通信设备有限公司 Method for starting association application and terminal
CN106126023A (en) * 2016-06-20 2016-11-16 珠海市魅族科技有限公司 Quick sharing method, quick sharing apparatus and terminal
CN107679156A (en) * 2017-09-27 2018-02-09 努比亚技术有限公司 A kind of video image identification method and terminal, readable storage medium storing program for executing

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004118601A (en) * 2002-09-27 2004-04-15 Victor Co Of Japan Ltd Display unit for image contents selection screen
US20080082426A1 (en) * 2005-05-09 2008-04-03 Gokturk Salih B System and method for enabling image recognition and searching of remote content on display
CN102368269A (en) * 2011-10-25 2012-03-07 华为终端有限公司 Association relationship establishment method and device
CN103369108A (en) * 2012-03-29 2013-10-23 宇龙计算机通信科技(深圳)有限公司 Mobile terminal operating method and mobile terminal
CN104731840A (en) * 2013-12-23 2015-06-24 岳造宇 Electronic device and object information query method of dynamic image displayed by same
CN104731512A (en) * 2015-03-31 2015-06-24 努比亚技术有限公司 Method, device and terminal for sharing pictures
CN106020642A (en) * 2016-05-11 2016-10-12 深圳市金立通信设备有限公司 Method for starting association application and terminal
CN106126023A (en) * 2016-06-20 2016-11-16 珠海市魅族科技有限公司 Quick sharing method, quick sharing apparatus and terminal
CN107679156A (en) * 2017-09-27 2018-02-09 努比亚技术有限公司 A kind of video image identification method and terminal, readable storage medium storing program for executing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110782A (en) * 2021-03-22 2021-07-13 百度在线网络技术(北京)有限公司 Image recognition method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108037893B (en) Display control method and device of flexible screen and computer readable storage medium
CN107807767B (en) Communication service processing method, terminal and computer readable storage medium
CN108241752B (en) Photo display method, mobile terminal and computer readable storage medium
CN108196922B (en) Method for opening application, terminal and computer readable storage medium
CN110180181B (en) Method and device for capturing wonderful moment video and computer readable storage medium
CN109697008B (en) Content sharing method, terminal and computer readable storage medium
CN109584897B (en) Video noise reduction method, mobile terminal and computer readable storage medium
CN107422956B (en) Mobile terminal operation response method, mobile terminal and readable storage medium
CN114371803A (en) Operation method, intelligent terminal and storage medium
CN109683797B (en) Display area control method and device and computer readable storage medium
CN110020386B (en) Application page sharing method, mobile terminal and computer readable storage medium
CN108322611B (en) Screen locking information pushing method and device and computer readable storage medium
CN108762709B (en) Terminal control method, terminal and computer readable storage medium
CN108037901B (en) Display content switching control method, terminal and computer readable storage medium
CN112423211A (en) Multi-audio transmission control method, equipment and computer readable storage medium
CN108566476B (en) Information processing method, terminal and computer readable storage medium
CN109067976B (en) Desktop automatic switching method, mobile terminal and computer readable storage medium
CN112083872A (en) Picture processing method, mobile terminal and computer storage medium
CN107562304B (en) Control method, mobile terminal and computer readable storage medium
CN115277922A (en) Processing method, intelligent terminal and storage medium
CN110275667B (en) Content display method, mobile terminal, and computer-readable storage medium
CN113835586A (en) Icon processing method, intelligent terminal and storage medium
CN110087013B (en) Video chat method, mobile terminal and computer readable storage medium
CN109683793B (en) Content turning display method and device and computer readable storage medium
CN109495683B (en) Interval shooting method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201215