CN114371803B - Operation method, intelligent terminal and storage medium - Google Patents

Operation method, intelligent terminal and storage medium Download PDF

Info

Publication number
CN114371803B
CN114371803B CN202210285014.7A CN202210285014A CN114371803B CN 114371803 B CN114371803 B CN 114371803B CN 202210285014 A CN202210285014 A CN 202210285014A CN 114371803 B CN114371803 B CN 114371803B
Authority
CN
China
Prior art keywords
outputting
application
service
associated object
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210285014.7A
Other languages
Chinese (zh)
Other versions
CN114371803A (en
Inventor
沈剑锋
汪智勇
李晨雄
黑啸吉
陈蓉
刘丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Transsion Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Holdings Co Ltd filed Critical Shenzhen Transsion Holdings Co Ltd
Priority to CN202210285014.7A priority Critical patent/CN114371803B/en
Publication of CN114371803A publication Critical patent/CN114371803A/en
Application granted granted Critical
Publication of CN114371803B publication Critical patent/CN114371803B/en
Priority to PCT/CN2023/078276 priority patent/WO2023169236A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application relates to an operation method, an intelligent terminal and a storage medium, wherein the method comprises the following steps: responding to a first operation on at least one first object, and outputting at least one associated object; and responding to a second operation on the first object and/or the associated object, and performing preset processing through at least one second application or service. According to the technical scheme, the associated object is automatically recommended to perform data processing according to the operation aiming at the first object, and the method is more convenient, quicker and more intelligent.

Description

Operation method, intelligent terminal and storage medium
Technical Field
The application relates to the technical field of user interaction, in particular to an operation method, an intelligent terminal and a storage medium.
Background
With the rapid development of terminal technology, the functions of mobile terminals such as mobile phones and tablet computers are also improved, and the mobile terminals become one of the common tools in daily life and work.
In the course of conceiving and implementing the present application, the inventors found that at least the following problems existed: in some implementations, when a user needs to share data, the user is limited to share an object displayed in a current interface, and cannot automatically recommend an associated object, which is not convenient, fast and intelligent.
The foregoing description is provided for general background information and is not admitted to be prior art.
Disclosure of Invention
In order to solve the technical problems, the application provides an operation method, an intelligent terminal and a storage medium, and data processing is performed by automatically recommending a related object according to an operation on a first object, so that the operation method, the intelligent terminal and the storage medium are more convenient, quicker and more intelligent.
In order to solve the technical problem, the application provides an operation method, which can be applied to an intelligent terminal and comprises the following steps:
s1: responding to a first operation on at least one first object, and outputting at least one associated object;
s2: and responding to a second operation on the first object and/or the associated object, and performing preset processing through at least one second application or service.
Optionally, the step S1 includes:
s11: responding to a first operation on at least one first object, and acquiring information and/or use data of the first object;
s12: and outputting at least one associated object according to the information and/or the use data of the first object.
Optionally, the information of the first object includes at least one of: function, source, operational scenario.
Optionally, the source of the first object may be at least one of an interface, an application, a contact, local or network data, and an associated device.
Optionally, the associated device may be a device (for example, a smart wearable device, a car networking device, a smart home device, or a network device such as a server) directly or indirectly connected to the smart terminal applying the operation method, or may be a device having some association with the smart terminal applying the operation method (for example, the user is the same user, the login account is the same user, and the like).
Optionally, the usage data comprises at least one of: frequency of using the associated object, context of using the associated object, function of using the associated object, source of using the associated object.
Optionally, the source of the used associated object includes at least one of an application, a device, a contact, local or network data, and an associated device.
Optionally, the step of S12, including at least one of:
outputting at least one associated object according to the information of the first object;
outputting at least one associated object according to the use data;
if the information of the first object is matched with the use data, outputting at least one associated object according to the use data;
and if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object.
Optionally, the outputting at least one associated object according to the information of the first object includes at least one of:
outputting at least one associated object whose function mutually supports and/or is at least partially identical to the function of the first object;
outputting at least one associated object of which the type of the source is the same as that of the first object;
and outputting at least one associated object of which the associated time and/or place is matched with the operation scene of the first object.
Optionally, the outputting at least one associated object according to the usage data includes at least one of:
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
outputting at least one related object of which the function is at least partially the same as the function of the related object in the use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
and outputting at least one associated object which is matched with the scene of the associated object in the associated time and/or place and the use data.
Optionally, when at least one associated object is output, the associated object is displayed at the original position of the source interface or is displayed in a floating manner above the original position.
Optionally, the step S1 includes:
responding to a first operation on at least one first object, and outputting at least one first preset interface;
and displaying the first object and/or the associated object in the first preset interface.
Optionally, the method further comprises:
updating the first object and/or the associated object in response to editing the first object and/or the associated object in the first preset interface.
Optionally, the editing comprises at least one of:
content interception, content splitting, content combining, object combining, deletion of at least part of content of an object, and copying of at least part of content of an object.
Optionally, the performing, by the second application or service, preset processing includes:
s21: outputting a second preset interface, wherein optionally, the second preset interface comprises an identifier of at least one second application or service;
s22: and in response to that the third operation on the identifier of the second application or service meets a preset condition, performing preset processing through the second application or service.
Optionally, the first operation, the second operation, or the third operation includes at least one of:
Long press, heavy press, move, long press and move, heavy press and move, long press or heavy press preset time length, moving track and biological identification information.
Optionally, the meeting of the preset condition includes at least one of:
the identifier corresponding to the first object and/or the associated object is in contact with or intersects with the identifier of the second application or service;
dragging the identifier corresponding to the first object and/or the associated object to the identifier of the second application or service;
dragging the identifier of the second application or service to the identifier corresponding to the first object and/or the associated object;
the identifier corresponding to the first object and/or the associated object and the identifier of the second application or service are dragged to at least one preset area;
selecting operation of a second application or service on the second preset interface;
and verifying the corresponding movement track and/or the biological identification information.
Optionally, the performing of the preset processing includes at least one of:
outputting the content and/or attributes of the first object and/or the associated object;
opening, by the second application or service, content and/or attributes of the first object and/or the associated object;
Applying, copying, transferring or publishing the first object and/or the associated object to the second application or service.
The application also provides a second operation method, which can be applied to the intelligent terminal and comprises the following steps:
s10: responding to a first operation on at least one first object and a second operation on at least one second object, and outputting at least one associated object;
s20: and responding to a third operation of at least one of the first object, the second object and the associated object, and performing preset processing through at least one second application or service.
Optionally, the step S10 includes:
s101: responding to a first operation on at least one first object and a second operation on at least one second object, and acquiring information and/or use data of the first object and the second object;
s102: and outputting at least one associated object according to the information of the first object, the information of the second object and/or the use data.
Optionally, the information of the first object and/or the second object includes at least one of the following: function, source, operational scenario.
Optionally, the source of the first object and/or the second object may be at least one of an interface, an application, a contact, local or network data, and an associated device.
Optionally, in the second operation method, the associated device may be a device (for example, a network device such as a smart wearable device, a car networking device, a smart home device, or a server) directly or indirectly connected to the smart terminal applying the second operation method, or may be a device having some association with the smart terminal applying the second operation method (for example, the user is the same user, the login account is the same user, and the like).
Optionally, in the second method of operation, the usage data includes at least one of: frequency of using the associated object, scene of using the associated object, function of using the associated object, source of using the associated object.
Optionally, in the second operation method, the source of the used associated object includes at least one of an application, a device, a contact, local or network data, and an associated device.
Optionally, the step S102 includes at least one of:
outputting at least one associated object according to the information of the first object and the second object;
outputting at least one associated object according to the use data;
if the information of the first object is matched with the use data, outputting at least one associated object of the first object according to the use data;
If the information of the second object is matched with the using data, outputting at least one associated object of the second object according to the using data;
if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object;
and if the information of the second object is not matched with the use data, outputting at least one associated object according to the information of the second object.
Optionally, the outputting of the at least one associated object according to the information of the first object and/or the second object includes at least one of:
outputting at least one associated object whose function mutually supports and/or is at least partially identical to the function of the first object and/or the second object;
outputting at least one associated object of which the type of the source is the same as the source of the first object and/or the second object;
and outputting at least one associated object of which the associated time and/or place is matched with the operation scene of the first object and/or the second object.
Optionally, in the second operation method, the outputting at least one associated object according to the usage data includes at least one of:
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
Outputting at least one related object having at least a part of the same function as the related object in the function and use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
and outputting at least one associated object which is matched with the scene of the associated object in the associated time and/or place and the use data.
Optionally, the method further comprises:
displaying at least one of the first object, the second object and the associated object in a first preset interface; and/or the presence of a gas in the gas,
updating the first object, the second object and/or the associated object in response to editing the first object, the second object and/or the associated object in the first preset interface.
Optionally, the method further comprises:
in response to the fourth operation, a sharing mode of multiple objects is entered, optionally, the sharing mode of multiple objects allows for operating on at least two objects.
Optionally, the step S10 includes:
responding to the first operation on the at least one first object, and prompting to perform a second operation on the at least one second object;
and outputting the at least one associated object in response to a second operation on the at least one second object.
The application also provides a third operation method, which can be applied to the intelligent terminal and comprises the following steps:
s100: responding to at least one first object meeting a first preset condition, and outputting at least one associated object;
s200: responding to a second operation on the first object and/or the associated object, and outputting at least one first preset interface, wherein optionally, the first preset interface comprises an identifier of at least one second application or service;
s300: and executing preset processing in response to a third operation on the identification of the second application or service meeting a second preset condition.
Optionally, the first preset condition is met, and the first preset condition includes at least one of the following:
receiving a preset operation;
matching the operation time and/or the operation place of the first object with a preset scene;
the first object has a preset function;
the first object is from a preset application.
Optionally, the step S100 includes:
s1001: responding to at least one first object meeting a first preset condition, and acquiring information and/or use data of the first object;
s1002: and outputting at least one associated object according to the information and/or the use data of the first object.
Optionally, in the third operation method, the information of the first object includes at least one of: function, source, operational scenario.
Optionally, in the third operation method, the source of the first object may be at least one of an interface, an application, a contact, local or network data, and an associated device.
Optionally, in the third operation method, the associated device may be a device (for example, a network device such as a smart wearable device, a car networking device, a smart home device, or a server) directly or indirectly connected to the smart terminal applying the operation method, or may be a device having some association with the smart terminal applying the operation method (for example, the user is the same user, the login account is the same user, and the like).
Optionally, in the third method of operation, the usage data includes at least one of: frequency of using the associated object, scene of using the associated object, function of using the associated object, source of using the associated object.
Optionally, in the third operation method, the source of the used associated object includes at least one of an application, a device, a contact, local or network data, and an associated device.
Optionally, the step S1002 includes at least one of:
Outputting at least one associated object according to the information of the first object;
outputting at least one associated object according to the use data;
if the information of the first object is matched with the use data, outputting at least one associated object according to the use data;
and if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object.
Optionally, in the third operation method, the outputting at least one associated object according to the information of the first object includes at least one of:
outputting at least one associated object with a function mutually supporting and/or at least partially identical to the function of the first object;
outputting at least one associated object of which the type of source is the same as the source of the first object;
and outputting at least one associated object of which the associated time and/or place is matched with the operation scene of the first object.
Optionally, in the third operation method, the outputting at least one associated object according to the usage data includes at least one of:
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
Outputting at least one related object of which the function is at least partially the same as the function of the related object in the use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
and outputting at least one associated object which is matched with the scene of the associated object in the associated time and/or place and the use data.
Optionally, in the third operation method, the method further includes:
displaying the first object and/or the associated object in a first preset interface; and/or the presence of a gas in the gas,
updating the first object and/or the associated object in response to editing the first object and/or the associated object in the first preset interface.
Optionally, the method further comprises: a second application or service is determined.
Optionally, the determining the second application or service includes at least one of:
determining at least one second application or service according to the sharing record of the first object;
determining at least one second application or service according to the current background application;
determining at least one second application or service according to the usage data;
determining at least one second application or service according to the content keywords of the first object;
Determining at least one second application or service according to the content type of the first object;
and determining at least one second application or service according to the operation of adding the application or service.
Optionally, in the third operation method, the performing preset processing includes at least one of:
outputting the content and/or attributes of the first object and/or the associated object;
opening, by the second application or service, content and/or attributes of the first object and/or the associated object;
applying, copying, transferring or publishing the first object and/or the associated object to the second application or service.
The application further provides an intelligent terminal, the intelligent terminal includes: the device comprises a memory and a processor, wherein the memory is stored with an operating program, and the operating program realizes the steps of the operating method according to any one of the above items when being executed by the processor.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the operating method of any one of the above.
As described above, the present application provides an operating method, an intelligent terminal, and a storage medium, where the operating method includes the steps of: responding to a first operation on at least one first object, and outputting at least one associated object; and responding to a second operation on the first object and/or the associated object, and performing preset processing through at least one second application or service. According to the technical scheme, the associated object can be automatically recommended to perform data processing according to the operation aiming at the first object, and the method is more convenient, quicker and more intelligent.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic diagram of a hardware structure of an intelligent terminal for implementing various embodiments of the present application.
Fig. 2 is a diagram illustrating a communication network system architecture according to an embodiment of the present application.
Fig. 3 is a flow chart diagram illustrating a method of operation according to the first embodiment.
Fig. 4 is an interface diagram illustrating an operation method according to the first embodiment.
Fig. 5 is another interface diagram illustrating the operation method according to the first embodiment.
Fig. 6 is a flow chart diagram illustrating a method of operation according to a second embodiment.
Fig. 7 is an interface diagram illustrating an operation method according to the second embodiment.
Fig. 8 is a flow chart diagram illustrating a method of operation according to the third embodiment.
Fig. 9 is an interface diagram showing an operation method according to the third embodiment.
The implementation, functional features and advantages of the object of the present application will be further explained with reference to the embodiments, and with reference to the accompanying drawings. Specific embodiments of the present application have been shown by way of example in the drawings and will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or," "and/or," "including at least one of the following," and the like, as used herein, are to be construed as inclusive or mean any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", again for example," A, B or C "or" A, B and/or C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless otherwise indicated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
The words "if", as used herein, may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that step numbers such as S1 and S2 are used herein for the purpose of more clearly and briefly describing the corresponding contents, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S2 first and then perform S1 in the specific implementation, which should be within the scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The smart terminal may be implemented in various forms. For example, the smart terminal described in the present application may include smart terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
While the following description will be given by way of example of a smart terminal, those skilled in the art will appreciate that the configuration according to the embodiments of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of an intelligent terminal for implementing various embodiments of the present application, the intelligent terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the intelligent terminal architecture shown in fig. 1 does not constitute a limitation of the intelligent terminal, and that the intelligent terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The following specifically describes each component of the intelligent terminal with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), TDD-LTE (Time Division duplex-Long Term Evolution, Time Division Long Term Evolution), 5G, and so on.
WiFi belongs to a short-distance wireless transmission technology, and the intelligent terminal can help a user to receive and send emails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the smart terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the smart terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the smart terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The smart terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor and a proximity sensor, the ambient light sensor may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1061 and/or the backlight when the smart terminal 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the intelligent terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the smart terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the smart terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the smart terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the smart terminal 100 or may be used to transmit data between the smart terminal 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, etc. Further, memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the intelligent terminal, connects various parts of the entire intelligent terminal using various interfaces and lines, and performs various functions of the intelligent terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the intelligent terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally, the application processor mainly handles operating systems, user interfaces, application programs, etc., and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The intelligent terminal 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
Although not shown in fig. 1, the smart terminal 100 may further include a bluetooth module or the like, which is not described herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the intelligent terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide some registers to manage functions such as home location register (not shown) and holds some user-specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems (e.g. 5G), and the like.
Based on the intelligent terminal hardware structure and the communication network system, the embodiments of the application are provided.
First embodiment
Fig. 3 is a flow chart diagram illustrating a method of operation according to the first embodiment. As shown in fig. 3, an operating method of the present application includes the steps of:
s1: responding to a first operation on at least one first object, and outputting at least one associated object;
s2: and responding to a second operation on the first object and/or the associated object, and performing preset processing through at least one second application or service.
By the method, when the user needs to share data, the related object can be automatically recommended to perform data processing according to the operation on the first object, and the method is more convenient, quicker and more intelligent.
Optionally, the first object comprises at least one of: pictures, text, video, music, hardware functions, software functions, screenshot interfaces of the current application or service, system screenshot interfaces, system modes, configuration information. Optionally, the pictures, texts, videos and music can be files themselves or links for opening corresponding files; the hardware function refers to a function of hardware that can be used by an application or service, and includes but is not limited to at least one of functions of hardware configured by an intelligent terminal such as a camera, a microphone, a flash lamp, a gyroscope and the like; the software function refers to a function of software which can be used by the application or the service, and includes but is not limited to at least one of a control function of the intelligent terminal on the intelligent household equipment and an operation interface of the application or the service; the screenshot interface of the current application or service refers to an interface with a screenshot function of the current application or service; the system screenshot interface refers to an interface of a screenshot function at a system level; the system mode includes but is not limited to at least one of a screen projection mode, a do-not-disturb mode and a landscape mode; the configuration information includes, but is not limited to, at least one of an information alert tone, font size, background configuration, notification configuration, preference data.
Optionally, the first object and/or the associated object is from at least one application or service different from the second application or service. Optionally, the first object is from a first application or service and the associated object is from the first application or service and/or at least one application or service different from the second application or service and the first application or service. Thus, data processing across applications between applications or services can be achieved.
In one scenario, a user selects to send a file during working hours, and other files which are the same as the type of the file to be sent and are all in the company at the receiving place can be recommended, so that the user can select whether to send the files together, and the efficiency is improved. In another scenario, when the user selects to send the pictures in the photo album, the habit of sending the pictures together with the head portrait or the signature of the user can be recommended, so that the user can send the pictures together, the operation is reduced, and the efficiency is improved. In another scenario, if the user selects the zoom-out function in the picture editing application, the zoom-in function and the cropping function in the picture editing application may be determined to be associated objects. Therefore, by recommending the associated object, the user can not be limited to only sharing the object selected by the current operation, the efficiency is improved, and the method is more flexible and intelligent.
Optionally, the step of S1, including:
s11: responding to a first operation on at least one first object, and acquiring information and/or use data of the first object;
s12: and outputting at least one associated object according to the information and/or the use data of the first object.
Optionally, the information of the first object includes at least one of: function, source, operational scenario. Alternatively, the functions include software functions such as a portable picture, a screen shot, a mute function, and the like, and hardware functions such as a photographing function, a flash function, and the like. Optionally, the source of the first object includes at least one of an interface, an application, a contact, local or network data, an associated device. Optionally, the operation scenario includes an operation location and/or time.
Optionally, the associated device may be a device (for example, a smart wearable device, a car networking device, a smart home device, or a network device such as a server) directly or indirectly connected to the smart terminal applying the operation method, or may be a device having some association with the smart terminal applying the operation method (for example, the user is the same user, the login account is the same user, and the like). In one scenario, the associated device is a smart wearable device bound to a smart terminal using the operation method, and the first object may be a software function, a hardware function, or configuration information on the smart wearable device.
Optionally, the usage data comprises at least one of: frequency of using the associated object, scene of using the associated object, function of using the associated object, source of using the associated object. Using the associated object means processing the associated object by the second application or service, for example, sending the associated object to the second application or service, that is, indicating using the associated object. Optionally, the frequency of using the associated object is the number of times of using the associated object in a preset period, for example, a day and a week. Optionally, the scene using the associated object includes a place and/or time of using the associated object. Optionally, the function of using the associated object includes a software function and a hardware function. Optionally, the source of the associated object used includes at least one of an application, a device, a contact, local or network data, an associated device. In one scenario, the first object may be a software function, a hardware function, or configuration information on the smart wearable device, and the associated object may be a software function, a hardware function, or configuration information of another smart wearable device intelligence bound to the smart terminal applying the operation method.
And outputting at least one associated object according to the information and/or the use data of the first object, so that the output associated object can be more accurate and more in line with the requirements of users.
Optionally, step S12, including at least one of:
outputting at least one associated object according to the information of the first object;
outputting at least one associated object according to the usage data;
if the information of the first object is matched with the use data, outputting at least one associated object according to the use data;
and if the information of the first object does not match the use data, outputting at least one associated object according to the information of the first object.
Optionally, the outputting of the at least one associated object according to the information of the first object includes at least one of:
outputting at least one associated object whose function mutually supports and/or is at least partially identical to the function of the first object;
outputting at least one associated object of which the type of the source is the same as that of the first object;
and outputting at least one associated object of which the associated time and/or the associated place are matched with the operation scene of the first object.
Optionally, the output function and the function of the first object support each other and/or are at least partially the same as each other, and the function includes a software function and/or a hardware function, for example, the user selects an image enlargement function, an image reduction function, a picture cropping function may be output as an associated object, the user selects a photographing function, and a flash function may be recommended as an associated object.
Optionally, at least one associated object of the same type as the source of the first object is output. Optionally, the source of the first object includes at least one of an interface, an application, and a contact. For example, the first object is a file from WeChat, a file from QQ or nailing can be output as an associated object, the first object is a background from a system setting interface, a background from a WeChat setting interface can be output as an associated object, the first object is a file link sent by a colleague A, and a file link sent by a colleague B can be output as an associated object.
Optionally, at least one associated object whose associated time and/or location matches the operation scene of the first object is output. Optionally, the operation scenario includes an operation time and/or a place. The object associated with the corresponding time may be selected according to the operation time of the first object, for example, the user selects to send a file at the meeting time, and other files received at the meeting time may be output as the associated object. Alternatively, an object associated with a corresponding place may also be selected according to an operation place for the first object, for example, an image that the user selects himself or herself to take at a park, and an image related to a landscape of the park on the network may be output as the associated object.
Optionally, the at least one associated object is output according to the usage data, including at least one of:
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
outputting at least one related object of which the function is at least partially the same as the function of the related object in the use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
and outputting at least one associated object which is matched with the scene of the associated object in the associated time and/or place and the use data.
Optionally, if the frequency of using the associated object is greater than or equal to the preset frequency threshold, outputting at least one associated object. When the frequency of using the associated object in the usage data is greater than or equal to the preset frequency threshold, indicating that the user prefers to use the associated object, the associated object may be output according to the information and/or usage data of the first object.
Optionally, the output function and the usage data use at least one associated object having at least a part of the same function as the associated object. For example, since the function of the related object that the user is accustomed to using is a map function, which indicates that the user is willing to share information of the location or the surrounding area of the user at ordinary times, the current location, the nearby food, the nearby shopping mall, and the like can be output as the related object.
Optionally, at least one associated object of which the type of the source is the same as the source of the associated object used in the usage data is output. For example, a user often shares a hot article from a certain technical website, and therefore when the user operates a first object, there is a high probability that the user selects to share the hot article, and the hot article from the technical website or a related type of technical website can be output as an associated object.
Optionally, at least one associated object of the associated time and/or location and usage data, which is matched with the scene of the usage associated object, is output. Optionally, the scene using the associated object includes a time and/or a place of using the associated object. For example, when a user frequently shares movies or music with friends at home, when the user operates the first object, the user may select to share movies or music with a high probability, and high-score movies, popular movies, and music lists may be output as associated objects.
Alternatively, when the associated object is output according to the information and the usage data of the first object, the object determined according to the information of the first object may be filtered according to the usage data, and the associated object may be output. For example, an a article from the a site, a B article from the B site, and a C article from the C site are determined from the information of the first object, but if the usage data indicates that the user does not like to share the articles from the C site, the a article from the a site and the B article from the B site are output as the associated objects.
Optionally, when the associated object is output according to the information of the first object and the usage data, if the information of the first object is matched with the usage data, at least one associated object is output according to the usage data, so that the output associated object can better meet the user requirement; and/or if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object, so that the output associated object can be more accurate.
As shown in fig. 4, as an operation interface of this embodiment, a picture 11 is displayed in a webpage 10, when a user needs to share the picture 11 with another application, the user may long-press the picture 11, and select the picture 11 as a first object, at this time, a recommended associated object 21 is displayed in a first preset interface 20, for example, another picture from an album or a picture frequently shared by the user may be displayed. Then, the user can select the picture 11 and/or the associated object 21 to perform subsequent operations, so that the object which can be shared is not limited to the picture selected by the user in the webpage 10, different interfaces or applications do not need to be opened for multiple times to share for multiple times, and the whole process is more convenient, quicker and more intelligent.
Optionally, when the associated object is output, if the source interface of the associated object is different from the first object, the source interface where the associated object is located may also be displayed, and since the source interface where the associated object is located usually includes content related to the associated object, the user may further adjust or add an associated object as needed. Optionally, when at least one associated object is output, the associated object is displayed at the original position of the source interface or is displayed in a floating manner above the original position, and through the floating display, a user can quickly distinguish the associated object in the interface for operation.
Optionally, in order to facilitate the user operation and comparison, the step S1 includes:
responding to a first operation on at least one first object, and outputting at least one first preset interface;
and displaying the first object and/or the associated object in the first preset interface.
Optionally, the original interface where the first object is located has a different display effect from the first preset interface. Optionally, the display effect includes at least one of a font, a size, a color, and an animation, so as to increase the recognition degree between different interfaces and facilitate better configuration of the effect respectively applied to the original interface where the first object is located and the first preset interface.
As shown in fig. 5, as another operation interface of this embodiment, a picture 11 is displayed in a web page 10, when a user needs to share the picture 11 with another application, the user may long-press the picture 11, and select the picture 11 as a first object, at this time, the picture 11 is displayed in a first preset interface 20, and at the same time, a recommended associated object 21 is also displayed, for example, another picture from an album or a picture frequently shared by the user is displayed. Then, the user can select the picture 11 and/or the associated object 21 to perform subsequent operations, so that the objects that can be shared are not limited to the pictures selected by the user in the webpage 10, different interfaces or applications do not need to be opened for multiple times to share for multiple times, all the objects that can be shared can be checked in one interface, and the method is more convenient and clear, so that the display area of the first preset interface 20 can be enlarged, further more associated objects can be displayed, and the whole process is more convenient, quicker and more intelligent.
Optionally, the method further comprises:
and updating the first object and/or the associated object in response to editing the first object and/or the associated object in the first preset interface.
Optionally, the editing comprises at least one of: content interception, content splitting, content combining, object combining, deletion of at least part of content of an object, and copying of at least part of content of an object. The first object and the associated object are displayed in the first preset interface, so that the function of editing the first object and/or the associated object in the first preset interface can be further provided, the sharing mode of the first object and the associated object is more flexible, the sharing misoperation can be reduced, the sharing is more convenient, quicker and more intelligent, and the sharing accuracy and the sharing success rate are improved.
Optionally, the step S2, performing preset processing by the second application or service, includes:
s21: outputting a second preset interface, wherein optionally, the second preset interface comprises at least one identifier of a second application or service;
s22: and in response to the third operation on the identifier of the second application or service meeting the preset condition, performing preset processing through the second application or service.
Optionally, at least one of the first operation, the second operation, and the third operation includes at least one of: long press, heavy press, move, long press and move, heavy press and move, long press or heavy press preset time length, moving track and biological identification information. Optionally, the movement track may be a preset movement track such as a circle, and when the movement track of the operation is consistent with the preset movement track, the security verification is passed, and the first object and/or the associated object are allowed to be processed. Optionally, the biometric information may be fingerprint information, and when the user presses the fingerprint information for a long time or presses the fingerprint information again, the fingerprint information is collected by a fingerprint collection module arranged on the screen for security verification, and after the security verification is passed, the first object and/or the associated object are allowed to be processed.
Optionally, the identifier of the second application or service is at least one of a name, an icon, and an interface thumbnail. Optionally, the second application or service in the second preset interface may be displayed in a single manner or in a combined manner, for example, an application or service frequently shared by the user may be displayed using an interface thumbnail, and a thumbnail of at least one common interface of the application or service may be displayed, for example, a chat interface of different contacts in the wechat application, and an application or service not frequently used by the user may be displayed using an icon or a name.
Optionally, the second application or service is from the current device and/or an associated device of the current device. Optionally, the associated device is a device that has an association or binding relationship with the current device, and may be, for example, other devices used by the user, or devices of family and friends of the user, so that data and functions can be shared among different devices, and the use is convenient.
Optionally, the preset condition is satisfied, and the preset condition includes at least one of the following:
the identifier corresponding to the first object and/or the associated object is in contact with or intersects with the identifier of the second application or service;
Dragging the identifier corresponding to the first object and/or the associated object to the identifier of the second application or service;
dragging the identifier of the second application or service to the identifier corresponding to the first object and/or the associated object;
the identification corresponding to the first object and/or the associated object and the identification of the second application or service are dragged to at least one preset area;
selecting operation of a second application or service on a second preset interface;
and verifying the corresponding movement track and/or the biological identification information.
Optionally, the preset area comprises at least one of: the interface of the first object, the interface of the associated object, the first preset interface, the second preset interface, and at least one third preset interface different from the interface of the first object, the interface of the associated object, the first preset interface, and the second preset interface.
Optionally, a preset process is performed, including at least one of:
outputting the content and/or attributes of the first object and/or the associated object;
opening the content and/or attributes of the first object and/or the associated object by the second application or service;
the first object and/or associated object is applied, copied, transferred or published to a second application or service.
Optionally, when a third operation on the identifier of the second application or service meets a preset condition, only the content and/or the attribute of the first object and/or the associated object may be output, at this time, the user may check the content and/or the attribute of the first object and/or the associated object to determine whether to continue sharing, and if sharing needs to continue, the sharing may be completed by performing a confirmation operation.
Optionally, when a third operation on the identifier of the second application or service satisfies a preset condition, the content and/or the attribute of the first object and/or the associated object may also be opened by the second application or service, and when the content and/or the attribute of the first object and/or the associated object is opened by the second application or service, the interface of the second application or service may be entered for display, or the floating window may be used for display without directly entering the interface of the second application or service.
Optionally, when the third operation on the identification of the second application or service satisfies the preset condition, the first object and/or the associated object may also be applied, copied, transferred or published to the second application or service, for example, applying the ringtone and hot ringtone of WeChat to QQ, copying the picture on the web page and the associated picture in the album to a memo, transferring the files in the current folder and the associated folder to a network hard disk for saving, transferring the photographing function and the flash function from the A application to the B application, publishing the picture selected by the user, music or video and the recommended picture, music or video together to a social application such as microblog, WeChat and the like. For another example, the first object may be a software function, a hardware function, or configuration information on the smart wearable device, the associated object may be a software function, a hardware function, or configuration information of another smart wearable device that is bound to the smart terminal that uses the operation method, and the second application or service may be a desktop, and the associated functions of the plurality of smart wearable devices may be simultaneously sent to the desktop to generate shortcuts. Therefore, the user can complete the sharing operation quickly and comprehensively at any time and any place, for example, the memorandum of the selected information and the associated information can be quickly carried out when the user browses the webpage, the memorandum does not need to be inserted after the user stores the memorandum, the memorandum does not need to be inserted for many times, and the sharing operation is very convenient.
The operation method of the application responds to a first operation on at least one first object and outputs at least one associated object; and responding to a second operation on the first object and/or the associated object, and performing preset processing through at least one second application or service. According to the technical scheme, the associated object is automatically recommended to perform data processing according to the operation aiming at the first object, and the method is more convenient, quicker and more intelligent.
Second embodiment
Fig. 6 is a flow chart diagram illustrating a method of operation according to a second embodiment. As shown in fig. 6, the operation method of the present application includes the following steps:
s10: responding to a first operation on at least one first object and a second operation on at least one second object, and outputting at least one associated object;
s20: and responding to a third operation of at least one of the first object, the second object and the associated object, and performing preset processing through at least one second application or service.
Through the mode, the method and the system can support sharing of a plurality of objects, can automatically recommend the associated objects, and are more convenient, quicker and more intelligent.
Optionally, the first object and/or the second object comprises at least one of: pictures, text, video, music, hardware functions, software functions, screenshot interfaces of the current application or service, system screenshot interfaces, system modes, configuration information. Optionally, the first object and/or the second object may come from different interfaces or the same interface, thereby increasing the source of the sharable object.
Optionally, the step of S10, including:
s101: responding to a first operation on at least one first object and a second operation on at least one second object, and acquiring information and/or use data of the first object and the second object;
s102: and outputting at least one associated object according to the information and/or the use data of the first object and the second object.
Optionally, the information of the first object and/or the second object includes at least one of: function, source, operational scenario. Optionally, the source of the first object and/or the second object comprises at least one of an interface, an application, a contact, local or network data, an associated device.
Optionally, the associated device may be a device (for example, a smart wearable device, a car networking device, a smart home device, or a network device such as a server) directly or indirectly connected to the smart terminal applying the operation method, or may be a device having some association with the smart terminal applying the operation method (for example, the user is the same user, the login account is the same user, and the like).
Optionally, the usage data comprises at least one of: frequency of using the associated object, scene of using the associated object, function of using the associated object, source of using the associated object. Optionally, the source of the associated object used includes at least one of an application, a device, a contact, local or network data, an associated device.
Optionally, step S102, includes at least one of:
outputting at least one associated object according to the information of the first object and the second object;
outputting at least one associated object according to the usage data;
if the information of the first object is matched with the use data, outputting at least one associated object of the first object according to the use data;
if the information of the second object is matched with the using data, outputting at least one associated object of the second object according to the using data;
if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object;
and if the information of the second object is not matched with the use data, outputting at least one associated object according to the information of the second object.
Optionally, the outputting of the at least one associated object according to the information of the first object and/or the second object includes at least one of:
outputting at least one associated object whose function mutually supports and/or is at least partially identical to the function of the first object and/or the second object;
outputting at least one associated object of which the type of the source is the same as the source of the first object and/or the second object;
and outputting at least one associated object of which the associated time and/or the associated place are matched with the operation scene of the first object and/or the second object.
Optionally, the at least one associated object is output according to the usage data, including at least one of:
if the frequency of the used associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
outputting at least one related object having at least a part of the same function as the related object in the function and use data;
outputting at least one associated object of which the type of the source is the same as the source of the used associated object in the use data;
and outputting at least one associated object which is matched with the scene of the associated object in the associated time and/or place and the use data.
Alternatively, when at least one associated object is output, only an object associated with the first object may be output, only an object associated with the second object may be output, and objects associated with both the first object and the second object may be output. Optionally, when the objects associated with both the first object and the second object are output, it may be determined whether the first object and the second object are associated, if so, the objects associated with both the first object and the second object are output, and if not, the object associated with the first object and the object associated with the second object are output at the same time.
Optionally, the first object, the second object and the associated object are displayed in a first preset interface, and the method further includes:
And updating the first object, the second object and/or the associated object in response to editing the first object, the second object and/or the associated object in the first preset interface.
Optionally, the method further comprises:
and responding to the fourth operation, entering a sharing mode of the multiple objects, and optionally allowing at least two objects to be operated in the sharing mode of the multiple objects.
Optionally, a sharing mode of multiple objects is set, and in this mode, a user can operate at least two objects, thereby reducing or avoiding misoperation.
Optionally, the step of S10, including:
responding to the first operation on the at least one first object, and prompting to perform a second operation on the at least one second object;
and outputting the at least one associated object in response to a second operation on the at least one second object.
Optionally, after entering the multi-object sharing mode, if a first operation on the first object is detected, actively prompting to perform a second operation on the second object, and outputting at least one associated object after completing the operation on each object, so that understanding of a user on the sharing mode is increased through a guided operation, and misoperation is reduced or avoided.
Optionally, the step S20, performing preset processing by the second application or service, includes:
S201: outputting a second preset interface, wherein optionally, the second preset interface comprises at least one identifier of a second application or service;
s202: and in response to that the fifth operation on the identifier of the second application or service meets the preset condition, performing preset processing through the second application or service.
Optionally, a preset process is performed, including at least one of:
outputting the content and/or attributes of the first object, the second object, and/or the associated object;
opening the content and/or attributes of the first object, the second object, and/or the associated object by the second application or service;
the first object, the second object, and/or the associated object are applied, copied, transferred, or published to a second application or service.
Among the above method steps, the implementation process of the steps related to the first embodiment is the same as that of the first embodiment, and is not described herein again.
As shown in fig. 7, for an operation interface of this embodiment, setting buttons of a chat background 11 and a ring tone 12 are displayed in a setting interface 10 of a WeChat, when a user needs to use the same chat background and ring tone as the WeChat in other social applications, the user may simultaneously or sequentially long-press the buttons of the chat background 11 and the ring tone 12, select the chat background 11 as a first object, and select the ring tone 12 as a second object, at this time, a first preset interface 20 is displayed, and an associated object 21 of the first object and/or the second object, which may be a hothouse background, a hothouse ring tone, font setting information, or the like, is displayed in the first preset interface 20. Then, the user can select the picture chat background 11, the incoming call ringtone 12, and/or the associated object 21 to perform subsequent operations, so that multiple objects in the setting interface 10 can be shared, the objects that can be shared are not limited to the objects selected by the user on the setting interface 10, multiple sharing is not required, and the whole process is more convenient, faster, and more intelligent.
The operation method of the application responds to a first operation on at least one first object and a second operation on at least one second object and outputs at least one associated object; and responding to a third operation of at least one of the first object, the second object and the associated object, and performing preset processing through at least one second application or service. Through the mode, the method and the system can support sharing of a plurality of objects, can automatically recommend the associated objects, and are more convenient, quicker and more intelligent.
Third embodiment
Fig. 8 is a flow chart diagram illustrating a method of operation according to the third embodiment. As shown in fig. 8, the operation method of the present application includes the following steps:
s100: responding to at least one first object meeting a first preset condition, and outputting at least one associated object;
s200: responding to a second operation on the first object and/or the associated object, and outputting at least one first preset interface, wherein optionally, the first preset interface comprises an identifier of at least one second application or service;
s300: and executing preset processing in response to the third operation on the identification of the second application or service meeting a second preset condition.
By the method, the associated object and the second application or service can be automatically recommended, and data processing is performed on the operation of the identifier of the second application or service, so that the method is more visual, convenient, rapid and intelligent.
Optionally, the first preset condition is satisfied, including at least one of:
receiving a preset operation;
matching the operation time and/or the operation place of the first object with a preset scene;
the first object has a preset function;
the first object is from a preset application.
Optionally, the operation time and/or location of the first object is matched with a preset scene, wherein the preset scene comprises a preset time and/or location, and the preset scene can be, for example, at a company, at home, on a weekday, on a weekend, and the like. The preset function includes a preset software function and/or a preset hardware function, such as a function of controlling an air conditioner, a photographing function, and the like. Alternatively, the preset application is an application set by the user or a default application, for example, the user sends a file in the nail more often, and in order to improve the working efficiency, the nail may be set as the preset application, and when the first object is selected in the nail, the associated object is automatically output.
Optionally, the step S100 includes:
s1001: responding to at least one first object meeting a first preset condition, and acquiring information and/or use data of the first object;
s1002: and outputting at least one associated object according to the information and/or the use data of the first object.
Optionally, the information of the first object includes at least one of: function, source, operational scenario. Optionally, the source of the first object includes at least one of an interface, an application, a contact, local or network data, an associated device.
Optionally, the associated device may be a device (for example, a smart wearable device, a car networking device, a smart home device, or a network device such as a server) directly or indirectly connected to the smart terminal applying the operation method, or may be a device having some association with the smart terminal applying the operation method (for example, the user is the same user, the login account is the same user, and the like).
Optionally, the usage data comprises at least one of: frequency of using the associated object, context of using the associated object, function of using the associated object, source of using the associated object. Optionally, the source of the associated object used includes at least one of an application, a device, a contact, local or network data, an associated device.
Optionally, step S1002, includes at least one of:
outputting at least one associated object according to the information of the first object;
outputting at least one associated object according to the usage data;
if the information of the first object is matched with the use data, outputting at least one associated object according to the use data;
and if the information of the first object does not match the use data, outputting at least one associated object according to the information of the first object.
Optionally, the output of the at least one associated object according to the information of the first object includes at least one of:
outputting at least one associated object whose function mutually supports and/or is at least partially identical to the function of the first object;
outputting at least one associated object of which the type of the source is the same as that of the first object;
and outputting at least one associated object of which the associated time and/or the associated place are matched with the operation scene of the first object.
Optionally, the at least one associated object is output according to the usage data, including at least one of:
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
outputting at least one related object of which the function is at least partially the same as the function of the related object in the use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
and outputting at least one associated object which is matched with the scene of the associated object in the associated time and/or place and the use data.
Optionally, the first object and the associated object are displayed in a first preset interface, and the method further includes:
and updating the first object and/or the associated object in response to editing the first object and/or the associated object in the first preset interface.
Optionally, before outputting the first preset interface, the method further includes: determining a second application or service; determining a second application or service, including at least one of:
determining at least one second application or service according to the sharing record of the first object;
determining at least one second application or service according to the current background application;
determining at least one second application or service according to the usage data;
determining at least one second application or service according to the content keywords of the first object;
determining at least one second application or service according to the content type of the first object;
and determining at least one second application or service according to the operation of adding the application or service.
Optionally, at least one second application or service is determined according to the sharing record of the first object, for example, if the sharing record indicates that the user frequently shares the first object to a specific type of application or service or a fixed application or service, the corresponding second application or service may be determined according to the sharing record.
Optionally, when the at least one second application or service is determined according to the current background application, where the current background application is an application currently switched to be run in the background, and is also an application frequently used by the user, and the accuracy is good.
Optionally, at least one second application or service is determined according to the usage data, for example, if the application or service that the user is accustomed to sharing is a social application, a memo or a picture editing application, and the like, the social application, the memo or the picture editing application may be recommended as the second application or service.
Optionally, at least one second application or service is determined according to the content keyword of the first object, for example, if the first object is a piece of text containing "work plan", a related application or service such as memo, social application or mailbox application may be recommended as the second application or service.
Optionally, at least one second application or service is determined according to the content type of the first object, for example, when the content type of the first object is a picture, a text, a video, or music, a social application or an editor that can process the corresponding content may be recommended, or when the content type of the first object is a hardware function, a software function, a screenshot interface of the current application or service, or a system screenshot interface, an application or service without the corresponding function and/or interface may be recommended, or when the content type of the first object is a system mode or configuration information, an application or service that applies the corresponding mode and/or configuration may be recommended.
Optionally, at least one second application or service is determined according to the operation of adding the application or service, for example, when the at least one second application or service displayed on the first preset interface does not have an application or service that needs to be shared, the user may select the application or service to add to the first preset interface for display.
Optionally, the user may set the frequently used application as a fixed application or service, and each time the first preset interface is displayed, the set application or service is displayed.
Optionally, when the current first object does not correspond to the sharing record and/or does not currently use the data, determining at least one second application or service according to at least one of the current background application, a content keyword of the first object, and a content type of the first object.
By the mode, the second application or service can be recommended more intelligently and accurately, user operation is reduced, and convenience are realized.
Optionally, the identifier of the second application or service is at least one of a name, an icon, and an interface thumbnail. Optionally, the second application or service in the first preset interface may be displayed in a single manner or in a combined manner, for example, an application or service frequently shared by the user may be displayed using an interface thumbnail, and a thumbnail of at least one common interface of the application or service may be displayed, for example, a chat interface of different contacts in the wechat application, and an application or service not frequently used by the user may be displayed using an icon or a name.
Optionally, the second application or service is from the current device and/or an associated device of the current device. Optionally, the associated device is a device that has an association or binding relationship with the current device, and may be, for example, other devices used by the user, or devices of family and friends of the user, so that data and functions can be shared among different devices, and the use is convenient. Under a scene, a user can share the projection function of the game application of the equipment of the user with the game application on the equipment of the friend, so that the game pictures of the two pieces of equipment can be projected into the same screen for interaction, and the interestingness is increased. Or, in another scenario, the user can share the configuration information of the incoming call do-not-disturb mode on the mobile phone with the notification application of the devices such as the smart watch, and the incoming call do-not-disturb mode is set for multiple devices of the user synchronously, so that the mobile phone is very convenient and fast.
Optionally, a preset process is performed, including at least one of:
outputting the content and/or attributes of the first object and/or the associated object;
opening the content and/or attributes of the first object and/or the associated object by the second application or service;
the first object and/or associated object is applied, copied, transferred or published to a second application or service.
Among the above method steps, the implementation process of the steps related to the first embodiment is the same as that of the first embodiment, and is not described herein again.
As shown in fig. 9, which is an operation interface of this embodiment, a received file 1 is displayed in the pinned chat interface 10, when a user needs to share the file 1, the file 1 is selected as the first object 11, and since the time when the file is pinned as a preset application or the user selects the object is the working time, at this time, an associated object 21, such as a file 2 and a file 3 that are received recently, a file 4 and a picture 1 that are often sent together with the file, is displayed in the interface 20. Then, the user can select the file 1, the file 2, the file 3, the file 4 and the picture 1 to perform subsequent operations, so that the associated object is automatically recommended when the first object meets the condition, the shared object is not limited to the object selected by the user on the setting interface 10, multiple sharing is not needed, and the whole process is more convenient, quicker and more intelligent.
The operating method of the application, in response to at least one first object meeting a first preset condition, outputs at least one associated object; responding to a second operation on the first object and/or the associated object, and outputting at least one first preset interface, wherein optionally, the first preset interface comprises an identifier of at least one second application or service; and executing preset processing in response to the third operation on the identification of the second application or service meeting a second preset condition. By the method, the associated object and the second application or service can be automatically recommended, and data processing is performed on the operation of the identifier of the second application or service, so that the method is more visual, convenient, rapid and intelligent.
The embodiment of the present application further provides an intelligent terminal, including: the device comprises a memory and a processor, wherein the memory is stored with an operating program, and the operating program realizes the steps of any one of the methods when being executed by the processor.
Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps of any one of the above methods.
In the embodiments of the intelligent terminal and the computer-readable storage medium provided in the present application, all technical features of any one of the above-described embodiments of the operation method may be included, and the expanding and explaining contents of the specification are basically the same as those of the above-described embodiments of the method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
It is to be understood that the foregoing scenarios are only examples, and do not constitute a limitation on application scenarios of the technical solutions provided in the embodiments of the present application, and the technical solutions of the present application may also be applied to other scenarios. For example, as can be known by those skilled in the art, with the evolution of system architecture and the emergence of new service scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The above-mentioned serial numbers of the embodiments of the present application are merely for description, and do not represent the advantages and disadvantages of the embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device in the embodiment of the application can be merged, divided and deleted according to actual needs.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with an emphasis on the description, and reference may be made to the description of other embodiments for parts that are not described or recited in any embodiment.
All possible combinations of the technical features in the embodiments are not described in the present application for the sake of brevity, but should be considered as the scope of the present application as long as there is no contradiction between the combinations of the technical features.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, memory Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (21)

1. A method of operation, comprising the steps of:
s1: responding to a first operation on at least one first object, and outputting at least one associated object, wherein the first object is data to be shared;
s2: responding to a second operation on the first object and/or the associated object, and performing preset processing through at least one second application or service;
the output at least one associated object, including at least one of:
outputting at least one associated object with a function mutually supporting and/or at least partially identical to the function of the first object;
outputting at least one associated object of which the type of source is the same as the source of the first object;
outputting at least one associated object of which the associated time and/or place is matched with the operation scene of the first object;
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
Outputting at least one related object of which the function is at least partially the same as the function of the related object in the use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
outputting at least one associated object matched with the scene of the associated object in the associated time and/or place and use data;
the performing of the preset treatment includes at least one of:
outputting the content and/or the attribute of the first object, and if the sharing needs to be continued, performing confirmation operation to share the first object;
outputting the content and/or the attribute of the first object and the associated object, and if the sharing needs to be continued, performing confirmation operation to share the first object and the associated object;
opening the content and/or attribute of the first object through the second application or service, and displaying by adopting a floating window without directly entering an interface of the second application or service;
opening the content and/or attributes of the first object and the associated object through the second application or service, and displaying by adopting a floating window without directly accessing an interface of the second application or service;
when the first object is at least one of a hardware function, a software function, a screenshot interface or a system screenshot interface of a current application or service, and configuration information, applying the first object to the second application or service so that the second application or service can use the function or configuration corresponding to the first object;
Applying, copying, transferring or publishing the first object and the associated object to the second application or service.
2. The method according to claim 1, wherein the step of S1 includes:
s11: responding to a first operation on at least one first object, and acquiring information and/or use data of the first object;
s12: and outputting at least one associated object according to the information and/or the use data of the first object.
3. The method of claim 2, wherein the step of S12 includes at least one of:
outputting at least one associated object according to the information of the first object;
outputting at least one associated object according to the use data;
if the information of the first object is matched with the use data, outputting at least one associated object according to the use data;
and if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object.
4. The method according to any one of claims 1 to 3, wherein the step S1 includes:
responding to a first operation on at least one first object, and outputting at least one first preset interface;
And displaying the first object and/or the associated object in the first preset interface.
5. The method of claim 4, further comprising:
updating the first object and/or the associated object in response to editing the first object and/or the associated object in the first preset interface.
6. The method of claim 5, wherein editing comprises at least one of: content interception, content splitting, content combining, object combining, deletion of at least part of content of an object, and copying of at least part of content of an object.
7. The method according to any one of claims 1 to 3, wherein the performing, by the second application or service, a preset process includes:
s21: outputting a second preset interface, wherein the second preset interface comprises at least one second application or service identifier;
s22: and in response to that the third operation on the identifier of the second application or service meets a preset condition, performing preset processing through the second application or service.
8. The method according to claim 7, wherein the preset condition is met, and the preset condition comprises at least one of the following:
The identifier corresponding to the first object and/or the associated object is in contact with or intersects with the identifier of the second application or service;
dragging the identifier corresponding to the first object and/or the associated object to the identifier of the second application or service;
dragging the identifier of the second application or service to the identifier corresponding to the first object and/or the associated object;
the identifier corresponding to the first object and/or the associated object and the identifier of the second application or service are dragged to at least one preset area;
selecting operation of a second application or service on the second preset interface;
and verifying the corresponding movement track and/or the biological identification information.
9. A method of operation, comprising the steps of:
s10: responding to a first operation on at least one first object and a second operation on at least one second object, and outputting at least one associated object, wherein the first object and the second object are data to be shared;
s20: responding to a third operation on at least one of the first object, the second object and the associated object, and performing preset processing through at least one second application or service;
The output at least one associated object comprises at least one of the following items:
outputting at least one associated object whose function mutually supports and/or is at least partially identical to the function of the first object and/or the second object;
outputting at least one associated object of which the type of the source is the same as that of the first object and/or the second object;
outputting at least one associated object of which the associated time and/or place is matched with the operation scene of the first object and/or the second object;
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
outputting at least one related object of which the function is at least partially the same as the function of the related object in the use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
outputting at least one associated object matched with the scene of the associated object in the associated time and/or place and use data;
the performing of the preset treatment includes at least one of:
outputting the content and/or the attribute of the first object and the second object, and if the sharing needs to be continued, performing confirmation operation to share the first object and the second object;
Outputting the contents and/or attributes of the first object, the second object and the associated object, and if the sharing is required to be continued, performing a confirmation operation to share the first object, the second object and the associated object;
opening the content and/or the attribute of the first object and the second object through the second application or service, and displaying by adopting a floating window without directly entering an interface of the second application or service;
opening the content and/or attributes of the first object, the second object and the associated object through the second application or service, and displaying by adopting a floating window without directly entering an interface of the second application or service;
when the first object and the second object are at least one of hardware functions, software functions, screenshot interfaces or system screenshot interfaces of current applications or services and configuration information, applying the first object and the second object to the second applications or services so that the second applications or services can use the functions or configurations corresponding to the first object and the second object;
applying, replicating, transferring or publishing the first object, second object and the associated object to the second application or service.
10. The method of claim 9, wherein the step of S10 includes:
s101: responding to a first operation on at least one first object and a second operation on at least one second object, and acquiring information and/or use data of the first object and the second object;
s102: and outputting at least one associated object according to the information of the first object, the information of the second object and/or the use data.
11. The method according to claim 10, wherein the step S102 includes at least one of:
outputting at least one associated object according to the information of the first object and/or the second object;
outputting at least one associated object according to the use data;
if the information of the first object is matched with the use data, outputting at least one associated object of the first object according to the use data;
if the information of the second object is matched with the use data, outputting at least one associated object of the second object according to the use data;
if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object;
And if the information of the second object is not matched with the use data, outputting at least one associated object according to the information of the second object.
12. The method according to any one of claims 9 to 11, further comprising at least one of:
displaying at least one of the first object, the second object and the associated object in a first preset interface;
updating the first object, the second object and/or the associated object in response to editing the first object, the second object and/or the associated object in the first preset interface;
and responding to the fourth operation, and entering a sharing mode of the multiple objects.
13. The method according to any one of claims 9 to 11, wherein the step S10 includes:
responding to the first operation on the at least one first object, and prompting to perform a second operation on the at least one second object;
and outputting the at least one associated object in response to a second operation on the at least one second object.
14. A method of operation, comprising the steps of:
s100: responding to at least one first object meeting a first preset condition, and outputting at least one associated object, wherein the first object is data to be shared;
S200: responding to a second operation on the first object and/or the associated object, and outputting at least one first preset interface, wherein the first preset interface comprises an identifier of at least one second application or service;
s300: executing preset processing in response to a third operation on the identifier of the second application or service meeting a second preset condition;
the output at least one associated object, including at least one of:
outputting at least one associated object with a function mutually supporting and/or at least partially identical to the function of the first object;
outputting at least one associated object of which the type of source is the same as the source of the first object;
outputting at least one associated object of which the associated time and/or place is matched with the operation scene of the first object;
if the frequency of using the associated object is greater than or equal to a preset frequency threshold value, outputting at least one associated object;
outputting at least one related object of which the function is at least partially the same as the function of the related object in the use data;
outputting at least one associated object with the same type of source and the same source of the used associated object in the use data;
outputting at least one associated object matched with the scene of the associated object in the associated time and/or place and use data;
The executing preset processing comprises at least one of the following steps:
outputting the content and/or attribute of the first object, and if the sharing needs to be continued, performing confirmation operation to share the first object;
outputting the content and/or the attribute of the first object and the associated object, and if the sharing is required to be continued, performing a confirmation operation to share the first object and the associated object;
opening the content and/or attribute of the first object through the second application or service, and displaying by adopting a floating window without directly entering an interface of the second application or service;
opening the content and/or attributes of the first object and the associated object through the second application or service, and displaying by adopting a floating window without directly accessing an interface of the second application or service;
when the first object is at least one of a hardware function, a software function, a screenshot interface or a system screenshot interface of a current application or service, and configuration information, applying the first object to the second application or service so that the second application or service can use the function or configuration corresponding to the first object;
applying, copying, transferring or publishing the first object and the associated object to the second application or service.
15. The method according to claim 14, wherein the first preset condition is satisfied and includes at least one of:
receiving a preset operation;
matching the operation time and/or the operation place of the first object with a preset scene;
the first object has a preset function;
the first object is from a preset application.
16. The method according to claim 14, wherein the S100 step comprises:
s1001: responding to at least one first object meeting a first preset condition, and acquiring information and/or use data of the first object;
s1002: and outputting at least one associated object according to the information and/or the use data of the first object.
17. The method according to claim 16, wherein the step S1002 comprises at least one of:
outputting at least one associated object according to the information of the first object;
outputting at least one associated object according to the use data;
if the information of the first object is matched with the use data, outputting at least one associated object according to the use data;
and if the information of the first object is not matched with the use data, outputting at least one associated object according to the information of the first object.
18. The method of any one of claims 14 to 17, further comprising at least one of:
displaying the first object and/or the associated object in a first preset interface;
updating the first object and/or the associated object in response to editing the first object and/or the associated object in the first preset interface;
a second application or service is determined.
19. The method according to any one of claims 14 to 17, wherein the performing of the preset processing includes at least one of:
outputting the content and/or attributes of the first object and/or the associated object;
opening, by the second application or service, content and/or attributes of the first object and/or the associated object;
applying, copying, transferring or publishing the first object and/or the associated object to the second application or service.
20. An intelligent terminal, characterized in that, intelligent terminal includes: memory, processor, wherein the memory has stored thereon an operating program which, when executed by the processor, carries out the steps of the operating method according to any one of claims 1 to 19.
21. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the operating method according to any one of claims 1 to 19.
CN202210285014.7A 2022-03-07 2022-03-23 Operation method, intelligent terminal and storage medium Active CN114371803B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210285014.7A CN114371803B (en) 2022-03-23 2022-03-23 Operation method, intelligent terminal and storage medium
PCT/CN2023/078276 WO2023169236A1 (en) 2022-03-07 2023-02-24 Operation method, intelligent terminal, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210285014.7A CN114371803B (en) 2022-03-23 2022-03-23 Operation method, intelligent terminal and storage medium

Publications (2)

Publication Number Publication Date
CN114371803A CN114371803A (en) 2022-04-19
CN114371803B true CN114371803B (en) 2022-07-29

Family

ID=81146949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210285014.7A Active CN114371803B (en) 2022-03-07 2022-03-23 Operation method, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114371803B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023169236A1 (en) * 2022-03-07 2023-09-14 深圳传音控股股份有限公司 Operation method, intelligent terminal, and storage medium
CN114595007A (en) * 2022-05-10 2022-06-07 深圳传音控股股份有限公司 Operation method, intelligent terminal and storage medium
CN114594886B (en) * 2022-04-24 2023-03-21 深圳传音控股股份有限公司 Operation method, intelligent terminal and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109725790A (en) * 2018-12-28 2019-05-07 Oppo广东移动通信有限公司 A kind of application recommended method, terminal and storage medium
CN111880713A (en) * 2020-07-25 2020-11-03 Oppo广东移动通信有限公司 Object processing method, related device, terminal and computer storage medium
CN112486385A (en) * 2020-11-30 2021-03-12 维沃移动通信有限公司 File sharing method and device, electronic equipment and readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9552138B2 (en) * 2013-05-10 2017-01-24 Lg Electronics Inc. Mobile terminal and method for controlling the same
KR20150007723A (en) * 2013-07-12 2015-01-21 삼성전자주식회사 Mobile apparutus and control method thereof
US10642455B2 (en) * 2015-12-28 2020-05-05 Ssh Communications Security Oyj User interfaces in a computer system
CN109670817B (en) * 2017-10-17 2023-09-12 阿里巴巴集团控股有限公司 Data processing method and device
WO2019213547A1 (en) * 2018-05-03 2019-11-07 Blustream Corporation Product care lifecycle management

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109725790A (en) * 2018-12-28 2019-05-07 Oppo广东移动通信有限公司 A kind of application recommended method, terminal and storage medium
CN111880713A (en) * 2020-07-25 2020-11-03 Oppo广东移动通信有限公司 Object processing method, related device, terminal and computer storage medium
CN112486385A (en) * 2020-11-30 2021-03-12 维沃移动通信有限公司 File sharing method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN114371803A (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN114371803B (en) Operation method, intelligent terminal and storage medium
CN114327189B (en) Operation method, intelligent terminal and storage medium
CN113487705A (en) Image annotation method, terminal and storage medium
CN113194197A (en) Interaction method, terminal and storage medium
CN113342234A (en) Display control method, mobile terminal and storage medium
CN114510166B (en) Operation method, intelligent terminal and storage medium
CN114594886B (en) Operation method, intelligent terminal and storage medium
CN114595007A (en) Operation method, intelligent terminal and storage medium
CN115277922A (en) Processing method, intelligent terminal and storage medium
CN113900559A (en) Information processing method, mobile terminal and storage medium
CN113835586A (en) Icon processing method, intelligent terminal and storage medium
CN114138144A (en) Control method, intelligent terminal and storage medium
CN113901245A (en) Picture searching method, intelligent terminal and storage medium
CN112230825A (en) Sharing method, mobile terminal and storage medium
CN113852717B (en) Wallpaper module position migration method, equipment and computer readable storage medium
CN114327184A (en) Data management method, intelligent terminal and storage medium
CN116225279A (en) Adjustment method, intelligent terminal and storage medium
CN114003751A (en) Information processing method, intelligent terminal and storage medium
CN113326537A (en) Data processing method, terminal device and storage medium
CN114756320A (en) Display method, terminal and storage medium
CN113805767A (en) Information processing method, mobile terminal and storage medium
CN117572998A (en) Display method, intelligent terminal and storage medium
CN114995730A (en) Information display method, mobile terminal and storage medium
CN113568532A (en) Display control method, mobile terminal and storage medium
CN114003159A (en) Processing method, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant