CN106789589B - Sharing processing method, sharing processing device and terminal - Google Patents

Sharing processing method, sharing processing device and terminal Download PDF

Info

Publication number
CN106789589B
CN106789589B CN201710000655.2A CN201710000655A CN106789589B CN 106789589 B CN106789589 B CN 106789589B CN 201710000655 A CN201710000655 A CN 201710000655A CN 106789589 B CN106789589 B CN 106789589B
Authority
CN
China
Prior art keywords
target users
information resource
information
sharing
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710000655.2A
Other languages
Chinese (zh)
Other versions
CN106789589A (en
Inventor
吴玲玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201710000655.2A priority Critical patent/CN106789589B/en
Publication of CN106789589A publication Critical patent/CN106789589A/en
Application granted granted Critical
Publication of CN106789589B publication Critical patent/CN106789589B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/06Message adaptation to terminal or network requirements
    • H04L51/063Content adaptation, e.g. replacement of unsuitable content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Transfer Between Computers (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses a sharing processing method, wherein the method comprises the following steps: the method comprises the steps of capturing content communicated with one or more target users through one or more second applications in a running first application, wherein the communicated content is the content communicated with the one or more target users through the one or more second applications within a preset time, and sharing the determined information resources to the one or more target users according to the information resources determined to be shared by the one or more target users in the first application through the captured content communicated with the one or more target users. The invention also discloses a sharing processing device and a terminal, which solve the problem that in the related technology, when the related content is wanted to be shared with the friend, the friend forgets which friend is the related content for a long time, so that the related content cannot be shared accurately.

Description

Sharing processing method, sharing processing device and terminal
Technical Field
The present invention relates to the field of terminal technologies, and in particular, to a sharing processing method, an apparatus, and a terminal.
Background
Along with the development of the internet and the popularization of the terminal, the user group of the terminal is larger and larger, and meanwhile, more intelligent and humanized requirements are provided for software.
In the prior art, a real terminal is used as a game machine or a television by a user, possibly a learning machine, possibly a playground of a baby and the like, so that more fun is brought to the life of people.
In the process of listening to music or watching news information, a user listens to a certain song, accidentally reminds of chatting with friends, and wants to share the content with the friends, but wants to find out which friend the user does not think to be the friend for a long time, so that the user cannot accurately share the content to be shared with the friends, and the user experience is poor.
In the related art, when the related content is wanted to be shared with the friend, the problem that the related content cannot be shared accurately due to long-time forgetting which friend is solved.
Disclosure of Invention
The invention mainly aims to provide a sharing processing method, a sharing processing device and a sharing processing terminal, and aims to solve the problem that in the related art, when related content is wanted to be shared by friends, the related content cannot be shared accurately due to long-time forgetting which friend the related content belongs to.
In order to achieve the above object, the present invention provides a sharing processing method, including:
capturing content communicated with one or more target users through one or more second applications in a running first application, wherein the content communicated with the one or more target users through the one or more second applications in a preset time;
determining information resources shared for the one or more target users in the first application according to the captured content communicated with the one or more target users;
sharing the determined information resource to the one or more target users.
Optionally, capturing content communicated with the one or more target users through the one or more second applications includes:
capturing at least one of the following in communication with the one or more target users through the one or more second applications: usage habits, hobbies, chat content, news concerns.
Optionally, determining, in the first application, information resources shared for the one or more target users according to the captured content communicated with the one or more target users includes:
detecting an information resource which is playing or running in the first application;
determining whether the information resource being played or operated matches the captured content communicated with the one or more target users;
and if so, determining the information resource which is being played or operated as the information resource shared by the one or more target users.
Optionally, the sharing of the determined information resource to the one or more target users includes:
receiving a long-press instruction for long-pressing the information resource when the information resource appears in the status bar;
displaying one or more hover icons over the information resource, wherein the hover icons include an avatar or user nickname of the one or more target users;
receiving a dragging instruction for dragging the information resource to the head portrait of the one or more target users;
and sharing the information resources to the one or more target users according to the dragging instruction.
Optionally, after sharing the determined information resource to the one or more target users, the method further includes:
detecting an information resource similar to the information resource in the first application;
automatically triggering sharing of the information resource similar to the information resource to the one or more target users.
According to another aspect of the present invention, there is also provided a sharing processing apparatus, including:
the device comprises a capturing module, a judging module and a judging module, wherein the capturing module is used for capturing content communicated with one or more target users through one or more second applications in a running first application, and the communicated content is the content communicated with the one or more target users through the one or more second applications within a preset time;
the determining module is used for determining information resources shared by the one or more target users in the first application according to the captured content communicated with the one or more target users;
and the sharing module is used for sharing the determined information resources to the one or more target users.
Optionally, the capturing module includes:
a capturing unit configured to capture at least one of the following in communication with the one or more target users through the one or more second applications: usage habits, hobbies, chat content, news concerns.
Optionally, the determining module includes:
the detection unit is used for detecting the information resource which is played or operated in the first application;
the judging unit is used for judging whether the information resource which is playing or running is matched with the captured content communicated with the one or more target users;
and the determining unit is used for determining the information resource which is being played or operated as the information resource shared by the one or more target users under the condition that the judgment result is yes.
Optionally, the sharing module includes:
the first receiving unit is used for receiving a long-press instruction of long-press of the information resource when the information resource appears in the status bar;
a display unit, configured to display one or more floating icons above the information resource, where the floating icons include head portraits or user nicknames of the one or more target users;
a second receiving unit, configured to receive a dragging instruction for dragging the information resource to the avatar of the one or more target users;
and the sharing unit is used for sharing the information resources to the one or more target users according to the dragging instruction.
Optionally, the apparatus further comprises:
a detection module for detecting an information resource similar to the information resource in the first application;
and the automatic triggering module is used for automatically triggering the sharing of the information resources similar to the information resources to the one or more target users.
According to another aspect of the present invention, there is also provided a terminal including one of the above-described apparatuses.
According to the method and the device, in the running first application, the content communicated with one or more target users through one or more second applications is captured, the information resources shared by the one or more target users are determined in the first application according to the captured content communicated with the one or more target users, and the determined information resources are shared with the one or more target users, so that the problem that in the related technology, when the related content in question is wanted to be shared by the friend, the friend forgets about which friend for a long time, so that the related content cannot be accurately shared is solved, the information resources matched with the chat content are automatically shared by capturing the chat content of the friend, and the user experience is improved.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a flow chart of a sharing processing method according to an embodiment of the invention;
FIG. 4 is a first diagram illustrating a sharing process according to an embodiment of the invention;
FIG. 5 is a second diagram illustrating a sharing process according to an embodiment of the invention;
FIG. 6 is a third diagram illustrating a sharing process according to an embodiment of the invention;
fig. 7 is a block diagram of a sharing processing apparatus according to an embodiment of the present invention;
fig. 8 is a first block diagram of a sharing processing apparatus according to a preferred embodiment of the present invention;
fig. 9 is a second block diagram of a sharing processing apparatus according to a preferred embodiment of the present invention;
fig. 10 is a third block diagram of the sharing processing apparatus according to the preferred embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc.
Fig. 1 illustrates the mobile terminal 100 having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. The elements of the mobile terminal 100 will be described in detail below.
The wireless communication unit 110 may generally include one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive digital broadcasting by using a digital broadcasting system such as a data broadcasting system of multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), forward link media (MediaFLO @), terrestrial digital broadcasting integrated service (ISDB-T), and the like. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetooth (TM), Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbee (TM), and the like.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module 115 is a GPS (global positioning system). According to the current technology, the GPS calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 122, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal 100. The microphone 122 may receive sounds (audio data) via the microphone 122 in a phone call mode, a recording mode, a voice recognition mode, or the like, and is capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data to control various operations of the mobile terminal 100 according to a command input by a user. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device. The sensing unit 140 may include a proximity sensor 141.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal 100. Various command signals or power input from the cradle may be used as a signal for identifying whether the mobile terminal 100 is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, mobile terminal 100 may include two or more display units (or other display devices), for example, mobile terminal 100 may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (communicating communication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, and the multimedia module 181 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, the mobile terminal 100 has been described in terms of its functionality. In addition, the mobile terminal 100 in the embodiment of the present invention may be a mobile terminal such as a folder type, a bar type, a swing type, a slide type, and other various types, and is not limited herein.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, a CDMA wireless communication system may include a plurality of intelligent terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC 280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC 280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul line may be constructed according to any of several known interfaces, which may include, for example, european/american standard high capacity digital lines (E1/T1), Asynchronous Transfer Mode (ATM), network protocol (IP), point-to-point protocol (PPP), frame relay, high-rate digital subscriber line (HDSL), Asymmetric Digital Subscriber Line (ADSL), or various types of digital subscriber lines (xDSL). It will be understood that a system as shown in fig. 2 may include multiple BSCs 275.
Each BS 270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz, 5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each partition of a particular BS 270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The location information module 115 (e.g., GPS) as shown in fig. 1 is generally configured to cooperate with the satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS 270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station is processed within a particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC 280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS 270 to transmit forward link signals to the mobile terminal 100.
Based on the above mobile terminal, an embodiment of the present invention provides a sharing processing method, and fig. 3 is a flowchart of the sharing processing method according to the embodiment of the present invention, as shown in fig. 3, the method includes the following steps:
step S302, capturing content communicated with one or more target users through one or more second applications in a running first application, wherein the communicated content is the content communicated with the one or more target users through the one or more second applications within a preset time;
step S304, determining information resources shared by the one or more target users in the first application according to the captured content communicated with the one or more target users;
step S306, sharing the determined information resource to the one or more target users.
Through the steps, in the running first application, the content communicated with one or more target users through one or more second applications is captured, the information resources shared by one or more target users are determined in the first application according to the captured content communicated with the one or more target users, and the determined information resources are shared by the one or more target users, so that the problem that in the related technology, when the related content in question is wanted to be shared by the friend, the friend forgets about which friend for a long time, so that the related content cannot be accurately shared is solved, the information resources matched with the chat content are automatically shared by the friend through capturing the chat content of the friend, and the user experience is improved.
By the method of capturing big data information such as use habits, interests and hobbies, chat contents, recent news concerns and the like of a user and then directly sharing information such as music, news and the like with a target friend, information sources such as music, news and the like can predict specific information which is interesting for the friend of the user, and can quickly share and recommend the information to the friend while the user listens/reads/acquires the information, and related interactive operation functions are automatically loaded at the moment. When the user uses the related information resource, the system intelligently identifies the information resource and friends or groups of the user who are interested in the information resource, and then can quickly trigger sharing. The system intelligence recommends friends/groups for which the information may be needed, allowing the user and friends/groups to better interact and share directly and instantly.
The system captures information such as user chat records and frequently shared contents, and can automatically trigger a sharing function when the user uses application/information similar to the previously captured chat records and frequently shared contents; the sharing mode can be that the information is pressed for a long time through a status bar, and then one or more suspension icons appear above the long press entry, wherein the icons can be head portraits of pre-shared objects; dragging information to the avatar is immediate.
The small A listens to the network song, inadvertently listens to the song of inverse war, chats about the Zhangjie with the small B on WeChat and the small B right a few days before the small A, chats about the Zhangjie in the middle school friend group and also chats about the Zhangjie of singer in the middle school friend group, at the moment, when the status bar plays the song of inverse war to the Zhangjie, the small A presses the song item or the music widget of the status bar, two head portrait icons of the small B and the middle school friend group are suspended above the status bar, the items are respectively dragged to the two head portraits, and the song of inverse war is rapidly shared among the small B and the middle school friend group.
Capturing, by the one or more second applications, content communicated with the one or more target users may include: capturing, by the one or more second applications, communication content of at least one of the one or more target users: usage habits, hobbies, chat content, points of interest within a predetermined time. The capturing of the communication content with the frequently-used contacts can be triggered, the list of the frequently-used contacts can be set by the user, and the list of the frequently-used contacts can be automatically determined according to the comprehensive consideration of the continuous frequency and the continuous time of the contact of the user in the preset time. The communication content with the common contact can be determined through the content of WeChat chat, QQ chat and the content of short message interaction.
Determining, in the first application, information resources shared for the one or more target users according to the captured content of the one or more target user communications may include: detecting the information resource which is being played or operated in the first application, judging whether the information resource which is being played or operated is matched with the captured content of the one or more target users in communication, and determining that the information resource which is being played or operated is the information resource shared by the one or more target users under the condition that the judgment result is yes. Fig. 4 is a first schematic diagram of a sharing process according to an embodiment of the present invention, as shown in fig. 4, a user is listening to music through a dog music, and playing is playing with a wave. The terminal detects the communication content of the contact person in common use in a preset time period, detects that the song of the periodic rapid is mentioned in the communication content, determines that the song is flying to be the content corresponding to the communication content, and determines that the song is flying to be the information resource shared by the friend 1 and the friend 2. Fig. 5 is a schematic diagram of a sharing process according to an embodiment of the present invention, and as shown in fig. 5, the head portraits or the nicknames of the friends 1 and 2 are displayed below the upper right-hand sharing icon.
Sharing the determined information resource to the one or more target users may include: receiving a long-press instruction for long-pressing the information resources when the information resources appear in a status bar, displaying one or more floating icons above the information resources, wherein the floating icons comprise head portraits or user nicknames of one or more target users, receiving a dragging instruction for dragging the information resources to the head portraits of the one or more target users, and sharing the information resources to the one or more target users according to the dragging instruction. Fig. 6 is a schematic diagram of sharing processing according to an embodiment of the present invention, as shown in fig. 6, a user presses a floating song name for a long time, a terminal receives a long-press instruction of pressing the long time for a long time, obtains content of shaking the song according to the instruction, receives a dragging instruction of dragging the shaking song to a friend 1 or 2 below a sharing icon by the user, and shares the obtained content of shaking to the friend 1 or the friend 2 according to the dragging instruction.
For news feeds and the like, similar to music, a small a is watching news feeds, and in a running news application, captures content communicated with one or more target users through a WeChat application and a QQ application, for example, the target users include: the number of target users may be set according to the user's requirement, for example, the user sets 3 target users in this embodiment, and certainly, the user may also set 1, 2 or more target users. In the chat contents of a small B, a small C and a small D of a target user, the topic of the watched news information 'yellow city leader and room price' is found in the chat contents of the small B, the information resource shared by the small B of the target user is determined in the first application according to the captured content communicated with the small B of the target user, specifically, the running information resource in the news information application is detected, whether the running information resource is matched with the captured content communicated with the small B of the target user is judged, and the running information resource is determined to be the information resource shared by the small B of the target user under the condition that the judgment result is yes. The method comprises the steps that determined information resources are shared with a target user small B, namely news information which is watched by the user small A is shared with the user small B, the user small A presses a title of the news information for a long time, a mobile terminal receives a long-press instruction of pressing the information resources for a long time when the information resources appear in a status bar, one or more suspension icons are displayed above the information resources, the suspension icons comprise head images or user nicknames of the target user small B, the small A drags the selected title to an image icon of the small B, the mobile terminal receives a dragging instruction of dragging the information resources to the head image icon of the target user small B, and the information resources are shared with the target user small B according to the dragging instruction. The head portrait icon of the small B is suspended above the small A by pressing the title of the news information of the status bar for a long time, and the target is dragged to the head portrait icon of the small B, so that the news information is rapidly shared to the small B.
In order to more intelligently share information resources with friends, after certain information resources are shared with the one or more target users, the information resources similar to the information resources are detected in the first application, the information resources similar to the information resources are automatically triggered to be shared with the one or more target users, shared content does not need to be dragged by users, content sharing can be directly completed, and the content sharing step is simplified.
According to another aspect of the embodiments of the present invention, there is also provided a sharing processing apparatus, and fig. 7 is a block diagram of the sharing processing apparatus according to the embodiments of the present invention, as shown in fig. 7, including:
a capturing module 72, configured to capture, in a running first application, content communicated with one or more target users through one or more second applications, where the content of communication is content communicated with the one or more target users through the one or more second applications within a predetermined time;
a determining module 74, configured to determine, in the first application, information resources shared for the one or more target users according to the captured content communicated with the one or more target users;
a sharing module 76, configured to share the determined information resource with the one or more target users.
Optionally, the capturing module 72 includes:
a capturing unit configured to capture at least one of the following in communication with the one or more target users through the one or more second applications: usage habits, hobbies, chat content, news concerns.
Fig. 8 is a first block diagram of a sharing processing apparatus according to a preferred embodiment of the present invention, and as shown in fig. 8, the determining module 74 includes:
a detecting unit 82, configured to detect an information resource that is being played or running in the first application;
a determining unit 84, configured to determine whether the information resource being played or executed matches the captured content communicated with the one or more target users;
the determining unit 86 is configured to determine, if the determination result is yes, that the information resource being played or running is an information resource shared by the one or more target users.
Fig. 9 is a second block diagram of the sharing processing apparatus according to the preferred embodiment of the present invention, and as shown in fig. 9, the sharing module 76 includes:
a first receiving unit 92, configured to receive a long-press instruction for long-pressing the information resource when the information resource appears in the status bar;
a display unit 94, configured to display one or more floating icons above the information resource, where the floating icons include head portraits or user nicknames of the one or more target users;
a second receiving unit 96, configured to receive a dragging instruction for dragging the information resource to the avatar of the one or more target users;
the sharing unit 98 is configured to share the information resource to the one or more target users according to the dragging instruction.
Fig. 10 is a third block diagram of the sharing processing apparatus according to the preferred embodiment of the present invention, and as shown in fig. 10, the apparatus further includes:
a detecting module 102, configured to detect an information resource similar to the information resource in the first application;
an automatic triggering module 104, configured to automatically trigger sharing of the information resource similar to the information resource to the one or more target users.
According to another aspect of the present invention, there is also provided a terminal including one of the above-described apparatuses.
According to the embodiment of the invention, in the running first application, the content communicated with one or more target users is captured through one or more second applications, the information resource shared by one or more target users is determined in the first application according to the captured content communicated with one or more target users, and the determined information resource is shared with one or more target users, so that the problem that in the related technology, when the talk related content is wanted to be shared with the friend, the friend forgets about the friend for a long time, so that the information resource matched with the chat content can not be accurately shared is solved, the chat content with the friend is captured, and the user experience is improved.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1. A method for shared processing, comprising:
capturing content communicated with one or more target users through one or more second applications in a running first application, wherein the content communicated with the one or more target users through the one or more second applications in a preset time;
determining, in the first application, information resources shared for the one or more target users according to the captured content communicated with the one or more target users, including: detecting an information resource which is playing or running in the first application; determining whether the information resource being played or operated matches the captured content communicated with the one or more target users; if so, determining the information resources which are being played or operated as the information resources shared by the one or more target users;
sharing the determined information resource to the one or more target users.
2. The method of claim 1, wherein capturing content communicated with the one or more target users through the one or more second applications comprises:
capturing at least one of the following in communication with the one or more target users through the one or more second applications: usage habits, hobbies, chat content, news concerns.
3. The method of claim 1, wherein sharing the determined information resource to the one or more target users comprises:
receiving a long-press instruction for long-pressing the information resource when the information resource appears in the status bar;
displaying one or more hover icons over the information resource, wherein the hover icons include an avatar or user nickname of the one or more target users;
receiving a dragging instruction for dragging the information resource to the head portrait of the one or more target users;
and sharing the information resources to the one or more target users according to the dragging instruction.
4. The method according to any one of claims 1 to 3, wherein after sharing the determined information resource to the one or more target users, the method further comprises:
detecting an information resource similar to the information resource in the first application;
automatically triggering sharing of the information resource similar to the information resource to the one or more target users.
5. A shared processing apparatus, comprising:
the device comprises a capturing module, a judging module and a judging module, wherein the capturing module is used for capturing content communicated with one or more target users through one or more second applications in a running first application, and the communicated content is the content communicated with the one or more target users through the one or more second applications within a preset time;
a determining module, configured to determine, in the first application, information resources shared for the one or more target users according to the captured content communicated with the one or more target users, where the determining module includes: the detection unit is used for detecting the information resource which is played or operated in the first application;
the judging unit is used for judging whether the information resource which is playing or running is matched with the captured content communicated with the one or more target users;
the determining unit is used for determining the information resources which are being played or operated as the information resources shared by the one or more target users under the condition that the judging result is yes;
and the sharing module is used for sharing the determined information resources to the one or more target users.
6. The apparatus of claim 5, wherein the capture module comprises:
a capturing unit configured to capture at least one of the following in communication with the one or more target users through the one or more second applications: usage habits, hobbies, chat content, news concerns.
7. The apparatus of claim 5, wherein the sharing module comprises:
the first receiving unit is used for receiving a long-press instruction of long-press of the information resource when the information resource appears in the status bar;
a display unit, configured to display one or more floating icons above the information resource, where the floating icons include head portraits or user nicknames of the one or more target users;
a second receiving unit, configured to receive a dragging instruction for dragging the information resource to the avatar of the one or more target users;
and the sharing unit is used for sharing the information resources to the one or more target users according to the dragging instruction.
8. A terminal, characterized in that it comprises the apparatus of any of claims 5 to 7.
CN201710000655.2A 2017-01-03 2017-01-03 Sharing processing method, sharing processing device and terminal Active CN106789589B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710000655.2A CN106789589B (en) 2017-01-03 2017-01-03 Sharing processing method, sharing processing device and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710000655.2A CN106789589B (en) 2017-01-03 2017-01-03 Sharing processing method, sharing processing device and terminal

Publications (2)

Publication Number Publication Date
CN106789589A CN106789589A (en) 2017-05-31
CN106789589B true CN106789589B (en) 2020-02-14

Family

ID=58952818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710000655.2A Active CN106789589B (en) 2017-01-03 2017-01-03 Sharing processing method, sharing processing device and terminal

Country Status (1)

Country Link
CN (1) CN106789589B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107770369A (en) * 2017-09-28 2018-03-06 努比亚技术有限公司 Control method, device and the computer-readable recording medium of mobile terminal
CN108616443A (en) * 2018-03-30 2018-10-02 北京三快在线科技有限公司 Associated person information methods of exhibiting and device
CN109951382B (en) * 2019-04-30 2021-08-17 赵剑锋 Open type instant social contact system and method for automatically managing interpersonal relationship
CN110460578B (en) * 2019-07-09 2022-02-22 北京达佳互联信息技术有限公司 Method and device for establishing association relationship and computer readable storage medium
KR20210130583A (en) * 2020-04-22 2021-11-01 라인플러스 주식회사 Method and system for sharing content on instant messaging application
CN112702258B (en) * 2020-12-21 2022-08-30 维沃移动通信(杭州)有限公司 Chat message sharing method and device and electronic equipment
CN113162845B (en) * 2021-04-26 2022-12-27 维沃移动通信有限公司 Image sharing method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123360A (en) * 2014-07-18 2014-10-29 腾讯科技(深圳)有限公司 Application recommendation data acquisition method, device and system and electronic device
CN102630049B (en) * 2011-12-31 2014-12-10 上海聚力传媒技术有限公司 Method for determining interest degree of user about playing video and equipment thereof
CN104216905A (en) * 2013-06-03 2014-12-17 华为终端有限公司 Application sharing method and device
CN104580452A (en) * 2014-12-30 2015-04-29 北京奇虎科技有限公司 Content information sharing method and system and electronic device
CN104735524A (en) * 2015-03-31 2015-06-24 上海华勤通讯技术有限公司 Data sharing method and data sharing system of mobile terminal
CN105574182A (en) * 2015-12-22 2016-05-11 北京搜狗科技发展有限公司 News recommendation method and device as well as device for news recommendation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102630049B (en) * 2011-12-31 2014-12-10 上海聚力传媒技术有限公司 Method for determining interest degree of user about playing video and equipment thereof
CN104216905A (en) * 2013-06-03 2014-12-17 华为终端有限公司 Application sharing method and device
CN104123360A (en) * 2014-07-18 2014-10-29 腾讯科技(深圳)有限公司 Application recommendation data acquisition method, device and system and electronic device
CN104580452A (en) * 2014-12-30 2015-04-29 北京奇虎科技有限公司 Content information sharing method and system and electronic device
CN104735524A (en) * 2015-03-31 2015-06-24 上海华勤通讯技术有限公司 Data sharing method and data sharing system of mobile terminal
CN105574182A (en) * 2015-12-22 2016-05-11 北京搜狗科技发展有限公司 News recommendation method and device as well as device for news recommendation

Also Published As

Publication number Publication date
CN106789589A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106789589B (en) Sharing processing method, sharing processing device and terminal
CN106888158B (en) Instant messaging method and device
WO2017071424A1 (en) Mobile terminal and method for sharing file
CN106990889B (en) Shortcut operation implementation method and device
CN105303398B (en) Information display method and system
CN106911850B (en) Mobile terminal and screen capturing method thereof
CN105487802B (en) Screen projection management method, device and system
CN105391562B (en) Group chat device, method and mobile terminal
CN106533928B (en) Method and device for updating unread message reminding identification
CN106527933B (en) Control method and device for edge gesture of mobile terminal
CN106791238B (en) Call control method and device of multi-party call conference system
CN106648324B (en) Hidden icon control method and device and terminal
CN106708321B (en) Touch screen touch method and device and terminal
CN106249989B (en) Method for arranging social application program icons during content sharing and mobile terminal
CN106547439B (en) Method and device for processing message
CN106657579B (en) Content sharing method and device and terminal
CN106598538B (en) Instruction set updating method and system
CN106227415A (en) icon display method, device and terminal
CN106528298A (en) Resource distribution method and device
CN106453883B (en) Intelligent terminal and message notification processing method thereof
CN106371704B (en) Application shortcut layout method of screen locking interface and terminal
CN109542317B (en) Display control method, device and storage medium of double-sided screen mobile terminal
CN106658116B (en) Play control method, device and terminal
CN105227771B (en) Picture transmission method and device
CN104639428B (en) Self-adaptive method for session scene in instant messaging and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant