CN115599565A - Method and device for sending clipboard data - Google Patents

Method and device for sending clipboard data Download PDF

Info

Publication number
CN115599565A
CN115599565A CN202110780907.4A CN202110780907A CN115599565A CN 115599565 A CN115599565 A CN 115599565A CN 202110780907 A CN202110780907 A CN 202110780907A CN 115599565 A CN115599565 A CN 115599565A
Authority
CN
China
Prior art keywords
data
picture
app
storage space
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110780907.4A
Other languages
Chinese (zh)
Inventor
林娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110780907.4A priority Critical patent/CN115599565A/en
Publication of CN115599565A publication Critical patent/CN115599565A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/543User-generated data transfer, e.g. clipboards, dynamic data exchange [DDE], object linking and embedding [OLE]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/955Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Telephone Function (AREA)

Abstract

The application relates to the field of terminals, and provides a method and a device for sending clipboard data, which can solve the problem of cross-device copy failure of complex data. The method comprises the following steps: the method comprises the steps that a first device receives a copying operation, the copying operation is used for copying target data including a target picture and/or a target text, and the target text is a text with a format; the first device responds to the copying operation, and acquires first position information of the target data, wherein the first position information is used for indicating the storage position of the target data in a storage space of a copying APP; the first device writes the target data into a storage space of a cooperative APP according to the first position information; the first device sends clipboard data including second location information to a second device, the second location information being used for indicating a storage location of the target data in the storage space of the collaborative APP.

Description

Method and device for sending clipboard data
Technical Field
The present application relates to the field of terminals, and in particular, to a method and an apparatus for sending clipboard data.
Background
A clipboard (clipboard) is an area in the memory of a computer that is used to temporarily store exchanged information. Information can be shared between different Applications (APPs) through a clipboard, for example, a user temporarily stores data of APP1 in the clipboard through a copy operation, and then sends the data of APP1 in the clipboard to APP2 through a paste operation, thereby completing information sharing.
Copy-paste for cross-deviceIn a scenario, a portion of the content in the clipboard data at the sender may not be available to the receiver, e.g.,
Figure BDA0003156899680000011
mobile phone direction
Figure BDA0003156899680000012
When the computer sends the clipboard data, the pictures in the clipboard data can not be used
Figure BDA0003156899680000013
The computer acquires the information, thereby influencing the experience of the user.
Disclosure of Invention
The application provides a method and a device for sending clipboard data, which can solve the problem that a picture cannot be successfully copied when a cross-device is copied and pasted, and improve user experience.
In a first aspect, a method for sending clipboard data is provided, which includes: the method comprises the steps that first equipment receives a copying operation, the copying operation is used for copying target data including target pictures and/or target texts, and the target texts are texts with formats; the first device responds to the copying operation, and first position information of the target data is obtained, wherein the first position information is used for indicating the storage position of the target data in a storage space of a copying APP; the first device writes the target data into a storage space of a cooperative APP according to the first position information; the first device sends clipboard data including second location information to a second device, the second location information being used for indicating a storage location of the target data in the storage space of the collaborative APP.
The picture and text format belong to complex data, the target data cannot be directly written into a clipboard when the target data containing the picture and text format are copied, and the data written into the clipboard of the first device is actually the storage address (namely, first position information) of the target data in the storage space of the copying APP; after the second device receives the first location information, the second device requests the first device to acquire the target data based on the first location information, and at this time, the cooperative APP cannot transmit the target data to the second device because the target data is not in the storage space of the cooperative APP, thereby causing a cross-device copy failure. Firstly, transferring target data from a storage space for copying an APP to a storage space for cooperating with the APP to generate second position information; and then, sending the second address information to the second device, so that when the second device requests the cooperative APP of the first device to acquire the target data according to the second address information, the cooperative APP of the first device can read the target data from the storage space of the cooperative APP and send the target data to the second device, thereby completing cross-device copying and pasting of complex data such as pictures, text formats and the like.
Optionally, the first location information includes a Uniform Resource Identifier (URI) of the target data, and the writing, by the first device, the target data into a storage space of the cooperative APP according to the first location information includes: the first device reads hyper text markup language (html) from a storage space of the copy APP according to the URI, wherein the html content comprises the target data; and the first equipment writes the html content into a storage space of the cooperative APP.
Optionally, before the first device writes the html content into the storage space of the cooperative APP, the method further includes: the first device converts a first picture tag in the html content into a second icon tag, the first picture tag is a picture tag which cannot be identified by the second device, the second picture tag is a picture tag which can be identified by the second device, and the second picture tag is the second position information.
Some paste APPs cannot identify the first picture label, possibly resulting in paste failure. According to the embodiment, the first picture label which cannot be identified by the second equipment is converted into the second picture label which can be identified by the second equipment, so that the problem that the picture cannot be successfully copied when the picture is copied and pasted across the equipment can be solved, and the user experience is improved.
Optionally, the first picture tag is a v: imagedata tag, and the second picture tag is an img tag.
Optionally, the target data is the target picture, the first location information includes a first resource (src) path of the target picture, and the writing, by the first device, of the target data into a storage space of the cooperative APP according to the first location information includes: the first device generates a URI according to the first src path, wherein the URI is used for indicating the storage position of the target picture in the storage space of the copied APP; the first device reads the target picture from the storage space of the copied APP according to the URI; and the first equipment writes the target picture into a storage space of the cooperative APP.
The content written by some copying APPs into the clipboard is not a URI (Uniform resource identifier), but html content containing src paths, and a content analyzer cannot analyze the src paths, so that the src paths can be converted into URIs by the aid of the copying APPs, storage addresses of target pictures can be analyzed by the content analyzer, the target pictures can be obtained through a content provider, and the problem that the pictures cannot be copied successfully when the cross-device copying and pasting are carried out is solved.
Optionally, the first src path is located in html content, and the method further includes: the first device replaces the first src path in the html content with a second src path, where the second src path indicates a storage location of the target picture in the storage space of the cooperative APP, and the second src path is the second location information.
After the second src path is generated, the first src path in the html content needs to be replaced by the second src path, so that the target picture can be read from the storage location indicated by the second src path and sent to the second device when the second device requests to acquire the target picture.
Optionally, the method further comprises: the first device receives a request message including the second position information from the second device, wherein the request message is used for requesting to acquire the target data; the first device reads the target data from the storage space of the cooperative APP according to the second position information; the first device sends the target data to the second device.
In a second aspect, an apparatus for sending clipboard data is provided that includes means for performing any of the methods of the first aspect. The device can be a terminal device and also can be a chip in the terminal device. The apparatus may include an input unit and a processing unit.
When the apparatus is a terminal device, the processing unit may be a processor, and the input unit may be a communication interface; the terminal device may further comprise a memory for storing computer program code which, when executed by the processor, causes the terminal device to perform any of the methods of the first aspect.
When the apparatus is a chip in a terminal device, the processing unit may be a processing unit inside the chip, and the input unit may be an output interface, a pin, a circuit, or the like; the chip may also include a memory, which may be a memory within the chip (e.g., registers, cache, etc.) or a memory external to the chip (e.g., read only memory, random access memory, etc.); the memory is adapted to store computer program code which, when executed by the processor, causes the chip to perform any of the methods of the first aspect.
In a third aspect, there is provided a computer readable storage medium having stored thereon computer program code which, when executed by an apparatus for transmitting clipboard data, causes the apparatus to perform any one of the methods of the first aspect.
In a fourth aspect, there is provided a computer program product comprising: computer program code which, when run by an apparatus for sending clipboard data, causes the apparatus to perform any of the methods of the first aspect.
Drawings
FIG. 1 is a schematic diagram of a hardware system suitable for use in the apparatus of the present application;
FIG. 2 is a schematic diagram of a software system suitable for use in the apparatus of the present application;
FIG. 3 is a schematic diagram of a cross-device copy-and-paste scenario suitable for use in the present application;
FIG. 4 is a schematic diagram of a cross-device copy-and-paste flow provided by the present application;
FIG. 5 is a schematic diagram of a cross-device copy-and-paste embodiment provided by the present application;
FIG. 6 is a schematic diagram of another cross-device copy-and-paste embodiment provided herein;
fig. 7 is a schematic diagram of a single device copy and paste process provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 shows a hardware structure of an apparatus suitable for the present application.
The apparatus 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a projector, and the like, and the embodiment of the present application does not limit the specific type of the apparatus 100.
The apparatus 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The configuration shown in fig. 1 is not intended to specifically limit the apparatus 100. In other embodiments of the present application, the apparatus 100 may include more or fewer components than those shown in FIG. 1, or the apparatus 100 may include a combination of some of the components shown in FIG. 1, or the apparatus 100 may include sub-components of some of the components shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. For example, the processor 110 may include at least one of the following processing units: an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and a neural Network Processor (NPU). The different processing units may be independent devices or integrated devices.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of the following interfaces: an inter-integrated circuit (I2C) interface, an inter-integrated circuit audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose-output (GPIO) interface, a SIM interface, and a USB interface.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the apparatus 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled through a PCM interface. In some embodiments, the audio module 170 may also transmit the audio signal to the wireless communication module 160 through the PCM interface, so as to implement the function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194 and camera 193. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of apparatus 100. Processor 110 and display screen 194 communicate via a DSI interface to implement the display functionality of device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal interface and may also be configured as a data signal interface. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, and the sensor module 180. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, or a MIPI interface.
The USB interface 130 is an interface conforming to a USB standard specification, and may be a Mini (Mini) USB interface, a Micro (Micro) USB interface, or a USB Type C (USB Type C) interface, for example. The USB interface 130 may be used to connect a charger to charge the apparatus 100, to transmit data between the apparatus 100 and a peripheral device, and to connect an earphone to play audio through the earphone. The USB interface 130 may also be used to connect other apparatuses 100, such as AR devices.
The connection relationship between the modules shown in fig. 1 is only illustrative and does not limit the connection relationship between the modules of the apparatus 100. Alternatively, the modules of the apparatus 100 may also adopt a combination of the connection manners in the above embodiments.
The charge management module 140 is used to receive power from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive the current of the wired charger through the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive electromagnetic waves through a wireless charging coil of the device 100 (current path shown as dashed line). The charging management module 140 may also supply power to the device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle number, and battery state of health (e.g., leakage, impedance). Alternatively, the power management module 141 may be disposed in the processor 110, or the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the apparatus 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide solutions for wireless communication applied on the device 100, such as at least one of the following: second generation (2) th generation, 2G) mobile communication solution, third generation (3) th generation, 3G) mobile communication solution, fourth generation (4) th generation, 5G) mobile communication solution, fifth generation (5) th generation, 5G) mobile communication solutions. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 mayThe electromagnetic wave is received by the antenna 1, and the received electromagnetic wave is subjected to processing such as filtering and amplification, and then transmitted to a modem processor for demodulation. The mobile communication module 150 may further amplify the signal modulated by the modem processor, and the amplified signal is converted into electromagnetic waves by the antenna 1 and radiated. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (e.g., speaker 170A, microphone 170B) or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
Similar to the mobile communication module 150, the wireless communication module 160 may also provide a wireless communication solution applied on the device 100, such as at least one of the following: wireless Local Area Networks (WLANs), bluetooth (BT), bluetooth Low Energy (BLE), ultra Wide Band (UWB), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR) technologies. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency-modulates and filters electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive, frequency modulate and amplify the signal to be transmitted from the processor 110, which is converted to electromagnetic waves via the antenna 2 for radiation.
In some embodiments, antenna 1 of apparatus 100 and mobile communication module 150 are coupled and antenna 2 of apparatus 100 and wireless communication module 160 are coupled such that electronic device 100 may communicate with a network and other electronic devices through wireless communication techniques. The wireless communication technology may include at least one of the following communication technologies: global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, IR technologies. The GNSS may include at least one of the following positioning techniques: global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), satellite Based Augmentation System (SBAS).
The device 100 may implement display functionality through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 may be used to display images or video. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (Mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED), or a quantum dot light-emitting diode (QLED). In some embodiments, the device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The device 100 may implement a camera function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can perform algorithm optimization on the noise, brightness and color of the image, and can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into a standard Red Green Blue (RGB), YUV, or the like format image signal. In some embodiments, device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the apparatus 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The apparatus 100 may support one or more video codecs. In this way, the apparatus 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, and MPEG4.
The NPU is a processor which uses biological neural network structure for reference, for example, the NPU can use transfer mode between human brain neurons to quickly process input information, and can also continuously self-learn. The NPU can implement functions of the apparatus 100, such as intelligent recognition, for example: image recognition, face recognition, speech recognition and text understanding.
The external memory interface 120 may be used to connect an external memory card, such as a Secure Digital (SD) card, to implement the storage capability of the expansion device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, an application program required for at least one function (e.g., a sound playing function and an image playing function). The storage data area may store data (e.g., audio data and a phonebook) created during use of the device 100. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory such as: at least one magnetic disk storage device, a flash memory device, and a universal flash memory (UFS), and the like. The processor 110 performs various processing methods of the apparatus 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The apparatus 100 may implement audio functions, such as music playing and recording, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor.
The audio module 170 is used to convert digital audio information into an analog audio signal for output, and may also be used to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a horn, is used to convert the audio electrical signal into a sound signal. The device 100 may listen to music or hands-free talk through the speaker 170A.
The receiver 170B, also called an earpiece, is used to convert the electrical audio signal into a sound signal. When the user uses the device 100 to receive a call or voice information, the voice can be received by placing the receiver 170B close to the ear.
The microphone 170C, also referred to as a mouthpiece or a microphone, is used to convert the sound signal into an electrical signal. When a user makes a call or sends voice information, a voice signal may be input into the microphone 170C by sounding close to the microphone 170C. The apparatus 100 may be provided with at least one microphone 170C. In other embodiments, the apparatus 100 may be provided with two microphones 170C to implement the noise reduction function. In other embodiments, three, four, or more microphones 170C may be provided with the apparatus 100 to perform the functions of identifying the source of the sound and directing the recording. The processor 110 may process the electrical signal output by the microphone 170C, for example, the audio module 170 and the wireless communication module 160 may be coupled via a PCM interface, and the microphone 170C converts the ambient sound into an electrical signal (e.g., a PCM signal) and transmits the electrical signal to the processor 110 via the PCM interface; the electrical signal is subjected to a volume analysis and a frequency analysis from processor 110 to determine the volume and frequency of the ambient sound.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile device 100 platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A may be of a wide variety of types, and may be, for example, a resistive pressure sensor, an inductive pressure sensor, or a capacitive pressure sensor. The capacitive pressure sensor may be a sensor that includes at least two parallel plates having conductive material, and when a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes, and the apparatus 100 determines the strength of the pressure based on the change in capacitance. When a touch operation is applied to the display screen 194, the device 100 detects the touch operation from the pressure sensor 180A. The apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message; and when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the apparatus 100. In some embodiments, the angular velocity of device 100 about three axes (i.e., the x-axis, y-axis, and z-axis) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the device 100, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the device 100 by a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used in scenes such as navigation and motion sensing games.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the device 100 calculates altitude from barometric pressure values measured by the barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the apparatus 100 is a flip phone, the apparatus 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. The device 100 can set the flip cover to automatically unlock according to the detected opening and closing state of the leather sheath or the detected opening and closing state of the flip cover.
Acceleration sensor 180E may detect the magnitude of acceleration of device 100 in various directions, typically the x-axis, y-axis, and z-axis. The magnitude and direction of gravity can be detected when the device 100 is stationary. The acceleration sensor 180E may also be used to identify the attitude of the device 100 as an input parameter for applications such as horizontal and vertical screen switching and pedometers.
The distance sensor 180F is used to measure a distance. The device 100 may measure distance by infrared or laser. In some embodiments, for example in a shooting scene, the device 100 may utilize the distance sensor 180F to range to achieve fast focus.
The proximity light sensor 180G may include, for example, a light-emitting diode (LED) and a photodetector, for example, a photodiode. The LED may be an infrared LED. The device 100 emits infrared light outward through the LED. The apparatus 100 uses a photodiode to detect infrared reflected light from nearby objects. When reflected light is detected, the apparatus 100 may determine that an object is present nearby. When no reflected light is detected, the apparatus 100 can determine that there is no object nearby. The device 100 can detect whether the user holds the device 100 close to the ear by using the proximity light sensor 180G, so as to automatically turn off the screen to save power. The proximity light sensor 180G may also be used for automatic unlocking and automatic screen locking in a holster mode or a pocket mode.
The ambient light sensor 180L is used to sense the ambient light level. Device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the device 100 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The device 100 can utilize the collected fingerprint characteristics to achieve the functions of unlocking, accessing an application lock, taking a picture, answering an incoming call, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the apparatus 100 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the apparatus 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the device 100 heats the battery 142 when the temperature is below another threshold to avoid a low temperature causing the device 100 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the apparatus 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a touch device. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also referred to as a touch screen. The touch sensor 180K is used to detect a touch operation applied thereto or in the vicinity thereof. The touch sensor 180K may pass the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the device 100 at a different location than the display screen 194.
The bone conduction sensor 180M can acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human body pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The keys 190 include a power-on key and a volume key. The keys 190 may be mechanical keys or touch keys. The device 100 can receive the key input signal and realize the function related to the case input signal.
The motor 191 may generate vibrations. The motor 191 may be used for incoming call alerting as well as for touch feedback. The motor 191 may generate different vibration feedback effects for touch operations applied to different applications. The motor 191 may also produce different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenarios (e.g., time reminders, received messages, alarms, and games) may correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light that may be used to indicate a charge state and charge change, or may be used to indicate a message, missed call, and notification.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be inserted into the SIM card interface 195 to make contact with the device 100, or can be removed from the SIM card interface 195 to make separation from the device 100. The apparatus 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The same SIM card interface 195 may be inserted with multiple cards at the same time, which may be of the same or different types. The SIM card interface 195 is also compatible with external memory cards. The device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the device 100 employs an embedded SIM (eSIM) card, which can be embedded in the device 100 and cannot be separated from the device 100.
The hardware system of the apparatus 100 is described in detail above, and the software system of the apparatus 100 is described below. The software system may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture, and the software system of the apparatus 100 is exemplarily described in the embodiment of the present application by taking the layered architecture as an example.
As shown in fig. 2, the software system adopting the layered architecture is divided into a plurality of layers, and each layer has a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the software system may be divided into four layers, an application layer, an application framework layer, an Android Runtime (Android Runtime) and system library, and a kernel layer from top to bottom, respectively.
The application layer may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer may include some predefined functions.
For example, the application framework layer includes a window manager, a content provider, a content parser, a view system, a phone manager, a resource manager, and a notification manager.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen and judge whether a status bar, a lock screen and a capture screen exist.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, and phone books.
The content parser is used for acquiring the data provided by the content provider and modifying, adding, deleting or updating the data.
The view system includes visual controls such as controls to display text and controls to display pictures. The view system may be used to build applications. The display interface may be composed of one or more views, for example, a display interface including a short message notification icon, and may include a view displaying text and a view displaying pictures.
The phone manager is used to provide communication functions of the device 100, such as management of call status (on or off).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, and video files.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as notification managers, are used for download completion notifications and message reminders. The notification manager may also manage notifications that appear in the form of charts or scroll bar text in a status bar at the top of the system, such as notifications for applications running in the background. The notification manager may also manage notifications that appear on the screen in dialog windows, such as prompting for text messages in a status bar, sounding a prompt tone, vibrating the electronic device, and flashing an indicator light.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used to perform the functions of object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
The system library may include a plurality of functional modules, such as: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., open graphics library for embedded systems, openGL ES) and 2D graphics engines (e.g., skin Graphics Library (SGL)) for embedded systems.
The surface manager is used for managing the display subsystem and providing fusion of the 2D layer and the 3D layer for a plurality of application programs.
The media library supports playback and recording of multiple audio formats, playback and recording of multiple video formats, and still image files. The media library may support a variety of audiovisual coding formats, such as MPEG4, h.264, moving picture experts group audio layer III (MP 3), advanced Audio Coding (AAC), adaptive multi-rate (AMR), joint photographic experts group (JPG), and Portable Network Graphics (PNG).
The three-dimensional graphics processing library may be used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer can comprise driving modules such as a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the software system and the hardware system of the apparatus 100 is exemplarily described below in conjunction with the display interface scenario.
When a user performs a touch operation on the touch sensor 180K, a corresponding hardware interrupt is sent to the kernel layer, and the kernel layer processes the touch operation into an original input event, where the original input event includes information such as touch coordinates and a timestamp of the touch operation. The original input event is stored in the kernel layer, and the application framework layer acquires the original input event from the kernel layer, identifies a control corresponding to the original input event, and notifies an Application (APP) corresponding to the control. For example, the touch operation is a click operation, the APP corresponding to the control is a gallery APP, and after the gallery APP is awakened by the click operation, the display driver of the kernel layer may be called through the API, and the display driver controls the display screen 194 to display the interface of the gallery APP.
The method for sending clipboard data provided by the present application is described below.
Fig. 3 is a scenario applicable to the present application. The mobile phone 310 has software and hardware systems shown in fig. 1 and fig. 2, and the mobile phone 310 and the Personal Computer (PC) 320 are in a cooperative working mode, where the mobile phone 310 and the PC 320 may cooperate with each other through a wired connection or a wireless connection, and a specific way of cooperating the mobile phone 310 and the PC 320 is not limited in the embodiment of the present invention.
For example, the mobile phone 310 and the PC 320 may establish a communication connection through a USB data line, or may establish a communication connection through a WLAN or bluetooth, and when the mobile phone 310 and the PC 320 perform cooperative work, the cooperative work data may be transmitted through the communication connection.
The user selects a picture and a line of text on the mobile phone 310 and clicks "copy" in the pop-up dialog box, and then the user can operate a paste shortcut key (such as a combination key of "Ctrl" and "V") on the PC 320 to paste the picture and text on the PC 320. The paste result shown in fig. 3 may occur in some cases, and a picture cannot be pasted to the PC 320, and the format (font, font size, etc.) of the text cannot be retained.
The reason why the situation shown in fig. 3 occurs is that: the clipboard data includes resource location information of the picture data and the text format data, and after the PC 320 acquires the resource location information, the picture data and the text format data cannot be acquired according to the resource location information.
The present application provides a method for sending clipboard data, which can solve the above problems, and the main flow of the method is shown in fig. 4.
When the user clicks "copy" on the interface of the mobile phone 310, clip data containing text content and a Uniform Resource Identifier (URI) indicating a storage location of the picture data and the text format data is transmitted from the copy APP to the clipboard.
The replication process mainly involves the following four objects: clipboard manager (handleclipboardmanager), clipData (ClipData), clipData items (ClipData. Item), and ClipData description (ClipDescription).
The clipboard manager is a sub-module that cooperates with an APP (e.g., a glory sharing APP) to manage the clipboard, e.g., to listen to the clipboard for data changes.
There is only one copy of the clipboard data in the clipboard, and when new clipboard data enters the clipboard, the clipboard data previously stored in the clipboard will disappear.
A clip data item is an object related to copied contents, and a clip data item may be text, a URI, or intention (intent) data. A piece of clip data may contain one or more pieces of clip data, where when the copied content is simple data (such as text content or intention data), the simple data may be directly put into the clip data as a piece of clip data; when the copied contents are complex data such as pictures and text formats, the copied contents cannot be directly put into the clip data, but resource location information (i.e., URI) of the complex data is put into the clip data as a clip data item.
Clip data description, clip metadata, is used to describe the type of clip data item.
After monitoring that the data of the clipboard changes through the clipboard manager, the cooperative APP installed on the mobile phone 310 acquires clipboard data from the clipboard; then, the APP is cooperated with the APP to send the URI in the cut-and-pasted data to a content analyzer (ContentResolver), and the content analyzer analyzes the storage positions of the picture data and the text format data from the URI; the content parser sends the storage location (the parsing result of the URI) to a content provider (ContentProvider), which reads out the picture data and the text format data (the resource indicated by the URI) from the database of the duplicate APP based on the storage location and stores the read picture data and the text format data in the database of the cooperative APP (for simplicity, fig. 4 represents the database of the cooperative APP by the clipboard manager).
When the user operates a paste shortcut key on the PC 320, optionally, the cooperative APP of the mobile phone 310 sends text content and address information (indicating storage addresses of the picture data and the text format data in the database of the cooperative APP) to the cooperative APP (PC steward) of the PC 320; subsequently, the cooperative APP of the PC 320 writes the text content and the address information to the clipboard so that the paste APP of the PC 320 obtains these data from the clipboard. After obtaining the address information, the paste APP of the PC 320 sends a request message containing the address information to the cooperative APP of the mobile phone 310 to request for obtaining the picture data and the text format data; the cooperative APP of the mobile phone 310 sends the picture data and the text format data to the paste APP of the PC 320 according to the address information, thereby completing the pasting of the picture and the text format.
As another optional implementation, when the user operates a paste shortcut key on the PC 320, the cooperative APP of the mobile phone 310 may directly send text content, picture data, and text format data to the cooperative APP of the PC 320; subsequently, the cooperative APP of the PC 320 writes the text content, the picture data, and the text format data to the clipboard of the PC 320 to facilitate reading of the paste APP of the PC 320.
Embodiments of the method in different application scenarios are described below.
The first embodiment.
Office software is installed on the mobile phone 310
Figure BDA0003156899680000122
(copy one example of APP), user follows the pair of operations shown in FIG. 3
Figure BDA0003156899680000123
The pictures and the characters in (1) are copied, and after receiving the copy operation of the user, the mobile phone 310 executes the subsequent steps according to the flow shown in fig. 5.
S501,
Figure BDA0003156899680000128
Text content and a URI (indicating a storage location of picture data and text format data) are written to a clipboard, wherein the text content does not contain format information of characters (such as font information, font size information, and color information).
Figure BDA0003156899680000124
The data of the client can be shared out through a ContentProvider interface which is responsible for organizing
Figure BDA0003156899680000125
For example, store data, read data, and expose data in a database, and provide the data to other APPs.
Clipboard pass query
Figure BDA0003156899680000126
ContentProvider interface of from
Figure BDA0003156899680000127
The text and the URI are obtained.
S502, the clipboard manager monitors the clipboard for changes in the data of the clipboard, and obtains the clipboard data (i.e., text and URI) when the clipboard data changes.
For example, the clipboard manager may listen to the clipboard in the following manner.
Figure BDA0003156899680000121
After obtaining the clipboard data, the clipboard manager may perform the following steps to obtain the clipboard data.
S503, the clipboard manager calls the ContentResolver interface to obtain the AssetFileDescriptor object corresponding to the URI.
For example, contentResolver first obtains a URI, and then obtains an AssetFileDescriptor object according to the URI and the following method.
ContentResolver resolver=mContext.getContentResolver();
AssetFileDescriptor assetFileDescriptor=resolver.openAssetFileDescriptor(uri,"r")。
S504, the AssetFileDescriptor object calls a getFileDescriptor interface to obtain the FileDescriptor object.
S505, the clipboard manager creates FileReader or FileInputStream according to the FileDescriptor object, and reads the content of hypertext markup language (html) through FileReader or FileInputStream.
The html content contains format information of text and a picture address, wherein an example of the picture address is as follows.
<v:imagedata
src="/storage/emulated/0/Android/data/cn.wps.moffice_eng/.cache/KingsoftOffice/.temp/28029
/6a1e8d7c-473d-488a-b6f2-06fef0dc3b77.png"embosscolor="white"
o:title=""></v:imagedata>。
The picture address belongs to a v: imagedata tag, and cannot be read by a PC 320
Figure BDA0003156899680000131
Identification, therefore, is required forAnd carrying out format conversion on the picture address. An alternative embodiment is that the PC 320 performs format conversion on the picture address; another alternative is for the handset 310 to format the picture address. The conversion method of the two embodiments is the same, and the step of converting the picture address by the mobile phone 310 is shown as S506.
S506, the clipboard manager replaces all the v: imagedata tags in the html content with img tags.
Examples of img tags are as follows.
<img
src="/storage/emulated/0/Android/data/cn.wps.moffice_eng/.cache/KingsoftOffice/.temp/28029
/6a1e8d7c-473d-488a-b6f2-06fef0dc3b77.png"embosscolor="white"o:title=""/>。
After generating the html content containing the img tag, the clipboard manager executes S507 and S508 to transmit the html content containing the img tag.
S507, the clipboard manager sends the html content containing the img tag to the communication module of the mobile phone 310.
S508, the mobile phone 310 sends the html content containing the img tag to the PC 320 through the communication module.
For example, the communication module of the mobile phone 310 packages html content containing img tags to generate a data packet conforming to the WLAN standard; subsequently, the data packet is processed by the physical layer of the mobile phone 310 and transmitted through the antenna in the form of radio waves, where the physical layer may perform processing such as channel coding, interleaving, scrambling, modulation, layer mapping, and precoding on the data packet, and the embodiment of the present application does not limit the specific manner in which the mobile phone 310 transmits the data packet.
S509, the PC 320 receives html contents including the img tag from the handset 310 through the communication module.
For example, the physical layer of the PC 320 performs processing such as channel estimation, demodulation, and decoding on a received wireless signal to generate a data packet; subsequently, the communication module of the PC 320 parses the data packet to obtain html content containing img tags.
S510, a PC steward (in cooperation with an APP) of the PC 320 obtains html content containing img tags from the communication module, and sends the html content to the clipboard.
Since the html content contains img tags that can be pasted APP on the PC 320 (e.g., as in the case of html content)
Figure BDA0003156899680000132
) Recognizing that when a user performs a paste operation on the PC 320, the PC 320 can obtain a resource (src) path of picture data from the img tag; in addition, the PC 320 acquires the src path of text format data from html content. Subsequently, the PC 320 transmits a request message of the src path including the picture data and text format data to the handset 310, requesting to acquire the picture data and text format data.
After receiving the request message, the mobile phone 310 finds the positions of the picture data and the text format data in the memory (i.e., the storage positions of the picture data and the text format data in the database of the cooperative APP) according to the src path in the request message, and sends the picture data and the text format data in the memory to the PC 320, thereby completing the pasting of the picture and the text format.
Example two.
Office software is installed on the mobile phone 310
Figure BDA0003156899680000133
(one example of copying APP), user follows the operation pair shown in FIG. 3
Figure BDA0003156899680000134
The picture in (1) is copied, and after receiving the copy operation of the user, the mobile phone 310 executes the subsequent steps according to the flow shown in fig. 5.
S601,
Figure BDA0003156899680000135
Writing html content into a clipboard, wherein the html content contains src path of the picture shown in fig. 3, and the storage position indicated by the src path is that the picture data is in
Figure BDA0003156899680000142
Location in the database.
S602, the clipboard manager listens for clipboard data changes, and obtains clipboard data (i.e. html content) when the clipboard data changes.
For example, the clipboard manager may listen to the clipboard in the following manner.
Figure BDA0003156899680000141
Figure BDA0003156899680000143
In html content written in a clipboard, the storage position of picture data indicated by the src path is located
Figure BDA0003156899680000144
The database cannot be directly read by the clipboard manager, so the clipboard manager can execute the following steps to store the picture data in the database of the cooperative APP, so that the clipboard manager can read the picture data conveniently.
S603, the clipboard manager traverses the src path of the picture in the html content and generates a URI corresponding to the src path.
For example, the picture tag containing the src path in the html content is < img type = "image/png" src = "content:// com. Microsoft. Office mobile. Word/data/user/0/com. Microsoft. Office. Files/temp/clip. Png" >; the URI generated based on the src path in the picture tag may be content:// com.
S604, the clipboard manager calls the ContentResolver interface to obtain the AssetFileDescriptor object.
For example, contentResolver first obtains the URI in S603, and then obtains the AssetFileDescriptor object according to the URI and the following method.
ContentResolver resolver=mContext.getContentResolver();
AssetFileDescriptor assetFileDescriptor=resolver.openAssetFileDescriptor(uri,"r")。
S605, the AssetFileDescriptor object calls a getFileDescriptor interface to obtain the FileDescriptor object.
S606, the clipboard manager creates FileReader or FileInputStream according to the FileDescriptor object, and reads the picture content through FileReader or FileInputStream, generates the local picture.
S607, the clipboard manager replaces all the old src paths in the html content with new src paths (in cooperation with the storage location of the picture data in the database of the APP), and generates an updated html content.
For example, the picture label containing the old src path is:
<img type="image/png"
src="content://com.microsoft.office.officemobile.word/data/user/0/com.microsoft.office.officehub/files/temp/clip.png"></img>
the new src path is:
src="/storage/emulated/0/HwClipboard/clip.png";
the picture label containing the new src path is:
<img type="image/png"
src="/storage/emulated/0/HwClipboard/clip.png"></img>。
after generating the updated html content, the clipboard manager executes S608 and S609 to send the updated html content.
S608, the clipboard manager sends the updated html content to the communication module of the mobile phone 310.
S609, the mobile phone 310 sends the updated html content to the PC 320 through the communication module.
For example, the communication module of the mobile phone 310 packages the updated html content to generate a data packet conforming to the WLAN standard; subsequently, the data packet is processed by the physical layer of the mobile phone 310 and transmitted through the antenna in the form of radio waves, where the physical layer may perform processing such as channel coding, interleaving, scrambling, modulation, layer mapping, and precoding on the data packet, and the embodiment of the present application does not limit the specific manner in which the mobile phone 310 transmits the data packet.
S610, the PC 320 receives the updated html content from the handset 310 through the communication module.
For example, the physical layer of the PC 320 performs processing such as channel estimation, demodulation, and decoding on the received wireless signal to generate a data packet; subsequently, the communication module of the PC 320 parses the data packet to obtain updated html content.
S611, the PC steward of the PC 320 obtains the updated html content from the communication module, and sends the html content to the clipboard.
When the user performs a paste operation on the PC 320, the PC 320 acquires a new src path from the updated html content, and sends a request message including the new src path to the mobile phone 310 to request to acquire picture data.
After receiving the request message, the mobile phone 310 finds the location of the picture data in the memory (i.e., the storage location of the picture data in the database of the cooperative APP) according to the new src path in the request message, and sends the picture data and the text format data in the memory to the PC 320, thereby completing the pasting of the picture.
The first embodiment and the second embodiment are both scenarios of cross-device copy and paste, and the application further provides a method for copying and pasting on different APPs of one device.
As shown in FIG. 7, when a user copies APP (e.g., via a browser) at the cell phone 310
Figure BDA0003156899680000151
) When the user clicks "copy" on the interface, the clipboard data is transmitted from the copy APP to the clipboard, and the clipboard data contains text contents and a URI, wherein the URI indicates a storage location of the picture data and the text format data.
The replication process mainly involves the following four objects: clipboard manager (handffclipboardmanager), clipData (ClipData), clipData item (ClipData. Item), and ClipData description (ClipDescription). These four objects are the same as the corresponding objects in fig. 4, and are not described again here.
After monitoring that the data of the clipboard changes through the clipboard manager, the cooperative APP installed on the mobile phone 310 acquires clipboard data from the clipboard; then, the collaboration APP sends the URI in the clip data to a content parser (ContentResolver), and the content parser parses the storage positions of the picture data and the text format data from the URI; the content parser sends the storage location (the parsing result of the URI) to a content provider (ContentProvider), which reads out the picture data and the text format data (the resource indicated by the URI) from the database of the duplicate APP based on the storage location and stores the read picture data and the text format data in the database of the cooperative APP (for simplicity, fig. 7 represents the database of the cooperative APP by the clipboard manager).
When the user operates the paste shortcut key on the mobile phone 310, optionally, the cooperative APP of the mobile phone 310 sends the text content and the address information (indicating the storage addresses of the picture data and the text format data in the database of the cooperative APP) to the paste APP (such as
Figure BDA0003156899680000152
) (ii) a After the address information is obtained by the pasting APP, a request message containing the address information is sent to the cooperative APP to request for obtaining the picture data and the text format data; and sending the picture data and the text format data to the pasting APP by the cooperation APP according to the address information, thereby finishing the pasting of the picture and the text format.
As another alternative, when the user operates the paste shortcut key on the mobile phone 310, the cooperative APP of the mobile phone 310 may directly send the text content, the picture data, and the text format data to the paste APP.
The present application also provides a computer program product which, when executed by a processor, implements the method of any of the method embodiments of the present application.
The computer program product may be stored in a memory and eventually transformed into an executable object file that can be executed by a processor via preprocessing, compiling, assembling and linking.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a computer, implements the method of any of the method embodiments of the present application. The computer program may be a high-level language program or an executable object program.
The computer readable storage medium may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM).
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working process and the generated technical effect of the apparatus and the device described above may refer to the corresponding process and technical effect in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the disclosed system, apparatus and method may be implemented in other ways. For example, some features of the method embodiments described above may be omitted, or not performed. The above-described device embodiments are merely illustrative, and the division of the unit is only one type of logical function division, and there may be another division manner in actual implementation, and a plurality of units or components may be combined or may be integrated into another system. In addition, the coupling between the units or the coupling between the components may be direct coupling or indirect coupling, and the coupling includes electrical, mechanical, or other forms of connection.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the processes do not mean the execution sequence, and the execution sequence of the processes should be determined by the functions and the inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Additionally, the terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
In short, the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method of transmitting clipboard data, comprising:
the method comprises the steps that a first device receives a copying operation, the copying operation is used for copying target data including a target picture and/or a target text, and the target text is a text with a format;
the first device responds to the copying operation, and acquires first position information of the target data, wherein the first position information is used for indicating the storage position of the target data in a storage space of a copying application program (APP);
the first device writes the target data into a storage space of a cooperative APP according to the first position information;
the first device sends clipboard data including second location information to a second device, the second location information being used for indicating a storage location of the target data in the storage space of the collaborative APP.
2. The method of claim 1, wherein the first location information comprises a Uniform Resource Identifier (URI) of the target data,
the writing, by the first device, the target data into a storage space of the cooperative APP according to the first location information includes:
the first device reads html content of hypertext markup language (html) from a storage space of the copy APP according to the URI, wherein the html content comprises the target data;
and the first equipment writes the html content into a storage space of the cooperative APP.
3. The method of claim 2, wherein before the first device writes the html content into the storage space of the cooperative APP, the method further comprises:
the first device converts a first picture tag in the html content into a second icon tag, the first picture tag is a picture tag which cannot be identified by the second device, the second picture tag is a picture tag which can be identified by the second device, and the second picture tag is the second position information.
4. The method of claim 3, wherein the first picture tag is a Visagagedata tag and the second picture tag is an img tag.
5. The method of claim 1, wherein the target data is the target picture, wherein the first location information comprises a first resource src path of the target picture,
the writing, by the first device, the target data into a storage space of the cooperative APP according to the first location information includes:
the first device generates a URI according to the first src path, wherein the URI is used for indicating the storage position of the target picture in the storage space of the copied APP;
the first device reads the target picture from the storage space of the copied APP according to the URI;
and the first equipment writes the target picture into a storage space of the cooperative APP.
6. The method of claim 5, wherein the first src path is located in html content, the method further comprising:
the first device replaces the first src path in the html content with a second src path, where the second src path indicates a storage location of the target picture in the storage space of the collaborative APP, and the second src path is the second location information.
7. The method according to any one of claims 1 to 6, further comprising:
the first device receives a request message including the second position information from the second device, wherein the request message is used for requesting to acquire the target data;
the first device reads the target data from the storage space of the cooperative APP according to the second position information;
the first device sends the target data to the second device.
8. An apparatus for transmitting clipboard data, comprising a processor and a memory, the processor and the memory being coupled, the memory for storing a computer program that, when executed by the processor, causes the apparatus to perform the method of any of claims 1 to 7.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to carry out the method of any one of claims 1 to 7.
10. A chip comprising a processor that, when executing instructions, performs the method of any of claims 1 to 7.
CN202110780907.4A 2021-07-09 2021-07-09 Method and device for sending clipboard data Pending CN115599565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110780907.4A CN115599565A (en) 2021-07-09 2021-07-09 Method and device for sending clipboard data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110780907.4A CN115599565A (en) 2021-07-09 2021-07-09 Method and device for sending clipboard data

Publications (1)

Publication Number Publication Date
CN115599565A true CN115599565A (en) 2023-01-13

Family

ID=84840254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110780907.4A Pending CN115599565A (en) 2021-07-09 2021-07-09 Method and device for sending clipboard data

Country Status (1)

Country Link
CN (1) CN115599565A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116860483A (en) * 2023-07-20 2023-10-10 合芯科技有限公司 Data pruning method and device, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116860483A (en) * 2023-07-20 2023-10-10 合芯科技有限公司 Data pruning method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2020000448A1 (en) Flexible screen display method and terminal
WO2020093988A1 (en) Image processing method and electronic device
WO2021159746A1 (en) File sharing method and system, and related device
CN116360725B (en) Display interaction system, display method and device
US20240031450A1 (en) Method and apparatus for processing push message
CN113568634B (en) Processing method and processing device for application optimization
EP4033354A1 (en) Method and apparatus for installing plug-in, and storage medium
US20230418630A1 (en) Operation sequence adding method, electronic device, and system
CN113703894A (en) Display method and display device of notification message
WO2022253158A1 (en) User privacy protection method and apparatus
CN113656089B (en) Class verification method and device in application program
CN116483734B (en) Pile inserting method and system based on compiler and related electronic equipment
CN113741993A (en) Method and device for compiling plug-in dex file
CN112437341B (en) Video stream processing method and electronic equipment
CN115599565A (en) Method and device for sending clipboard data
CN115022982B (en) Multi-screen cooperative non-inductive access method, electronic equipment and storage medium
WO2023000746A1 (en) Augmented reality video processing method and electronic device
CN116709609B (en) Message delivery method, electronic device and storage medium
CN117707562B (en) Parameter updating method and terminal equipment
CN116048629B (en) System service switching method, control device, electronic equipment and storage medium
CN116743921B (en) Method for displaying number-carrying network number, electronic equipment and storage medium
CN116266159B (en) Page fault exception handling method and electronic equipment
WO2024140891A1 (en) Compiling method, electronic device, and system
CN117689796B (en) Rendering processing method and electronic equipment
CN114692641A (en) Method and device for acquiring characters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination