CN118057858A - Data processing method, device and system - Google Patents

Data processing method, device and system Download PDF

Info

Publication number
CN118057858A
CN118057858A CN202211473400.5A CN202211473400A CN118057858A CN 118057858 A CN118057858 A CN 118057858A CN 202211473400 A CN202211473400 A CN 202211473400A CN 118057858 A CN118057858 A CN 118057858A
Authority
CN
China
Prior art keywords
target
interface
equipment
application
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211473400.5A
Other languages
Chinese (zh)
Inventor
姚思
陈春林
李世明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202211473400.5A priority Critical patent/CN118057858A/en
Publication of CN118057858A publication Critical patent/CN118057858A/en
Pending legal-status Critical Current

Links

Landscapes

  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a data processing method, equipment and a system, wherein the method comprises the following steps: displaying a first interface; detecting a first selection operation of a user in a first interface, and determining target data; detecting a first operation for target data; acquiring equipment information of a second device when equipment corresponding to the first operation comprises equipment except the source equipment; displaying a third interface according to the equipment information of the second equipment, wherein the third interface comprises a first control corresponding to the second equipment; detecting a second selection operation of a user on a target first control in the first controls, obtaining second equipment corresponding to the target first control, and taking the second equipment corresponding to the target first control as target equipment of the first operation; and transmitting the target data to the target equipment in a broadcasting mode. According to the embodiment of the application, the electronic equipment can transmit the data to the target equipment without establishing connection between the electronic equipment, so that the user operation is reduced, and the user experience is improved.

Description

Data processing method, device and system
Technical Field
The present application relates to the field of intelligent terminals, and in particular, to a data processing method, device, and system.
Background
Currently, if a user wants to use an electronic device as a source device, data of the source device is transmitted to a target electronic device (hereinafter referred to as a target device) located around the source device, a connection in WiFi or bluetooth is first established between the source device and the target device, and then the source device may send data to the target device through the connection. However, in the process of establishing connection between the source device and the target device, the user operation is complicated, and the user experience is affected.
Content of the application
The application provides a data processing method, equipment and a system, which can realize that the source equipment transmits data to peripheral target equipment without establishing connection between the source equipment and the target equipment, thereby reducing user operation and improving user experience.
In a first aspect, an embodiment of the present application provides a data processing method, applied to a source device, where the method includes: displaying a first interface; detecting a first selection operation of a user in a first interface, and determining target data; detecting a first operation for target data; when the device corresponding to the first operation comprises a device other than the source device, acquiring device information of a second device, wherein the device information of the second device comprises: device description information of the second device and distance information of the second device and the source device; the second device comprises a peripheral device with a distance from the source device not exceeding a preset distance threshold; displaying a third interface according to the equipment information of the second equipment, wherein the third interface comprises a first control corresponding to the second equipment; detecting a second selection operation of a user on a target first control in the first controls, obtaining second equipment corresponding to the target first control, and taking the second equipment corresponding to the target first control as target equipment of the first operation; and transmitting the target data to the target equipment in a broadcasting mode. The third interface may be, for example, a device selection interface in the subsequent embodiment, and the user may select the target device in the interface, so that the user may designate the target device for transmitting the target data, so that the transmission of the target data may more satisfy the user's use requirement. The device description information can be used for providing more visual characteristic information of the device for a user, so that the user can conveniently distinguish the device indicated by the device description information, and further, the user can conveniently select the target device from the device selection interface. According to the method, the source equipment sends the target data to the target equipment in a broadcasting mode, so that connection between the source equipment and the target equipment is not required to be established before data transmission, and a user does not need to perform complicated operation in the process of establishing connection between the source equipment and the target equipment, so that user operation is reduced, and user experience is improved.
In one possible implementation, the method for transmitting the target data to the target device by broadcasting includes: the target data is sent to the target device using a BLE extension message, the payload of which includes the target data, the target device of which is the target device. The above-described BLE extension message may be BLE signaling implemented using a BLE5.0 extension mode, which may transmit more data than normal BLE signaling. The target device of the BLE extension message is the target device, so that the target data can be sent to the target device in a broadcast mode, and the error processing of the target data by the other devices outside the target device is not caused.
In one possible implementation, obtaining device information of the second device includes: measuring the distance between the peripheral equipment and the source equipment by using a BLE ranging method to obtain equipment information of the peripheral equipment; the device information of the peripheral device includes: device description information of the peripheral device and distance information of the peripheral device and the source device; and screening the equipment information of the second equipment from the equipment information of the peripheral equipment by using a preset distance threshold.
In one possible implementation, detecting a first operation directed to target data includes: receiving information of a first gesture sent by third equipment, wherein the first gesture is detected by the third equipment; the third device is a wearable device; an operation corresponding to the first gesture is determined as a first operation. The third device may be, for example, a wearable device in the subsequent embodiments. According to the method, through the cooperation of the third device, the detection of the gesture of the user by the electronic device can be realized, and then the first operation can be determined according to the gesture.
In one possible implementation, the method for transmitting the target data to the target device by broadcasting includes: the source device sends the target data to the third device; and the third device transmits the target data to the target device in a broadcast mode.
In one possible implementation, the target data is text data, image data, or message data.
In a second aspect, an embodiment of the present application provides a data processing method, applied to a target device, where the method includes: receiving target data sent by a source device in a broadcast manner, wherein the source device is used for executing the data processing method of any one of the first aspect. According to the method, the target device receives the target data sent by the source device in a broadcast mode, so that connection between the source device and the target device is not required to be established before data transmission, and a user does not need to perform complicated operation in the connection establishment process between the source device and the target device, so that user operation is reduced, and user experience is improved.
In one possible implementation, the target data is text data or image data, and the method further includes: and adding display target data in a second interface displayed by the target device.
In one possible implementation, adding display target data at the second interface includes: detecting a position indication operation of a user in a second interface; determining a target position of the target data in the second interface according to the position indication operation; and adding display target data at the target position.
In one possible implementation, the target data is message data, displaying the target data, including: when the interface displayed by the target equipment is a message main interface, message data is additionally displayed on the message main interface; and when the interface displayed by the target equipment is not the message main interface, displaying the prompt information of the message.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the processor, cause the electronic device to perform the method of any of the first or second aspects.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the method of any one of the first aspect or any one of the second aspect.
In a fifth aspect, embodiments of the present application provide a chip system comprising a processor and a data interface, the processor reading instructions stored on a memory via the data interface to perform the method of any one of the first aspect or the second aspect.
In a sixth aspect, an embodiment of the present application provides a data processing system, the data processing system comprising a source device according to any one of the first aspects and a target device according to any one of the second aspects.
In one possible implementation, the system may further include a third device of any of the above. The third device may be a wearable device.
In a seventh aspect, the present application provides a computer program for performing the method of any one of the first aspect or any one of the second aspect when the computer program is executed by a computer.
In one possible design, the program in the seventh aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1A is a schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 1B is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a data processing method according to an embodiment of the application;
FIG. 3 is an interface diagram of a data processing method according to an embodiment of the present application;
FIG. 4 is a flow chart of a data processing method according to an embodiment of the present application based on the interface diagram shown in FIG. 3;
FIG. 5 is a diagram illustrating another exemplary data processing method according to the present application;
FIG. 6 is a diagram of another interface diagram of a data processing method according to an embodiment of the present application;
FIG. 7 is a flowchart of a data processing method according to an embodiment of the present application based on the interface diagram shown in FIG. 6;
FIG. 8 is a further interface diagram of a data processing method according to an embodiment of the present application;
fig. 9 is a flowchart of a ranging method between a source device and a peripheral device according to an embodiment of the present application.
Detailed Description
The terminology used in the description of the embodiments of the application herein is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
In one embodiment, in order to establish a connection between a source device and a target device, a user may enable a network in the source device and the target device, respectively, then, the user logs in the same user account in the source device and the target device, respectively, after the user account is logged in successfully, the source device and the target device are connected to a server supporting the user account login through the network, and because the two devices are connected to the server through the same user account, the server may forward data between the source device and the target device based on the user account, so that the connection between the source device and the target device can be considered to be established through the server. After the user successfully logs in the same user account in the source device and the target device respectively, cross-device copy and paste can be realized between the source device and the target device. Specifically, a user may first make a copy operation of content on a source device, the copy operation triggering the source device to send the content to a server; the server then pushes the content indifferently to all other devices that log in the same user account as the source device, including the target device; when the user makes a paste operation on the target device, the paste operation triggers the target device to paste the content pushed by the previous server to the specified location.
In the data processing method, two devices are required to communicate with the server through the network, and users are required to finish the login of the same user account in the two devices respectively, so that the user operation is complicated, and the user experience is affected.
Therefore, the application provides a data processing method, device and system, which can realize that the source device transmits data to the target device without establishing connection between the source device and the target device, for example, without the need of the source device and the target device to communicate with a server through a network or without the need of the source device and the target device to log in the same user account, thereby reducing user operation and improving user experience.
The electronic device to which the data processing method of the embodiment of the present application is applicable may be, for example: a mobile phone, a tablet personal computer (PAD), a Personal Computer (PC), a large screen, etc.
Fig. 1A shows a schematic configuration of an electronic device 100. As shown in fig. 1A, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: when a touch operation with the touch operation intensity smaller than the first pressure threshold is applied to the short message application icon, an instruction for checking the short message is executed. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
The embodiment shown in fig. 1B exemplifies a layered architecture of the software system of the electronic device 100, and exemplifies the software structure of the electronic device 100.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. The software structure provided by the embodiment of the application comprises an application layer and a kernel layer.
The application layer may include several applications. As shown in fig. 1B, the application layer may include applications such as a first application, a device management application, a bluetooth low energy (Bluetooth Low Energy, BLE) ranging application, and the like. Optionally, applications such as talking, mapping, navigation, WLAN, bluetooth, music, video, short message, etc. may also be included, and embodiments of the present application are not limited. The first application in the embodiment of the present application may be an application having a text editing function, an application having a messaging function (e.g., a short message application, an instant messaging application, etc.), an application having an image management function (e.g., an album application, etc.), or the like.
The kernel layer is a layer between hardware and software. The kernel layer may include a display driver, a bluetooth driver.
It should be noted that, the software hierarchies shown in fig. 1B are only examples, mainly illustrate 2 software hierarchies involved in the embodiments of the present application, and in other embodiments provided by the present application, the electronic device may include more or fewer hierarchies than the software structure shown in fig. 1B, which is not limited by the embodiments of the present application. For example in five layersThe system is exemplified by an application framework layer, a system library and android runtime layer, and a hardware abstraction layer (hardware abstraction layer, HAL) which are also included between the application layer and the kernel layer.
Hereinafter, the data processing method according to the embodiment of the present application will be exemplarily described with reference to the software and hardware structures of the electronic device.
The data processing method in the embodiment of the present application relates to a source device and a target device, where the source device and the target device may be the electronic devices respectively. The data processing method provided by the embodiment of the application can realize that the target data in the source equipment is transmitted to the target equipment under the condition that the source equipment and the target equipment are not connected.
The target data refers to data to be transmitted, the target data may be specified by a user, and the target data may include, but is not limited to, text, images (including pictures and videos), messages (including short messages and instant messaging messages), and the like.
FIG. 2 is a schematic flow chart of a data processing method according to an embodiment of the present application, as shown in FIG. 2, the method may include:
Step 201: a distance threshold is preset in the source device.
This step may be performed by the first application, or the device management application, or the BLE ranging application, and embodiments of the present application are not limited.
Step 202: the source device determines target data.
The target data may include, but is not limited to: text data, image data, message data, or the like.
Text data may include, but is not limited to: selected text in text editing, text files, etc.
The image data may include: pictures, videos, etc.
The message data may include: short messages, instant messaging messages, etc.
Step 203: the source device determines a first operation for the target data.
The first operation may include, but is not limited to: copy, paste, cut, delete, copy to other devices, etc.
Step 204: the source device determines, according to the first operation for the target data, whether the target data needs to be transmitted to a device other than the source device, and if so, performs step 205.
In this step, if it is determined that the target device of the first operation is the source device, the first operation may be performed according to the prior art, which is not limited by the embodiment of the present application.
The source device may record in advance whether each of the operated target devices includes a device other than the source device, and in this step, the source device may find, according to the record, whether the first operated target device includes a device other than the source device, and if the first operated target device includes a device other than the source device, the source device may determine that the target data needs to be sent to the device other than the source device, and if the first operated target device does not include a device other than the source device, the source device may determine that the target data does not need to be sent to the device other than the source device.
Step 205: the source device determines a target device for the first operation from among the peripheral devices.
The source device may acquire device information of the peripheral device in a BLE ranging manner, where the device information of the peripheral device may include an address of the peripheral device, device description information of the peripheral device, distance information between the peripheral device and the source device, and the like, and determine the target device for the first operation according to the device information of the peripheral device.
The device description information of the peripheral device is used to describe the peripheral device.
In this step, the implementation of obtaining, by the source device, the device information of the peripheral device in the BLE ranging manner may refer to the embodiment shown in fig. 8, which is not described herein.
Step 206: the source device sends the target data to the target device in a broadcast manner.
Optionally, the source device may transmit the target data through BLE extension signaling, a destination address of the BLE extension signaling is an address of the target device, the address of the target device may be obtained from device information of the target device by the source device, so that the target device receives the BLE extension signaling, and according to the destination address of the BLE extension signaling, the BLE extension signaling may be determined to be a signaling sent to the target device, and the target data is obtained from the BLE extension signaling. For devices other than the target device, if the BLE expansion signaling is received, it can be determined according to the destination address of the BLE expansion signaling that the BLE expansion signaling is not the signaling sent to the device, and the BLE expansion signaling can be discarded, so that the error processing of the signaling when the BLE expansion signaling is received by the devices other than the target device is prevented.
BLE5.0 adds an extended mode to transmit additional data in a data channel such that the maximum payload (payload) of the BLE signaling supported by broadcasting is 254 bytes. In the embodiment of the present application, the target data may be transmitted using BLE signaling in an extension mode as the BLE extension signaling.
In the step, the source device sends the target data to the target device in a broadcasting mode, so that the data transmission can be realized without pre-establishing connection between the source device and the target device.
Step 207: the target device receives the target data.
Optionally, after receiving the target data, the target device may display the target data based on a trigger of the user, or may directly display the target data without triggering by the user.
According to the method, equipment information of the peripheral equipment is acquired in a BLE ranging mode, target equipment for first operation of target data is determined according to the equipment information, the target data is sent to the target equipment in a broadcasting mode, and therefore networking or connection establishment between source equipment and the target equipment is not needed, data transmission from the source equipment to the target equipment can be achieved, a plurality of manual operation steps in the connection establishment process between the source equipment and the target equipment by a user are omitted, operation of the user is simpler and more convenient, operation of the user is reduced, and user experience is improved.
The implementation of the data processing method according to the embodiment of the present application is exemplarily described below.
FIG. 3 is a schematic diagram of an interface implementation of one embodiment of a data processing method according to an embodiment of the present application. In fig. 3, taking an example that the first application is an application capable of supporting text editing on a file and the target data is a target text, copying the text of the first file in the source device to the second file in the target device may be implemented. As shown in fig. 3, includes:
the source device is provided with a first application, the first application provides a distance setting interface for a user, for example, the interface 310 shows that the user can set a distance threshold between the target device and the source device in the distance setting interface, for example, the distance threshold set by the user in the interface 310 is 10 meters, the user clicks a "confirm" control provided in the interface 310 to complete the setting of the distance threshold, and accordingly, the first application detects a distance threshold setting operation of the user and can obtain the distance threshold set by the user according to the distance threshold setting operation.
The BLE ranging has a maximum measurement distance beyond which the BLE ranging cannot be detected. Thus, if the user-set distance threshold exceeds the maximum measured distance of BLE ranging, the first application may prompt the user to narrow the set value of the distance threshold. For example, assuming that the maximum measured distance set in the source device is 100m, if the distance threshold set by the user is 101m, the first application may prompt the user to reduce the set value of the distance threshold to within 100m by popup or the like.
It should be noted that, after the first application obtains the distance threshold set by the user, the distance threshold may be stored, and the embodiment of the present application is not limited by the specific storage manner.
It should be noted that, when the first application is installed to the source device, an initial distance threshold may be set in the first application, and the user may open the distance setting interface at any time during the process of using the first application, modify the distance threshold, and update the distance threshold stored in the first application based on the distance threshold modified by the user.
In other embodiments provided by the present application, the distance setting interface may be provided by a device management application or a BLE ranging application, and the distance threshold may be stored by the device management application or the BLE ranging application, respectively.
The setting and storing of the distance threshold described above may be regarded as a preparation step of the data processing method of the embodiment of the present application as an optional step.
The user opens a text editing interface of the first file in the first application of the source device, for example, as shown in interface 320, the user selects a target text to be edited in the text editing interface, and the specific implementation method of selecting the target text by the user is not limited, for example, the method can be that a finger presses a selected word for a long time, or the front boundary and the rear boundary of a selection box are dragged after the finger presses the selected word for a long time to adjust the selected text range, etc.; accordingly, the first application of the source device detects a text selection operation by the user, and determines the target text selected by the user based on the text selection operation, e.g., the target text selected by the user in interface 320 is "this is a test text segment". The target text selected by the user may be a part of text or all text of the first file, which is not limited by the embodiment of the present application.
After the first application of the source device obtains the target text selected by the user, an operation selection control provided in the text editing interface is selected, for example, as shown by an "operation" control in the interface 320, and correspondingly, the first application of the source device detects a request operation of the user and provides an operation selection interface for the user, for example, as shown by an interface 330. The operation selection interface may list a plurality of operation controls for the user, where each operation control may correspond to an operation executable for the target text, for example, a control corresponding to an operation of "copy to other device" is provided in the interface 330; the user selects an operation control, for example, selects the control "copy to other device" control, and correspondingly, the first application of the source device detects the control selection operation of the user, and obtains the first operation aiming at the target text according to the control selection operation.
It should be noted that, in the above implementation, taking the example that the user selects the operation selection control provided in the text editing interface to trigger the first application to display the operation selection interface. In another embodiment provided by the application, a mode of executing specified gesture operation in a text editing interface by a user, for example, the user long presses a selected target text, a first application display operation selection interface of the source device is triggered, and the like. In yet another embodiment provided by the present application, the first application of the source device may also automatically trigger to display an operation selection interface in response to the text selection operation, e.g., the user performs the text selection operation in a text editing interface, e.g., interface 320, and then the first application of the source device displays an operation selection interface, e.g., interface 330.
After the first application of the source device obtains the first operation aiming at the target text, it is judged that the first operation involves other devices, that is, the target device of the first operation includes devices other than the source device, and then the first application of the source device obtains the device information of the peripheral devices, the distance between the first application and the source device does not exceed the distance threshold, and accordingly a device selection interface, for example, an interface 340 is provided for a user. The device information of the peripheral device may include, but is not limited to: device description information of the peripheral device and distance information of the peripheral device and the source device. The device selection interface comprises a plurality of device controls, each device control corresponds to one peripheral device, and the device description information of the peripheral devices and the distance information between the peripheral devices and the source device can be used as the description information of the device controls, so that a user can conveniently correspond the device controls to the entity devices, and the user can conveniently and correctly select the target device.
The device description information is used for describing characteristics of the peripheral device, and may include, for example, a MAC address of the device, and/or a device model number, and/or a device manufacturer, and/or a device name set by a device user for the device. The device description information is used for providing more visual characteristic information of the device for the user, so that the user can conveniently distinguish the device indicated by the device description information, and further, the user can conveniently select the target device from the device selection interface.
The user selects a device control corresponding to the target device in the device selection interface, and accordingly, the first application of the source device detects a target device selection operation of the user, and obtains the target device according to the target device selection operation, for example, taking the user selection device 3 as the target device in the interface 340.
The first application of the source device sends the target text to the first application of the target device by broadcasting. In one embodiment, the first application of the source device may send the target text to the first application of the target device through a BLE extension message, and then the first application of the source device may send the target text to the bluetooth driver, and the bluetooth driver controls the bluetooth module to broadcast the BLE extension message.
The first application of the target device displays a text editing interface for the second file, for example, as shown in interface 350, the user selects a target position of the target text in the text editing interface, for example, the user clicks a certain position in the text editing interface with a finger, and correspondingly, the first application of the target device detects a position indication operation of the user, obtains the target position according to the position indication operation, and increases and displays the target text at the target position of the text editing interface, for example, as shown in interface 360.
Optionally, the user may open the text editing interface for the second file in the first application of the target device before performing the operation on the source device, so that after the user performs the operation on the source device to send the target text to the first application of the target device, the user may directly specify the setting position of the target text in the text editing interface provided by the first application of the target device, thereby enabling the user to have better user operation experience.
In the foregoing description, the first application of the source device obtains the device information of the peripheral device whose distance from the source device does not exceed the distance threshold, thereby providing the user with a device selection interface, such as that shown by interface 340, and below, a possible implementation of the first application of the source device obtaining the device information of the peripheral device whose distance from the source device does not exceed the distance threshold is illustrated.
In one embodiment, the first application of the source device may request the BLE ranging application to measure a distance between the peripheral device of the source device and the source device, obtain, by the BLE ranging application, device information of the peripheral device through a BLE ranging mode, feed back the device information to the first application, and screen device information of the peripheral device, from the device information of the peripheral device, where the distance between the peripheral device and the source device does not exceed a distance threshold. The implementation of BLE ranging application to measure the distance between the peripheral device of the source device and the source device may refer to the embodiment shown in fig. 8, which is not described herein.
In another embodiment, if the first application of the source device does not have the right to access the BLE ranging application, the BLE ranging application may be accessed by other applications instead, for example, the first application of the source device may request device information of the peripheral device from the device management application, and then the device management application requests the BLE ranging application to measure a distance between the peripheral device of the source device and the source device, and then the device management application forwards the device information of the peripheral device detected by the BLE ranging application to the first application of the source device.
In yet another embodiment, the first application of the source device may carry the distance threshold when requesting the device information of the peripheral device from the device management application or the BLE ranging application, so that the device management application or the BLE ranging application may filter the device information of the peripheral device whose distance from the source device does not exceed the distance threshold, and finally feed back the device information to the first application of the source device.
The process realizes that the target text of the first file of the source equipment is copied to the second file of the target equipment, the operation of a user is simple and convenient, and networking or connection establishment between the source equipment and the target equipment is not required in the process.
FIG. 4 is a flowchart of a data processing method according to an embodiment of the present application, which is provided based on the interface schematic shown in FIG. 3, and as shown in FIG. 4, the method may include:
step 401: a distance threshold is preset.
This step may be performed by the first application of the source device, or the device management application, or the BLE ranging application, and embodiments of the present application are not limited.
In the embodiment of the present application, a preset distance threshold value is taken as an example in a first application of a source device. A default distance threshold may be preset in the first application of the source device, and then, during the use of the user, a specific value of the distance threshold may be adjusted at any time through a distance setting interface, such as the distance setting interface 310 described above.
Step 402: the first application of the source device displays a text editing interface of the first file, and determines a target text according to text selection operation of a user in the text editing interface.
A text editing interface, such as interface 320, is shown.
The target text may be some or all of the text displayed in the text editing interface.
Step 403: the first application of the source device displays an operation selection interface, and determines a first operation for the target text according to the selection operation of the operation control in the operation selection interface.
In one embodiment, the first application display operation selection interface of the source device may be triggered by a user through a preset operation, where the preset operation may be that the user selects a specific control in the text editing interface, such as the "operation" control described above, or may be that the user performs a specific gesture operation in the text editing interface, such as long pressing a target text selected by the user, and so on.
In another embodiment, the first application display operation selection interface of the source device may be triggered by the text selection operation in step 402, i.e. the first application of the source device determines the target text both according to the text selection operation of the user in the text editing interface and displays the operation selection interface according to the text selection operation.
An operation selection interface is shown, for example, as interface 330. The operation selection interface may display operation controls for operations that the user may perform on the target text, and the specific implementation may refer to the corresponding description in fig. 3.
Step 404: the first application of the source device determines that the target device of the first operation includes a device other than the source device, and obtains device information of a peripheral device whose distance from the source device does not exceed a distance threshold.
Alternatively, in the first application of the source device, whether the target device of each operation includes a device other than the source device may be preset, for example, the target device of the operation such as deletion is set to include no device other than the source device, and the target device of the operation such as copying to other devices, copying and copying to other devices simultaneously includes a device other than the source device. The method for recording whether the target device of each operation includes a device other than the source device in the first application of the source device is not limited in the embodiment of the present application, for example, a parameter may be set for the operation, where a value of the parameter is yes, which indicates that the target device includes a device other than the source device, and a value of the parameter is no, which indicates that the target device does not include a device other than the source device. The first application of the source device in this step may determine whether its target device includes a device other than the source device by acquiring the above-mentioned parameter value of the first operation.
It should be noted that, if the first application of the source device determines that the target device of the first operation does not include a device other than the source device, the first operation may be implemented by using an existing related method, which is not described in detail in the embodiments of the present application.
In one possible implementation manner, a distance threshold is preset in the first application of the source device, and the first application of the source device screens device information of the peripheral device according to the distance threshold, where this step may include:
the first application of the source device requests device information of the peripheral device from the BLE ranging application;
The BLE ranging application measures the distance between equipment positioned at the periphery of the source equipment and the source equipment in a BLE ranging mode to obtain equipment information of the peripheral equipment, and sends the equipment information of the peripheral equipment to a first application of the source equipment;
The first application of the source device screens the device information of the peripheral device, the distance between the device information and the source device of which is not more than the distance threshold, from the device information of the peripheral device according to the distance threshold.
Alternatively, the device information of the peripheral device may be recorded in a list manner, for example, referred to as a first device list, where fields such as an address of the device, device description information, and a distance may be included in the first device list. The step can be used for screening out records with the distance field value not larger than the distance threshold value by comparing whether the distance field value in the first equipment list is larger than the distance threshold value or not, so as to obtain a second equipment list.
In another implementation, the first application of the source device may also send the distance threshold to a BLE ranging application, which screens out device information of peripheral devices that are not more than the distance threshold from the source device according to the distance threshold.
In yet another implementation, the first application of the source device may request device information of the peripheral device from the BLE ranging application through the device management application if it does not have permission to access the BLE ranging application. In this implementation manner, the first application of the source device may screen the device information of the peripheral device according to the distance threshold, or may send the distance threshold to the device management application or the BLE ranging application, and the device management application or the BLE ranging application screens the device information of the peripheral device according to the distance threshold. For example, if the device information of the peripheral device is filtered by the device management application according to the distance threshold, this step may include:
The method comprises the steps that a first application of source equipment sends a first request message to an equipment management application of the source equipment, wherein the first request message comprises a distance threshold value, and the first request message is used for requesting equipment information of peripheral equipment, wherein the distance between the peripheral equipment and the source equipment does not exceed the distance threshold value;
The device management application sends a second request message to the BLE ranging application, wherein the second request message is used for requesting the device information of the peripheral devices of the source device;
The BLE ranging application measures the distance between equipment positioned at the periphery of the source equipment and the source equipment by using a BLE ranging method to obtain equipment information of the peripheral equipment, and sends the equipment information of the peripheral equipment to the equipment management application;
the device management application screens device information of the peripheral device, which is not more than the distance threshold from the source device, from the device information of the peripheral device according to the distance threshold, and transmits the device information of the peripheral device, which is not more than the distance threshold from the source device, to the first application of the source device.
It should be noted that, if the distance threshold is preset in the device management application or the BLE ranging application, the implementation manner may be adaptively changed, so that the first application of the source device obtains the device information of the peripheral device having a distance from the source device not exceeding the distance threshold.
Step 405: the first application of the source device selects an interface according to the device information display device.
A device selection interface is shown, for example, as interface 340. The device selection interface includes a device control corresponding to a peripheral device having a distance from the source device that does not exceed a distance threshold.
Step 406: the first application of the source device detects a selection operation of a user for a device control in a device selection interface, and determines the target device according to the selection operation.
Step 407: the first application of the source device sends the target text to the first application of the target device by broadcasting.
Optionally, the source device may transmit the target text through BLE extension signaling, where the payload of the BLE extension signaling includes the target text, and the destination address is an address of the target device, so that the target device receives the BLE extension signaling, and according to the destination address of the BLE extension signaling, the BLE extension signaling may be determined as a signaling sent to the target device, and the target text is obtained from the BLE extension signaling.
Specifically, the first application of the source device may send the target text and the target device to a bluetooth driver of the source device, where the bluetooth driver generates BLE expansion signaling according to the target text and the target device, and drives the bluetooth module to broadcast the BLE expansion signaling; correspondingly, the Bluetooth driver of the target device receives the BLE expansion signaling through the Bluetooth module, analyzes the address of the target device of the BLE expansion signaling as the address of the device, obtains a target text from the load of the BLE expansion signaling, and sends the target text to the first application of the target device.
Step 408: the first application of the target device is displayed with a text editing interface aiming at the second file, the position indication operation of the user in the text editing interface is detected, and the target position is determined according to the position indication operation.
A text editing interface, such as interface 350, for the second file is shown.
Step 409: the first application of the target device displays the target text in the target position increase in the text editing interface.
A text editing interface, such as interface 360, is added after the target text is displayed.
In the method shown in fig. 4, an interface is displayed for a user to select a target device independently, and a first application of a source device sends a target text to a first application of the target device selected by the user, so that transmission of the target text meets transmission requirements of the user; the source device sends the target text to the target device in a broadcasting mode, so that connection between the source device and the target device is not required to be established, transmission of the target text from the source device to the target device is realized, manual operation steps of a user in the connection establishment process between the source device and the target device are omitted, for example, for the implementation mode that the source device and the target device establish connection through logging in the same user account, operations of enabling the network of the source device and the target device respectively and logging in the source device and the target device respectively by using the same user account are omitted, operation of the user is simpler and more convenient, operation of the user is reduced, and user experience is improved.
In another embodiment of the present application, the first application may be a message transmission application, for example, a short message application or an instant messaging application, and the target data may be a plurality of short messages or instant messaging messages. At this time, the first application that transmits the message in the first application of the source device to the target device may be implemented. The interface diagram is shown in fig. 5, for example.
In this embodiment, the setting operation of the distance threshold may refer to fig. 3, which is not described here.
The first application provides a message display interface for a user, wherein the message display interface can be a message display main interface or a message detail interface; in fig. 5, the message display interface is exemplified by a short message main interface 510.
The user selects the target message on the message display interface, and accordingly, the first application detects a message selection operation of the user, and obtains the target message selected by the user according to the message selection operation, for example, the target message selected in the interface 510 is message 1.
After the first application obtains the target message selected by the user, the user triggers the first application to display an operation selection interface by selecting an operation control shown by an interface 510 or executing a specified gesture, for example, as shown by an interface 520, a plurality of operation controls are listed for the user in the operation selection interface, each operation control can correspond to an operation executable for the target message, for example, a control corresponding to the operation of copying to other devices is provided in the interface 520; the user selects an operation control, for example, the "copy to other device" control described above, and accordingly, the first application detects a first operation by the user.
The first application determines that the first operation of the user includes a device other than the source device, obtains device information of a peripheral device having a distance from the source device that does not exceed a distance threshold, and provides a device selection interface, such as shown by interface 530, for the user.
The user selects a device control corresponding to the target device in the device selection interface, and accordingly, the first application detects a target device selection operation of the user, and obtains the target device according to the target device selection operation, for example, taking the user selection device 3 as the target device in the interface 530.
The first application sends the target message to the first application of the target device in a broadcast manner.
A first application in the target device displays the target message to the user. Optionally, if the target device displays a message display interface other than the short message application, the first application may prompt the received target message by means of a popup window, for example, as shown in interface 540, or prompt the received target message by means of a top banner, for example, as shown in interface 550, or the like, if the target device displays a message display interface of the short message application, the newly received target message may be directly added to the message list of the message display interface, and the method for adding the display is not limited in the embodiment of the present application, for example, the newly received target message may be added and displayed at the top of the message list.
The implementation of the data processing method according to the embodiment of the present application based on fig. 5 may refer to fig. 4, and the difference is mainly that the method for displaying the target message after the target device receives the target message in step 408 is different from the method for displaying the target message in step 409, and in the embodiment of the present application, the position where the target device displays the target message does not need to be manually specified by the user, and the specific implementation may refer to the corresponding description in fig. 5, which is not repeated herein.
In another embodiment provided by the present application, the first application may be an image management application, such as an album application, and the target data may be image data. At this time, the first application that transmits the image (picture or video) in the first application of the source device to the target device may be implemented. Specific interface implementations and method flow implementations may refer to fig. 3 to fig. 5, which are not described herein.
In order to further facilitate the operation of the user, in the following embodiments, the wearable device may detect a gesture of the user, and send the detected gesture to the source device, so that the user may indicate the operation that the source device needs to perform through the gesture. FIG. 6 is an interface diagram of another embodiment of the data processing method of the present application.
For example, as shown in part 610 of fig. 6, the source device and the wearable device may have a connection established in advance, and the connection may be a bluetooth connection or another connection, which is not limited by the embodiment of the present application, so long as the source device and the wearable device can communicate based on the connection. In fig. 6, the wearable device is exemplified as a smart watch.
In this embodiment, a default distance threshold needs to be preset in the first application of the source device, and then, the user may modify the distance threshold at any time during the use process, and specific interface implementation may refer to the interface 310, which is not described herein.
The user opens a text editing interface for the first file in the first application of the source device, for example, as shown in interface 620, and the user selects a target text to be edited in the text editing interface, and the specific implementation method of selecting the target text by the user is not limited, for example, the user may press the selected word for a long time by a finger, or drag the front boundary and the rear boundary of the selection box after pressing the selected word for a long time by a finger to adjust the selected text range, etc.; accordingly, the first application of the source device detects a text selection operation by the user, and determines the target text selected by the user based on the text selection operation, e.g., the target text selected by the user in interface 620 is "this is a test text segment".
After that, the user may execute a first specified gesture, for example, in fig. 6, the user executes a long press gesture 621, and accordingly, the wearable device detects the first specified gesture and transmits information of the first specified gesture to the first application of the source device, and accordingly, the first application of the source device receives the information of the first specified gesture and obtains an operation corresponding to the first specified gesture, where the operation corresponding to the first specified gesture is an operation that needs to be executed by the first application of the source device on the target text. For example, the operation corresponding to the first specified gesture may be a pending operation, and the first application of the source device may wait for the gesture of the user next time. The pending operation here may be regarded as an instruction operation in which the user starts using a gesture.
It should be noted that, the first specified gesture may be the same as the gesture of selecting the target text in the text editing interface by the user, and the wearable device may detect the first specified gesture while the first application of the source device detects the text selection operation. For example, assuming that the gesture of the user to select the target text in the text editing interface and the first specified gesture are both long press gestures, then: if the user presses the target text long in a text editing interface, such as interface 620, the first application of the source device may detect a text selection operation of the user in the text editing interface and the wearable device may detect a first specified gesture of the user.
The user performs a second specified gesture, for example, a grab gesture 622 shown in fig. 6, and accordingly, the wearable device detects the second specified gesture and transmits information of the second specified gesture to the first application of the source device, and accordingly, the first application of the source device receives the information of the second specified gesture and obtains an operation corresponding to the second specified gesture, and the operation corresponding to the second specified gesture is an operation that needs to be performed on the target text by the first application of the source device. For example, continuing an example in which the operation corresponding to the first specified gesture is a pending operation, and the operation corresponding to the second specified gesture may be "copy to other device", the first application of the source device needs to perform the operation of "copy to other device" on the above-mentioned target text.
In the above procedure, since 2 operations (such as the above-described pending operation and the "copy to other device" operation) need to be indicated to the first application, the user performs first a first specified gesture, indicates to the first application of the source device that one operation needs to be performed on the target text (such as the above-described pending operation), and performs second specified gesture, and indicates to the first application of the source device that another operation needs to be performed on the target text (such as the above-described "copy to other device" operation). In other embodiments provided by the present application, if the user only needs to indicate 1 operation to the first application of the source device, the user may perform only 1 specified gesture, or if the user needs to indicate more than 2 operations to the first application of the source device, the user may sequentially perform more than 2 gestures, which is not limited by the embodiments of the present application.
And if the first application of the source device judges that the operation corresponding to the second specified gesture involves other devices, that is, the target device of the operation includes a device other than the source device, acquiring device information of the peripheral device, where the distance between the peripheral device and the source device does not exceed the distance threshold, so as to select an interface, for example, an interface 630, for the user display device.
The user selects the device control corresponding to the target device in the device selection interface, and the operation may be implemented by a manner of designating a gesture, or may be implemented by the user manually clicking the device control of the target device in the device selection interface, and accordingly, the first application of the source device detects the target device selection operation of the user, and obtains the target device according to the target device selection operation, for example, taking the user selection device 3 as the target device in the interface 630.
The first application of the source device sends the target text and the information of the target device to the wearable device through pre-establishing connection between the source device and the wearable device, and the wearable device receives the target text and the information of the target device correspondingly.
The user performs a third specified gesture, such as a tap gesture, and accordingly, the wearable device detects the third specified gesture, sends the target text to the target device using the BLE extension message, and accordingly, the target device receives the target text.
The source equipment sends the target text and the information of the target equipment to the wearable equipment, and the wearable equipment uses the BLE expansion message to send the target text to the target equipment, so that the transmission of the target text from the source equipment to the target equipment is realized without establishing connection between the source equipment and the target equipment, a plurality of manual operation steps of a user in the connection establishment process between the source equipment and the target equipment are omitted, and the operation of the user is simpler and more convenient.
Alternatively, the first application of the target device may provide a text editing interface for the second file, for example, as shown in interface 640, where the user selects a target location of the target text in the text editing interface, for example, the user clicks a certain location in the text editing interface with a finger, and accordingly, the first application of the target device detects a location indication operation of the user, obtains the target location according to the location indication operation, and additionally displays the target text at the target location of the text editing interface, for example, as shown in interface 650.
The above-mentioned process realizes that the text of the first file in the source device is copied to the appointed position of the second file in the target device. In the process, the user only needs to indicate the target text, the target equipment, the operation on the target text, the target position and the like in the modes of operation, gesture and the like, and networking or connection establishment between the source equipment and the target equipment is not needed manually, so that the operation of the user is simpler and more convenient.
It should be noted that, the third specified gesture of the user may be the same as the gesture of the user for specifying the target position in the text editing interface of the target device, and the wearable device may detect the third specified gesture while the first application of the target device detects the position indication operation of the user. For example, assuming that the gesture of the third specified gesture and the gesture of the specified target location are both click gestures, the user performs a finger click operation at the target location in the text editing interface of the target device, and accordingly, the wearable device detects the third specified gesture and the first application of the target device detects the location indication operation of the user.
In another embodiment of the present application, the third specified gesture is optional, that is, after the wearable device receives the target text and the information of the target device, the BLE extension message may be directly used to send the target text to the target device, without triggering the third specified gesture.
It should be noted that, since various actions can be made frequently in the daily life of the user, and these actions may be the same as the specified gestures in the embodiment of the present application, so that the wearable device worn by the user, for example, the smart watch, is recognized as the specified gestures, in order to enable the wearable device to better match with the electronic device to implement the data processing method of the embodiment of the present application, unnecessary gesture recognition of the wearable device is reduced, data processing capacity and power consumption of the wearable device are reduced, and whether to start a setting parameter for the gesture recognition function can be set in the wearable device, so that the user can start the gesture recognition function when needed, and close the gesture recognition function when not needed.
An exemplary description of a method of the wearable device to recognize a user gesture follows.
When a user performs a gesture, different gestures may cause the user's arm to generate different electromyographic signals. The wearable device, such as a smart watch, is provided with a sensor for detecting an electromyographic signal, and when the sensor is in contact with the arm of the user, the sensor can detect the electromyographic signal generated by the arm of the user due to the gesture of the user. The wearable device can determine the gesture executed by the user according to the detected electromyographic signals of the arm of the user, and the gesture can be specifically realized by using the existing related method for recognizing the gesture by using the electromyographic signals, which is not described herein.
FIG. 7 is a flowchart of a data processing method according to an embodiment of the present application, which is provided based on the interface schematic shown in FIG. 6, and as shown in FIG. 7, the method may include:
step 701: a distance threshold is preset.
The implementation of this step may refer to step 401, which is not described here in detail.
Step 702: the first application of the source device displays a text editing interface of the first file, and determines a target text according to text selection operation of a user in the text editing interface.
A text editing interface, such as interface 320, is shown. The text editing interface may also include editing controls, such as control 321, for example, in which text of the first file is displayed.
The target text may be some or all of the text displayed in the text editing interface.
Step 703: the wearable device detects a first specified gesture and sends information of the first specified gesture to a first application of the source device.
Step 704: the first application of the source device obtains an operation corresponding to the first specified gesture as a first operation for the target text.
Steps 703 and 704 are optional steps.
Step 705: the wearable device detects the second specified gesture and sends information of the second specified gesture to the first application of the source device.
Step 706: the first application of the source device obtains an operation corresponding to the second specified gesture as a second operation for the target text.
Step 707: the first application of the source device determines that the target device of the second operation includes a device other than the source device, and obtains device information of a peripheral device whose distance from the source device does not exceed a distance threshold.
Step 708: the first application of the source device displays a device selection interface.
Step 709: the first application of the source device detects a selection operation of a user for a device control in a device selection interface, and determines the target device according to the selection operation.
Step 710: the first application of the source device sends the target text and the information of the target device to the wearable device.
Step 711: the wearable device detects a third specified gesture and sends the target text to a first application of the target device in a broadcast mode.
In another embodiment, the third specified gesture is optional, i.e., the wearable device may send the target text directly to the first application of the target device after receiving the target text and the information of the target device, without the third specified gesture triggering.
Alternatively, the wearable device may transmit the target text to the target device through BLE extension signaling.
Step 712: the first application of the target device is displayed with a text editing interface aiming at the second file, the position indication operation of the user in the text editing interface is detected, and the target position is determined according to the position indication operation.
Step 713: the first application of the target device displays the target text in the target position increase in the text editing interface.
In another embodiment provided by the application, the first application of the source device may send the target text directly to the first application of the target device without forwarding by the wearable device, and at this time, the source device may also send the target text to the target device using the BLE extension message.
In another embodiment of the present application, the first application may be a message transmission application, for example, a short message application or an instant messaging application, where the first application may be implemented to transmit a message in the first application of the source device to the first application of the target device, where the interface implementation of this embodiment is shown in fig. 9, for example.
The implementation of this embodiment may refer to the related descriptions in fig. 5 and fig. 6, and only a brief description is provided here, and no redundant description is provided.
In this embodiment, the source device and the wearable device establish a connection in advance, and communication can be performed between the source device and the wearable device based on the connection. Still taking the wearable device as an example of a smart watch in fig. 8. In this embodiment, a default distance threshold needs to be preset in the first application of the source device.
The first application provides a message display interface for the user, which is exemplified in fig. 8 as a short message main interface 810.
The user selects a target message on the message display interface, and accordingly, the first application detects a message selection operation of the user, and determines the target message according to the message selection operation, and the target message in fig. 8 is message 1.
The user performs a first specified gesture, for example, in fig. 8, the user performs a long-press gesture 821, and accordingly, the wearable device detects the first specified gesture and transmits information of the first specified gesture to a first application of the source device, and accordingly, the first application receives the information of the first specified gesture and obtains an operation corresponding to the first specified gesture, for example, a pending operation;
The user performs a second specified gesture, such as the grab gesture 822 shown in fig. 8, and accordingly, the wearable device detects the second specified gesture, transmits information of the second specified gesture to the first application of the source device, and accordingly, the first application receives the information of the second specified gesture, and obtains an operation corresponding to the second specified gesture, such as a copy to other device operation;
the first application determines that the target device corresponding to the second specified gesture includes a device other than the source device, and obtains device information of the peripheral device having a distance from the source device that does not exceed the distance threshold, thereby selecting an interface, such as interface 820, for the user display device.
The user selects a device control corresponding to the target device in the device selection interface, and accordingly, the first application detects a target device selection operation of the user, and obtains the target device according to the target device selection operation, for example, taking the user selection device 3 as the target device in the interface 820.
The method comprises the steps that a first application of a source device sends target text and information of a target device to a wearable device through pre-establishing connection between the source device and the wearable device;
the user performs a third specified gesture, such as a tap gesture, and, accordingly, the wearable device detects the third specified gesture and sends the target message to the target device using the BLE extension message.
A first application in the target device displays the target message to the user.
The implementation of the data processing method according to the embodiment of the present application provided based on the interface schematic diagram shown in fig. 8 may refer to the method shown in fig. 7, and the difference is mainly that the method for displaying the target message after the target device receives the target message is different, which is not described herein.
In still another embodiment provided by the present application, the first application may be an image management application, for example, an album application, and at this time, the first application may be implemented to transmit an image (a picture or a video) in the first application of the source device to the target device. The interface implementation and the method flow implementation of the embodiments of the present application may refer to the embodiments shown in fig. 6 to 8, and are not described herein again. Taking the example that the first application is an album application, the user may perform operations and gestures similar to those of fig. 6 and 8, for example, so as to transmit a target image of the album application in the source device to the album application in the target device, and the album application in the target device may add the received target image to an existing image list of the album application, so that when the user accesses the album application, the album application may perform related interface display according to the image list with the added target image.
In the following, taking ranging between a source device and a peripheral device as an example, implementation of the BLE ranging method according to the embodiment of the present application is described as an example. In the embodiment of the application, the source device comprises a first BLE ranging application and a first Bluetooth driver, and the peripheral device comprises a second BLE ranging application and a second Bluetooth driver, wherein the first BLE ranging application and the second BLE ranging application are only used for distinguishing the BLE ranging application from the Bluetooth driver in the source device and the peripheral device.
As shown in fig. 9, the BLE ranging method may include:
Step 801: the first ranging application of the source device sends broadcast ranging signaling to the first bluetooth driver.
Step 802: the first Bluetooth driver of the source device sends the broadcast ranging signaling through the Bluetooth module, and the second Bluetooth driver of the target device receives the broadcast ranging signaling through the Bluetooth module.
The broadcast ranging signaling may include: a signal strength Indicator (RECEIVED SIGNAL STRENGTH Indicator, RSSI) is received.
Optionally, the broadcast ranging signaling may further include: the transmit power is broadcast (ADVERTISER POWER).
Step 803: the second bluetooth driver of the peripheral device sends broadcast ranging signaling to a second BLE ranging application of the target device.
Step 804: the second BLE ranging application of the peripheral device calculates a distance between the target device and the source device from the broadcast ranging signaling.
In a possible implementation manner, the distance between the target device and the source device can be calculated according to the RSSI value in the broadcast ranging signaling, and the calculation formula is shown as the following formula:
Wherein d represents the distance between the target device and the source device; d 0 denotes a reference distance; p represents RSSI carried in broadcast measurement signaling; p 0 represents the received signal strength at a distance d 0 between the source device and the target device; n represents a loss index, and mainly depends on the surrounding environments of source equipment and target equipment, and the more environmental barriers are, the larger the value of n is; xi represents error, which can be ignored in practical application;
The d 0 and the P 0 may be standard reference values obtained based on a pre-distance test, and may be preset in the target device, and the specific value is not limited in the embodiment of the present application;
The n value may be preset in the target device, and the embodiment of the present application is not limited to the specific value.
Step 805: the second BLE ranging application of the peripheral equipment acquires the equipment description information of the target equipment, and generates ranging response signaling according to the equipment description information and the distance information between the source equipment.
In this step, the second BLE ranging application may acquire the device description information from other applications of the peripheral device, or the second BLE ranging application may be preset with the device description information of the peripheral device, which is not limited in the embodiment of the present application.
Step 806: the second BLE ranging application of the peripheral device sends ranging response signaling to the second bluetooth driver.
Step 807: the second Bluetooth driver of the peripheral equipment sends a ranging response signaling through the Bluetooth module, and the first Bluetooth driver of the source equipment receives the ranging response signaling through the Bluetooth module.
The ranging response signaling may include: device address of the peripheral device, device description information, and distance information between the peripheral device and the source device.
Step 808: the first bluetooth driver of the source device sends ranging response signaling to the first BLE ranging application of the source device.
Step 809: the first BLE ranging application obtains device information of the peripheral device from the ranging response signaling.
The source device and each peripheral device receiving the broadcast ranging signaling execute the above procedure, so that device information of all peripheral devices can be obtained.
The embodiment of the application provides a data processing system, which comprises the source equipment and the target equipment provided by the embodiment of the application. Optionally, the system may further include the wearable device provided by the embodiment of the present application.
The embodiment of the application provides a chip system which comprises a processor and a data interface, wherein the processor reads instructions stored on a memory through the data interface so as to execute the data processing method provided by the embodiment of the application.
The embodiment of the application also provides a computer readable storage medium, in which a computer program is stored, which when run on a computer, causes the computer to execute the method provided by the embodiment of the application.
The present application also provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method provided by the embodiments of the present application.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relation of association objects, and indicates that there may be three kinds of relations, for example, a and/or B, and may indicate that a alone exists, a and B together, and B alone exists. Wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in the embodiments disclosed herein can be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In several embodiments provided by the present application, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (hereinafter referred to as ROM), a random access Memory (Random Access Memory hereinafter referred to as RAM), a magnetic disk, or an optical disk, etc., which can store program codes.
The foregoing is merely exemplary embodiments of the present application, and any person skilled in the art may easily conceive of changes or substitutions within the technical scope of the present application, which should be covered by the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A data processing method, applied to a source device, the method comprising:
Displaying a first interface;
Detecting a first selection operation of a user in the first interface, and determining target data;
Detecting a first operation for the target data;
When the device corresponding to the first operation includes a device other than the source device, obtaining device information of a second device, where the device information of the second device includes: device description information of the second device and distance information of the second device and the source device; the second device comprises a peripheral device with a distance from the source device not exceeding a preset distance threshold;
displaying a third interface according to the equipment information of the second equipment, wherein the third interface comprises a first control corresponding to the second equipment;
Detecting a second selection operation of a user on a target first control in the first controls, obtaining second equipment corresponding to the target first control, and taking the second equipment corresponding to the target first control as target equipment of the first operation;
and transmitting the target data to the target equipment in a broadcasting mode.
2. The method of claim 1, wherein said transmitting said target data to said target device by broadcasting comprises:
And transmitting the target data to the target device by using a BLE extension message, wherein the load of the BLE extension message comprises the target data, and the target device of the BLE extension message is the target device.
3. The method of claim 1, wherein the obtaining device information for the second device comprises:
Measuring the distance between peripheral equipment and the source equipment by using a BLE ranging method to obtain equipment information of the peripheral equipment; the device information of the peripheral device includes: the device description information of the peripheral device and the distance information between the peripheral device and the source device;
And screening the equipment information of the second equipment from the equipment information of the peripheral equipment by using a preset distance threshold.
4. A method according to any one of claims 1 to 3, wherein said detecting a first operation directed to said target data comprises:
receiving information of a first gesture sent by third equipment, wherein the first gesture is detected by the third equipment; the third device is a wearable device;
and determining the operation corresponding to the first gesture as the first operation.
5. The method of claim 4, wherein said transmitting said target data to said target device by broadcasting comprises:
the source device sends the target data to the third device;
And the third device sends the target data to the target device in a broadcasting mode.
6. A method according to any one of claims 1 to 3, wherein the target data is text data, image data, or message data.
7.A data processing method, for application to a target device, the method comprising:
Receiving target data transmitted by a source device by broadcasting, the source device being configured to perform the data processing method of any one of claims 1 to 6.
8. The method of claim 7, wherein the target message is text data or image data, the method further comprising:
And additionally displaying the target data in a second interface displayed by the target equipment.
9. The method of claim 8, wherein the additionally displaying the target data at the second interface comprises:
Detecting a position indication operation of a user in the second interface;
determining a target position of the target data in the second interface according to the position indication operation;
and additionally displaying the target data at the target position.
10. The method of claim 7, wherein the target data is message data, and wherein the displaying the target data comprises:
when the interface displayed by the target equipment is a message main interface, the message data is additionally displayed on the message main interface;
and when the interface displayed by the target equipment is not the message main interface, displaying the prompt information of the message.
11. An electronic device, comprising:
One or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the processor, cause the electronic device to perform the method of any of claims 1-10.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1 to 10.
13. A chip system comprising a processor and a data interface, the processor reading instructions stored on a memory via the data interface to perform the data processing method of any of claims 1 to 10.
14. A data processing system comprising a source device as claimed in any one of claims 1 to 6 and a target device as claimed in any one of claims 7 to 10.
CN202211473400.5A 2022-11-21 2022-11-21 Data processing method, device and system Pending CN118057858A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211473400.5A CN118057858A (en) 2022-11-21 2022-11-21 Data processing method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211473400.5A CN118057858A (en) 2022-11-21 2022-11-21 Data processing method, device and system

Publications (1)

Publication Number Publication Date
CN118057858A true CN118057858A (en) 2024-05-21

Family

ID=91069367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211473400.5A Pending CN118057858A (en) 2022-11-21 2022-11-21 Data processing method, device and system

Country Status (1)

Country Link
CN (1) CN118057858A (en)

Similar Documents

Publication Publication Date Title
CN110347269B (en) Empty mouse mode realization method and related equipment
CN113885759A (en) Notification message processing method, device, system and computer readable storage medium
WO2020015144A1 (en) Photographing method and electronic device
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
CN114466107A (en) Sound effect control method and device, electronic equipment and computer readable storage medium
WO2023241209A9 (en) Desktop wallpaper configuration method and apparatus, electronic device and readable storage medium
CN111343326A (en) Method and related device for acquiring test log
CN111104295A (en) Method and equipment for testing page loading process
CN113467735A (en) Image adjusting method, electronic device and storage medium
WO2024045661A1 (en) Image processing method and electronic device
CN116389884B (en) Thumbnail display method and terminal equipment
CN113901485B (en) Application program loading method, electronic device and storage medium
CN115389927B (en) Method and system for measuring and calculating motor damping
CN114915747B (en) Video call method, electronic device and readable storage medium
CN117093068A (en) Vibration feedback method and system based on wearable device, wearable device and electronic device
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium
CN118057858A (en) Data processing method, device and system
CN116346982B (en) Method for processing audio, electronic device and readable storage medium
WO2023020420A1 (en) Volume display method, electronic device, and storage medium
CN116048831B (en) Target signal processing method and electronic equipment
CN114520870B (en) Display method and terminal
CN116233599B (en) Video mode recommendation method and electronic equipment
CN116719376B (en) Voltage control method, device, equipment and storage medium
CN116708317B (en) Data packet MTU adjustment method and device and terminal equipment
CN117319369A (en) File delivery method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination