WO2022063159A1 - 一种文件传输的方法及相关设备 - Google Patents
一种文件传输的方法及相关设备 Download PDFInfo
- Publication number
- WO2022063159A1 WO2022063159A1 PCT/CN2021/119830 CN2021119830W WO2022063159A1 WO 2022063159 A1 WO2022063159 A1 WO 2022063159A1 CN 2021119830 W CN2021119830 W CN 2021119830W WO 2022063159 A1 WO2022063159 A1 WO 2022063159A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- file
- devices
- target file
- information
- interface
- Prior art date
Links
- 230000005540 biological transmission Effects 0.000 title claims abstract description 107
- 238000000034 method Methods 0.000 title claims abstract description 85
- 238000003860 storage Methods 0.000 claims description 61
- 238000012545 processing Methods 0.000 claims description 53
- 238000004891 communication Methods 0.000 claims description 46
- 230000006854 communication Effects 0.000 claims description 46
- 230000000694 effects Effects 0.000 claims description 43
- 238000004590 computer program Methods 0.000 claims description 19
- 230000000977 initiatory effect Effects 0.000 claims description 19
- 238000012544 monitoring process Methods 0.000 claims description 4
- 238000012546 transfer Methods 0.000 description 50
- 230000006870 function Effects 0.000 description 41
- 238000007726 management method Methods 0.000 description 40
- 238000005516 engineering process Methods 0.000 description 39
- 238000010586 diagram Methods 0.000 description 37
- 230000006855 networking Effects 0.000 description 27
- 230000008569 process Effects 0.000 description 16
- 238000010295 mobile communication Methods 0.000 description 13
- 230000005236 sound signal Effects 0.000 description 13
- 210000000988 bone and bone Anatomy 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 7
- 238000013528 artificial neural network Methods 0.000 description 6
- 230000008878 coupling Effects 0.000 description 5
- 238000010168 coupling process Methods 0.000 description 5
- 238000005859 coupling reaction Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 4
- 230000008093 supporting effect Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000009471 action Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000001939 inductive effect Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000003238 somatosensory effect Effects 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000037237 body shape Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000013515 script Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- JLGLQAWTXXGVEM-UHFFFAOYSA-N triethylene glycol monomethyl ether Chemical compound COCCOCCOCCO JLGLQAWTXXGVEM-UHFFFAOYSA-N 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1698—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/16—File or folder operations, e.g. details of user interfaces specifically adapted to file systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1827—Network arrangements for conference optimisation or adaptation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/21—Monitoring or handling of messages
- H04L51/224—Monitoring or handling of messages providing notification on incoming messages, e.g. pushed notifications of received messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/06—Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1446—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/08—Cursor circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1822—Conducting the conference, e.g. admission, detection, selection or grouping of participants, correlating users to one or more conference sessions, prioritising transmission
Definitions
- the present invention relates to the field of terminals, and in particular, to a file transmission method and related equipment.
- the device collaboration technology and the WiFi-P2P direct connection technology are used to establish a connection between devices to realize file sharing.
- using the device collaboration technology when two devices establish collaboration, it is possible to drag and transfer files across devices. For example, when a file is dragged on the first device, when the dragged file leaves the first device. Carry relevant information to the second device, and the second device will deal with it accordingly; use WiFi-P2P technology to establish a point-to-point connection between intelligent terminal devices, files with a large amount of data are transmitted over the WiFi network, and files with a smaller amount of data are transmitted over Bluetooth channel transmission.
- the technical problem to be solved by the embodiments of the present invention is to provide a file transmission method and related equipment to improve file sharing efficiency and user experience among multiple devices.
- a first aspect a method for file transfer, characterized in that it is applied to a first device in a multi-device collaborative system, wherein the multi-device collaborative system includes N devices, and any one of the N devices is associated with the first device. At least one other device in the N devices establishes cooperation; the first device is any one of the N devices; N is an integer greater than 2; the method includes: the first device displays the first device interface; the first interface includes a display screen interface of the first device, and collaboration windows corresponding to M second devices that establish collaboration with the first device; M is an integer greater than or equal to 0; the first drag operation on the target file on the first interface; notify other devices in the N devices to monitor the release position of the first drag operation; the release position includes any one of the N devices The interface of the device or the collaboration window; detecting the release position of the first drag operation; controlling the sending of the target file to the device matching the release position among the N devices.
- the target file is selected on the first interface of the first device, and the target file is processed.
- the drag operation includes, but is not limited to, dragging and dropping the target file by touching the screen, and dragging and dropping the target file by using a peripheral device such as a mouse.
- the first device will send a broadcast message to notify all other devices in the system that there is currently a target file to be shared. Ready to receive the target file.
- the first device may control to send the target file to the device matching the release position.
- each device can send or receive a target file without disconnecting the collaboration, which can Open the collaboration established with other devices, and then re-establish a new collaboration for file transfer, thereby realizing file transfer across multiple devices, improving the efficiency of file transfer under multi-device collaboration and simplifying user operations.
- the target file is a file stored on the first device; the starting position of the first drag operation is on the display screen interface; the target file is the In the M second devices, a file stored on the second device is initiated; the initial position of the first drag operation is in the collaboration window corresponding to the initial second device.
- the first interface includes the display screen interface of the first device and the cooperation windows corresponding to M second devices, and the target file is dragged and dropped on the first interface. If the target file is in the first interface On the display interface of one device and the target file is dragged on the display interface of the first device, but not in the collaboration windows corresponding to M second devices, it means that the target file is stored in the first device.
- the first device can send the target file.
- the target file is on the collaborative window of one of the devices and the target file is dragged on the collaborative window instead of on the display interface of the first device, it means that the target file is stored in the On the device corresponding to the collaborative window, it is determined that the first device can control the device corresponding to the collaborative window to send the target file.
- the target file is a file stored on the first device; the control sends the target file to a device matching the release position among the N devices, including : when it is detected that the release position of the first drag operation is on the display screen interface of the third device, or when it is detected that the release position of the first drag operation is among the N devices of the third device In the collaboration window on other devices, control the first device to send the target file to the third device; the third device includes the N devices that have not established collaboration with the first device device of.
- the target file after the target file is dragged on the first interface of the first device, if the target file is released on the display screen interface of any device in the multi-device collaboration system or the collaboration window corresponding to the device, It means that the target file needs to be sent to the device matching the release position, and the device and the first device do not need to establish cooperation.
- the tablet and the mobile phone establish collaboration and there is a collaboration window of the mobile phone on the tablet, and the tablet and the computer establish collaboration
- the user can drag and drop the file to be shared in the collaboration window of the mobile phone on the tablet screen, Share files to send to a computer. Therefore, it can be realized that the first device can perform file transfer with the device for which the cooperation has not been established by dragging and dropping the target file without disconnecting the cooperation with other devices.
- the method further includes: acquiring file information of the target file; the file information includes the file name, file content, and file size information of the target file; judging and releasing the location Whether the matching device satisfies the condition for receiving the target file; if so, determine the storage path of the device matching the release position to receive the target file.
- the sending device the device storing the target file, such as one of the first device or M second devices
- the sending device needs to determine whether to receive Whether the device has sufficient storage space to store the target file.
- the sending device can first send the size of the target file to the receiving device to determine whether the device has space to store the target file, and if there is sufficient storage space, determine that the target file is in the receiving device on the storage path, and then the sending device can successfully send the target file to this storage path.
- the controlling the first device to send the target file to the third device includes: establishing a data transmission channel; the data transmission channel is used to transmit all the target files. the file information; if the first device establishes a direct connection with the third device, send the file information to the storage path of the third device through the data transmission channel; The device establishes an indirect connection with the third device, then sends the file information to the relay device through the data transmission channel, and forwards the file information to the third device through the relay device. path, wherein the relay device is a device that establishes a direct connection with the first device and establishes a direct connection with the third device at the same time.
- the connection of more devices can be realized without disconnecting the collaboration among the devices.
- different networking technologies may have different networking modes, which may lead to changes in the connection relationship between devices.
- the first device is a sending device and the third device is a receiving device
- the first device and the third device can establish a direct connection or an indirect connection under different networking modes. If the first device establishes a direct connection with the third device (for example, using the ad hoc network technology to connect the devices), the first device can directly send the file information of the target file to the third device.
- a direct connection is established but an indirect connection (such as using WiFi-P2P) is established through a relay device (a device that can establish a direct connection with the first device and also establish a direct connection with the third device among the N devices in the multi-device collaborative system).
- a relay device a device that can establish a direct connection with the first device and also establish a direct connection with the third device among the N devices in the multi-device collaborative system.
- the target file is a file stored on an initial second device among the M second devices; the control sends the target file to the N devices and all
- the device matching the release position includes: when it is detected that the release position of the first drag operation is on the display screen interface of the third device, or when it is detected that the release position of the first drag operation is on the third device In the collaboration window on other devices among the N devices, the initiating second device is controlled to send the target file to the third device.
- the third device may be one device among the M second devices.
- the computer establishes collaboration with the mobile phone and the tablet at the same time, and there is a collaboration window for the mobile phone and the tablet on the computer, and the mobile phone and the tablet do not need to establish collaboration, the user can select a file in the collaboration window of the tablet, and drag the file to the collaboration window of the mobile phone.
- the files on the tablet can be directly shared to the mobile phone by dragging and dropping. Therefore, without disconnecting the collaboration with other devices, the target file can be dragged and dropped to perform file transfer with the device for which no collaboration has been established, thereby simplifying the file transfer operation.
- the controlling the initiating second device to send the target file to the third device includes: establishing a data transmission channel; the data transmission channel is used to transmit the target the file information of the file; if the initial second device establishes a direct connection with the third device, the file information is sent to the device matching the release position through the data transmission channel. storage path; if the initial second device establishes an indirect connection with the third device, the file information is sent to the relay device through the data transmission channel, and the file information is sent to the relay device through the relay device. The storage path is forwarded to the device matching the release position, wherein the relay device is a device that establishes a direct connection with the originating second device and establishes a direct connection with the third device at the same time.
- the connection of more devices can be realized without disconnecting the collaboration among the devices.
- different networking technologies may have different networking modes, which may lead to changes in the connection relationship between devices.
- the initiating second device When the initiating second device is used as the sending device, the initiating second device and the third device may establish a direct connection or an indirect connection under different networking modes. If the initiating second device can establish a direct connection with the third device (for example, using ad hoc networking technology for networking), the initiating second device can directly send the file information of the target file to the third device.
- the second device does not establish a direct connection with the third device, but establishes an indirect connection through a relay device (such as the first device) (such as using WiFi-P2P technology networking to achieve a one-to-many connection between devices), then start the second device.
- the device first sends the file information of the target file to the relay device, and the relay device forwards the file information of the target file to the third device, thereby realizing file transmission among multiple devices.
- the method further includes: acquiring first information of the target file; the first information includes file type information, file quantity information, and file arrangement order information of the target file. one or more; according to the first information, generate the drag effect set of the target file; display the drag effect matching the fourth device according to the drag effect set of the target file; the fourth device is The device passed by the drag track of the first drag operation, or the device corresponding to the collaboration window on the passed device.
- the target file in the process of dragging and dropping the target file, the target file can be dragged across devices, and the dragging track can pass through multiple devices, and the multiple devices passed through are the fourth device , Different devices may have different operating systems.
- a file dragging effect suitable for the operating system is displayed during the dragging process of the target file.
- a dragging effect set can be generated according to the information, and the dragging effect set includes a or multiple file drag effects, and then display the appropriate file drag effects according to the operating system of the device where the drag track passes.
- a second aspect provides a method for file transfer, characterized in that it is applied to a third device in a multi-device collaborative system, wherein the multi-device collaborative system includes N devices, and any one of the N devices is connected to a third device. At least one other device among the N devices establishes cooperation; the third device is any one of the N devices; N is an integer greater than 2; the method includes: receiving an initiation from the first device monitor the notification of the release position of the first drag operation; the first drag operation is to initiate a drag on the target file on the first interface; the first interface includes the display screen interface of the first device, And the coordination windows corresponding to M second devices that establish coordination with the first device; M is an integer greater than or equal to 0; monitor the release position of the first drag operation; the target file; send a broadcast to notify other devices in the multi-device collaborative system that the target file has been successfully received.
- the first device initiates the sharing of the target file, and the third device receives the first device.
- a device initiates to monitor the notification of the release location of the target file.
- the third device will be ready to receive the target file.
- the first device can control to send the target file to the The third device, at this time, the third device will receive the target file, and after the third device successfully receives the target file, it will send a broadcast to notify all other devices in the multi-device collaborative system that the target file has been successfully received, and there is no need to wait for reception.
- each device can receive the target file without disconnecting the collaboration, which can Open the collaboration established with other devices, and then re-establish a new collaboration for file transfer, thereby realizing file transfer across multiple devices, improving the efficiency of file transfer under multi-device collaboration and simplifying user operations.
- the receiving the target file controlled and sent by the first device includes: establishing a data transmission channel with a device storing the target file; receiving file information of the target file;
- the file information includes the file name, file content, and file size information of the target file.
- the third device will receive information about the size of the target file sent from the sending device, and after judging that the third device has enough space to receive the target file, the third device may receive the file of the target file information.
- a third aspect provides an apparatus for file transmission, characterized in that it is applied to a first device in a multi-device collaborative system, wherein the multi-device collaborative system includes N devices, and any one of the N devices is associated with At least one other device in the N devices establishes cooperation; the first device is any one of the N devices; N is an integer greater than 2; the apparatus includes:
- a first display unit used for the first device to display a first interface
- the first interface includes a display screen interface of the first device, and M second devices corresponding to the first device to establish cooperation Collaboration window; M is an integer greater than or equal to 0;
- a first receiving unit configured to receive a first drag operation acting on the target file on the first interface
- a sending unit configured to notify other devices in the N devices to monitor the release position of the first drag operation; the release position includes the interface of any one of the N devices or the collaboration window;
- a first processing unit configured to detect the release position of the first drag operation
- the sending unit is further configured to control sending the target file to the device matching the release position among the N devices.
- the target file is a file stored on the first device; the first processing unit is specifically configured to detect that the release position of the first drag operation is at the third The display screen interface of the device, or when it is detected that the release position of the first drag operation is in the coordination window of the third device on other devices among the N devices, the first device is controlled to The target file is sent to the third device; the third device includes a device among the N devices that has not established cooperation with the first device.
- the apparatus further includes: a second receiving unit, configured to acquire file information of the target file; the file information includes the file name, file content, and file size information of the target file ; a second processing unit for judging whether the device matching the release position satisfies the condition for receiving the target file; if so, determining the storage path for the device matching the release position to receive the target file.
- a second receiving unit configured to acquire file information of the target file; the file information includes the file name, file content, and file size information of the target file ; a second processing unit for judging whether the device matching the release position satisfies the condition for receiving the target file; if so, determining the storage path for the device matching the release position to receive the target file.
- the first processing unit is further configured to establish a data transmission channel; the data transmission channel is configured to transmit the file information of the target file; the sending unit is further configured to If a direct connection is established between the first device and the third device, the file information is sent to the storage path of the third device through the data transmission channel; if the first device and the third device are connected
- the third device establishes an indirect connection, sends the file information to the relay device through the data transmission channel, and forwards the file information to the storage path of the third device through the relay device, where the The relay device is a device that establishes a direct connection with the first device and establishes a direct connection with the third device at the same time.
- the target file is a file stored on an initial second device among the M second devices; the first processing unit is specifically configured to, when detecting the first drag The release position of the drag operation is on the display interface of the third device, or it is detected that the release position of the first drag operation is in the coordination window of the third device on other devices among the N devices, then control the The initiating second device sends the target file to the third device.
- the first processing unit is further configured to establish a data transmission channel; the data transmission channel is configured to transmit the file information of the target file; the sending unit is further configured to If the initial second device establishes a direct connection with the third device, send the file information to the storage path with the third device through the data transmission channel; The second device establishes an indirect connection with the third device, then sends the file information to the relay device through the data transmission channel, and forwards the file information to the third device through the relay device.
- the storage path wherein the relay device is a device that establishes a direct connection with the initial second device and establishes a direct connection with the third device at the same time.
- the apparatus further includes: a third receiving unit, configured to acquire first information of the target file; the first information includes file type information and file quantity information of the target file , one or more of file arrangement order information; a third processing unit, used for generating a drag effect set of the target file according to the first information; a second display unit, used for according to the target file
- the drag effect set displays a drag effect matching the fourth device; the fourth device is the device passed by the drag track of the first drag operation, or the device corresponding to the collaboration window on the passed device.
- a fourth aspect an apparatus for file transmission, characterized in that it is applied to a third device in a multi-device collaborative system, wherein the multi-device collaborative system includes N devices, and any one of the N devices is associated with a third device. At least one other device among the N devices establishes cooperation; the third device is any one of the N devices; N is an integer greater than 2; the apparatus includes: a receiving unit for receiving the first A notification initiated by a device to monitor the release position of the first drag operation; the first drag operation is to initiate a drag on the target file on the first interface; the first interface includes the display of the first device screen interface, and coordination windows corresponding to M second devices that establish coordination with the first device; M is an integer greater than or equal to 0; a processing unit is used to monitor the release position of the first drag operation; a receiving unit , and is also used for receiving the target file controlled and sent by the first device; the sending unit is used for sending a broadcast notification to other devices in the multi-device collaborative system that the target file has been successfully received.
- the processing unit is further configured to establish a data transmission channel with the device storing the target file; the receiving unit is further configured to receive file information of the target file; the file The information includes the file name, file content, and file size information of the target file.
- an embodiment of the present invention provides an electronic device, the electronic device includes a processor, and the processor is configured to support the electronic device to implement corresponding functions in the file transfer method provided in the first aspect.
- the electronic device may also include a memory for coupling with the processor that holds program instructions and data necessary for the electronic device.
- the electronic device may also include a communication interface for the electronic device to communicate with other devices or a communication network.
- an embodiment of the present invention provides an electronic device, the electronic device includes a processor, and the processor is configured to support the electronic device to implement corresponding functions in the file transfer method provided in the second aspect.
- the electronic device may also include a memory for coupling with the processor that holds program instructions and data necessary for the electronic device.
- the electronic device may also include a communication interface for the electronic device to communicate with other devices or a communication network.
- the present application provides a chip system
- the chip system includes a processor for supporting a file transfer device to implement the functions involved in the first aspect, for example, generating or processing the files involved in the file transfer method. information.
- the chip system further includes a memory for storing necessary program instructions and data of the data sending device.
- the chip system may be composed of chips, or may include chips and other discrete devices.
- the present application provides a chip system
- the chip system includes a processor for supporting a file transfer device to implement the functions involved in the second aspect above, for example, generating or processing the file transfer method involved in the above. information.
- the chip system further includes a memory for storing necessary program instructions and data of the data sending device.
- the chip system may be composed of chips, or may include chips and other discrete devices.
- the present application provides a computer storage medium, where the computer storage medium stores a computer program, and when the computer program is executed by a processor, implements the flow of the file transmission method described in any one of the above-mentioned first aspect.
- the present application provides a computer storage medium, where the computer storage medium stores a computer program, and when the computer program is executed by a processor, implements the flow of the file transmission method described in any one of the second aspect above.
- an embodiment of the present invention provides a computer program, where the computer program includes instructions, when the computer program is executed by a computer, the computer can execute the program executed by the processor in the file transmission device in the first aspect above. process.
- an embodiment of the present invention provides a computer program, where the computer program includes instructions that, when the computer program is executed by a computer, enable the computer to execute the execution of the processor in the file transmission device in the second aspect above. process.
- FIG. 1A is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- FIG. 1B is a block diagram of a software structure of an electronic device provided by an embodiment of the present application.
- FIG. 2 is a schematic diagram of a system architecture for file transmission according to an embodiment of the present application.
- FIG. 3 is a schematic diagram of a file transmission system architecture of any one of the N devices according to an embodiment of the present invention.
- FIG. 4 is a schematic flowchart of a file transmission method provided by an embodiment of the present application.
- FIG. 5A is a schematic diagram of a multi-device collaboration system (taking three devices as an example) provided by an embodiment of the present invention.
- 5B is a schematic diagram of initiating a file drag and drop on a first device (taking three devices as an example) in a multi-device collaboration system provided by an embodiment of the present invention.
- 5C is a schematic diagram of a first drag operation release position (taking three devices as an example) in a multi-device collaboration system provided by an embodiment of the present invention.
- FIG. 5D is a schematic diagram of sending a target file (taking three devices as an example) in a multi-device collaboration system provided by an embodiment of the present invention.
- FIG. 6 is a detailed schematic flowchart of a file transmission method in an embodiment of the present application.
- FIG. 7A is a schematic diagram of a multi-device collaboration system (taking 5 devices as an example) provided by an embodiment of the present invention.
- FIG. 7B is a schematic diagram of a target file (taking 5 devices as an example) in a multi-device collaboration system provided by an embodiment of the present invention.
- FIG. 7C is a schematic diagram of a drag effect (taking 5 devices as an example) in a multi-device collaboration system provided by an embodiment of the present invention.
- FIG. 7D is a schematic diagram of a first drag operation release position (taking five devices as an example) in a multi-device collaboration system provided by an embodiment of the present invention.
- FIG. 7E is a schematic diagram of dragging and dropping files (taking 5 devices as an example) in a multi-device collaboration system provided by an embodiment of the present invention.
- FIG. 8 is a schematic structural diagram of a file transmission apparatus according to an embodiment of the present invention.
- FIG. 9 is a schematic structural diagram of another file transmission apparatus provided by an embodiment of the present invention.
- the electronic device may be a portable electronic device that also includes other functions such as personal digital assistant and/or music player functions, such as a cell phone, tablet computer, wearable electronic device with wireless communication capabilities (eg, a smart watch) Wait.
- portable electronic devices include, but are not limited to, carry-on Or portable electronic devices with other operating systems.
- the portable electronic device described above may also be other portable electronic devices, such as a laptop computer (Laptop) with a touch-sensitive surface or touch panel, or the like. It should also be understood that, in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer having a touch-sensitive surface or a touch panel.
- UI user interface
- the term "user interface (UI)" in the description, claims and drawings of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the internal form of information Conversion to and from user-acceptable forms.
- the user interface of the application is the source code written in a specific computer language such as java and extensible markup language (XML).
- the interface source code is parsed and rendered on the terminal device, and finally presented as content that the user can recognize.
- Controls also known as widgets, are the basic elements of the user interface. Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar), pictures and text.
- the attributes and content of controls in the interface are defined by tags or nodes.
- XML specifies the controls contained in the interface through nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
- a node corresponds to a control or property in the interface, and the node is rendered as user-visible content after parsing and rendering.
- applications such as hybrid applications, often contain web pages in their interface.
- a web page, also known as a page can be understood as a special control embedded in an application interface.
- a web page is source code written in a specific computer language, such as hypertext markup language (GTML), cascading styles Tables (cascading style sheets, CSS), java scripts (JavaScript, JS), etc.
- GTML hypertext markup language
- cascading styles Tables cascading style sheets, CSS
- java scripts JavaScript, JS
- the specific content contained in a web page is also defined by tags or nodes in the source code of the web page.
- GTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
- GUI graphical user interface
- GUI refers to a user interface related to computer operations that is displayed graphically. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. visual interface elements.
- FIG. 1A shows a schematic structural diagram of an electronic device 100 .
- the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, 3D camera module 193, display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
- SIM subscriber identification module
- the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180G, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
- the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100 .
- the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
- the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
- the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a central processing unit (central processing unit, CPU), a graphics processing unit (graphics processing unit, GPU) , neural-network processing unit (NPU), modem processor, image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor) processor, DSP), baseband processor, etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. In some embodiments, electronic device 100 may also include one or more processors 110 .
- application processor application processor
- AP central processing unit
- CPU central processing unit
- graphics processing unit graphics processing unit
- NPU neural-network processing unit
- modem processor image signal processor
- image signal processor image signal processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- baseband processor baseband processor
- the controller may be the nerve center and command center of the electronic device 100 .
- the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
- a memory may also be provided in the processor 110 for storing instructions and data.
- the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device 100 .
- the processor 110 may include one or more interfaces.
- the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- PCM pulse code modulation
- UART universal asynchronous transceiver
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB universal serial bus
- the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
- the processor 110 may contain multiple sets of I2C buses.
- the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the 3D camera module 193 and the like through different I2C bus interfaces.
- the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
- the I2S interface can be used for audio communication.
- the processor 110 may contain multiple sets of I2S buses.
- the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
- the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
- the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
- the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
- the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
- the UART interface is a universal serial data bus used for asynchronous communication.
- the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
- a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
- the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
- the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
- the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the 3D camera module 193 .
- MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
- the processor 110 communicates with the 3D camera module 193 through a CSI interface to implement the camera function of the electronic device 100 .
- the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
- the GPIO interface can be configured by software.
- the GPIO interface can be configured as a control signal or as a data signal.
- the GPIO interface can be used to connect the processor 110 with the 3D camera module 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and the like.
- the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
- the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
- the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
- the interface can also be used to connect other electronic devices, such as AR devices.
- the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
- the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
- the charging management module 140 is used to receive charging input from the charger.
- the charger may be a wireless charger or a wired charger.
- the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
- the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
- the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
- the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the 3D camera module 193, and the wireless communication module 160.
- the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
- the power management module 141 may also be provided in the processor 110 .
- the power management module 141 and the charging management module 140 may also be provided in the same device.
- the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
- the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
- the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
- the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
- the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
- the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
- at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
- at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
- the modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
- the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
- the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
- the modem processor may be a stand-alone device.
- the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
- the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
- WLAN wireless local area networks
- BT Bluetooth
- GNSS global navigation satellite system
- FM frequency modulation
- NFC near field communication
- IR infrared technology
- the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
- the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
- the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, and the
- the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
- the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
- global positioning system global positioning system, GPS
- global navigation satellite system global navigation satellite system, GLONASS
- Beidou navigation satellite system beidou navigation satellite system, BDS
- quasi-zenith satellite system quadsi -zenith satellite system, QZSS
- SBAS satellite based augmentation systems
- the electronic device 100 can implement a display function through a GPU, a display screen 194, an application processor, and the like.
- the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
- the GPU is used to perform mathematical and geometric calculations for graphics rendering.
- Processor 110 may include one or more GPUs that execute instructions to generate or change display information.
- Display screen 194 is used to display images, videos, and the like.
- Display screen 194 includes a display panel.
- the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
- LED diode AMOLED
- flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
- the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
- the electronic device 100 can realize the camera function through the 3D camera module 193, the ISP, the video codec, the GPU, the display screen 194, the application processor AP, the neural network processor NPU, and the like.
- the 3D camera module 193 can be used to collect color image data and depth data of the photographed object.
- the ISP can be used to process the color image data collected by the 3D camera module 193 .
- the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
- ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
- the ISP may be provided in the 3D camera module 193 .
- the 3D camera module 193 may be composed of a color camera module and a 3D sensing module.
- the photosensitive element of the camera of the color camera module may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
- CCD charge coupled device
- CMOS complementary metal-oxide-semiconductor
- the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
- the ISP outputs the digital image signal to the DSP for processing.
- DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
- the 3D sensing module may be a time of flight (TOF) 3D sensing module or a structured light (structured light) 3D sensing module.
- the structured light 3D sensing is an active depth sensing technology, and the basic components of the structured light 3D sensing module may include an infrared (Infrared) emitter, an IR camera module, and the like.
- the working principle of the structured light 3D sensing module is to first emit a light spot of a specific pattern on the object to be photographed, and then receive the light coding of the light spot pattern on the surface of the object, and then compare the similarities and differences with the original projected light spot. And use the principle of trigonometry to calculate the three-dimensional coordinates of the object.
- the three-dimensional coordinates include the distance between the electronic device 100 and the object to be photographed.
- TOF 3D sensing is also an active depth sensing technology, and the basic components of the TOF 3D sensing module may include an infrared (Infrared) transmitter, an IR camera module, and the like.
- the working principle of the TOF 3D sensing module is to calculate the distance (ie depth) between the TOF 3D sensing module and the object to be photographed through the time of infrared reentry to obtain a 3D depth map.
- Structured light 3D sensing modules can also be used in face recognition, somatosensory game consoles, industrial machine vision detection and other fields.
- TOF 3D sensing modules can also be applied to game consoles, augmented reality (AR)/virtual reality (VR) and other fields.
- AR augmented reality
- VR virtual reality
- the 3D camera module 193 may also be composed of two or more cameras.
- the two or more cameras may include color cameras, and the color cameras may be used to collect color image data of the photographed object.
- the two or more cameras may use stereo vision technology to collect depth data of the photographed object.
- Stereoscopic vision technology is based on the principle of human eye parallax. Under natural light sources, two or more cameras are used to capture images of the same object from different angles, and then operations such as triangulation are performed to obtain the electronic device 100 and the object. The distance information between the objects, that is, the depth information.
- the electronic device 100 may include one or N 3D camera modules 193 , where N is a positive integer greater than one.
- the electronic device 100 may include a front 3D camera module 193 and a rear 3D camera module 193 .
- the front 3D camera module 193 can usually be used to collect the color image data and depth data of the photographer facing the display screen 194, and the rear 3D camera module can be used to collect the shooting objects (such as people) that the photographer faces. , landscape, etc.) color image data and depth data.
- the CPU or GPU or NPU in the processor 110 may process the color image data and depth data collected by the 3D camera module 193 .
- the NPU can recognize the color images collected by the 3D camera module 193 (specifically, the color camera module) through a neural network algorithm based on the skeletal point recognition technology, such as a convolutional neural network algorithm (CNN). data to determine the skeletal points of the person being photographed.
- CNN convolutional neural network algorithm
- the CPU or GPU can also run the neural network algorithm to realize the determination of the skeletal points of the photographed person according to the color image data.
- the CPU, GPU or NPU can also be used to confirm the body of the person being photographed (eg body proportion, the fatness and thinness of the body parts between the skeleton points), and can further determine the body beautification parameters for the photographed person, and finally process the photographed image of the photographed person according to the body beautification parameters, so that the shooting The image of the subject's body shape is beautified. Subsequent embodiments will introduce in detail how to perform body beautification processing on the image of the photographed person based on the color image data and depth data collected by the 3D camera module 193 , which will not be described here.
- a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
- Video codecs are used to compress or decompress digital video.
- the electronic device 100 may support one or more video codecs.
- the electronic device 100 can play or record videos in various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG)-1, MPEG-2, MPEG-3, MPEG-4 and so on.
- MPEG Moving Picture Experts Group
- the NPU is a neural-network (NN) computing processor.
- NN neural-network
- Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
- the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
- the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save data such as music, photos, videos, etc. in an external memory card.
- Internal memory 121 may be used to store one or more computer programs including instructions.
- the processor 110 can execute the above-mentioned instructions stored in the internal memory 121, thereby causing the electronic device 100 to execute the method for photographing and previewing the electronic device provided in some embodiments of the present application, as well as various functional applications and data processing.
- the internal memory 121 may include a storage program area and a storage data area. Wherein, the stored program area may store the operating system; the stored program area may also store one or more application programs (such as gallery, contacts, etc.) and the like.
- the storage data area may store data (such as photos, contacts, etc.) created during the use of the electronic device 100 .
- the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
- the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
- the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
- Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
- the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
- the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
- the voice can be answered by placing the receiver 170B close to the human ear.
- the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
- the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
- the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
- the earphone jack 170D is used to connect wired earphones.
- the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
- OMTP open mobile terminal platform
- CTIA cellular telecommunications industry association of the USA
- the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
- the pressure sensor 180A may be provided on the display screen 194 .
- the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
- the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
- the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
- touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
- the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
- the angular velocity of electronic device 100 about three axes i.e., x, y, and z axes
- the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
- the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
- the air pressure sensor 180C is used to measure air pressure.
- the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
- the magnetic sensor 180D includes a Hall sensor.
- the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
- the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
- the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
- the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
- the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
- Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
- the light emitting diodes may be infrared light emitting diodes.
- the electronic device 100 emits infrared light to the outside through the light emitting diode.
- Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
- the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
- Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
- the ambient light sensor 180L is used to sense ambient light brightness.
- the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
- the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
- the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
- the fingerprint sensor 180G is used to collect fingerprints.
- the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
- the temperature sensor 180J is used to detect the temperature.
- the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
- the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
- the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
- the touch sensor 180K may also be referred to as a touch panel or a touch sensitive surface.
- the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
- the touch sensor 180K is used to detect a touch operation on or near it.
- the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
- Visual output related to touch operations may be provided through display screen 194 .
- the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
- the bone conduction sensor 180M can acquire vibration signals.
- the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
- the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
- the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
- the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, and realize the voice function.
- the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
- the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
- the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
- Motor 191 can generate vibrating cues.
- the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
- touch operations acting on different applications can correspond to different vibration feedback effects.
- the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
- Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
- the touch vibration feedback effect can also support customization.
- the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
- the SIM card interface 195 is used to connect a SIM card.
- the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
- the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
- the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
- the SIM card interface 195 can also be compatible with different types of SIM cards.
- the SIM card interface 195 is also compatible with external memory cards.
- the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
- the electronic device 100 employs an eSIM, ie: an embedded SIM card.
- the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
- the electronic device 100 exemplarily shown in FIG. 1A may display various user interfaces described in various embodiments below through the display screen 194 .
- the electronic device 100 can detect a touch operation in each user interface through the touch sensor 180K, for example, a click operation (such as a touch operation on an icon, a double-click operation) in each user interface, and, for example, an up or down operation in each user interface. Swipe down, perform a circle gesture, etc.
- the electronic device 100 may detect a motion gesture performed by the user holding the electronic device 100, such as shaking the electronic device, through the gyro sensor 180B, the acceleration sensor 180E, and the like.
- the electronic device 100 can detect non-touch gesture operations through the 3D camera module 193 (eg, a 3D camera, a depth camera).
- the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
- the embodiment of the present invention takes an Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 as an example.
- FIG. 1B is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present invention.
- the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
- the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
- the application layer can include a series of application packages.
- the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
- the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
- the application framework layer includes some predefined functions.
- the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
- a window manager is used to manage window programs.
- the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
- Content providers are used to store and retrieve data and make these data accessible to applications.
- the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
- the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
- a display interface can consist of one or more views.
- the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
- the phone manager is used to provide the communication function of the electronic device 100 .
- the management of call status including connecting, hanging up, etc.).
- the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
- the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
- the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
- Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
- the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
- the application layer and the application framework layer run in virtual machines.
- the virtual machine executes the java files of the application layer and the application framework layer as binary files.
- the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
- a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
- surface manager surface manager
- media library Media Libraries
- 3D graphics processing library eg: OpenGL ES
- 2D graphics engine eg: SGL
- the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
- the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
- the media library can support a variety of audio and video encoding formats, such as: MPEG4, G.264, MP3, AAC, AMR, JPG, PNG, etc.
- the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
- 2D graphics engine is a drawing engine for 2D drawing.
- the kernel layer is the layer between hardware and software.
- the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
- the software system shown in FIG. 1B involves application presentation (such as gallery, file manager) using sharing capability, instant sharing module providing sharing capability, print service and print spooler providing printing capability , and the application framework layer provides printing framework, WLAN services, Bluetooth services, and the kernel and the bottom layer provide WLAN Bluetooth capabilities and basic communication protocols.
- application presentation such as gallery, file manager
- instant sharing module providing sharing capability
- print service and print spooler providing printing capability
- application framework layer provides printing framework, WLAN services, Bluetooth services
- the kernel and the bottom layer provide WLAN Bluetooth capabilities and basic communication protocols.
- WiFi direct connection WiFi peer-to-peer, WiFi-p2p
- WiFi-p2p WiFi peer-to-peer
- WiFi-p2p uses range from web browsing to file transfers and communicating with multiple devices simultaneously, taking advantage of the speed benefits of WiFi.
- WiFi-p2p and traditional WiFi technology are not mutually exclusive, GO (Group Owner) can provide services for multiple GC (Group Client) like AP, it can connect to an AP like a traditional device, it can simultaneously You can also be an AP yourself.
- the GO is a role in the protocol, which is equivalent to an AP, and there is only one GO in a group.
- the GC is another role in the protocol, and there can be multiple GCs in a group.
- Wireless Local Area Networks refers to the application of wireless communication technology to interconnect computer equipment to form a network system that can communicate with each other and realize resource sharing.
- the essential feature of the wireless local area network is that it no longer uses a communication cable to connect the computer with the network, but connects it wirelessly, so that the construction of the network and the movement of the terminal are more flexible.
- NFC Near Field Communication
- an ad hoc network is a network that combines mobile communication and computer network.
- the information exchange of the network adopts the packet switching mechanism in the computer network, and the user terminal is a portable terminal that can be moved.
- Each user terminal has both router and host functions.
- scenarios of a file transfer system to which a file transfer method in this application is applied are exemplified below. It can be understood that when a file transfer method in this application is applied to different scenarios At the same time, the intelligent terminal devices may correspond to different types of devices respectively, and the files transmitted by the corresponding files are also of different types. Two scenarios are exemplified below.
- Scenario 1 a file transfer scenario based on video production:
- Vlogs With the rapid development of the Internet, more and more young people like to use Vlogs to record the little things in their lives and upload them to the Internet to share with their friends and fans. This has prompted the continuous improvement of the camera function of mobile phones. People slowly put down their heavy cameras and began to take out their mobile phones to record video materials anytime, anywhere. To complete a Vlog video, you need to write the manuscript in the early stage, shoot the video material, and edit it in the later stage to make it a continuous video with complete content. In the process of Vlog video production, it is inevitable to encounter such a situation. In order to make the shooting picture clearer, the photographer usually uses the mobile phone to shoot the camera. When recording some pictures that require the photographer to enter the country, they can use a tablet and mobile phone.
- the use of the embodiments of the present invention can realize the establishment of collaboration between the mobile phone and the tablet, the display screen of the mobile phone can be projected on the tablet, the tablet and the computer can establish collaboration, and the computer can send the manuscript to the tablet at any time.
- Video recording can be done by looking at the tablet.
- the mobile phone can directly drag and drop the video material data on the mobile phone to the computer without establishing collaboration with the computer, and then use the computer for post-editing, which greatly facilitates the photographer's process of making Vlog videos.
- FIG. 2 is a schematic diagram of a system architecture for file transmission provided by an embodiment of the present invention, and the system is used to solve the problem of low file resource transmission efficiency among multiple intelligent terminal devices.
- the system architecture may include N devices, and any one device in the N devices establishes cooperation with at least one other device in the N devices; N is an integer greater than 2.
- any one of the N devices may be the electronic device 100 described in FIG. 1A above, wherein,
- the first device where the first device is any one of the N devices.
- it refers to an electronic device that initiates a file drag and drop operation, and the electronic device has an operating system and a data transmission interface.
- Common electronic devices include smartphones, personal computers, tablets, smart screens and other devices.
- a smart phone has an independent operating system and can achieve wireless network access through a mobile communication network.
- Some file data can be stored on the smart phone and the files can be edited. After the smart phone has established collaboration with other electronic devices, the files on the smart phone can be transferred to the electronic device that established the collaboration.
- the third device where the third device is any one of the N devices (eg, one of the M second devices, etc.).
- the electronic device refers to an electronic device that receives a target file, and the electronic device has an operating system and a data transmission interface.
- Common electronic devices include smartphones, personal computers, tablets, smart screens and other devices.
- An example is a personal computer, which has an independent operating system and can be wired or wirelessly connected to the Internet.
- the personal computer can establish cooperation with other electronic devices and can receive or forward documents from other electronic devices.
- Another example is a tablet.
- the tablet can communicate with other electronic devices, and can also establish collaboration with some electronic devices.
- the screen projection function of the smartphone can be realized on the tablet, and the desktop of the smartphone can be displayed on the display screen of the tablet. And the tablet can receive or forward documents from other electronic devices.
- a file transfer system architecture in FIG. 2 is only an exemplary implementation in the embodiments of the present application, and the file transfer system architectures in the embodiments of the present application include but are not limited to the above system architectures.
- FIG. 3 is a schematic diagram of a file transfer system architecture of any one of the N devices provided by an embodiment of the present invention.
- the file transfer system architecture includes a connection management module 301, a collaboration management module 302, a file dragging module Drag management module 303 and file transfer management module 304.
- the connection management module 301 is responsible for establishing connections between multiple electronic devices. For example, in the connection management module 301, a connection entry can be provided to the user, so as to facilitate the access of multiple electronic devices in cooperation. At the same time, the connection management module 301 can provide an authentication function during the connection process, and after the authentication of accessing the electronic device is completed, the connection between the electronic devices can be established. When the connection between electronic devices needs to be disconnected, the connection management module 301 provides a disconnection entry, so that the electronic device that has established the connection can disconnect at any time.
- the collaboration management module 302 is responsible for realizing the collaboration function between electronic devices. For example, if collaboration is established between multiple electronic devices, the collaboration management module 302 can provide the capability of supporting audio and video transmission such as screen projection and audio switching between electronic devices. The collaborative management module 302 can also provide the capability of supporting data transmission such as sending and receiving of various operation signaling between devices. At the same time, the ability to support sharing of peripheral devices can also be provided, the peripheral devices are used on electronic devices that do not support touch, and the file dragging function can be conveniently used for the electronic devices.
- the file drag and drop management module 303 is responsible for the realization of file sharing between electronic devices.
- file type information can be obtained, for example, the file name suffix of different files can be obtained to determine the file type, and accordingly, corresponding drag effects closer to different operating systems can be generated in different systems.
- a decision can be made on the dragging effect according to the type, quantity, size, and arrangement order of the dragged files, and the corresponding dragging effect can be determined to be displayed on different devices.
- the dragging effect is: The display effect of the file during the drag and drop process.
- the file dragging management module 303 can implement the management of related functions of file drag and drop between electronic devices.
- the file transfer management module 304 is responsible for acquiring and generating file drag events. For example, in the file transfer management module 304, file information of the dragged file can be obtained, and file transfer and reception can be prepared accordingly. At the same time, in this module, the file storage can be managed, and the file storage can be managed accordingly by judging whether the current electronic device can receive the file and determining the storage path of the received file. In this module, file sending and receiving can also be managed, and socket connections can be created according to the IP between different electronic devices, so that channels can be established between different devices for file transfer between devices. The file transmission management module 304 can implement related operations such as file transmission, reception, and storage between devices.
- the file transmission system architecture in the electronic device in FIG. 3 is only an exemplary implementation in the embodiment of the present invention, and the file transmission system architecture in the electronic device in the embodiment of the present invention includes but is not limited to above structure.
- FIG. 4 is a schematic flow chart of a file transfer method in an embodiment of the present application.
- the file scanning method in the embodiment of the present application is described on the interactive side of one of the second devices, etc.). It should be noted that, in order to describe the file transfer method in the embodiments of the present application in more detail, the present application describes in each process step that the corresponding execution body is the first device or the third device, but this does not mean that the embodiments of the present application only The corresponding method flow can be performed through the described execution body.
- Step S401 The first device displays a first interface.
- the first interface includes a display screen interface of the first device, and collaboration windows corresponding to M second devices that establish collaboration with the first device, and M is an integer greater than or equal to 0.
- the first device may display the first interface through the display screen 194 .
- FIG. 5A is a schematic diagram of a multi-device collaboration system (taking three devices as an example), the system includes three electronic devices, and the three electronic devices are a computer, a tablet and a mobile phone.
- the computer and the tablet establish collaboration
- the tablet and the mobile phone establish collaboration
- there is a collaboration window for the mobile phone on the tablet but the computer does not establish collaboration with the mobile phone
- the first device can be a tablet, and the value of M is at this time.
- the M second devices include a mobile phone, so the first interface is the screen of the tablet, and there is only one collaborative window of the device on the screen of the tablet, and other positions on the screen of the tablet excluding the position of the collaborative window are the first interface.
- a display interface of a device is 1, the M second devices include a mobile phone, so the first interface is the screen of the tablet, and there is only one collaborative window of the device on the screen of the tablet, and other positions on the screen of the tablet excluding the position of the collaborative window are the first interface.
- Step S402 The first device receives a first drag operation on the target file acting on the first interface.
- the target file is a file to be shared
- the first drag operation may include, but is not limited to, dragging and dropping the target file by using the touch sensor 180K touch screen of the first device, and using peripheral devices such as a mouse to drag the target file.
- FIG. 5B is a schematic diagram of initiating a file drag and drop on a first device (taking three devices as an example) in a multi-device collaboration system.
- the system includes three electronic devices, and the three electronic devices are Computers, tablets and mobile phones.
- the tablet can be used as the first device to initiate a target on the screen of the tablet.
- the drag and drop operation of the file the target file can be any file that appears on the screen, as shown in the figure, the target file can be a C file or a b file.
- the C file can be dragged from track 1 or track 2, and the b file can be dragged according to track 3 or track 4. It should be noted that the track is not limited to the above-mentioned track and may include more possibilities.
- Step S403 The first device notifies other devices among the N devices to monitor the release position of the first drag operation.
- the release position includes the interface of any one of the N devices or the collaboration window
- the first drag operation release position is the position where the user releases the target file on the first interface.
- FIG. 5C is a schematic diagram of the release position of the first drag operation (taking 3 devices as an example) in the multi-device collaboration system.
- the user initiates the C file in the mobile phone collaboration window on the first interface.
- Drag and drop operation At this time, the tablet will send a broadcast message to notify the computers, tablets, and mobile phones in the multi-device collaboration system that a file drag operation is initiated, so that every device in the system is ready to receive the C file.
- the device will monitor the release position of the file drag operation and each device can know that the C file comes from the mobile phone.
- Step S404 The third device receives a notification initiated by the first device to monitor the release position of the first drag operation.
- FIG. 5C is a schematic diagram of the release position of the first drag operation (taking 3 devices as an example) in the multi-device collaboration system.
- the user initiates the C file in the mobile phone collaboration window on the first interface.
- the third device can be a computer or a tablet. At this time, both the computer and the tablet will receive a notification from the first device to monitor the release position of the first drag operation, so as to know that there is a C file to be shared on the mobile phone , always ready to receive C files.
- Step S405 The third device monitors the release position of the first drag operation.
- FIG. 5C is a schematic diagram of the release position of the first drag operation (taking 3 devices as an example) in the multi-device collaboration system.
- the user initiates the C file in the mobile phone collaboration window on the first interface.
- the third device can be a computer or a tablet. When the computer is used as the third device, the computer will always monitor the position where the C file is released, and receive the C file when the C file is released on its own display interface.
- Step S406 The first device detects the release position of the first drag operation.
- the release position on the first interface may include that the release position includes a display screen interface of any one of the N devices or a collaboration window corresponding to the device.
- the third device is the device corresponding to the release position of the first drag operation.
- FIG. 5C is a schematic diagram of the release position of the first drag operation in the three device systems.
- the user initiates a drag operation on the C file in the mobile phone collaboration window on the first interface, and can move the C file along the track 1. If the file is released on the display interface of the tablet, the tablet will detect that the release position is the display interface of the tablet.
- the third device is the tablet, which means that the C file needs to be sent to the tablet.
- the C file can also be released on the computer screen interface along the track 2, and the tablet will detect that the release position is the computer screen interface.
- the third device is a computer, which means that the C file needs to be sent to the computer.
- Step S407 The first device controls sending the target file to the device matching the release position among the N devices.
- the controlling to send the target file to the device matching the release position among the N devices includes the first device sending the target file to the device matching the release position and the first device controlling other devices Send the target file to the device that matches the release location.
- FIG. 5D is a schematic diagram of sending a target file (taking 3 devices as an example) in a multi-device collaborative system. If the user selects the C file on the first interface and drags the C file along the track 5, Release the drag and drop operation on the computer screen interface, and the tablet will control the mobile phone to send the C file to the computer. If the user selects the b file on the first interface, drags the b file along the track 6, and releases the drag operation on the collaboration window of the mobile phone, the tablet will send the b file to the mobile phone.
- Step S408 The third device receives the target file controlled and sent by the first device.
- the third device may store the target file in the device.
- FIG. 5D is a schematic diagram of sending a target file (taking 3 devices as an example) in a multi-device collaborative system. If the user selects the C file on the first interface and drags the C file along the track 5, Release the drag and drop operation on the computer screen interface, and the tablet will control the mobile phone to send the C file to the computer. At this time, the computer, as the third device, will receive the C file and store the C file locally.
- the tablet will send the b file to the mobile phone.
- the mobile phone as the third device, will receive the b file and store the b file locally.
- Step S409 The third device sends a broadcast notification to other devices in the multi-device collaboration system that the target file has been successfully received.
- FIG. 5D is a schematic diagram of sending a target file (taking 3 devices as an example) in a multi-device collaborative system. If the user selects the C file on the first interface and drags the C file along the track 5, Release the drag and drop operation on the computer screen interface, and the tablet will control the mobile phone to send the C file to the computer. After the computer successfully receives the C file, it will notify the tablet and mobile phone in the multi-device collaboration system that the C file has been successfully received, and there is no need to wait for it to be received.
- each device can send or receive the target file without disconnecting the collaboration, which can avoid file transfer between two devices that have not established collaboration. Disconnect the collaboration established with other devices, and then re-establish a new collaboration for file transfer, thereby realizing convenient file transfer across multiple devices, improving the efficiency of file resource transfer under multi-device collaboration, simplifying user operations, and improving user experience. experience.
- FIG. 6 is a detailed flowchart of a file transfer method in an embodiment of the present application. Below, in conjunction with FIG. 6 and based on the file transfer system architecture in the above-mentioned FIG.
- the file transmission method in the embodiment of the present application is described on the interactive side of one of the M second devices, etc.). It should be noted that, in order to describe the file scanning method in the embodiments of the present application in more detail, the present application describes in each process step that the corresponding execution body is the first device or the third device, but this does not mean that the embodiments of the present application only The corresponding method flow can be performed through the described execution body.
- Step S601 The first device displays a first interface.
- the first interface includes a display screen interface of the first device, and collaboration windows corresponding to M second devices that establish collaboration with the first device, where M is an integer greater than or equal to 0.
- the first device may display the first interface through the display screen 194 .
- FIG. 7A is a schematic diagram of a multi-device collaboration system (taking 5 devices as an example), the system includes five electronic devices, and the five electronic devices are a computer, a tablet, a mobile phone 1, and a mobile phone 2. and mobile phone 3.
- a collaboration is established between mobile phone 1 and the computer and the collaboration window of mobile phone 1 is established on the computer, and the collaboration window between mobile phone 2 and the computer is established and the collaboration window of mobile phone 2 is established on the computer.
- the computer can be used as the first device, and the first interface can be displayed on the computer, and the value of M is 2 at this time, and the M second devices include the mobile phone 1.
- the mobile phone 2 there are collaboration windows of the two devices on the screen of the computer, and other positions on the screen of the computer except the position of the collaboration window are the display interface of the first device.
- Step S602 The first device receives a first drag operation on the target file acting on the first interface.
- the target file is a file to be transferred
- the first drag operation may include, but is not limited to, dragging and dropping the target file by using the touch sensor 180K touch screen of the first device, and using peripheral devices such as a mouse to drag the target file.
- drag. 7A is a schematic diagram of a multi-device collaboration system (taking 5 devices as an example), the system includes five electronic devices in the figure, and the five electronic devices are respectively a computer, a tablet, a mobile phone 1, a mobile phone 2, and a mobile phone 3. It should be noted that the collaboration between mobile phone 1 and the computer is established, and the collaboration window of mobile phone 1 is established on the computer, and the collaboration between mobile phone 2 and the computer is established, and the collaboration window of mobile phone 2 is established on the computer.
- the tablet establishes collaboration and the tablet has the collaboration window of the mobile phone 3
- the computer can be used as the first device
- the target files can be A file, B file, D file, A4 file, or B4 file.
- the target file is a file stored on the first device, and the starting position of the first drag operation is on the display screen interface; the target file is the In the M second devices, a file stored on the second device is initiated; the initial position of the first drag operation is in the collaboration window corresponding to the initial second device.
- the first interface includes the display screen interface of the first device and the M collaborative windows corresponding to the second devices, and the target file is dragged and dropped on the first interface. If the target file is displayed on the first device On the screen interface and the target file is dragged on the display screen interface of the first device, instead of being dragged in the collaboration windows corresponding to the M second devices, it means that the target file is stored on the first device.
- a device can send the target file.
- FIG. 7B is a schematic diagram of a target file (taking 5 devices as an example) in a multi-device collaboration system.
- the system includes five electronic devices, and the five electronic devices are a computer, a tablet, and a mobile phone 1 , Mobile 2 and Mobile 3.
- Collaboration is established between mobile phone 1 and the computer and there is a collaboration window for mobile phone 1 on the computer
- mobile phone 2 establishes collaboration with the computer and there is a collaboration window for mobile phone 2 on the computer
- the computer and tablet establish collaboration
- mobile phone 3 establishes collaboration with the tablet and there is a collaboration window on the tablet.
- the computer can be used as the first device.
- the user when the user needs to drag and drop the A4 file, since the A4 file is on the computer screen interface, it can be considered that the A4 file is stored on the computer, and the computer can receive the A4 file that acts on the computer screen interface. Drag and drop operation of files.
- the initial second device is the mobile phone 1, because the A file, B file, and D file are stored on the computer.
- the computer can receive the A file, B file, D file on the collaborative window acting on the computer screen. drag and drop operation.
- Step S603 The first device displays a drag effect matching the fourth device.
- the fourth device is a device through which the target file is dragged and moved.
- the fourth device may be any device among the M second devices.
- first information of the target file is acquired, where the first information includes one or more of file type information, file quantity information, and file arrangement order information of the target file, according to The first information generates a drag effect set of the target file, and displays a drag effect matching the fourth device according to the drag effect set of the target file, where the fourth device is the first drag
- the device that the drag track of the operation passes through, or the device corresponding to the collaboration window on the passing device. Specifically, when the user selects the target file and performs a dragging operation on the target file, the target file moves along the dragging track.
- FIG. 7C is a schematic diagram of a drag effect (taking 5 devices as an example) in a multi-device collaborative system.
- the system includes five electronic devices, and the five electronic devices are a computer, a tablet, and a mobile phone.
- Collaboration is established between mobile phone 1 and the computer and there is a collaboration window for mobile phone 1 on the computer
- mobile phone 2 establishes collaboration with the computer and there is a collaboration window for mobile phone 2 on the computer
- the computer and tablet establish collaboration
- mobile phone 3 establishes collaboration with the tablet and there is a collaboration window on the tablet.
- a computer can be used as the first device.
- the user selects the A4 file on the computer screen interface, and moves the A4 file along the dragging track. The track can pass through the collaboration window of the mobile phone 2.
- the mobile phone 2 can be used as the fourth device.
- a dragging effect suitable for the system will be displayed according to the system of the device during the moving process.
- the dragging effect may be a file shadow effect generated according to the A4 file.
- Step S604 The first device notifies other devices among the N devices to monitor the release position of the first drag operation.
- the first device when a target file is dragged and dropped on the first interface, the first device will send a broadcast message to notify other devices in the system that there is a target file to be shared and distinguish the source of the target file. All devices in the system listen to the position where the drag operation is released, so that all devices in the system are ready to receive the target file.
- the system includes five electronic devices, and the five electronic devices are a computer, a tablet, a mobile phone 1 , a mobile phone 2 , and a mobile phone 3 .
- Collaboration is established between mobile phone 1 and the computer and there is a collaboration window for mobile phone 1 on the computer, mobile phone 2 establishes collaboration with the computer and there is a collaboration window for mobile phone 2 on the computer, at the same time, the computer and tablet establish collaboration, and mobile phone 3 establishes collaboration with the tablet and there is a collaboration window on the tablet.
- the computer can be used as the first device.
- the computer will send a broadcast message to notify the mobile phone 1, mobile phone 2, mobile phone 3, and tablet in the multi-device collaboration system to start monitoring the position where the drag operation is released. All devices in this multi-device collaboration system can receive A4 files.
- Step S605 The third device receives a notification initiated by the first device to monitor the release position of the first drag operation.
- the third device is a device corresponding to the release position of the first drag operation.
- the system includes five electronic devices, and the five electronic devices are a computer, a tablet, a mobile phone 1 , a mobile phone 2 , and a mobile phone 3 .
- Collaboration is established between mobile phone 1 and the computer and there is a collaboration window for mobile phone 1 on the computer
- mobile phone 2 establishes collaboration with the computer and there is a collaboration window for mobile phone 2 on the computer
- the computer and tablet establish collaboration
- mobile phone 3 establishes collaboration with the tablet and there is a collaboration window on the tablet.
- the computer can be used as the first device.
- the user initiates a drag-and-drop operation on the A4 file on the computer screen interface on the first interface
- the mobile phone 1, mobile phone 2, mobile phone 3, tablet and computer in the multi-device collaboration system can all be used as the third device.
- the mobile phone 3 will receive the notification initiated by the computer to monitor the location of the first drag operation device, so as to know that there is an A4 file to be shared on the computer, and is always ready to receive the A4 file.
- Step S606 The third device monitors the release position of the first drag operation.
- the third device after learning that there is a target file to be shared on the first device, the third device always monitors whether the release position of the first drag operation corresponds to its own device.
- the system includes five electronic devices, and the five electronic devices are a computer, a tablet, a mobile phone 1 , a mobile phone 2 , and a mobile phone 3 .
- Collaboration is established between mobile phone 1 and the computer and there is a collaboration window for mobile phone 1 on the computer
- mobile phone 2 establishes collaboration with the computer and there is a collaboration window for mobile phone 2 on the computer
- the computer and tablet establish collaboration
- mobile phone 3 establishes collaboration with the tablet and there is a collaboration window on the tablet.
- the computer can be used as the first device.
- the user initiates a drag-and-drop operation on the A4 file on the computer screen interface on the first interface, and the mobile phone 1, mobile phone 2, mobile phone 3, tablet and computer in the multi-device collaboration system can all be used as the third device.
- the mobile phone 3 will always monitor the position where the A4 file is released, and receive the A4 file when the A4 file is released on its own collaboration window.
- Step S607 The first device detects the release position of the first drag operation.
- the release position may include that the release position includes an interface or a collaboration window of any one of the N devices.
- the third device is the device corresponding to the release position of the first drag operation.
- FIG. 7D is a schematic diagram of the release position of the first drag operation (taking 5 devices as an example) in the multi-device collaboration system.
- the user initiates a drag operation on the D file in the collaboration window of the mobile phone 1 on the first interface , the D file can be released on the collaboration window of the mobile phone 2 along the track 8, and the computer will detect that the release position is the collaboration window of the mobile phone 2, which means that the D file needs to be sent to the mobile phone 2.
- the C file can also be released on the display screen interface of the tablet along the track 7, and the computer will detect that the release position is the tablet display interface.
- the third device is a tablet, which means that the D file needs to be sent to the tablet.
- Step S608 The first device determines the storage path of the device matching the release position to receive the target file.
- the target file to be shared will be sent to the device whose release position matches, and then stored in the determined storage path.
- the file information of the target file is obtained, where the file information includes the file name, file content, and file size information of the target file, and it is judged whether the device matching the release position is suitable for receiving If the condition of the target file is satisfied, determine the storage path of the device matching the release position to receive the target file.
- the sending device needs to determine whether the third device has sufficient storage space to store the target file.
- the sending device When the sending device obtains the file information of the target file, optionally, the sending device first sends the size of the target file to the third device to determine whether the device has space to store the target file, and if there is sufficient storage space, determine whether The storage path of the target file on the third device, and the sending device can send the target file to the storage path.
- the computer sends the file size to the multi-device collaboration after obtaining the file size of the A4 file.
- Other devices in the system will calculate the remaining storage space in advance to determine whether the target file can be received. If the target file cannot be received, it will give a notification that it cannot be received.
- the computer After receiving the prompt of the device whose release position matches, the computer can determine the storage path of the target file stored on the device when the device has storage space to store the A4 file.
- Step S609 The first device controls sending the target file to the device matching the release position among the N devices.
- controlling to send the target file to the device matching the release position among the N devices includes the first device sending the target file to the device matching the release position and the first device controlling other devices Send the target file to the device that matches the release location.
- the target file is a file stored on the first device; the control sends the target file to a device matching the release position among the N devices, including : when it is detected that the release position of the first drag operation is on the display screen interface of the third device, or when it is detected that the release position of the first drag operation is among the N devices of the third device In the collaboration window on other devices, control the first device to send the target file to the third device; the third device includes the N devices that have not established collaboration with the first device device of.
- FIG. 7E is a schematic diagram of file dragging (taking 5 devices as an example) in a multi-device collaborative system.
- the system includes five electronic devices, and the five electronic devices are a computer, a tablet, and a mobile phone. 1. Mobile phone 2 and mobile phone 3.
- Collaboration is established between mobile phone 1 and the computer and there is a collaboration window for mobile phone 1 on the computer
- mobile phone 2 establishes collaboration with the computer and there is a collaboration window for mobile phone 2 on the computer
- the computer and tablet establish collaboration
- mobile phone 3 establishes collaboration with the tablet and there is a collaboration window on the tablet.
- the computer can be used as the first device.
- the user selects the target file A4 file on the computer screen interface, and can move the A4 file along the track 9, and finally release it in the collaboration window of the mobile phone 3.
- the controlling the first device to send the target file to the third device includes establishing a data transmission channel, where the data transmission channel is used to transmit all the target files. If the first device establishes a direct connection with the third device, the file information is sent to the storage path of the third device through the data transmission channel. The device establishes an indirect connection with the third device, then sends the file information to the relay device through the data transmission channel, and forwards the file information to the third device through the relay device. path, wherein the relay device is a device that establishes a direct connection with the first device and establishes a direct connection with the third device at the same time.
- the first device is a sending device and the third device is a receiving device
- the first device and the third device can establish a direct connection or an indirect connection under different networking modes. If the first device establishes a direct connection with the third device (for example, using the ad hoc network technology to connect the devices), the first device can directly send the file information of the target file to the third device.
- a direct connection is established but an indirect connection (such as using WiFi-P2P) is established through a relay device (a device that can establish a direct connection with the first device and also establish a direct connection with the third device among the N devices in the multi-device collaborative system).
- a relay device a device that can establish a direct connection with the first device and also establish a direct connection with the third device among the N devices in the multi-device collaborative system.
- the first device sends the file information of the target file to the relay device first, and the relay device forwards the file information of the target file to the third device, thereby realizing multiple Device file transfer.
- the user selects the target file A4 on the computer screen interface, and can move the A4 file along the track 9 , and finally release it in the collaboration window of the mobile phone 3 .
- the target file A4 file is stored on the computer, and the device that finally receives the A4 file is the mobile phone 3 .
- the computer and the tablet can first establish a data transmission channel, the computer sends the file information of the A4 file to the tablet, and then the tablet establishes a data transmission channel with the mobile phone 3, and the tablet forwards the file information of the A4 file to the mobile phone 3.
- all devices in the multi-device collaboration system can establish direct connections.
- the computer can establish a direct connection with the mobile phone 3, and the computer can directly connect with the mobile phone 3.
- a data transmission channel is established, and the computer directly sends the file information of the A4 file to the mobile phone 3 .
- the target file is a file stored on an initial second device among the M second devices; the control sends the target file to the N devices and all the The device matching the release position includes: when it is detected that the release position of the first drag operation is on the display screen interface of the third device, or when it is detected that the release position of the first drag operation is on the third device In the collaboration window on other devices among the N devices, the initiating second device is controlled to send the target file to the third device.
- the third device may be one device among the M second devices.
- the user in the figure selects the target file D file in the collaboration window of the mobile phone 1, and can move the D file along the track 8, and finally releases it in the collaboration window of the mobile phone 2.
- the mobile phone 1 is the initial second device
- the mobile phone 2 is the third device.
- the user can select the target file D file in the collaboration window of the mobile phone 1, and can move the D file along the track 10, and finally release it on the screen interface of the tablet.
- the mobile phone 1 is the The second device is the starting device, and the tablet is the third device.
- the controlling the initiating second device to send the target file to the third device includes: establishing a data transmission channel; the data transmission channel is used to transmit the target the file information of the file; if the initial second device establishes a direct connection with the third device, the file information is sent to the device matching the release position through the data transmission channel. storage path; if the initial second device establishes an indirect connection with the third device, the file information is sent to the relay device through the data transmission channel, and the file information is sent to the relay device through the relay device. The storage path is forwarded to the device matching the release position, wherein the relay device is a device that establishes a direct connection with the originating second device and establishes a direct connection with the third device at the same time.
- the connection of more devices can be realized without disconnecting the collaboration among the devices.
- different networking technologies may have different networking modes, which may lead to changes in the connection relationship between devices.
- the initiating second device is used as the sending device (a device that stores the target file, such as the first device or one of M second devices)
- the initiating second device and the third device can use different networking modes Establishing a direct connection can also establish an indirect connection. If the initiating second device can establish a direct connection with the third device (for example, using ad hoc networking technology for networking), the initiating second device can directly send the file information of the target file to the third device.
- the second device does not establish a direct connection with the third device, but establishes an indirect connection through a relay device (such as the first device) (such as using WiFi-P2P technology networking to achieve a one-to-many connection between devices), then start the second device.
- the device first sends the file information of the target file to the relay device, and the relay device forwards the file information of the target file to the third device, thereby realizing file transmission among multiple devices.
- FIG. 7E is a schematic diagram of file dragging in five device systems. In the figure, the user selects the target file D file in the collaboration window of the mobile phone 1, and can move the D file along the track 8, and finally Release in the collaboration window of phone 2.
- the initial second device is the mobile phone 1
- the third device is the mobile phone 2
- it can be determined that the target file D file is stored on the mobile phone 1
- the device that finally receives the D file is the mobile phone 2.
- mobile phone 1 and mobile phone 2 cannot establish a direct connection, but mobile phone 1 can establish a direct connection with the computer, and the computer and mobile phone 2 can also establish a direct connection. direct connection.
- the mobile phone 1 can first establish a data transmission channel with the computer, the mobile phone 1 sends the file information of the D file to the computer, and then the computer establishes a data transmission channel with the mobile phone 2, and the computer forwards the file information of the D file to the mobile phone 2.
- all devices in the multi-device collaborative system can establish direct connections.
- mobile phone 1 can establish a direct connection with mobile phone 2, and mobile phone 1 can establish direct connection with mobile phone. 2.
- Step S6010 The third device receives the target file controlled and sent by the first device.
- the third device may store the target file in the device.
- the receiving the target file controlled and sent by the first device includes: establishing a data transmission channel with a device storing the target file, receiving file information of the target file, and the The file information includes the file name, file content, and file size information of the target file.
- the third device will receive the size information of the target file sent from the sending device, and after judging that the third device has enough space to receive the target file, the third device may receive the file information of the target file. For example, as shown in FIG.
- Step S6011 The third device sends a broadcast notification to other devices in the multi-device collaboration system that the target file has been successfully received.
- the third device after the third device successfully receives the target file, it will send a broadcast to notify other devices in the multi-device collaboration system that the target file has been successfully received, and other devices do not need to wait for reception. For example, as shown in FIG. 7E, if the user selects the D file on the first interface and drags the D file along the track 8, and releases the drag operation in the collaboration window of the mobile phone 2, then the mobile phone 2 acts as the third device , the computer will control the mobile phone 1 to send the D file to the mobile phone 2. After the mobile phone 2 successfully receives the D file, it will notify the mobile phone 1, computer, tablet and mobile phone 3 in the multi-device collaboration system that the D file has been successfully received, and there is no need to wait for reception.
- FIG. 8 is a schematic structural diagram of a file transmission apparatus provided by an embodiment of the present invention.
- the file transmission apparatus 80 may include a first display unit 801, a first receiving unit 802, a sending unit 803, and a first processing unit 804 , the second processing unit 805 , the second receiving unit 806 , the third receiving unit 807 , the third processing unit 808 , and the second display unit 809 , where the detailed description of each module is as follows.
- the multi-device coordination system includes N devices, and any one of the N devices establishes cooperation with at least one other device in the N devices;
- the first device is any one of the N devices;
- N is an integer greater than 2; the device includes:
- the first display unit 801 is used for the first device to display a first interface; the first interface includes a display screen interface of the first device, and M second devices corresponding to establishing cooperation with the first device ; M is an integer greater than or equal to 0;
- a first receiving unit 802 configured to receive a first drag operation acting on the target file on the first interface
- a sending unit 803, configured to notify other devices in the N devices to monitor the release position of the first drag operation; the release position includes the interface of any one of the N devices or the collaboration window;
- a first processing unit 804 configured to detect the release position of the first drag operation
- the sending unit 803 is further configured to control sending the target file to the device matching the release position among the N devices.
- the target file is a file stored on the first device; the first processing unit 804 is specifically configured to detect that the release position of the first drag operation is in the first The display screen interface of the three devices, or when it is detected that the release position of the first drag operation is in the coordination window of the third device on other devices among the N devices, control the first device Sending the target file to the third device; the third device includes a device among the N devices that has not established cooperation with the first device.
- the apparatus further includes: a second receiving unit 806, configured to acquire file information of the target file; the file information includes the file name, file content, and file size of the target file information; a second processing unit for judging whether the device matching the release position satisfies the condition for receiving the target file; if so, determining the storage path for the device matching the release position to receive the target file.
- a second receiving unit 806 configured to acquire file information of the target file; the file information includes the file name, file content, and file size of the target file information
- a second processing unit for judging whether the device matching the release position satisfies the condition for receiving the target file; if so, determining the storage path for the device matching the release position to receive the target file.
- the first processing unit 804 is further configured to establish a data transmission channel; the data transmission channel is configured to transmit the file information of the target file; the sending unit 803 is further configured to is used to send the file information to the storage path of the third device through the data transmission channel if the first device establishes a direct connection with the third device; The third device establishes an indirect connection, then sends the file information to the relay device through the data transmission channel, and forwards the file information to the storage path of the third device through the relay device, The relay device is a device that establishes a direct connection with the first device and establishes a direct connection with the third device at the same time.
- the target file is a file stored on an initial second device among the M second devices; the first processing unit 804 is specifically configured to, when detecting the first The release position of the drag operation is on the display screen interface of the third device, or it is detected that the release position of the first drag operation is in the coordination window of the third device on other devices among the N devices, then Controlling the originating second device to send the target file to the third device.
- the first processing unit 804 is further configured to establish a data transmission channel; the data transmission channel is configured to transmit the file information of the target file; the sending unit 803 is further configured to is used to send the file information to the storage path with the third device through the data transmission channel if the starting second device establishes a direct connection with the third device;
- the second device establishes an indirect connection with the third device, the file information is sent to the relay device through the data transmission channel, and the file information is forwarded to the third device through the relay device.
- the relay device is a device that establishes a direct connection with the originating second device and establishes a direct connection with the third device at the same time.
- the apparatus further includes: a third receiving unit 807, configured to acquire the first information of the target file; the first information includes file type information and the number of files of the target file information and one or more of file arrangement order information; a third processing unit 808, configured to generate a drag effect set of the target file according to the first information; a second display unit 809, configured to The drag effect set of the target file displays the drag effect matching the fourth device; the fourth device is the device that the drag track of the first drag operation passes through, or the device corresponding to the collaboration window on the passing device .
- step S401 for the functions of each functional unit in the file transmission apparatus 80 described in the embodiment of the present invention, reference may be made to step S401, step S402, step S403, step S406, step S407 in the method embodiment described above in FIG. 4 The related descriptions are not repeated here.
- FIG. 9 is a schematic structural diagram of another file transmission apparatus provided by an embodiment of the present invention.
- the file transmission apparatus 90 may include a receiving unit 901, a processing unit 902, and a sending unit 903.
- the detailed description of each module is as follows . It is applied to a third device in a multi-device coordination system, wherein the multi-device coordination system includes N devices, and any one of the N devices establishes cooperation with at least one other device in the N devices;
- the third device is any one of the N devices; N is an integer greater than 2; the device includes:
- a receiving unit 901 configured to receive a notification initiated by a first device to monitor the release position of a first drag operation; the first drag operation is to initiate a drag on a target file on the first interface; the first interface It includes a display screen interface of the first device, and cooperation windows corresponding to M second devices that establish cooperation with the first device; M is an integer greater than or equal to 0; the processing unit 902 is configured to monitor the first device.
- a drag operation releases the position; the receiving unit 901 is further configured to receive the target file controlled and sent by the first device; the sending unit 903 is configured to send a broadcast notification to other devices in the multi-device collaboration system The target file has been successfully received.
- the processing unit 902 is further configured to establish a data transmission channel with the device storing the target file; the receiving unit 901 is further configured to receive file information of the target file;
- the file information includes the file name, file content, and file size information of the target file.
- Embodiments of the present invention further provide a computer storage medium, wherein the computer storage medium may store a program, and when the program is executed, the program includes part or all of the steps of any one of the file transmission methods described in the above method embodiments.
- the embodiment of the present invention also provides a computer program, the computer program includes instructions, when the computer program is executed by the computer, the computer can execute part or all of the steps of any file transfer method.
- the disclosed apparatus may be implemented in other manners.
- the device embodiments described above are only illustrative.
- the division of the above-mentioned units is only a logical function division.
- multiple units or components may be combined or integrated. to another system, or some features can be ignored, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical or other forms.
- the above-mentioned units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
- each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
- the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
- the integrated units are implemented in the form of software functional units and sold or used as independent products, they may be stored in a computer-readable storage medium.
- the technical solutions of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, and the computer software products are stored in a storage medium , including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc., specifically a processor in the computer device) to execute all or part of the steps of the foregoing methods in the various embodiments of the present application.
- a computer device which may be a personal computer, a server, or a network device, etc., specifically a processor in the computer device
- the aforementioned storage medium may include: U disk, mobile hard disk, magnetic disk, optical disk, Read-Only Memory (Read-Only Memory, abbreviation: ROM) or Random Access Memory (Random Access Memory, abbreviation: RAM), etc.
- a medium that can store program code may include: U disk, mobile hard disk, magnetic disk, optical disk, Read-Only Memory (Read-Only Memory, abbreviation: ROM) or Random Access Memory (Random Access Memory, abbreviation: RAM), etc.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
本发明实施例公开了一种文件传输方法及相关设备,应用于多设备协同系统中的第一设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第一设备为所述N个设备中的任意一个;N为大于2的整数;该方法包括:所述第一设备显示第一界面;接收作用于所述第一界面上对目标文件的第一拖拽操作;通知所述N个设备中的其他设备监听所述第一拖拽操作释放位置;检测所述第一拖拽操作的释放位置;控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。采用本发明实施例可实现多台设备文件传输,提高了多设备协同下文件传输效率并精简了用户操作。
Description
本申请要求于2020年09月28日提交中国专利局、申请号为202011045443.4、申请名称为“一种文件传输的方法及相关设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本发明涉及终端领域,尤其涉及一种文件传输的方法及相关设备。
随着科技的进步,智能手机、个人电脑、平板等智能终端设备的应用越来越广泛。在实际办公场景下,用户希望在不同的智能终端设备上共享文件资源。实现不同设备之间多屏互动,有效地提高多台设备间的文件资源传输效率。
目前,现有技术中有以下两种常用的文件分享方法,使用设备协同技术和WiFi-P2P直连技术建立设备间的连接,可实现文件分享。其中,使用设备协同技术,在两台设备建立协同的情况下,能够实现对文件进行跨设备拖拽并传输,例如,在第一设备上发起文件拖拽,被拖拽文件离开第一设备时携带相关信息至第二设备,第二设备给予相应地处理;使用WiFi-P2P技术建立智能终端设备之间点对点的连接,数据量较大的文件通过WiFi网络传输,数据量较小的文件通过蓝牙信道传输。
然而,上述两种文件分享技术,通常只建立在两台设备之间,且仅在该两台设备之间进行文件的拖拽和文件传输。对于一些多样化或复杂的应用场景(如多台设备之间的文件分享等),还不能提供更好的文件分享体验。
发明内容
本发明实施例所要解决的技术问题在于,提供一种文件传输方法及相关设备,提升多设备间的文件分享效率和用户体验。
第一方面,一种文件传输的方法,其特征在于,应用于多设备协同系统中的第一设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第一设备为所述N个设备中的任意一个;N为大于2的整数;所述方法包括:所述第一设备显示第一界面;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;接收作用于所述第一界面上对目标文件的第一拖拽操作;通知所述N个设备中的其他设备监听所述第一拖拽操作释放位置;所述释放位置包括所述N个设备中任意一个设备的界面或所述协同窗口;检测所述第一拖拽操释放位置;控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。
在本发明实施例中,多设备协同系统中可存在三个及三个以上的设备,若设备间需要进行文件传输,在所述第一设备的第一界面上选中目标文件,对目标文件进行拖拽操作,所述拖拽操作包括但不限于触屏将目标文件拖拽、使用鼠标等外设将目标文件拖拽。同时,所述第一设备会发送广播消息通知该系统中的其他所有设备,当前有目标文件待分享,让系统中 的所有设备都监听拖拽操作释放的位置,可让该系统中的设备都准备接收该目标文件。进一步地,第一设备在检测到拖拽操作的释放位置后,可控制将目标文件发送给与所述释放位置匹配的设备。通过实施本发明实施例的方法,在多设备协同系统中每一个设备都可在不断开协同的情况下发送或接收目标文件,可避免在未建立协同的两个设备上进行文件传输时,断开与其他设备建立的协同,再重新建立新的协同进行文件传输的情况,从而实现跨多台设备文件传输,提高了多设备协同下文件传输效率同时也精简了用户操作。
在一种可能的实现方式中,所述目标文件为所述第一设备上存储的文件;所述第一拖拽操作的起始位置在所述显示屏界面上;所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述第一拖拽操作的起始位置在所述起始第二设备对应的协同窗口中。在本发明实施例中,在第一界面上包括了第一设备的显示屏界面和M个第二设备对应的协同窗口,在第一界面上对目标文件进行拖拽操作,目标文件若在第一设备的显示屏界面上且目标文件在第一设备的显示屏界面上被拖拽,而不是在M个第二设备对应的协同窗口中被拖拽,则表示该目标文件存储在第一设备上,确定第一设备可发送目标文件。在另一种情况下,目标文件若在其中一个设备的协同窗口上且目标文件在该协同窗口上被拖拽,而不是在第一设备的显示屏界面被拖拽,则表示目标文件存储在该协同窗口所对应的设备上,确定第一设备可控制该协同窗口所对应的设备发送目标文件。
在一种可能的实现方式中,所述目标文件为所述第一设备上存储的文件;所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备,包括:当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或当检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中时,则控制所述第一设备将所述目标文件发送至所述第三设备;所述第三设备包括所述N个设备中与所述第一设备未建立协同的设备。在本发明实施例中,当目标文件在第一设备的第一界面被拖拽后,若目标文件被释放在多设备协同系统中任意一个设备的显示屏界面或设备对应的协同窗口上时,则代表目标文件需发送给与该释放位置所匹配的设备,该设备与第一设备可无需建立协同。例如,当平板与手机建立协同且平板上有手机的协同窗口,且平板与电脑建立协同的情况下,在平板屏幕上用户可对手机的协同窗口中的待分享文件进行拖拽操作,将待分享文件发送给电脑。从而可实现在第一设备无需断开与其他设备的协同的情况下,可通过对目标文件进行拖拽操作与未建立协同的设备进行文件传输。
在一种可能的实现方式中,所述方法还包括:获取所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息;判断与所述释放位置匹配的设备是否满足接收所述目标文件的条件;若满足,则确定与所述释放位置匹配的设备接收所述目标文件的存储路径。在本发明实施例中,若两个设备通过对目标文件进行拖拽操作实现文件分享,则发送设备(存储目标文件的设备,如第一设备或M个第二设备中的一个)需要判断接收设备是否有充足的存储空间来存储目标文件。因此,发送设备在获取目标文件的文件信息后,可以先将目标文件的大小发送给接收设备,判断该设备是否有空间来存储目标文件,若有充足的存储空间,则确定目标文件在接收设备上的存储路径,继而发送设备可将目标文件成功发送到该存储路径。
在一种可能的实现方式中,所述则控制所述第一设备将所述目标文件发送至第三设备,包括:建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;若所述第一设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到所述第三设备的所述存储路径;若所述第一设备与所述第三设备建立间接连接,则通过所述 数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发到所述第三设备所述存储路径,其中所述中继设备为与所述第一设备建立直接连接同时与所述第三设备建立直接连接的设备。在本发明实施例中,在多设备协同系统中,由于使用了不同的组网技术,从而可实现更多设备的连接同时不用断开设备间的协同。但不同的组网技术可存在不同的组网方式可导致各设备间的连接关系发生变化。当第一设备为发送设备,第三设备作为接收设备时,第一设备和第三设备在使用不同的组网方式下可建立直接连接也可建立间接连接。若第一设备与第三设备建立直接连接(如使用自组网技术连接各设备),则第一设备可将目标文件的文件信息直接发送给第三设备,若第一设备与第三设备未建立直接连接而是通过中继设备(多设备协同系统里N个设备中可与第一设备建立直接连接同时也可与第三设备建立直接连接的设备)建立了间接连接(如使用WiFi-P2P技术组网实现设备间一对多的连接),则第一设备将目标文件的文件信息先发送给中继设备,中继设备在将目标文件的文件信息转发给第三设备,从而实现多台设备文件传输。
在一种可能的实现方式中,所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备,包括:当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中,则控制所述起始第二设备将所述目标文件发送至所述第三设备。在本发明实施例中,当协同窗口中的目标文件在第一界面上被拖拽后,若目标文件被释放在多设备协同系统中任意一个设备的显示屏界面或该设备对应的协同窗口上时,则代表目标文件需发送给与该释放位置所匹配的设备。可选的,第三设备可为M个第二设备中的一个设备。例如,当电脑同时与手机和平板建立协同且在电脑上有手机和平板的协同窗口,且手机与平板无需建立协同,用户可选中平板协同窗口中的文件,将该文件拖拽到手机协同窗口中,此时可实现平板上的文件可以通过拖拽操作直接分享给手机。从而实现在无需断开与其他设备协同的情况下,通过对目标文件进行拖拽操作与未建立协同的设备进行文件传输,精简了文件传输操作。
在一种可能的实现方式中,所述控制所述起始第二设备将所述目标文件发送至所述第三设备,包括:建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;若所述起始第二设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到与所述释放位置匹配的设备的所述存储路径;若所述起始第二设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发给与所述释放位置匹配的设备所述存储路径,其中所述中继设备为与所述起始第二设备建立直接连接同时与所述第三设备建立直接连接的设备。在本发明实施例中,在多设备协同系统中,由于使用了不同的组网技术,从而可实现更多设备的连接同时不用断开设备间的协同。但不同的组网技术可存在不同的组网方式可导致各设备间的连接关系发生变化。当起始第二设备作为发送设备时,起始第二设备和第三设备在使用不同的组网方式下可建立直接连接也可建立间接连接。如起始第二设备与第三设备可建立直接连接(如使用自组网技术进行组网),则起始第二设备可将目标文件的文件信息直接发送给第三设备,如起始第二设备与第三设备未建立直接连接而是通过中继设备(如第一设备)建立了间接连接(如使用WiFi-P2P技术组网实现设备间一对多的连接),则起始第二设备将目标文件的文件信息先发送给中继设备,中继设备在将目标文件的文件信息转发给第三设备,从而实现多台设备文件传输。
在一种可能的实现方式中,所述方法还包括:获取所述目标文件的第一信息;所述第一 信息包括所述目标文件的文件类型信息、文件数量信息、文件排列顺序信息中的一个或多个;根据所述第一信息,生成所述目标文件的拖拽效果集合;根据所述目标文件的拖拽效果集合显示与第四设备匹配的拖拽效果;所述第四设备为所述第一拖拽操作的拖拽轨迹经过的设备,或经过的设备上的协同窗口对应的设备。在本发明实施例中,在对目标文件进行拖拽操作过程中,目标文件可被跨设备拖拽,在拖拽轨迹上可经过多个设备,所经过的多个设备为所述第四设备,不同的设备可存在不同的操作系统,例如,若拖拽轨迹经过手机和电脑,手机使用Android系统,但电脑使用Windows系统。针对不同的操作系统在目标文件被拖拽过程中显示与操作系统相适应的文件拖拽效果。通过使用本发明实施例的方法,在获取到目标文件的文件类型信息、文件数量信息等与目标文件相关的信息后,可根据这些信息生成一个拖拽效果集合,拖拽效果集合中包括了一个或多个文件拖拽效果,然后根据拖拽轨迹所经过的设备的操作系统显示相适应的文件拖拽效果。
第二方面,一种文件传输的方法,其特征在于,应用于多设备协同系统中的第三设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第三设备为所述N个设备中的任意一个设备;N为大于2的整数;所述方法包括:接收所述第一设备发起的监听第一拖拽操作释放位置的通知;所述第一拖拽操作为在所述第一界面上对目标文件发起拖拽;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;监听所述第一拖拽操作的释放位置;接收第一设备所控制发送的所述目标文件;发送广播通知所述多设备协同系统中的其他设备所述目标文件已被成功接收。在本发明实施例中,多设备协同系统中可存在三个及三个以上的设备,若设备间需要进行文件传输,在第一设备上发起对目标文件的分享,第三设备会接收到第一设备发起的监听目标文件释放位置的通知,此时第三设备会做好接收目标文件的准备,当检测到该目标文件释放在第三设备上时,第一设备可控制将目标文件发送给第三设备,此时第三设备会接收该目标文件,在第三设备成功接收该目标文件后会发送广播通知多设备协同系统中的其他所有设备目标文件已经成功接收,无需在等待接收。通过实施本发明实施例的方法,在多设备协同系统中每一个设备都可在不断开协同的情况下都可接收目标文件,可避免在未建立协同的两个设备上进行文件传输时,断开与其他设备建立的协同,再重新建立新的协同进行文件传输的情况,从而实现跨多台设备文件传输,提高了多设备协同下文件传输效率同时也精简了用户操作。
在一种可能的实现方式中,所述接收第一设备所控制发送的所述目标文件,包括:与存储所述目标文件的设备建立数据传输通道;接收所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息。在本发明实施例中,第三设备会接收到来自发送设备发送的目标文件的大小的信息,在判断第三设备有足够的空间接收该目标文件后,第三设备可接收到目标文件的文件信息。
第三方面,一种文件传输的装置,其特征在于,应用于多设备协同系统中的第一设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第一设备为所述N个设备中的任意一个;N为大于2的整数;所述装置包括:
第一显示单元,用于所述第一设备显示第一界面;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等 于0的整数;
第一接收单元,用于接收作用于所述第一界面上对目标文件的第一拖拽操作;
发送单元,用于通知所述N个设备中的其他设备监听所述第一拖拽操作的释放位置;所述释放位置包括所述N个设备中任意一个设备的界面或所述协同窗口;
第一处理单元,用于检测所述第一拖拽操作的所述释放位置;
所述发送单元,还用于控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。
在一种可能的实现方式中,所述目标文件为所述第一设备上存储的文件;所述第一处理单元,具体用于当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或当检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中时,则控制所述第一设备将所述目标文件发送至所述第三设备;所述第三设备包括所述N个设备中与所述第一设备未建立协同的设备。
在一种可能的实现方式中,所述装置还包括:第二接收单元,用于获取所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息;第二处理单元,用于判断与所述释放位置匹配的设备是否满足接收所述目标文件的条件;若满足,则确定与所述释放位置匹配的设备接收所述目标文件的存储路径。
在一种可能的实现方式中,所述第一处理单元,还用于建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;所述发送单元,还用于若所述第一设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到所述第三设备的所述存储路径;若所述第一设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发到所述第三设备所述存储路径,其中所述中继设备为与所述第一设备建立直接连接同时与所述第三设备建立直接连接的设备。
在一种可能的实现方式中,所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述第一处理单元,具体用于当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中,则控制所述起始第二设备将所述目标文件发送至所述第三设备。
在一种可能的实现方式中,所述第一处理单元,还用于建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;所述发送单元,还用于若所述起始第二设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到与所述第三设备的所述存储路径;若所述起始第二设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发给与所述第三设备所述存储路径,其中所述中继设备为与所述起始第二设备建立直接连接同时与所述第三设备建立直接连接的设备。
在一种可能的实现方式中,所述装置还包括:第三接收单元,用于获取所述目标文件的第一信息;所述第一信息包括所述目标文件的文件类型信息、文件数量信息、文件排列顺序信息中的一个或多个;第三处理单元,用于根据所述第一信息,生成所述目标文件的拖拽效果集合;第二显示单元,用于根据所述目标文件的拖拽效果集合显示与第四设备匹配的拖拽效果;所述第四设备为所述第一拖拽操作的拖拽轨迹经过的设备,或经过的设备上的协同窗口对应的设备。
第四方面,一种文件传输的装置,其特征在于,应用于多设备协同系统中的第三设备, 所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第三设备为所述N个设备中的任意一个设备;N为大于2的整数;所述装置包括:接收单元,用于接收第一设备发起的监听第一拖拽操作释放位置的通知;所述第一拖拽操作为在所述第一界面上对目标文件发起拖拽;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;处理单元,用于监听所述第一拖拽操作释放位置;接收单元,还用于接收第一设备所控制发送的所述目标文件;发送单元,用于发送广播通知所述多设备协同系统中的其他设备所述目标文件已被成功接收。
在一种可能的实现方式中,所述处理单元,还用于与存储所述目标文件的设备建立数据传输通道;所述接收单元,还用于接收所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息。
第五方面,本发明实施例提供一种电子设备,该电子设备中包括处理器,处理器被配置为支持该电子设备实现第一方面提供的文件传输方法中相应的功能。该电子设备还可以包括存储器,存储器用于与处理器耦合,其保存该电子设备必要的程序指令和数据。该电子设备还可以包括通信接口,用于该电子设备与其他设备或通信网络通信。
第六方面,本发明实施例提供一种电子设备,该电子设备中包括处理器,处理器被配置为支持该电子设备实现第二方面提供的文件传输方法中相应的功能。该电子设备还可以包括存储器,存储器用于与处理器耦合,其保存该电子设备必要的程序指令和数据。该电子设备还可以包括通信接口,用于该电子设备与其他设备或通信网络通信。
第七方面,本申请提供了一种芯片系统,该芯片系统包括处理器,用于支持文件传输设备实现上述第一方面中所涉及的功能,例如,生成或处理上述文件传输方法中所涉及的信息。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器,用于保存数据发送设备必要的程序指令和数据。该芯片系统,可以由芯片构成,也可以包含芯片和其他分立器件。
第八方面,本申请提供了一种芯片系统,该芯片系统包括处理器,用于支持文件传输设备实现上述第二方面中所涉及的功能,例如,生成或处理上述文件传输方法中所涉及的信息。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器,用于保存数据发送设备必要的程序指令和数据。该芯片系统,可以由芯片构成,也可以包含芯片和其他分立器件。
第九方面,本申请提供一种计算机存储介质,所述计算机存储介质存储有计算机程序,该计算机程序被处理器执行时实现上述第一方面中任意一项所述的文件传输方法流程。第十方面,本申请提供一种计算机存储介质,所述计算机存储介质存储有计算机程序,该计算机程序被处理器执行时实现上述第二方面中任意一项所述的文件传输方法流程。
第十方面,本发明实施例提供了一种计算机程序,该计算机程序包括指令,当该计算机程序被计算机执行时,使得计算机可以执行上述第一方面中的文件传输装置中的处理器所执行的流程。
第十一方面,本发明实施例提供了一种计算机程序,该计算机程序包括指令,当该计算机程序被计算机执行时,使得计算机可以执行上述第二方面中的文件传输装置中的处理器所执行的流程。
图1A为本申请实施例提供的电子设备的结构示意图。
图1B为本申请实施例提供的电子设备的软件结构框图。
图2为本申请实施例提供的一种文件传输的系统架构示意图。
图3为本发明实施例提供的所述N个设备中任意一个设备的文件传输系统架构示意图。
图4为本申请实施例提供的一种文件传输方法的流程示意图。
图5A为本发明实施例提供的多设备协同系统(以3个设备为例)示意图。
图5B为本发明实施例提供的多设备协同系统中在第一设备发起文件拖拽(以3个设备为例)示意图。
图5C为本发明实施例提供的多设备协同系统中第一拖拽操作释放位置(以3个设备为例)示意图。
图5D为本发明实施例提供的多设备协同系统中发送目标文件(以3个设备为例)示意图。
图6为本申请实施例中的一种文件传输方法的详细流程示意图。
图7A为本发明实施例提供的多设备协同系统(以5个设备为例)示意图。
图7B为本发明实施例提供的多设备协同系统中目标文件(以5个设备为例)示意图。
图7C为本发明实施例提供的多设备协同系统中拖拽效果(以5个设备为例)示意图。
图7D为本发明实施例提供的多设备协同系统中第一拖拽操作释放位置(以5个设备为例)示意图。
图7E为本发明实施例提供的多设备协同系统中文件拖拽(以5个设备为例)示意图。
图8为本发明实施例提供的一种文件传输装置的结构示意图。
图9是本发明实施例提供的另一种文件传输装置的结构示意图。
本申请以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。还应当理解,本申请中使用的术语“和/或”是指并包含一个或多个所列出项目的任何或所有可能组合。
以下介绍了电子设备、用于这样的电子设备的用户界面、和用于使用这样的电子设备的实施例。在一些实施例中,电子设备可以是还包含其他功能诸如个人数字助理和/或音乐播放器功能的便携式电子设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴电子设备(如智能手表)等。便携式电子设备的示例性实施例包括但不限于搭载
或者其他操作系统的便携式电子设备。上述便携式电子设备也可以是其他便携式电子设备,诸如具有触敏表面或触控面板的膝上型计算机(Laptop)等。还应当理解的是,在其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是具有触敏表面或触控 面板的台式计算机。
本申请的说明书和权利要求书及附图中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。应用程序的用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在终端设备上经过解析,渲染,最终呈现为用户可以识别的内容,比如图片、文字、按钮等控件。控件(control)也称为部件(widget),是用户界面的基本元素,典型的控件有工具栏(toolbar)、菜单栏(menu bar)、文本框(text box)、按钮(button)、滚动条(scrollbar)、图片和文本。界面中的控件的属性和内容是通过标签或者节点来定义的,比如XML通过<Textview>、<ImgView>、<VideoView>等节点来规定界面所包含的控件。一个节点对应界面中一个控件或属性,节点经过解析和渲染之后呈现为用户可视的内容。此外,很多应用程序,比如混合应用(hybrid application)的界面中通常还包含有网页。网页,也称为页面,可以理解为内嵌在应用程序界面中的一个特殊的控件,网页是通过特定计算机语言编写的源代码,例如超文本标记语言(hyper text markup language,GTML),层叠样式表(cascading style sheets,CSS),java脚本(JavaScript,JS)等,网页源代码可以由浏览器或与浏览器功能类似的网页显示组件加载和显示为用户可识别的内容。网页所包含的具体内容也是通过网页源代码中的标签或者节点来定义的,比如GTML通过<p>、<img>、<video>、<canvas>来定义网页的元素和属性。
用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
首先介绍本申请以下实施例中提供的示例性电子设备100。
图1A示出了电子设备100的结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,3D摄像模组193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180G,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),中央处理器(central processing unit,CPU),图形处理器(graphics processing unit,GPU),神经网络处理器(neural-network processing unit,NPU),调制解调处理器,图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字 信号处理器(digital signal processor,DSP),基带处理器等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。在一些实施例中,电子设备100也可以包括一个或多个处理器110。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了电子设备100的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,3D摄像模组193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,3D摄像模组193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和3D摄像模组193通过CSI接口通信,实现电子设备100的摄像功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与3D摄像模组193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口, I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,3D摄像模组193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距 离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。示例性地,无线通信模块160可以包括蓝牙模块、Wi-Fi模块等。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等可以实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过3D摄像模组193,ISP,视频编解码器,GPU,显示屏194以及应用处理器AP、神经网络处理器NPU等实现摄像功能。
3D摄像模组193可用于采集拍摄对象的彩色图像数据以及深度数据。ISP可用于处理3D摄像模组193采集的彩色图像数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在3D摄像模组193中。
在一些实施例中,3D摄像模组193可以由彩色摄像模组和3D感测模组组成。
在一些实施例中,彩色摄像模组的摄像头的感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。
在一些实施例中,3D感测模组可以是(time of flight,TOF)3D感测模块或结构光(structured light)3D感测模块。其中,结构光3D感测是一种主动式深度感测技术,结构光 3D感测模组的基本零组件可包括红外线(Infrared)发射器、IR相机模等。结构光3D感测模组的工作原理是先对被拍摄物体发射特定图案的光斑(pattern),再接收该物体表面上的光斑图案编码(light coding),进而比对与原始投射光斑的异同,并利用三角原理计算出物体的三维坐标。该三维坐标中就包括电子设备100距离被拍摄物体的距离。其中,TOF 3D感测也是主动式深度感测技术,TOF 3D感测模组的基本组件可包括红外线(Infrared)发射器、IR相机模等。TOF 3D感测模组的工作原理是通过红外线折返的时间去计算TOF 3D感测模组跟被拍摄物体之间的距离(即深度),以得到3D景深图。
结构光3D感测模组还可应用于人脸识别、体感游戏机、工业用机器视觉检测等领域。TOF 3D感测模组还可应用于游戏机、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)等领域。
在另一些实施例中,3D摄像模组193还可以由两个或更多个摄像头构成。这两个或更多个摄像头可包括彩色摄像头,彩色摄像头可用于采集被拍摄物体的彩色图像数据。这两个或更多个摄像头可采用立体视觉(stereo vision)技术来采集被拍摄物体的深度数据。立体视觉技术是基于人眼视差的原理,在自然光源下,透过两个或两个以上的摄像头从不同的角度对同一物体拍摄影像,再进行三角测量法等运算来得到电子设备100与被拍摄物之间的距离信息,即深度信息。
在一些实施例中,电子设备100可以包括1个或N个3D摄像模组193,N为大于1的正整数。具体的,电子设备100可以包括1个前置3D摄像模组193以及1个后置3D摄像模组193。其中,前置3D摄像模组193通常可用于采集面对显示屏194的拍摄者自己的彩色图像数据以及深度数据,后置3D摄像模组可用于采集拍摄者所面对的拍摄对象(如人物、风景等)的彩色图像数据以及深度数据。
在一些实施例中,处理器110中的CPU或GPU或NPU可以对3D摄像模组193所采集的彩色图像数据和深度数据进行处理。在一些实施例中,NPU可以通过骨骼点识别技术所基于的神经网络算法,例如卷积神经网络算法(CNN),来识别3D摄像模组193(具体是彩色摄像模组)所采集的彩色图像数据,以确定被拍摄人物的骨骼点。CPU或GPU也可来运行神经网络算法以实现根据彩色图像数据确定被拍摄人物的骨骼点。在一些实施例中,CPU或GPU或NPU还可用于根据3D摄像模组193(具体是3D感测模组)所采集的深度数据和已识别出的骨骼点来确认被拍摄人物的身材(如身体比例、骨骼点之间的身体部位的胖瘦情况),并可以进一步确定针对该被拍摄人物的身体美化参数,最终根据该身体美化参数对被拍摄人物的拍摄图像进行处理,以使得该拍摄图像中该被拍摄人物的体型被美化。后续实施例中会详细介绍如何基于3D摄像模组193所采集的彩色图像数据和深度数据对被拍摄人物的图像进行美体处理,这里先不赘述。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)-1,MPEG-2,MPEG-3,MPEG-4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐、照片、视频等数据保存在外部存储卡中。
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备100执行本申请一些实施例中所提供的电子设备的拍照预览方法,以及各种功能应用以及数据处理等。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用程序(比如图库、联系人等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如照片,联系人等)。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通 过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180G用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也可称触控面板或触敏表面。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音 频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
图1A示例性所示的电子设备100可以通过显示屏194显示以下各个实施例中所描述的各个用户界面。电子设备100可以通过触摸传感器180K在各个用户界面中检测触控操作,例如在各个用户界面中的点击操作(如在图标上的触摸操作、双击操作),又例如在各个用户界面中的向上或向下的滑动操作,或执行画圆圈手势的操作,等等。在一些实施例中,电子设备100可以通过陀螺仪传感器180B、加速度传感器180E等检测用户手持电子设备100执行的运动手势,例如晃动电子设备。在一些实施例中,电子设备100可以通过3D摄像模组193(如3D摄像头、深度摄像头)检测非触控的手势操作。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图1B是本发明实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,Android运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图1B所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图1B所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责Android系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是Android的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,G.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
图1B所示的软件系统涉及到使用分享能力的应用呈现(如图库,文件管理器),提供分享能力的即时分享模块,提供打印能力的打印服务(print service)和打印后台服务(print spooler),以及应用框架层提供打印框架、WLAN服务、蓝牙服务,以及内核和底层提供WLAN蓝牙能力和基本通信协议。
以下,对本申请中的部分用语进行解释说明,以便于本领域技术人员理解。
(1)WiFi直连(WiFi peer-to-peer,WiFi-p2p)使设备之间能够轻松连接彼此而不需要一个中介性质的无线接入点(Access Point,AP)。其使用范围从网页浏览到文件传输,以及 同时与多个设备进行通信,能够发挥WiFi的速度优势。WiFi-p2p和传统的WiFi技术并不是互斥的,GO(Group Owner)可以像AP一样可以为多台GC(Group Client)提供服务,它可以像传统的设备一样连接到某个AP,它同时自己也可以是一个AP。所述GO是协议中的一种角色,相当于AP,一个组里只有一个GO。所述GC是协议中另一种角色,一个组里可以有多个GC。
(2)无线局域网(Wireless Local Area Networks,WLAN)指应用无线通信技术将计算机设备互联起来,构成可以互相通信和实现资源共享的网络体系。无线局域网本质的特点是不再使用通信电缆将计算机与网络连接起来,而是通过无线的方式连接,从而使网络的构建和终端的移动更加灵活。
(3)近场通信(Near Field Communication,NFC)是一种新兴的技术,使用了NFC技术的设备可以在彼此靠近的情况下进行数据交换,是由非接触式射频识别(RFID)及互连互通技术整合演变而来的,通过在单一芯片上集成感应式读卡器、感应式卡片和点对点通信的功能,利用移动终端实现移动支付、电子票务、门禁、移动身份识别、防伪等应用。
(4)自组网,自组网是一种移动通信和计算机网络相结合的网络,网络的信息交换采用计算机网络中的分组交换机制,用户终端是可以移动的便携式终端,自组网中每个用户终端都兼有路由器和主机两种功能。
为了便于理解本发明实施例,以下示例性列举本申请中一种文件传输方法所应用的文件传输系统的场景,可以理解的是,当本申请中的一种文件传输方法应用到不同的场景中时,智能终端设备可以分别对应不同类型的设备,并且对应的文件传输的文件也是不同类型,以下示例性列举两种场景。
场景一,基于视频制作的文件传输场景:
随着互联网的高速发展,越来越多年轻人喜欢用拍Vlog的方式记录下自己生活中的点点滴滴并上传到网络,与自己的好友和粉丝进行分享。这促使着手机摄像功能的不断进步,人们慢慢放下沉重的相机,开始掏出手机随时随地录制视频素材。完成一个Vlog视频需要前期写稿子、拍摄视频素材,后期进行剪辑,使之成为一个连续的有完整内容的视频。在Vlog视频制作过程中难免会遇到这样的情况,为了使拍摄画面更加清晰,拍摄者一般会使用手机后摄进行摄像,在录制一些需要拍摄者入境的画面时,可借助一台平板与手机建立协同作为视频画面监控屏,使得拍摄者可以随时看到拍摄效果。但当拍摄者想要把视频素材上传到电脑进行剪辑时,手机必须要断开与平板的协同,然后在与电脑建立协同上传文件资料,这样繁琐的操作会给拍摄者带来一些不便。所以在此情境下,使用本发明实施例可实现手机与平板建立协同,手机的显示屏可投到平板上,平板与电脑建立协同,电脑可随时把稿子发送给平板,此时拍摄者只用看着平板就能完成视频录制。并且手机可在不与电脑建立协同的情况下,可把手机上的视频素材资料直接拖拽并传输给电脑,然后使用电脑进行后期剪辑,这样极大地方便了拍摄者制作Vlog视频的过程。
场景二,基于企业会议的文件传输场景:
随着科技的进步,越来越多的企业开始注重高效办公。在传统的企业会议中,需要主讲人提前准备好文件资料并进行打印,分发给每一个参会人员,并且在会议中如果有人对文件资料进行修改是没有办法快速同步给其他参会人员的。在此情况下,如果将参会人员的个人电子设备进行彼此连接,会议主讲人就可把自己准备的文件资料直接传给其他参会人员的个人电子设备,并且如果参会人员对文件资料进行修改后,修改人可把修改后的文件资料传给 其他参会人员的个人电子设备。通过使用本发明实施例可实现,在不用断开各个电子设备协同的情况下,可实现多个电子设备(三个及三个以上电子设备)进行文件传输,这样提高了企业会议中的办公效率。
可以理解的是,上述两种应用场景的只是本发明实施例中的几种示例性的实施方式,本发明实施例中的应用场景包括但不仅限于以上应用场景。
下面结合附图对本申请的实施例进行描述。
基于上述提出的技术问题以及本申请中对应的应用场景,也为了便于理解本发明实施例,下面先对本发明实施例所基于的系统架构进行描述。请参考见图2,图2是本发明实施例提供的一种文件传输的系统架构示意图,该系统用于解决多台智能终端设备之间文件资源传输效率低的问题。该系统架构中可以包括N个设备,述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;N为大于2的整数。其中,所述N个设备中的任意一个设备都可为上述图1A中所述的电子设备100,其中,
第一设备,第一设备为N个设备中任意一个设备。在本申请中指发起文件拖拽操作的电子设备,该电子设备拥有操作系统且具有数据传输接口。常见的电子设备包括,智能手机、个人电脑、平板、智慧屏等设备。例如智能手机,智能手机具有独立的操作系统,并可以通过移动通讯网络来实现无线网络接入。在智能手机上可存储一些文件资料并可以对文件进行编辑,当智能手机与其他电子设备建立协同后,可将智能手机上的文件传输给建立协同的电子设备。
第三设备,第三设备为N个设备中任意一个设备(如M个第二设备中的一个等)。在本申请中指接收目标文件的电子设备,该电子设备拥有操作系统且具有数据传输接口。常见的电子设备包括,智能手机、个人电脑、平板、智慧屏等设备。例如个人电脑,个人电脑具有独立的操作系统,并且可以通过有线或无线接入互联网。同时,个人电脑可与其他电子设备建立协同且可接收或转发来自其他电子设备的文件资料。又如平板,平板可以与其他电子设备进行通信,也可与一些电子设备建立协同,同时在平板上还可实现智能手机的投屏功能,可将智能手机的桌面显示在平板的显示屏上。并且平板可以接收或转发来自其他电子设备的文件资料。
可以理解的是,图2中的一种文件传输系统架构只是本申请实施例中的一种示例性的实施方式,本申请实施例中的文件传输系统架构包括但不仅限于以上系统架构。
下面对本发明实施例所述N个设备中任意一个设备的文件传输系统架构进行描述。请参考见图3,图3是本发明实施例提供的所述N个设备中任意一个设备的文件传输系统架构示意图,所述文件传输系统架构包括连接管理模块301、协同管理模块302、文件拖拽管理模块303和文件传输管理模块304。
连接管理模块301,负责多个电子设备间连接的建立。例如,在连接管理模块301中,可提供连接入口给用户,便于多个电子设备在协同中的接入。同时在连接过程中连接管理模块301可提供认证功能,在完成接入电子设备的认证后,可建立电子设备间的连接。当电子设备间需要断开连接时,连接管理模块301提供了断开入口,使得建立连接的电子设备可随时断开连接。
协同管理模块302,负责电子设备间协同功能的实现。例如,若多个电子设备间建立协同,在协同管理模块302中,可提供支持电子设备间投屏、音频切换等音视频传输的能力。 协同管理模块302也可提供支持设备间各种操作信令的发送和接收等数据传输的能力。同时,还可以提供支持外设共享的能力,所述外设用在不支持触摸的电子设备上,为了所述电子设备方便地使用文件拖拽功能。
文件拖拽管理模块303,负责电子设备间文件分享的实现。例如,在文件拖拽管理模块303中,可获取文件类型信息,如获取不同文件的文件名后缀确定文件类型,据此可在不同系统中产生相应的更贴近不同操作系统的拖拽效果。在触发文件拖拽的电子设备上,可根据被拖拽文件类型、数量、大小、排列顺序对拖拽效果生成决策,决定在不同设备上显示相对应的拖拽效果,所述拖拽效果是文件在拖拽过程中的显示效果。然后在文件拖拽管理模块303中,可对文件拖拽效果进行管理,可根据释放设备决定转发、释放等操作。文件拖拽管理模块303可实现电子设备间文件拖拽的相关功能管理。
文件传输管理模块304,负责文件拖拽事件的获取和产生。例如,在文件传输管理模块304中,可获取被拖拽文件的文件信息,据此可准备文件传输和接收。同时在该模块中,可对文件存储进行管理,通过判断当前电子设备是否可以接收文件且确定接收文件的存储路径来对文件存储进行相应的管理。在该模块中还可对文件收发进行管理,可根据不同电子设备之间的IP创建套接字(Socket)连接,从而可实现在不同设备间建立通道进行设备间的文件传输。文件传输管理模块304可实现设备间的文件传输、接收、存储等相关操作。
可以理解的是,图3中的电子设备中的文件传输系统架构只是本发明实施例中的一种示例性的实施方式,本发明实施例中的电子设备中的文件传输系统架构包括但不仅限于以上结构。
下面对本发明实施例所基于的具体方法架构进行描述。参见图4,图4是本申请实施例中的一种文件传输方法的流程示意图,下面将结合附图4并基于上述图2中的文件传输系统架构从第一设备和第三设备(如M个第二设备中的一个等)的交互侧对本申请实施例中的文件扫描方法进行描述。需要说明的是,为了更详细的描述本申请实施例中的文件传输方法,本申请在各个流程步骤中描述了相应的执行主体为第一设备或第三设备,但不代表本申请实施例只能通过所描述的执行主体进行对应的方法流程。
步骤S401:第一设备显示第一界面。
具体地,所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口,且M为大于或者等于0的整数。所述第一设备可通过显示屏194显示第一界面。例如,如图5A所示,图5A为多设备协同系统(以3个设备为例)示意图,图中该系统包括三个电子设备,三个电子设备分别为电脑、平板和手机。在电脑与平板建立协同,且平板与手机建立协同且平板上有手机的协同窗口,但电脑未与手机建立协同的情景下,所述第一设备可为平板,此时所述M的取值为1,所述M个第二设备包括一个手机,因此所述第一界面为平板的屏幕,且平板的屏幕上只有一个设备的协同窗口,平板的屏幕上除去协同窗口位置的其他位置为第一设备的显示屏界面。
步骤S402:第一设备接收作用于所述第一界面上对目标文件的第一拖拽操作。
具体地,所述目标文件为待分享的文件,所述第一拖拽操作可包括但不限于通过使用第一设备的触摸传感器180K触屏将目标文件拖拽、使用鼠标等外设将目标文件拖拽。例如,如图5B所示,图5B为多设备协同系统中在第一设备发起文件拖拽(以3个设备为例)示意图,图中该系统包括三个电子设备,三个电子设备分别为电脑、平板和手机。在电脑与平板建立协同,且平板与手机建立协同且平板上有手机的协同窗口,但电脑未与手机建立协同的 情景下,平板可作为所述第一设备,在平板的屏幕上发起对目标文件的拖拽操作,所述目标文件可为屏幕上出现的任意文件,如图目标文件可为C文件、b文件,用户可在屏幕上选中目标文件后对其进行拖拽操作,然后可根据轨迹1或轨迹2拖拽C文件,可根据轨迹3或轨迹4拖拽b文件,需要说明的是所述轨迹不局限于上述所提及的轨迹还可包括更多的可能。在用户对目标文件发起拖拽操作后,平板可接收该拖拽操作。
步骤S403:第一设备通知所述N个设备中的其他设备监听所述第一拖拽操作释放位置。
具体地,所述释放位置包括N个设备中任意一个设备的界面或所述协同窗口,所述第一拖拽操作释放位置为用户在第一界面上释放目标文件的位置。例如,如图5C所示,图5C为多设备协同系统中第一拖拽操作释放位置(以3个设备为例)示意图,图中用户在第一界面上的手机协同窗口中对C文件发起拖拽操作,此时平板会发送广播消息通知多设备协同系统中的电脑、平板、手机有文件拖拽操作发起,让系统中的每一个设备都做好接收C文件的准备,此时每一个设备会监听文件拖拽操作的释放位置且每一个设备都可得知C文件来源于手机。
步骤S404:第三设备接收第一设备发起的监听第一拖拽操作释放位置的通知。
具体地,所述第三设备对所述第一拖拽操作释放位置所对应的设备。例如,如图5C所示,图5C为多设备协同系统中第一拖拽操作释放位置(以3个设备为例)示意图,图中用户在第一界面上的手机协同窗口中对C文件发起拖拽操作,第三设备可为电脑也可为平板,此时电脑和平板都会接收到第一设备发起的监听第一拖拽操作释放位置的通知,从而得知在手机上有C文件待分享,时刻准备接收C文件。
步骤S405:第三设备监听所述第一拖拽操作释放位置。
具体地,所述第三设备在得知在第一设备上有目标文件待分享后,就时刻监听所述第一拖拽操作释放位置是不是对应在自己的界面上。例如,如图5C所示,图5C为多设备协同系统中第一拖拽操作释放位置(以3个设备为例)示意图,图中用户在第一界面上的手机协同窗口中对C文件发起拖拽操作,第三设备可为电脑也可为平板。当电脑作为第三设备时,电脑会时刻监听C文件被释放的位置,当C文件释放在自己的显示屏界面上时就接收C文件。
步骤S406:第一设备检测所述第一拖拽操作所述释放位置。
具体地,在所述第一界面上的释放位置可包括所述释放位置包括所述N个设备中任意一个设备的显示屏界面或该设备对应的协同窗口。此时第三设备为第一拖拽操作释放位置所对应的设备。例如,如图5C所示,图5C为三个设备系统中第一拖拽操作释放位置示意图,用户在第一界面上的手机协同窗口中对C文件发起拖拽操作,可沿轨迹1将C文件释放在平板的显示屏界面,则平板会检测到所述释放位置为平板的显示屏界面,此时第三设备为平板,代表需将C文件发送给平板。也可沿轨迹2将C文件释放在电脑的显示屏界面,平板会检测到所述释放位置为电脑显示屏界面,此时第三设备为电脑,代表需将C文件发送给电脑。
步骤S407:第一设备控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。
具体地,所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备包括第一设备发送目标文件至与所述释放位置匹配的设备和第一设备控制其他设备将目标文件发送至与所述释放位置匹配的设备。例如,如图5D所示,图5D为多设备协同系统中发送目标文件(以3个设备为例)示意图,若当用户在第一界面上选中C文件并沿着轨迹5拖拽C文件,在电脑的显示屏界面上释放拖拽操作,则平板会控制手机将C文件发送至电脑。若用户在第一界面上选中b文件并沿着轨迹6拖拽b文件,在手机的协同窗口上释放拖拽操作, 则平板会将b文件发送至手机。
步骤S408:第三设备接收第一设备所控制发送的所述目标文件。
具体地,当第一设备控制将所述目标文件发送至第三设备后,第三设备可将所述目标文件存储在设备中。例如,如图5D所示,图5D为多设备协同系统中发送目标文件(以3个设备为例)示意图,若当用户在第一界面上选中C文件并沿着轨迹5拖拽C文件,在电脑的显示屏界面上释放拖拽操作,则平板会控制手机将C文件发送至电脑。此时电脑作为第三设备,会接收C文件,并将C文件存储在本地。若用户在第一界面上选中b文件并沿着轨迹6拖拽b文件,在手机的协同窗口上释放拖拽操作,则平板会将b文件发送至手机。此时手机作为第三设备,会接收b文件,并将b文件存储在本地。
步骤S409:第三设备发送广播通知所述多设备协同系统中的其他设备所述目标文件已成功接收。
具体地,当第三设备在成功接收了目标文件后,会发送广播通知多设备协同系统中的其他设备所述目标文件已经被成功接收,其他设备无需在等待接收。例如,如图5D所示,图5D为多设备协同系统中发送目标文件(以3个设备为例)示意图,若当用户在第一界面上选中C文件并沿着轨迹5拖拽C文件,在电脑的显示屏界面上释放拖拽操作,则平板会控制手机将C文件发送至电脑。在电脑成功接收C文件后,会通知多设备协同系统中的平板和手机,C文件已经被成功接收,无需在等待接收。
通过使用本发明实施例的方法,在多设备协同系统中每一个设备都可在无需断开协同的情况下发送或接收目标文件,可避免在未建立协同的两个设备上进行文件传输时,断开与其他设备建立的协同,再重新建立新的协同进行文件传输,从而实现跨多台设备便捷地进行文件传输,提高了多设备协同下文件资源传输效率并精简了用户操作,提升了用户体验。
下面对本发明实施例所基于的具体方法架构进行描述。参见图6,图6是本申请实施例中的一种文件传输方法的详细流程示意图,下面将结合附图6并基于上述图2中的文件传输系统架构从第一设备和第三设备(如M个第二设备中的一个等)的交互侧对本申请实施例中的文件传输方法进行描述。需要说明的是,为了更详细的描述本申请实施例中的文件扫描方法,本申请在各个流程步骤中描述了相应的执行主体为第一设备或第三设备,但不代表本申请实施例只能通过所描述的执行主体进行对应的方法流程。
步骤S601:第一设备显示第一界面。
具体地,所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口,M为大于或者等于0的整数。所述第一设备可通过显示屏194显示第一界面。例如,如图7A所示,图7A为多设备协同系统(以5个设备为例)示意图,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。在手机1与电脑建立协同且电脑上有手机1的协同窗口,且手机2与电脑建立协且电脑上有手机2的协同窗口,同时电脑与平板建立协同,且手机3与平板建立协同且平板上有手机3的协同窗口的情景下,电脑可作为所述第一设备,在电脑上可显示第一界面,此时所述M的取值为2,所述M个第二设备包括手机1和手机2,电脑的屏幕上有两个设备的协同窗口,电脑的屏幕上除去协同窗口位置的其他位置为第一设备的显示屏界面。
步骤S602:第一设备接收作用于所述第一界面上对目标文件的第一拖拽操作。
具体地,所述目标文件为待传输的文件,所述第一拖拽操作可包括但不限于通过使用第一设备的触摸传感器180K触屏将目标文件拖拽、使用鼠标等外设将目标文件拖拽。图7A为 多设备协同系统(以5个设备为例)示意图,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。需要说明的是,在手机1与电脑建立协同且电脑上有手机1的协同窗口,且手机2与电脑建立协同且电脑上有手机2的协同窗口,同时电脑与平板建立协同,且手机3与平板建立协同且平板上有手机3的协同窗口的情景下,电脑可作为所述第一设备,此时目标文件可为A文件、B文件、D文件、A4文件、B4文件。
在一种可能的实现方式中,所述目标文件为所述第一设备上存储的文件,所述第一拖拽操作的起始位置在所述显示屏界面上;所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述第一拖拽操作的起始位置在所述起始第二设备对应的协同窗口中。具体地,在第一界面上包括了第一设备的显示屏界面和M个第二设备对应的协同窗口,在第一界面上对目标文件进行拖拽操作,目标文件若在第一设备的显示屏界面上且目标文件在第一设备的显示屏界面上被拖拽,而不是在M个第二设备对应的协同窗口中被拖拽,则表示该目标文件存储在第一设备上,确定第一设备可发送目标文件。目标文件若在其中一个设备的协同窗口上且目标文件在该协同窗口上被拖拽,而不是在第一设备的显示屏界面被拖拽,则表示目标文件存储在该协同窗口所对应的设备上,确定第一设备可控制该协同窗口所对应的设备发送目标文件。例如,如图7B所示,图7B为多设备协同系统中目标文件(以5个设备为例)示意图,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。在手机1与电脑建立协同且电脑上有手机1的协同窗口,手机2与电脑建立协同且电脑上有手机2的协同窗口,同时电脑与平板建立协同,手机3与平板建立协同且平板上有手机3的协同窗口的情景下,电脑可作为所述第一设备。图中当用户需要对A4文件进行文件拖拽操作时,由于A4文件在电脑的显示屏界面上,因此可认为A4文件存储在电脑上,则电脑可接收到作用于电脑显示屏界面上对A4文件的拖拽操作。图中当用户需要对A文件、B文件、D文件进行文件拖拽操作时,需要说明的是,此时所述起始第二设备为手机1,由于A文件、B文件、D文件在电脑屏幕的协同窗口上,所以可认为A文件、B文件、D文件存储该协同窗口所对应的手机1上,则电脑可接收到作用于电脑屏幕的协同窗口上对A文件、B文件、D文件的拖拽操作。
步骤S603:第一设备显示与第四设备匹配的拖拽效果。
具体地,所述第四设备为目标文件被拖拽移动过程中所经过的设备。例如,所述第四设备可为M个第二设备中的任意设备。在一种可能的实现方式中,获取所述目标文件的第一信息,所述第一信息包括所述目标文件的文件类型信息、文件数量信息、文件排列顺序信息中的一个或多个,根据所述第一信息,生成所述目标文件的拖拽效果集合,根据所述目标文件的拖拽效果集合显示与第四设备匹配的拖拽效果,所述第四设备为所述第一拖拽操作的拖拽轨迹经过的设备,或经过的设备上的协同窗口对应的设备。具体地,当用户选中目标文件并对目标文件进行拖拽操作时,目标文件沿着拖拽轨迹移动,在此过程中为了实时显示目标文件被移动的位置,则在对目标文件进行拖拽操作时会显示相应地拖拽效果。需要说明的是,当目标文件被拖拽时,第一设备可控制存储目标文件的设备获取目标文件的文件类型信息、文件数量信息、文件排列顺序信息中的一个或多个信息,根据这些信息生成目标文件的拖拽效果集合,然后根据拖拽轨迹所进过的设备的系统显示相应的拖拽效果。例如,如图7C所示,图7C为多设备协同系统中拖拽效果(以5个设备为例)示意图,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。在手机1与电脑建立协同且电脑上有手机1的协同窗口,手机2与电脑建立协同且电脑上有手机2的协同窗口,同时电脑与平板建立协同,手机3与平板建立协同且平板上有手机3的协同窗口的情景下, 电脑可作为所述第一设备。图中用户在电脑显示屏界面上选中A4文件,并将A4文件沿着拖拽轨迹移动,该轨迹可经过手机2的协同窗口,此时手机2可做为所述第四设备。为实时显示A4文件移动的位置,在移动过程中会根据设备的系统显示与系统相适应的拖拽效果,如图7C所示,该拖拽效果可为根据A4文件生成的文件阴影效果。
步骤S604:第一设备通知所述N个设备中的其他设备监听所述第一拖拽操作释放位置。
具体地,当在所述第一界面上对目标文件进行拖拽操作时,所述第一设备会发送广播消息通知该系统中的其他设备有目标文件待分享且区分目标文件来源,这样可让系统中的所有设备都监听拖拽操作释放的位置,可让该系统中的设备都准备接收目标文件。例如,如图7C所示,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。在手机1与电脑建立协同且电脑上有手机1的协同窗口,手机2与电脑建立协同且电脑上有手机2的协同窗口,同时电脑与平板建立协同,手机3与平板建立协同且平板上有手机3的协同窗口的情景下,电脑可作为所述第一设备。当用户在电脑的显示屏界面上选中目标文件A4文件时,电脑会发送广播消息通知多设备协同系统中的手机1、手机2、手机3、平板开始监听拖拽操作释放的位置,此时在该多设备协同系统中的所有设备可接收A4文件。
步骤S605:第三设备接收第一设备发起的监听第一拖拽操作释放位置的通知。
具体地,所述第三设备为所述第一拖拽操作释放位置所对应的设备。例如,如图7C所示,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。在手机1与电脑建立协同且电脑上有手机1的协同窗口,手机2与电脑建立协同且电脑上有手机2的协同窗口,同时电脑与平板建立协同,手机3与平板建立协同且平板上有手机3的协同窗口的情景下,电脑可作为所述第一设备。图中用户在第一界面上的电脑显示屏界面上对A4文件发起拖拽操作,多设备协同系统中的手机1、手机2、手机3、平板和电脑都可作为第三设备。此时当手机3作为第三设备时,手机3会接收到电脑发起的监听第一拖拽操作设备位置的通知,从而得知在电脑上有A4文件待分享,时刻准备接收A4文件。
步骤S606:第三设备监听所述第一拖拽操作释放位置。
具体地,所述第三设备在得知在第一设备上有目标文件待分享后,就时刻监听所述第一拖拽操作释放位置是不是对应在自己的设备上。例如,如图7C所示,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。在手机1与电脑建立协同且电脑上有手机1的协同窗口,手机2与电脑建立协同且电脑上有手机2的协同窗口,同时电脑与平板建立协同,手机3与平板建立协同且平板上有手机3的协同窗口的情景下,电脑可作为所述第一设备。图中用户在第一界面上的电脑显示屏界面上对A4文件发起拖拽操作,多设备协同系统中的手机1、手机2、手机3、平板和电脑都可作为第三设备。当手机3作为第三设备时,手机3会时刻监听A4文件被释放的位置,当A4文件释放在自己的协同窗口上时就接收A4文件。
步骤S607:第一设备检测所述第一拖拽操作释放位置。
具体地,所述释放位置可包括所述释放位置包括N个设备中任意一个设备的界面或协同窗口。此时第三设备为第一拖拽操作释放位置所对应的设备。如图7D所示,图7D为多设备协同系统中第一拖拽操作释放位置(以5个设备为例)示意图,用户在第一界面上的手机1协同窗口中对D文件发起拖拽操作,可沿轨迹8将D文件释放在手机2协同窗口上,则电脑会检测到所述释放位置为手机2的协同窗口,代表需将D文件发送给手机2。也可沿轨迹7将C文件释放在平板的显示屏界面,则电脑会检测到所述释放位置为平板显示屏界面,此时第三设备为平板,代表需将D文件发送给平板。
步骤S608:第一设备确定与所述释放位置匹配的设备接收所述目标文件的存储路径。
具体地,待分享的目标文件将被发送给所述释放位置匹配的设备后,存储在确定的存储路径下。在一种可能的实现方式中,获取所述目标文件的文件信息,所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息,判断与所述释放位置匹配的设备是否满足接收所述目标文件的条件,若满足,则确定与所述释放位置匹配的设备接收所述目标文件的存储路径。具体地,若两个设备通过对目标文件进行拖拽操作实现文件传输,则发送设备需要判断第三设备是否有充足的存储空间来存储目标文件。当发送设备在获取目标文件的文件信息后,可选的,发送设备先将目标文件的大小发送给第三设备,判断该设备是否有空间来存储目标文件,若有充足的存储空间,则确定目标文件在第三设备上的存储路径,发送设备可将目标文件发送到该存储路径。例如,如图7C所示,当用户在电脑的显示屏界面上选中目标文件A4文件并对其进行文件拖拽操作,电脑在获取到A4文件的文件大小后,将文件大小发送给多设备协同系统中的其他设备,其他设备会提前计算存储的剩余空间判断是否可接收目标文件,若无法接收此目标文件则给予无法接收的提示,若可接收则会给予接收提示。电脑在接收到所述释放位置匹配的设备的提示后,当该设备与有存储空间来存储A4文件,继而可确定目标文件存储在该设备上的存储路径。
步骤S609:第一设备控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。
具体地,所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备包括第一设备发送目标文件至与所述释放位置匹配的设备和第一设备控制其他设备将目标文件发送至与所述释放位置匹配的设备。
在一种可能的实现方式中,所述目标文件为所述第一设备上存储的文件;所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备,包括:当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或当检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中时,则控制所述第一设备将所述目标文件发送至所述第三设备;所述第三设备包括所述N个设备中与所述第一设备未建立协同的设备。需要说明的是,当目标文件在第一设备的第一界面被拖拽后,若目标文件被释放在多设备协同系统中任意一个设备的显示屏界面或该设备对应的协同窗口上时,则代表目标文件需发送给与该释放位置所匹配的设备,该设备与第一设备可无需建立协同。例如,如图7E所示,图7E为多设备协同系统中文件拖拽(以5个设备为例)示意图,图中该系统包括五个电子设备,五个电子设备分别为电脑、平板、手机1、手机2和手机3。在手机1与电脑建立协同且电脑上有手机1的协同窗口,手机2与电脑建立协同且电脑上有手机2的协同窗口,同时电脑与平板建立协同,手机3与平板建立协同且平板上有手机3的协同窗口的情景下,电脑可作为所述第一设备。图中用户在电脑显示屏界面上选中目标文件A4文件,可将A4文件沿轨迹9移动,最后将其释放在手机3的协同窗口中。
在一种可能实现的方式中,所述则控制所述第一设备将所述目标文件发送至第三设备,包括,建立数据传输通道,所述数据传输通道用于传输所述目标文件的所述文件信息,若所述第一设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到所述第三设备的所述存储路径,若所述第一设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发到所述第三设备所述存储路径,其中所述中继设备为与第一设备建立直接连接同时与第三设备建立直接连接的设备。具体地,在多设备协同系统中,由于使用了不同的组网技术,从而可实 现更多设备的连接同时不用断开设备间的协同。但不同的组网技术可存在不同的组网方式可导致各设备间的连接关系发生变化。当第一设备为发送设备,第三设备作为接收设备时,第一设备和第三设备在使用不同的组网方式下可建立直接连接也可建立间接连接。若第一设备与第三设备建立直接连接(如使用自组网技术连接各设备),则第一设备可将目标文件的文件信息直接发送给第三设备,若第一设备与第三设备未建立直接连接而是通过中继设备(多设备协同系统里N个设备中可与第一设备建立直接连接同时也可与第三设备建立直接连接的设备)建立了间接连接(如使用WiFi-P2P技术组网实现设备间一对多的连接),则第一设备将目标文件的文件信息先发送给中继设备,中继设备在将目标文件的文件信息转发给第三设备,从而实现多台设备文件传输。例如,如图7E所示,图中用户在电脑显示屏界面上选中目标文件A4文件,并可将A4文件沿轨迹9移动,最后将其释放在手机3的协同窗口中。此时可确定目标文件A4文件存储在电脑上,且最终接收A4文件的设备为手机3。在多设备协同系统中,若多个设备之间采用WiFi-P2P技术进行组网,则电脑与手机3不能建立直接连接,但电脑与平板可建立直接连接,平板与手机3也可建立直接连接。此时电脑可与平板先建立数据传输通道,电脑将A4文件的文件信息发送给平板,然后平板在与手机3建立数据传输通道,平板将A4文件的文件信息转发给手机3。可选的,若多个设备之间采用自组网技术进行组网,则多设备协同系统中的所有设备可建立直接连接,此时电脑可与手机3建立直接连接,电脑可与手机3直接建立数据传输通道,电脑将A4文件的文件信息直接发送给手机3。
在一种可能实现的方式中,所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备,包括:当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中,则控制所述起始第二设备将所述目标文件发送至所述第三设备。需要说明的是,当协同窗口中的目标文件在第一界面上被拖拽后,若目标文件被释放在多设备协同系统中任意一个设备的显示屏界面或该设备对应的协同窗口上时,则代表目标文件需发送给与该释放位置所匹配的设备。可选的,第三设备可为M个第二设备中的一个设备。例如,如图7E所示,图中用户在手机1的协同窗口中选中目标文件D文件,并可将D文件沿轨迹8移动,最后将其释放在手机2的协同窗口中,需要说明的是,此时手机1为所述起始第二设备,手机2为所述第三设备。同时,用户可在手机1的协同窗口中选中目标文件D文件,并可将D文件沿轨迹10移动,最后将其释放在平板的显示屏界面上,需要说明的是,此时手机1为所述起始第二设备,平板为所述第三设备。
在一种可能实现的方式中,所述控制所述起始第二设备将所述目标文件发送至所述第三设备,包括:建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;若所述起始第二设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到与所述释放位置匹配的设备的所述存储路径;若所述起始第二设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发给与所述释放位置匹配的设备所述存储路径,其中所述中继设备为与所述起始第二设备建立直接连接同时与所述第三设备建立直接连接的设备。在本发明实施例中,在多设备协同系统中,由于使用了不同的组网技术,从而可实现更多设备的连接同时不用断开设备间的协同。但不同的组网技术可存在不同的组网方式可导致各设备间的连接关系发生变化。当起始第二设备作为发送设备(存储目标文件的设备,如第一设备或M个第二设备中的一个)时,起始第二设备和第三设备在使用不同的组网方式下可建立直接连接也可建 立间接连接。如起始第二设备与第三设备可建立直接连接(如使用自组网技术进行组网),则起始第二设备可将目标文件的文件信息直接发送给第三设备,如起始第二设备与第三设备未建立直接连接而是通过中继设备(如第一设备)建立了间接连接(如使用WiFi-P2P技术组网实现设备间一对多的连接),则起始第二设备将目标文件的文件信息先发送给中继设备,中继设备在将目标文件的文件信息转发给第三设备,从而实现多台设备文件传输。例如,如图7E所示,图7E为五个设备系统中文件拖拽示意图,图中用户在手机1的协同窗口中选中目标文件D文件,并可将D文件沿轨迹8移动,最后将其释放在手机2的协同窗口中。需要说明的是,在上述例子中所述起始第二设备为手机1,所述第三设备为手机2。此时可确定目标文件D文件存储在手机1上,最终接收D文件的设备为手机2。在多设备协同系统中,若多个设备之间采用WiFi-P2P技术进行组网,则手机1与手机2不能建立直接连接,但手机1与电脑可建立直接连接,电脑与手机2也可建立直接连接。此时手机1可与电脑先建立数据传输通道,手机1将D文件的文件信息发送给电脑,然后电脑在与手机2建立数据传输通道,电脑将D文件的文件信息转发给手机2。可选的,若多个设备之间采用自组网技术进行组网,则多设备协同系统中的所有设备可建立直接连接,此时手机1可与手机2建立直接连接,手机1可与手机2直接建立数据传输通道,手机1将D文件的文件信息直接发送给手机2。
步骤S6010:第三设备接收第一设备所控制发送的所述目标文件。
具体地,当第一设备控制将所述目标文件发送至第三设备后,第三设备可将所述目标文件存储在设备中。在一种可能实现的方式中,所述接收第一设备所控制发送的所述目标文件,包括:与存储所述目标文件的设备建立数据传输通道,接收所述目标文件的文件信息,所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息。需要说明的是,第三设备会接收到来自发送设备发送的目标文件的大小的信息,在判断第三设备有足够的空间接收该目标文件后,第三设备可接收到目标文件的文件信息。例如,如图7E所示,若当用户在第一界面上选中D文件并沿着轨迹8拖拽D文件,且在手机2的协同窗口中释放拖拽操作,此时手机2作为第三设备,则电脑会控制手机1将D文件发送给手机2。手机2会接收D文件并将D文件存储在本地。
步骤S6011:第三设备发送广播通知所述多设备协同系统中的其他设备所述目标文件已被成功接收。
具体地,当第三设备在成功接收了目标文件后,会发送广播通知多设备协同系统中的其他设备所述目标文件已经被成功接收,其他设备无需在等待接收。例如,如图7E所示,若当用户在第一界面上选中D文件并沿着轨迹8拖拽D文件,且在手机2的协同窗口中释放拖拽操作,此时手机2作为第三设备,则电脑会控制手机1将D文件发送给手机2。在手机2成功接收D文件后,会通知多设备协同系统中的手机1、电脑、平板和手机3,D文件已经被成功接收,无需在等待接收。
上述详细阐述了本发明实施例的方法,下面提供了本发明实施例的相关装置。
请参见图8,图8是本发明实施例提供的一种文件传输装置的结构示意图,该文件传输装置80可以包括第一显示单元801、第一接收单元802、发送单元803、第一处理单元804、第二处理单元805、第二接收单元806、第三接收单元807、第三处理单元808、第二显示单元809,其中各个模块的详细描述如下。
应用于多设备协同系统中的第一设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第一设备为所 述N个设备中的任意一个;N为大于2的整数;所述装置包括:
第一显示单元801,用于所述第一设备显示第一界面;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;
第一接收单元802,用于接收作用于所述第一界面上对目标文件的第一拖拽操作;
发送单元803,用于通知所述N个设备中的其他设备监听所述第一拖拽操作的释放位置;所述释放位置包括所述N个设备中任意一个设备的界面或所述协同窗口;
第一处理单元804,用于检测所述第一拖拽操作的所述释放位置;
发送单元803,还用于控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。
在一种可能的实现方式中,所述目标文件为所述第一设备上存储的文件;所述第一处理单元804,具体用于当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或当检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中时,则控制所述第一设备将所述目标文件发送至所述第三设备;所述第三设备包括所述N个设备中与所述第一设备未建立协同的设备。在一种可能的实现方式中,所述装置还包括:第二接收单元806,用于获取所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息;第二处理单元,用于判断与所述释放位置匹配的设备是否满足接收所述目标文件的条件;若满足,则确定与所述释放位置匹配的设备接收所述目标文件的存储路径。在一种可能的实现方式中,所述第一处理单元804,还用于建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;所述发送单元803,还用于若所述第一设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到所述第三设备的所述存储路径;若所述第一设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发到所述第三设备所述存储路径,其中所述中继设备为与所述第一设备建立直接连接同时与所述第三设备建立直接连接的设备。
在一种可能的实现方式中,所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述第一处理单元804,具体用于当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中,则控制所述起始第二设备将所述目标文件发送至所述第三设备。在一种可能的实现方式中,所述第一处理单元804,还用于建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;所述发送单元803,还用于若所述起始第二设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到与所述第三设备的所述存储路径;若所述起始第二设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发给与所述第三设备所述存储路径,其中所述中继设备为与所述起始第二设备建立直接连接同时与所述第三设备建立直接连接的设备。在一种可能的实现方式中,所述装置还包括:第三接收单元807,用于获取所述目标文件的第一信息;所述第一信息包括所述目标文件的文件类型信息、文件数量信息、文件排列顺序信息中的一个或多个;第三处理单元808,用于根据所述第一信息,生成所述目标文件的拖拽效果集合;第二显示单元809,用于根据所述目标文件的拖拽效果集合显示与第四设备匹配的拖拽效果;所述第四设备为所述第一拖拽操作的拖拽轨迹经过的设备,或经过的设备上的协同窗口对应的设备。
需要说明的是,本发明实施例中所描述的文件传输装置80中各功能单元的功能可参见上述图4中所述的方法实施例中步骤S401、步骤S402、步骤S403、步骤S406、步骤S407的相关描述,此处不再赘述。
上述详细阐述了本发明实施例的方法,下面提供了本发明实施例的相关装置。
请参见图9,图9是本发明实施例提供的另一种文件传输装置的结构示意图,该文件传输装置90可以包括接收单元901、处理单元902、发送单元903,其中各个模块的详细描述如下。应用于多设备协同系统中的第三设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第三设备为所述N个设备中的任意一个设备;N为大于2的整数;所述装置包括:
接收单元901,用于接收第一设备发起的监听第一拖拽操作释放位置的通知;所述第一拖拽操作为在所述第一界面上对目标文件发起拖拽;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;处理单元902,用于监听所述第一拖拽操作释放位置;所述接收单元901,还用于接收所述第一设备所控制发送的所述目标文件;发送单元903,用于发送广播通知所述多设备协同系统中的其他设备所述目标文件已被成功接收。在一种可能的实现方式中,所述处理单元902,还用于与存储所述目标文件的设备建立数据传输通道;所述接收单元901,还用于接收所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息。
需要说明的是,本发明实施例中所描述的文件传输装置90中各功能单元的功能可参见上述图4中所述的方法实施例中步骤S404、步骤S405、步骤S408、步骤S409的相关描述,此处不再赘述。
本发明实施例还提供一种计算机存储介质,其中,该计算机存储介质可存储有程序,该程序执行时包括上述方法实施例中记载的任意一种文件传输方法的部分或全部步骤。
本发明实施例还提供一种计算机程序,该计算机程序包括指令,当该计算机程序被计算机执行时,使得计算机可以执行任意一种文件传输方法的部分或全部步骤。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可能可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置,可通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如上述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性 或其它的形式。
上述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
上述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以为个人计算机、服务器或者网络设备等,具体可以是计算机设备中的处理器)执行本申请各个实施例上述方法的全部或部分步骤。其中,而前述的存储介质可包括:U盘、移动硬盘、磁碟、光盘、只读存储器(Read-Only Memory,缩写:ROM)或者随机存取存储器(Random Access Memory,缩写:RAM)等各种可以存储程序代码的介质。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。
Claims (23)
- 一种文件传输的方法,其特征在于,应用于多设备协同系统中的第一设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第一设备为所述N个设备中的任意一个;N为大于2的整数;所述方法包括:所述第一设备显示第一界面;所述第一界面包括所述第一设备的显示屏界面以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;接收作用于所述第一界面上对目标文件的第一拖拽操作;通知所述N个设备中的其他设备监听所述第一拖拽操作释放位置;所述释放位置包括所述N个设备中任意一个设备的界面或所述协同窗口;检测所述第一拖拽操释放位置;控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。
- 如权利要求1所述的方法,其特征在于,所述目标文件为所述第一设备上存储的文件;所述第一拖拽操作的起始位置在所述显示屏界面上;所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述第一拖拽操作的起始位置在所述起始第二设备对应的协同窗口中。
- 如权利要求1或2所述的方法,其特征在于,所述目标文件为所述第一设备上存储的文件;所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备,包括:当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或当检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中时,则控制所述第一设备将所述目标文件发送至所述第三设备;所述第三设备包括所述N个设备中与所述第一设备未建立协同的设备。
- 如权利要求1-3任意一项所述的方法,其特征在于,所述方法还包括:获取所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息;判断与所述释放位置匹配的设备是否满足接收所述目标文件的条件;若满足,则确定与所述释放位置匹配的设备接收所述目标文件的存储路径。
- 如权利要求3或4所述的方法,其特征在于,所述则控制所述第一设备将所述目标文件发送至第三设备,包括:建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;若所述第一设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到所述第三设备的所述存储路径;若所述第一设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发到所述第三设备所述存储路径,其中所述中继设备为与所述第一设备建立直接连接同时与所述第三设备建立直接连接的设备。
- 如权利要求1或2所述的方法,其特征在于,所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备,包括:当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或当检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中,则控制所述起始第二设备将所述目标文件发送至所述第三设备。
- 如权利要求4或6所述的方法,其特征在于,所述控制所述起始第二设备将所述目标文件发送至所述第三设备,包括:建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;若所述起始第二设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到与所述第三设备的所述存储路径;若所述起始第二设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发给与所述第三设备所述存储路径,其中所述中继设备为与所述起始第二设备建立直接连接同时与所述第三设备建立直接连接的设备。
- 如权利要求1-7任意一项所述的方法,其特征在于,所述方法还包括:获取所述目标文件的第一信息;所述第一信息包括所述目标文件的文件类型信息、文件数量信息、文件排列顺序信息中的一个或多个;根据所述第一信息,生成所述目标文件的拖拽效果集合;根据所述目标文件的拖拽效果集合显示与第四设备匹配的拖拽效果;所述第四设备为所述第一拖拽操作的拖拽轨迹经过的设备,或经过的设备上的协同窗口对应的设备。
- 一种文件传输的方法,其特征在于,应用于多设备协同系统中的第三设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第三设备为所述N个设备中的任意一个;N为大于2的整数;所述方法包括:接收第一设备发起的监听第一拖拽操作释放位置的通知;所述第一拖拽操作为在所述第一界面上对目标文件发起拖拽;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;监听所述第一拖拽操作释放位置;接收所述第一设备所控制发送的所述目标文件;发送广播通知所述多设备协同系统中的其他设备所述目标文件已被成功接收。
- 如权利要求9所述的方法,其特征在于,所述接收第一设备所控制发送的所述目标文件,包括:与存储所述目标文件的设备建立数据传输通道;接收所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息。
- 一种文件传输的装置,其特征在于,应用于多设备协同系统中的第一设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第一设备为所述N个设备中的任意一个;N为大于2的整数;所述装置包括:第一显示单元,用于所述第一设备显示第一界面;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等 于0的整数;第一接收单元,用于接收作用于所述第一界面上对目标文件的第一拖拽操作;发送单元,用于通知所述N个设备中的其他设备监听所述第一拖拽操作的释放位置;所述释放位置包括所述N个设备中任意一个设备的界面或所述协同窗口;第一处理单元,用于检测所述第一拖拽操作的所述释放位置;所述发送单元,还用于控制将所述目标文件发送至所述N个设备中与所述释放位置匹配的设备。
- 如权利要求11所述的装置,其特征在于,所述目标文件为所述第一设备上存储的文件;所述第一处理单元,具体用于当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或当检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中时,则控制所述第一设备将所述目标文件发送至所述第三设备;所述第三设备包括所述N个设备中与所述第一设备未建立协同的设备。
- 如权利要求11-12任意一项所述的装置,其特征在于,所述装置还包括:第二接收单元,用于获取所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息;第二处理单元,用于判断与所述释放位置匹配的设备是否满足接收所述目标文件的条件;若满足,则确定与所述释放位置匹配的设备接收所述目标文件的存储路径。
- 如权利要求13所述的装置,其特征在于:所述第一处理单元,还用于建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;所述发送单元,还用于若所述第一设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到所述第三设备的所述存储路径;若所述第一设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发到所述第三设备所述存储路径,其中所述中继设备为与所述第一设备建立直接连接同时与所述第三设备建立直接连接的设备。
- 如权利要求11所述的装置,其特征在于,所述目标文件为所述M个第二设备中起始第二设备上存储的文件;所述第一处理单元,具体用于当检测到所述第一拖拽操作的释放位置在第三设备的显示屏界面,或检测到所述第一拖拽操作的释放位置在所述第三设备在所述N个设备中其他设备上的协同窗口中,则控制所述起始第二设备将所述目标文件发送至所述第三设备。
- 如权利要求15所述的装置,其特征在于:所述第一处理单元,还用于建立数据传输通道;所述数据传输通道用于传输所述目标文件的所述文件信息;所述发送单元,还用于若所述起始第二设备与所述第三设备建立直接连接,则通过所述数据传输通道将所述文件信息发送到与所述第三设备的所述存储路径;若所述起始第二设备与所述第三设备建立间接连接,则通过所述数据传输通道将所述文件信息发送给中继设备,通过所述中继设备将所述文件信息转发给与所述第三设备所述存储路径,其中所述中继设备为与所述起始第二设备建立直接连接同时与所述第三设备建立直接连接的设备。
- 如权利要求11-16任意一项所述的装置,其特征在于,所述装置还包括:第三接收单元,用于获取所述目标文件的第一信息;所述第一信息包括所述目标文件的 文件类型信息、文件数量信息、文件排列顺序信息中的一个或多个;第三处理单元,用于根据所述第一信息,生成所述目标文件的拖拽效果集合;第二显示单元,用于根据所述目标文件的拖拽效果集合显示与第四设备匹配的拖拽效果;所述第四设备为所述第一拖拽操作的拖拽轨迹经过的设备,或经过的设备上的协同窗口对应的设备。
- 一种文件传输的装置,其特征在于,应用于多设备协同系统中的第三设备,所述多设备协同系统中包括N个设备,所述N个设备中的任意一个设备与所述N个设备中的至少一个其他设备建立协同;所述第三设备为所述N个设备中的任意一个设备;N为大于2的整数;所述装置包括:接收单元,用于接收第一设备发起的监听第一拖拽操作释放位置的通知;所述第一拖拽操作为在所述第一界面上对目标文件发起拖拽;所述第一界面包括所述第一设备的显示屏界面,以及与所述第一设备建立协同的M个第二设备对应的协同窗口;M为大于或者等于0的整数;处理单元,用于监听所述第一拖拽操作释放位置;所述接收单元,还用于接收所述第一设备所控制发送的所述目标文件;发送单元,用于发送广播通知所述多设备协同系统中的其他设备所述目标文件已被成功接收。
- 如权利要求18所述的装置,其特征在于:所述处理单元,还用于与存储所述目标文件的设备建立数据传输通道;所述接收单元,还用于接收所述目标文件的文件信息;所述文件信息包括所述目标文件的文件名、文件内容、文件大小信息。
- 一种电子设备,其特征在于,包括处理器、存储器以及通信接口,其中,所述存储器用于存储信息发送程序代码,所述处理器用于调用所述文件传输程序代码来执行权利要求1-8或者9-10中任一项所述的方法。
- 一种芯片系统,其特征在于,所述芯片系统包括至少一个处理器,存储器和接口电路,所述存储器、所述接口电路和所述至少一个处理器通过线路互联,所述至少一个存储器中存储有指令;所述指令被所述处理器执行时,权利要求1-8或者9-10中任意一项所述的方法得以实现。
- 一种计算机存储介质,其特征在于,所述计算机存储介质存储有计算机程序,该计算机程序被处理器执行时实现上述权利要求1-8或者9-10中任意一项所述的方法。
- 一种计算机程序,其特征在于,所述计算机程序包括指令,当所述计算机程序被计算机执行时,使得所述计算机执行如权利要求1-8或者9-10中任意一项所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21871528.2A EP4209890A4 (en) | 2020-09-28 | 2021-09-23 | FILE TRANSFER METHOD AND ASSOCIATED APPARATUS |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011045443.4 | 2020-09-28 | ||
CN202011045443.4A CN114356195B (zh) | 2020-09-28 | 2020-09-28 | 一种文件传输的方法及相关设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022063159A1 true WO2022063159A1 (zh) | 2022-03-31 |
Family
ID=80844953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/119830 WO2022063159A1 (zh) | 2020-09-28 | 2021-09-23 | 一种文件传输的方法及相关设备 |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4209890A4 (zh) |
CN (1) | CN114356195B (zh) |
WO (1) | WO2022063159A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115314494A (zh) * | 2022-08-04 | 2022-11-08 | 三星电子(中国)研发中心 | 一种多设备协同工作方法和装置 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114760291B (zh) * | 2022-06-14 | 2022-09-13 | 深圳乐播科技有限公司 | 一种文件处理方法及装置 |
CN117692551A (zh) * | 2022-09-02 | 2024-03-12 | 荣耀终端有限公司 | 数据传输方法和终端设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150356949A1 (en) * | 2014-06-10 | 2015-12-10 | Samsung Electronics Co., Ltd. | Method and apparatus for processing information of electronic device |
CN107425942A (zh) * | 2017-07-27 | 2017-12-01 | 广州视源电子科技股份有限公司 | 数据发送、转发和传输的方法及装置 |
CN110602805A (zh) * | 2019-09-30 | 2019-12-20 | 联想(北京)有限公司 | 信息处理方法、第一电子设备和计算机系统 |
CN110618970A (zh) * | 2019-09-12 | 2019-12-27 | 联想(北京)有限公司 | 文件传输方法和电子设备 |
CN110703966A (zh) * | 2019-10-17 | 2020-01-17 | 广州视源电子科技股份有限公司 | 文件共享方法、装置、系统、相应设备及存储介质 |
CN112527221A (zh) * | 2019-09-18 | 2021-03-19 | 华为技术有限公司 | 一种数据传输的方法及相关设备 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2632188B1 (en) * | 2012-02-24 | 2018-04-11 | BlackBerry Limited | Method and apparatus for interconnected devices |
US9288299B2 (en) * | 2012-09-14 | 2016-03-15 | Dewmobile, Inc. | Method and apparatus for file sharing in a network |
KR102390082B1 (ko) * | 2015-07-14 | 2022-04-25 | 삼성전자주식회사 | 전자 장치의 동작 방법 및 전자 장치 |
CN105183343A (zh) * | 2015-08-25 | 2015-12-23 | 努比亚技术有限公司 | 一种处理报点信息的装置和方法 |
CN105892851A (zh) * | 2016-03-29 | 2016-08-24 | 北京金山安全软件有限公司 | 一种可视资源传输方法、装置及电子设备 |
-
2020
- 2020-09-28 CN CN202011045443.4A patent/CN114356195B/zh active Active
-
2021
- 2021-09-23 EP EP21871528.2A patent/EP4209890A4/en active Pending
- 2021-09-23 WO PCT/CN2021/119830 patent/WO2022063159A1/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150356949A1 (en) * | 2014-06-10 | 2015-12-10 | Samsung Electronics Co., Ltd. | Method and apparatus for processing information of electronic device |
CN107425942A (zh) * | 2017-07-27 | 2017-12-01 | 广州视源电子科技股份有限公司 | 数据发送、转发和传输的方法及装置 |
CN110618970A (zh) * | 2019-09-12 | 2019-12-27 | 联想(北京)有限公司 | 文件传输方法和电子设备 |
CN112527221A (zh) * | 2019-09-18 | 2021-03-19 | 华为技术有限公司 | 一种数据传输的方法及相关设备 |
CN110602805A (zh) * | 2019-09-30 | 2019-12-20 | 联想(北京)有限公司 | 信息处理方法、第一电子设备和计算机系统 |
CN110703966A (zh) * | 2019-10-17 | 2020-01-17 | 广州视源电子科技股份有限公司 | 文件共享方法、装置、系统、相应设备及存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4209890A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115314494A (zh) * | 2022-08-04 | 2022-11-08 | 三星电子(中国)研发中心 | 一种多设备协同工作方法和装置 |
Also Published As
Publication number | Publication date |
---|---|
EP4209890A1 (en) | 2023-07-12 |
CN114356195B (zh) | 2024-03-26 |
CN114356195A (zh) | 2022-04-15 |
EP4209890A4 (en) | 2024-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021013158A1 (zh) | 显示方法及相关装置 | |
WO2021052147A1 (zh) | 一种数据传输的方法及相关设备 | |
US11922005B2 (en) | Screen capture method and related device | |
WO2021139768A1 (zh) | 跨设备任务处理的交互方法、电子设备及存储介质 | |
WO2021103981A1 (zh) | 分屏显示的处理方法、装置及电子设备 | |
WO2021036770A1 (zh) | 一种分屏处理方法及终端设备 | |
WO2019072178A1 (zh) | 一种通知处理方法及电子设备 | |
WO2022063159A1 (zh) | 一种文件传输的方法及相关设备 | |
EP4199499A1 (en) | Image capture method, graphical user interface, and electronic device | |
WO2022017393A1 (zh) | 显示交互系统、显示方法及设备 | |
WO2020238759A1 (zh) | 一种界面显示方法和电子设备 | |
WO2022042326A1 (zh) | 显示控制的方法及相关装置 | |
WO2022001619A1 (zh) | 一种截屏方法及电子设备 | |
WO2022012418A1 (zh) | 拍照方法及电子设备 | |
WO2023030099A1 (zh) | 跨设备交互的方法、装置、投屏系统及终端 | |
WO2021218544A1 (zh) | 一种提供无线上网的系统、方法及电子设备 | |
JP7543442B2 (ja) | コンテンツ共有方法、装置、およびシステム | |
WO2021143391A1 (zh) | 基于视频通话的共享屏幕方法及移动设备 | |
WO2021143650A1 (zh) | 数据分享的方法、电子设备 | |
US20240143262A1 (en) | Splicing Display Method, Electronic Device, and System | |
US20230236714A1 (en) | Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device | |
CN115016697A (zh) | 投屏方法、计算机设备、可读存储介质和程序产品 | |
WO2022152174A1 (zh) | 一种投屏的方法和电子设备 | |
WO2023020012A1 (zh) | 设备之间的数据通信方法、设备、存储介质及程序产品 | |
WO2023169237A1 (zh) | 一种截屏方法、电子设备及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21871528 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021871528 Country of ref document: EP Effective date: 20230404 |