CN114338642B - File transmission method and electronic equipment - Google Patents

File transmission method and electronic equipment Download PDF

Info

Publication number
CN114338642B
CN114338642B CN202011018822.4A CN202011018822A CN114338642B CN 114338642 B CN114338642 B CN 114338642B CN 202011018822 A CN202011018822 A CN 202011018822A CN 114338642 B CN114338642 B CN 114338642B
Authority
CN
China
Prior art keywords
file
equipment
azimuth
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011018822.4A
Other languages
Chinese (zh)
Other versions
CN114338642A (en
Inventor
谢雨
路扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011018822.4A priority Critical patent/CN114338642B/en
Priority to PCT/CN2021/117200 priority patent/WO2022062902A1/en
Publication of CN114338642A publication Critical patent/CN114338642A/en
Application granted granted Critical
Publication of CN114338642B publication Critical patent/CN114338642B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Environmental & Geological Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

The application relates to the technical field of electronics, and provides a file transmission method and electronic equipment, wherein the method comprises the following steps: a first device acquires a first image; if the first image comprises the target equipment and the target file displayed on the screen of the target equipment, the first equipment searches third equipment meeting preset azimuth conditions from second equipment in close-range communication connection with the first equipment; the method comprises the steps that a first device sends a screen information query request to a third device, wherein the screen information query request is used for indicating the third device to feed back a current screenshot; the first equipment receives the screenshot fed back by the third equipment, matches the first image with the screenshot, and determines the third equipment corresponding to the matched screenshot as target equipment if the matched screenshot exists; the first device sends a file transmission request to the target device, wherein the file transmission request is used for indicating the target device to send a target file to the first device. Through the method and the device, the efficiency of cross-device file transmission can be effectively improved.

Description

File transmission method and electronic equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a file transmission method and an electronic device.
Background
With the development of communication technology and the improvement of living standard of people, people usually have a plurality of terminal devices, such as portable terminal devices like smart phones and tablet computers, and the terminal devices provide various services like file storage and file preview for people, and cross-device file transmission becomes a common operation in daily life of people.
However, there are some problems in the existing cross-device file transfer, for example, when a user owns a plurality of terminal devices, the user is required to select a target device from a list containing the plurality of terminal devices to complete the file transfer, and the operation efficiency is low.
Disclosure of Invention
The application discloses a file transmission method, electronic equipment and a computer readable storage medium. The method can effectively reduce the operation of the user and improve the efficiency of cross-device file transmission.
In a first aspect, an embodiment of the present application provides a file transmission method, which is applied to a first device, and includes: the first equipment acquires a first image; if the first image comprises target equipment and a target file displayed on a screen of the target equipment, the first equipment searches third equipment meeting a preset azimuth condition from second equipment in close-range communication connection with the first equipment; the first device sends a screen information query request to the third device, wherein the screen information query request is used for indicating the third device to feed back a current screenshot; the first equipment receives the screenshot fed back by the third equipment, matches the first image with the screenshot, and determines the third equipment corresponding to the matched screenshot as the target equipment if the matched screenshot exists; and the first equipment sends a file transmission request to the target equipment, wherein the file transmission request is used for indicating the target equipment to send the target file to the first equipment.
In the embodiment of the application, the first device searches for a third device meeting a preset orientation condition from a second device in close-range communication connection with the first device, so that the range of searching for the target device is reduced, and then the target device is further determined and the transmission of the target file is completed through a screenshot fed back by the third device, so that the user can complete the transmission of the cross-device file without manually selecting the target device needing to transmit the file, the operation of the user is reduced, the efficiency of the cross-device file transmission is improved, and the practicability and the usability are high.
It should be noted that the target file includes, but is not limited to, a document, an audio, a video, an audio-video, a web page, and the like, and the file type of the target file is not specifically limited in this embodiment.
It should be further noted that, in this embodiment of the application, the target file is in an open or play state, and when two or more files are displayed on the screen of the target device, the target file may be all files displayed on the screen of the target device, may also be a file running in the foreground of the target device, or a file located according to the current position of the cursor, which is not specifically limited herein.
In a first possible implementation manner of the first aspect, the screen information query request is further configured to instruct the third device to feed back currently displayed first file information, and after determining the target device, the first device further includes: the first device receives first file information which is fed back by the target device and is displayed currently, matches the first file information with file information extracted from the first image, and generates a file transmission request if matching is successful, wherein the file transmission request comprises the first file information which is successfully matched.
Illustratively, the file information includes, but is not limited to, file type, file name, file path. The method comprises the steps that after first file information which is fed back by a third device and is displayed currently is received by the first device, the file information extracted from a first image is matched with the first file information, whether the first file information matched with the file information extracted from the first image exists or not is judged, for example, the file information with the same file type and file name is determined to be matched file information, if the first file information matched with the file information extracted from the first image exists, namely, the matching is successful, a file transmission request is generated, the generated file transmission request comprises the first file information with the successful matching, the first file information is used for indicating a target device to send a target file to the first device, and the name of the target file is the same as the file name in the first file information.
In the embodiment of the application, the first device receives the currently displayed first file information fed back by the third device, generates a file transmission request including the successfully matched first file information after the file information extracted from the first image is matched with the first file information, and sends the file transmission request to the target device after the first device determines the target device, so that the target device can quickly search a file corresponding to a file name from a file path provided by the first file information to serve as a target file, and the file transmission efficiency is improved.
In a second possible implementation manner of the first aspect, if the first image includes a target device and a target file displayed on a screen of the target device, the searching, by the first device, for a third device that meets a preset orientation condition from second devices that are in close-range communication connection with the first device includes: if the first image comprises a target device and a target file displayed on a screen of the target device, the first device sends a direction query request to the second device, wherein the direction query request is used for indicating the second device to feed back a current first direction angle, the first direction angle is a distance direction angle between the second device and the first device, and the first direction angle is determined according to the signal strength of near-distance communication between the second device and the first device; and the first equipment receives the first azimuth angle fed back by the second equipment, and searches for third equipment meeting preset azimuth conditions from the second equipment according to the first azimuth angle.
In the embodiment of the application, after a first image is obtained, whether a target device and a target file displayed on a screen of the target device are included in the first image is judged by a first device, if the first image includes the target device and the target file displayed on the screen of the target device, a direction query request is sent to a second device by the first device, and after the second device receives the direction query request sent by the first device, a current first direction angle of the first device is fed back to the first device, so that the first device can search a third device meeting a preset direction condition from the second device according to the first direction angle, the range of searching the target device is effectively reduced, and the searching efficiency and accuracy of the target device are improved.
In a third possible implementation manner of the first aspect, the receiving, by the first device, a first azimuth fed back by the second device, and searching, according to the first azimuth, a third device that meets a preset azimuth condition from the second device includes: the first device obtains a second azimuth, matches the second azimuth with the first azimuth, and determines a second device corresponding to the matched first azimuth as a third device meeting a preset azimuth condition if the matched first azimuth exists, wherein the second azimuth is a visual azimuth of the first device and the target device.
In an embodiment of the application, the second azimuth is a visual azimuth between the first device and the target device determined based on the first image.
Illustratively, the acquiring, by the first device, the second azimuth comprises: the first device determines position information of the target device in the first image based on the trained deep neural network, and calculates the second azimuth angle according to the position information.
It should be noted that the deep neural network model is obtained by training an image containing a device as a training sample.
Illustratively, the acquiring, by the first device, the second azimuth comprises: the first equipment acquires a third image, wherein the third image is a binocular image shot by the first equipment when shooting the first image; the first equipment determines the depth information of the target equipment based on a binocular vision positioning method, and calculates the second azimuth angle according to the depth information.
In this embodiment of the application, the third image is a binocular image obtained by using any two cameras in the electronic device, that is, the third image includes a left image and a right image, the left image is an image seen by a left eye of a person, and the right image is an image seen by a right eye of the person. Information about the real world environment, in particular depth information of the target device, can be extracted based on the binocular images.
It should be noted that when the first device acquires the third image, the first device performs binocular calibration and correction on the acquired third image, and performs binocular matching to acquire the depth information of the target device, so that the first device can acquire the accurate second azimuth.
In a third possible implementation manner of the first aspect, the method for determining, by the first device, the second device as the third device meeting a preset azimuth condition includes: the first device calculates the similarity of the second azimuth angle and the first azimuth angle; if the calculated similarity is smaller than a first threshold, the first device determines that a first azimuth corresponding to the similarity smaller than the first threshold is matched with the second azimuth, and determines a second device corresponding to the matched first azimuth as a third device meeting a preset azimuth condition.
In a third possible implementation manner of the first aspect, the method for determining, by the first device, the second device as the third device meeting a preset azimuth condition includes: the first device calculating an angle difference between the second azimuth and the first azimuth; if the calculated angle difference is smaller than a second threshold value, the first device determines that a first azimuth angle corresponding to the angle difference smaller than the second threshold value is matched with the second azimuth angle, and determines a second device corresponding to the matched first azimuth angle as a third device meeting a preset azimuth condition.
In a second aspect, an embodiment of the present application provides another file transmission method, which is applied to a second device, and includes: the method comprises the steps that after receiving a screen information inquiry request sent by first equipment connected with the second equipment in short-distance communication, the second equipment obtains a current screen capture and feeds the current screen capture back to the first equipment, wherein the screen information inquiry request is used for indicating the second equipment to feed back the current screen capture; and after receiving a file transmission request sent by the first equipment, the second equipment sends a target file to the first equipment, wherein the file transmission request is used for indicating the second equipment to send the target file to the first equipment.
In the embodiment of the application, after receiving a screen information query request sent by a first device connected with the second device in near field communication, the second device obtains a current screen capture and feeds the current screen capture back to the first device, so that the first device can determine a target device from the second device according to the screen capture, and after receiving a file transmission request sent by the first device, the target device searches for a target file and sends the target file to the first device, thereby achieving the purpose of quickly transmitting the target file between the first device and the second device, and improving the efficiency of file transmission across devices.
In a first possible implementation manner of the second aspect, before receiving the screen information query request sent by the first device connected to the second device in short-range communication, the second device further includes: after receiving an azimuth query request sent by the first device, the second device determines a first azimuth according to the signal strength of near field communication between the second device and the first device, and feeds back the first azimuth to the first device, so that the first device determines whether the second device is a device meeting a preset azimuth condition according to the first azimuth, and after determining that the second device is a device meeting the preset azimuth condition, sends a screen information query request to the second device, wherein the first azimuth is a distance azimuth between the second device and the first device.
In the embodiment of the application, after receiving the direction query request sent by the first device, the second device determines the first direction angle according to the signal strength of near field communication between the second device and the first device, and feeds back the determined first direction angle to the first device, so that the first device can determine the direction of the second device according to the first direction angle and search for the second device meeting the preset direction condition, thereby reducing the range of searching for the target device from the second device, improving the searching efficiency and accuracy of the target device, and achieving the purpose of improving the efficiency of file transmission.
Illustratively, after receiving the direction inquiry request sent by the first device, the second device obtains the signal strength of the short-distance communication connection with the first device, calculates the distance between the two devices according to the obtained signal strength, and calculates the first direction angle according to the calculated distance.
It should be noted that the signal strength of the short-distance communication connection includes, but is not limited to, signal strength of a bluetooth signal, a WIFI signal, and a wireless access point signal.
According to the embodiment of the application, the second equipment is positioned based on the signal intensity of the close-distance connection, so that the first equipment can determine the third equipment meeting the preset azimuth condition from the second equipment, the equipment range of the target equipment which is judged on the next step is reduced, and the judgment efficiency is improved.
In a second possible implementation manner of the second aspect, after receiving the location query request sent by the first device, the determining, by the second device, a first location angle according to the signal strength of the close-range communication between the second device and the first device includes: the second equipment broadcasts a Bluetooth signal of the second equipment after receiving the direction inquiry request sent by the first equipment; and the second equipment establishes Bluetooth connection with the first equipment through the Bluetooth signal and calculates the first azimuth angle according to the Bluetooth signal intensity.
In a third possible implementation manner of the second aspect, after receiving the location query request sent by the first device, the determining, by the second device, a first location angle according to the signal strength of the close-range communication between the second device and the first device includes: the second equipment broadcasts a WIFI signal of the second equipment when detecting the direction inquiry request sent by the first equipment; and the second equipment establishes WIFI connection with the first equipment through the WIFI signal, and calculates the first azimuth angle according to the WIFI signal intensity.
In a fourth possible implementation manner of the second aspect, the receiving, by the second device, a file transfer request sent by the first device, and sending a target file to the first device includes: the second equipment determines whether the current account information is consistent with the account information of the first equipment, and if so, the second equipment sends the target file to the first equipment; and if not, the second equipment generates a file sending confirmation request and feeds the file sending confirmation request back to the first equipment, wherein the file sending confirmation request is used for indicating the first equipment to determine whether to transmit the target file.
Before the second device sends the target file to the first device, the method further includes: and acquiring current node information of the target file and generating file node information, wherein the node information comprises but is not limited to file page numbers, cursor positions, file progress bars and the like. And the second equipment simultaneously sends the file node information and the target file to the first equipment, so that the first equipment can realize continuous display of the file according to the file node information when opening the target file.
For example, before the second device sends the target file to the first device, it determines whether the target file is of a preset file type, such as a video file or an audio file, and if the target file is of the preset file type, obtains current node information of the target file, generates file node information including a file name and a playing position, and sends the file node information to the first device, so that the first device continuously plays, from the playing position, the target file corresponding to the file name, for example, a video named XXX, according to the file node information.
In the embodiment of the application, the second device sends the file node information to the first device, so that the first device can realize the continuous display of the file according to the received file node information, and the use experience of a user is improved.
In a third aspect, the application provides a file transmission device, which is applied to first equipment and comprises an image acquisition unit, a file transmission unit and a file transmission unit, wherein the image acquisition unit is used for acquiring a first image; the device searching unit is used for searching a third device meeting a preset azimuth condition from a second device in close-range communication connection with the first device if the first image comprises a target device and a target file displayed on a screen of the target device; a screen information query request sending unit, configured to send a screen information query request to the third device, where the screen information query request is used to instruct the third device to feed back a current screenshot; the device confirming unit is used for receiving the screen capture fed back by the third device, matching the first image with the screen capture, and if the matched screen capture exists, determining the third device corresponding to the matched screen capture as the target device; a file transfer request sending unit, configured to send a file transfer request to the target device, where the file transfer request is used to instruct the target device to send the target file to the first device.
In a fourth aspect, the present application provides a file transfer apparatus, applied to a second device, including: the screen capture feedback unit is used for acquiring a current screen capture after receiving a screen information query request sent by first equipment connected with the first equipment in short-distance communication and feeding the current screen capture back to the first equipment, wherein the screen information query request is used for indicating the second equipment to feed back the current screen capture; and the target file sending unit is used for sending a target file to the first equipment after the second equipment receives a file transmission request sent by the first equipment, wherein the file transmission request is used for indicating the second equipment to send the target file to the first equipment.
In a fifth aspect, the present application provides an electronic device, comprising: a processor and a memory, the processor and the memory being coupled, the memory being configured to store a computer program (also referred to as instructions or code), which when executed by the processor, causes the electronic device as described above to perform the method as provided by the first aspect or any one of the possible implementations of the first aspect.
In a sixth aspect, the present application provides an electronic device, comprising: a processor and a memory, the processor and the memory being coupled, the memory being configured to store a computer program (also referred to as instructions or code), which, when executed by the processor, causes the electronic device as described above to perform the method as provided by the second aspect or any one of the possible embodiments of the second aspect.
In a seventh aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program runs on an electronic device, the computer program causes the electronic device to perform the method as provided in the first aspect, the second aspect, any one of the possible implementation manners of the first aspect, or any one of the possible implementation manners of the second aspect.
In an eighth aspect, an embodiment of the present application provides a chip, which includes a processor, and when the processor reads and executes a computer program stored in a memory, the method as provided in the first aspect, the second aspect, any one of the possible implementations of the first aspect, or any one of the possible implementations of the second aspect is implemented.
In a ninth aspect, an embodiment of the present application provides a chip system, which includes a memory and a processor, and when the chip system is executed, the electronic device is caused to perform the method as provided by the first aspect, the second aspect, any possible implementation manner of the first aspect, or any possible implementation manner of the second aspect. Provided is a method. The chip system can be a single chip or a chip module formed by a plurality of chips.
It is to be understood that the above-mentioned file transmission apparatus of the third aspect, the above-mentioned file transmission apparatus of the fourth aspect, the above-mentioned electronic device of the fifth aspect, the above-mentioned electronic device of the sixth aspect, the above-mentioned computer storage medium of the seventh aspect, or the above-mentioned chip of the eighth aspect are all configured to execute the method provided by any one of the first aspect, any one of the possible embodiments of the first aspect, the second aspect, or any one of the possible embodiments of the second aspect. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and the details are not repeated here.
Drawings
The drawings used in the embodiments of the present application are described below.
Fig. 1 is a schematic hardware structure diagram of an electronic device 100 provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a software structure of the electronic device 100 according to the embodiment of the present application;
fig. 3 is a schematic flowchart of a file transfer method according to an embodiment of the present application;
FIG. 4 is a schematic flow chart illustrating another file transfer method according to an embodiment of the present application;
fig. 5 is a schematic view of a file transmission scenario provided in an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a document transportation device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of another file transfer device according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings. The terminology used in the description of the embodiments herein is for the purpose of describing particular embodiments herein only and is not intended to be limiting of the application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The electronic device related to the embodiment of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, a virtual reality device, or the like. The specific type of electronic device is not limited in this application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
Wherein the controller may be a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, a charger, a flash, a camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to implement the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. Processor 110 and display screen 194 communicate via a DSI interface to implement display functions of electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In some embodiments, the display screen 194 of fig. 1 may be bent when the display panel is made of OLED, AMOLED, FLED, or the like. Here, the display screen 194 may be bent, which means that the display screen may be bent at any position to any angle and may be maintained at the angle. For example, the display screen 194 may be folded in half from the middle left and right, or may be folded in half from the middle up and down. In the embodiment of the present application, the display screen that can be folded is referred to as a foldable display screen. The foldable display screen may be a single screen, or a display screen formed by splicing a plurality of screens, which is not limited herein.
In some embodiments, the electronic device 100 may determine whether the foldable display screen is in the folded configuration or in the unfolded configuration through one or more of a gravity sensor, an acceleration sensor, and a gyroscope, and may also determine whether the foldable display screen is in the portrait screen display state or in the landscape screen display state. The electronic device 100 may further detect a bending angle of the foldable display screen through a gravity sensor, an acceleration sensor, and a gyroscope, and then the electronic device 100 may determine whether the foldable display screen is in the folded state or the unfolded state according to the bending angle. The electronic device 100 may further determine the orientation of the foldable display screen in the folded state through one or more of a gravity sensor, an acceleration sensor, and a gyroscope, so as to determine a display area of the interface content output by the display system. For example, when the first screen region of the foldable display screen faces upward with respect to the ground, the electronic device 100 may display the interface content output by the display system on the first screen region. When the second screen area of the foldable display screen faces upward relative to the ground, the electronic device 100 may display the interface content output by the display system on the second screen area.
In some embodiments, the electronic device 100 may further include an angle sensor (not shown in fig. 1) that may be disposed at a bend of the foldable display screen. The electronic device 100 may measure an included angle formed between two ends of the middle bending portion of the foldable display screen by an angle sensor (not shown in fig. 1) disposed at the bending portion of the foldable display screen, and when the included angle is greater than or equal to the first angle, the electronic device 100 may recognize that the foldable display screen enters the unfolded state by the angle sensor. When the included angle is smaller than or equal to the first angle, the electronic device 100 may recognize that the foldable display screen enters the folded state through the angle sensor.
In some other embodiments, the electronic device 100 can also recognize whether the foldable display screen is in the folded state through a physical switch disposed at the bending portion of the foldable display screen. For example, when the electronic device receives a user's folding operation on the foldable display screen and the physical switch provided on the electronic device is triggered to open, the electronic device 100 may determine that the foldable display screen is in the folded configuration. When the electronic device 100 receives an unfolding operation of the foldable display screen by a user, the physical switch arranged on the electronic device is triggered to be closed, and the electronic device can determine that the foldable display screen is in the unfolded state. The examples are merely illustrative of the present application and should not be construed as limiting.
Taking the foldable display screen as two foldable display screens as an example, when the foldable display screen is in the unfolded state, the foldable display screen may display the content in a full screen, in a partial area (for example, the first screen area or the second screen area), or in two or more partial areas. In a possible implementation manner, when the foldable display screen displays the interface content in a full screen, the interface content may occupy a part of a display area of the foldable display screen, for example, when the display screen 194 is a special-shaped cut screen (Notch screen), a middle portion of the special-shaped cut screen displays the interface content, and when one or both edge portions are blank screens, the foldable display screen may also be regarded as displaying the interface content in a full screen.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense ambient light brightness. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards can be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a layered architecture as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions for the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
Currently, when some devices share files among multiple devices, after a user clicks an Airdrop (Airdrop) function icon in a desktop clicking icon of a device a that sends the files, a name list of a series of devices that are in communication connection with the device a, such as bluetooth and WIFI, is displayed on a desktop of the device a, and the user can select a device B that receives the files from the name list, that is, one-to-one file transmission can be completed. However, when the method is used to complete the file transfer between the devices, the number of steps required for the user to operate is large, and especially when the number of devices in communication connection with the device a is large, the user needs to find the device B which receives the file from the frequent name list.
For example, in a specific application scenario, when a user needs to share a document displayed on the current screen of the device a with the device B, the user needs to click the sharing icon displayed on the current screen of the device a first and then select the air-drop function icon. At this time, a list of names of a series of devices connected to the device a via bluetooth or WIFI is displayed on the screen of the device a, and the user needs to search for and select the device B to be shared from the list of names. After the user selects the device B to be shared, the device B receives the document transmitted by the device a, and can select a mode of opening the document. In the process that the user shares the file of the device A with the device B, the user needs to perform multiple operations to transmit the file from the device A to the device B, so that the efficiency is low, and the user experience is poor. Particularly, when a plurality of devices exist around the device a and are in communication connection with the device a, the user needs to find the name of the device B from a long list of names, which takes more time and greatly reduces the user experience.
In order to solve the above technical problem, the following technical solutions are provided in the embodiments of the present application, and specific contents thereof can be referred to below.
The technical solution provided in the embodiment of the present application may be applied to various communication systems, for example, a New Radio (NR) communication system adopting a fifth generation (5 th generation,5 g) communication technology, a future evolution system, or a multiple communication convergence system, and the like. The technical scheme provided by the application can be applied to various application scenarios, for example, scenarios such as machine-to-machine (M2M), macro-micro communication, enhanced mobile internet (eMBB), ultra-reliable and ultra-low latency communication (urlcc), and mass internet of things communication (mtc). These scenarios may include, but are not limited to: communication scenarios between communication devices, network devices, and communication scenarios between network devices and communication devices, etc. The following description is given by way of example as applied to a communication scenario between a first device and a second device.
First, in this embodiment of the application, a first device is a receiving device by default, that is, a device that receives a file sent by another device, and a second device is a sending device by default, that is, a device that transmits a file of the second device to another device. The distributed collaborative service is operated on the first device and is used for carrying out command distribution, file reception, identity authentication, communication service provision and the like, for example, the distributed collaborative service is responsible for processing image recognition and processing in a camera visual recognition mode, is communicated with a client of the second device, inquires about a current active task of the second device, receives files transmitted by the second device and distributes execution commands to application programs, wherein the application programs are used for opening the received files and can also carry out continuous display on the received files; the distributed cooperative client is operated on the second device, and is used for performing command processing, file transmission, identity authentication, providing communication services, and the like, for example, the distributed cooperative client is responsible for monitoring a current active task, performing file transmission after a command is initiated, and the distributed cooperative client is used for establishing a trusted connection between the first device and the second device client. The short-distance communication between the first device and the second device can be directly connected through Bluetooth BT/WIFI or indirectly connected through a wireless access point.
The technical solutions provided in the embodiments of the present application are specifically described below with reference to the drawings of the specification.
As shown in fig. 3, a file transmission method provided in the embodiment of the present application includes the following steps:
s101, the first equipment acquires a first image.
In the embodiment of the application, the first image is an image shot by a user by using the first device, and the image comprises the device and a file displayed on a screen of the device. Generally, when a user photographs a target device and a target document displayed on a screen of the target device using a first device, the first image is an image including the target device and the target document displayed on the screen of the target device.
It should be noted that the target file includes, but is not limited to, a document, an audio file, a video file, an audio-video file, a web page, and the like, and the file type of the target file is not specifically limited in this embodiment.
It should be further noted that, in the embodiment of the present application, the target file is in an open or playing state, and when two or more files are displayed on the screen of the target device, the target file may be all files displayed on the screen of the target device, or may be a file running in the foreground of the target device, for example, a file located according to a position where a current cursor is located, which is not specifically limited herein.
In the embodiment of the application, a user holding the first device wants to share the target file displayed on the current screen of the target device on the current device, that is, on the screen of the first device, that is, when the target file displayed on the current screen of the target device is shared with the first device to be displayed, the user can shoot the target device by using the first device to obtain the first image.
Specifically, in an actual application scenario, a user may click a file transfer icon in a first device, and after detecting a click operation on the file transfer icon, the first device starts a camera visual recognition mode to instruct the user to use a camera of the first device to capture an image including a target device, that is, a first image.
S102, if the first image comprises target equipment and a target file displayed on a screen of the target equipment, the first equipment searches third equipment meeting a preset azimuth condition from second equipment in close range communication connection with the first equipment.
In the embodiment of the application, in the camera visual recognition mode, after the first device takes the first image, it is recognized whether the first image includes the target device and the target file displayed on the screen of the target device, that is, whether the first image includes the device and the file displayed on the screen of the device.
For example, after the first device starts the camera visual recognition mode and the first device has captured the first image through the camera, the first device will recognize and extract image features of the first image, and determine whether the first image includes the target device and the target file displayed on the screen of the target device according to the extracted image features, that is, determine whether the first image includes the device and the file displayed on the screen of the device.
It should be noted that the second device is one or more devices connected to the first device in short-range communication; the third equipment is any one or more of the second equipment meeting the preset azimuth condition; the target device is any one of the third devices.
When the first image is determined to contain the target device and the target file displayed on the screen of the target device, the first device sends a direction query request to the second device, wherein the direction query request is used for indicating the second device to feed back the current first direction angle. The first azimuth is specifically a distance azimuth between the second device and the first device. After receiving the first azimuth angle fed back by the second equipment, the first equipment searches for third equipment meeting preset azimuth conditions from the second equipment according to the first azimuth angle.
In some specific embodiments, when the first device sends the direction query request to the second device, the first device scans, through its own wireless device, such as a bluetooth module, a WIFI module, a wireless access point AP, and the like, wireless signals broadcast by devices within a preset range and the wireless device of the second device at the same time, so that the first device and the second device can establish a close-range communication connection through the wireless signals, such as bluetooth signals, WIFI signals, and the like.
Illustratively, when the first device sends the direction inquiry request, the first device scans a bluetooth signal or a WIFI signal broadcast by the second device within a preset range, so as to establish a close-range communication connection with a device around the first device, that is, the second device, through the bluetooth signal or the WIFI signal. And the second equipment broadcasts the Bluetooth signal or the WIFI signal when detecting the direction inquiry request sent by the first equipment, so that the first equipment can scan the Bluetooth signal or the WIFI signal broadcasted by the second equipment, and the first equipment and the second equipment establish close-range communication connection. After the first device establishes the near field communication connection with the second device, the first device searches for a third device meeting a preset azimuth condition according to the first azimuth angle fed back by the second device.
In a specific application scenario, when the first device is a mobile phone and the first device sends an azimuth query request, the mobile phone antenna array receives a bluetooth signal or a WIFI signal broadcast by the second device, and calculates a first azimuth angle according to the intensity of the received bluetooth signal or WIFI signal.
It should be noted that the bluetooth module or the WIFI module of the first device may be in a working state all the time, or may be started when the first device sends the direction query request. The bluetooth module or the WIFI module of the second device may be always in a working state, or may be started when the direction query request sent by the first device is detected.
In the embodiment of the application, when there are many other devices around the target device, in order to improve the efficiency of file transmission and reduce user operations, the location between the second device and the first device may be located by locating the second device, for example, by using the first azimuth angle of the second device, so as to narrow the device range of the second device that needs to be searched, so that the first device may search, from the second device that is in close-range communication connection with the first device, a third device that meets a preset azimuth condition, and then search, from the third device with a smaller range, a final target device.
In some embodiments of the present application, when receiving a first azimuth angle fed back by a second device, a first device obtains a second azimuth angle, matches the second azimuth angle with the first azimuth angle, determines whether a first azimuth angle matched with the second azimuth angle exists, and if the first azimuth angle matched with the second azimuth angle exists, the first device determines the second device corresponding to the first azimuth angle matched with the second azimuth angle as a third device meeting a preset azimuth condition.
It should be noted that the second azimuth is a visual azimuth between the first device and the target device, and the second azimuth may be determined through the first image.
For example, in a specific embodiment, based on the deep neural network, the first device obtains the position information of the target device in the first image through the first image, and calculates the second azimuth angle according to the obtained position information of the target device in the first image.
Specifically, when the first device searches for a third device meeting a preset orientation condition from a second device in close-range communication connection with the first device, the first device inputs an acquired first image into the deep neural network model for identification and positioning, inputs position information of the target device in the first image through the deep neural network model, and calculates a visual orientation angle, namely a second orientation angle, between the first device and the target device according to the position information output by the deep neural network model.
For another example, in another specific embodiment, the first device acquires a third image, where the third image is a binocular image captured by the first device when capturing the first image, and based on a binocular vision positioning method, the first device acquires depth information of the target device and calculates the second azimuth angle according to the acquired depth information of the target device.
It should be noted that at least two cameras are arranged on most of current electronic devices, when any two cameras in the electronic devices are used for shooting images, binocular images can be acquired, information related to a real world environment, especially depth information of a target object can be extracted based on the binocular images, when any two cameras of the first device are used for shooting images, the depth of an observation point can be normally restored under the condition that calibration of the cameras is not accurate, human errors are reduced to the minimum, accurate depth information of the target device can be acquired based on a binocular vision positioning method, and therefore a second azimuth angle calculated according to the depth information is more accurate.
In this embodiment of the application, the third image is a binocular image obtained by shooting with any two cameras in the electronic device, that is, the third image includes a left image and a right image, the left image is an image seen by a left eye of a person, and the right image is an image seen by a right eye of the person. Information about the real world environment, in particular depth information of the target device, can be extracted based on the binocular images.
It should be noted that when the first device acquires the third image, the first device performs binocular calibration and correction on the acquired third image, and performs binocular matching to acquire the depth information of the target device, so that the first device can acquire the accurate second azimuth.
It should be further noted that when the first device acquires the third image, the first device further performs binocular calibration and correction on the acquired third image, and performs binocular matching to acquire the depth information of the target device, so that the first device can acquire the accurate second azimuth angle.
In the embodiment of the present application, the first azimuth fed back to the first device by the second device is related to the number of the second devices, if there is only one second device, the fed back first azimuth is one, and if there are two or more second devices, the fed back first azimuth is two or more. The target device is any one of the second devices, and after receiving the first azimuth fed back by the second device, the first device matches the first azimuth with the second azimuth to judge whether a first azimuth matched with the second azimuth exists, so that whether the second device corresponding to the matched first azimuth is determined to be a third device meeting a preset azimuth condition is determined.
For example, in a specific embodiment, when the first device matches the second azimuth with the first azimuth, the first device calculates the similarity between the first azimuth and the second azimuth; and if the calculated similarity is smaller than a first threshold value, determining that a first azimuth angle corresponding to the similarity smaller than the first threshold value is matched with a second azimuth angle, and determining second equipment corresponding to the matched first azimuth angle as third equipment meeting a preset azimuth condition.
It should be noted that the similarity between the first azimuth and the second azimuth may be calculated by calculating a ratio between an absolute value of a difference between the first azimuth and the second azimuth, if the calculated ratio is smaller than a first threshold, it is determined that the first azimuth corresponding to the calculated ratio smaller than the first threshold is matched with the second azimuth, and the second device corresponding to the matched first azimuth is determined as the third device meeting the preset azimuth condition.
For another example, in another specific embodiment, when the first device matches the second azimuth with the first azimuth, the first device calculates a similarity between the first azimuth and the second azimuth; and if the calculated similarity is within the preset threshold range, determining that the first azimuth corresponding to the similarity within the preset threshold range is matched with the second azimuth, and determining the second equipment corresponding to the matched first azimuth as third equipment meeting the preset azimuth condition.
It should be noted that, the similarity between the first azimuth and the second azimuth may be calculated by calculating a ratio between the first azimuth and the second azimuth, and if the calculated ratio is within a preset threshold range, it is determined that the first azimuth and the second azimuth corresponding to the calculated ratio within the preset threshold range are matched.
For another example, in another specific embodiment, when the first device matches the second azimuth with the first azimuth, the first device calculates an angle difference between the first azimuth and the second azimuth; and if the calculated angle difference is smaller than a second threshold value, determining that the first azimuth angle corresponding to the angle difference smaller than the second threshold value is matched with the second azimuth angle. The angular difference referred to herein is the absolute value of the difference between the first azimuth and the second azimuth.
In the embodiment of the application, in order to obtain a more accurate second azimuth angle, the first device determines the position information of the target device in the first image based on the trained deep neural network model, and then calculates a third azimuth angle according to the determined position information; and meanwhile, the first equipment acquires a third image, determines the depth information of the target equipment based on a binocular vision positioning method, calculates a fourth azimuth angle according to the determined depth information, and takes the average value or mean square value of the third azimuth angle and the fourth azimuth angle as a second azimuth angle.
It should be noted that the deep neural network model is obtained by training an image containing a device as a training sample.
In the embodiment of the application, after a first image is obtained, a first device judges whether the first image contains a target device and a target file displayed on a screen of the target device, if the first image contains the target device and the target file displayed on the screen of the target device, the first device sends a direction query request to a second device, and after the second device receives the direction query request sent by the first device, the second device feeds back a current first direction angle to the first device, so that the first device can search a third device meeting a preset direction condition from the second device according to the first direction angle, the range of searching the target device is effectively reduced, and the searching efficiency and accuracy of the target device are improved.
S103, the first device sends a screen information query request to the third device, wherein the screen information query request is used for indicating the third device to feed back the current screenshot.
In the embodiment of the application, after the first device finds the third device meeting the preset azimuth condition, that is, after the search range of the target device is narrowed, the first device sends a screen information query request to the third device, so that the third device feeds back the current screen capture to the first device, and the first device determines the final target device through the screen capture fed back by the third device conveniently.
It should be noted that, because the third device does not necessarily have a screen, for example, when the third device is a bluetooth sound box or a bluetooth headset, the current screenshot of the third device cannot be fed back to the first device, and at this time, the first device only needs to determine whether the third device fed back the screenshot is the target device.
And S104, the first device receives the screenshot fed back by the third device, matches the first image with the screenshot, and if the matched screenshot exists, determines the third device corresponding to the matched screenshot as the target device.
In the embodiment of the application, the first device matches the first image with the screenshot after receiving the screenshot fed back by the third device, and if the screenshot matched with the first image exists, the third device corresponding to the screenshot matched with the first image is determined to be the target device.
It should be noted that, the first image acquired by the first device includes, in addition to the target device and the target file displayed on the screen of the target device, the surrounding environment of the device, and in order to improve the accuracy of device matching, the first image needs to be segmented to obtain an image with less interference, for example, the segmented target image includes less or even no surrounding environment of the device, and then the target image is matched with the screenshot fed back by the third device to determine the final target device, so that the efficiency and accuracy of image matching can be further improved, and the purpose of improving the file transmission efficiency is achieved.
It should be noted that, when the first device determines the target device from the third devices, it needs to determine the related information of the target file displayed on the screen of the target device, which requires the third device to feed back the related information of the target file displayed on the screen of the third device to the first device, so that the first device generates a corresponding file transmission request, and the target device can find the target file according to the file transmission request.
Specifically, the screen information query request is further used for instructing the third device to feed back currently displayed first file information, the first device receives the currently displayed first file information fed back by the target device, matches the first file information with file information extracted from the first image, and if matching is successful, a file transmission request is generated, where the file transmission request includes the successfully matched first file information.
It should be noted that the file information referred to in the embodiments of the present application includes, but is not limited to, a file type, a file name, a file path, and a file state. The first file information is related to a file displayed on a screen of the third device, including but not limited to related information of a file running in the foreground.
It should be further noted that the file state in the foregoing includes an operation state and a display state of the file, where the operation state may specifically determine whether the current file is in the operation state or the non-operation state according to the position of the cursor pointer, and the display state may determine whether the file is in the display state or the non-display state according to the window state of the file, for example, when the window state of the file is minimized, it may be determined that the file is currently in the non-display state, and when the window state of the file is not minimized, it may be determined that the file is currently in the display state.
Illustratively, the first device matches the file information extracted from the first image with the first file information after receiving the currently displayed first file information fed back by the third device, determines whether there is first file information matching the file information extracted from the first image, for example, determines file information with the same file type and file name as the first file information, and generates a file transmission request after matching successfully if there is first file information matching the file information extracted from the first image, where the generated file transmission request includes the first file information with successful matching and is used for instructing the target device to send a target file to the first device, and the name of the target file is the same as the file name in the first file information.
In the embodiment of the application, the first device receives the first file information fed back by the third device, generates a file transmission request including the first file information successfully matched after the file information extracted from the first image is matched with the first file information, and sends the file transmission request to the target device after the target device is determined by the first device, so that the target device can quickly search a file corresponding to a file name from a file path provided by the first file information to serve as a target file, the query efficiency and accuracy of the target file are improved, the file transmission efficiency and accuracy are improved, and the user experience is better.
S105, the first device sends a file transmission request to the target device, wherein the file transmission request is used for indicating the target device to send the target file to the first device.
In the embodiment of the application, after determining the target device from the third device, the first device displays file transmission confirmation information on a screen of the first device, sends a file transmission request to the target device after receiving a file transmission confirmation request input by a user, and sends a target file to the first device after receiving the file transmission request sent by the first device.
It should be noted that, in order to further reduce the user operations, the first device may directly send the file transfer request to the target device after determining the target device from the third devices, without confirmation by the user.
In some embodiments, the target device sends the node information of the target file while sending the target file to the first device, and after receiving the target file and the node information of the target file, the first device continuously displays the target file of the corresponding node on a screen of the first device according to the node information of the target file.
In the embodiment of the application, the first device searches for the third device meeting the preset azimuth condition from the second device connected with the first device in close range communication, the device range for searching for the target device is reduced, and then the target device is further determined and the transmission of the target file is completed through the screenshot fed back by the third device, so that a user does not need to manually select the target device needing to transmit the file and the target file needing to be transmitted, the operation steps of the user are reduced, the efficiency of file transmission across devices is improved, and the practicability and the usability are high.
As shown in fig. 4, another file transmission method provided in this embodiment of the present application is applied to a second device, and the method includes the following steps:
s201, a second device receives a screen information inquiry request sent by a first device connected with the second device in near field communication, wherein the screen information inquiry request is used for indicating the second device to feed back a current screenshot.
In this embodiment of the application, the step of sending, by the first device, the screen information query request to the first device in close proximity communication connection therewith may be referred to the related description above, and details are not described here again.
Before a second device receives a screen information query request sent by a first device connected with the second device in short-distance communication, the second device receives the azimuth query request sent by the first device, determines a first azimuth according to the signal strength of the short-distance communication between the second device and the first device, and feeds back the first azimuth to the first device, so that the first device determines whether the second device is a device meeting a preset azimuth condition according to the first azimuth, and sends the screen information query request to the second device after determining that the second device is a device meeting the preset azimuth condition, wherein the first azimuth is a distance azimuth between the second device and the first device.
In the embodiment of the application, when receiving an orientation query request sent by a first device connected with a second device in close-range communication, the second device calculates a distance between the second device and the first device according to signal strength of the close-range communication, and calculates a first orientation angle according to the calculated distance, that is, the first orientation angle is determined according to signal strength of the close-range communication between the second device and the first device.
It should be noted that the signal strength of the short-distance communication connection includes, but is not limited to, a bluetooth signal, a WIFI signal, and a wireless access point signal.
In the embodiment of the application, after receiving the direction query request sent by the first device, the second device determines the first direction angle according to the signal strength of the close-range communication between the second device and the first device, and feeds back the determined first direction angle to the first device, so that the first device can determine the direction of the second device according to the first direction angle, and search for the second device meeting the preset direction condition, thereby reducing the range of searching for the target device from the second device, improving the searching efficiency and accuracy of the target device, and achieving the purpose of improving the efficiency of file transmission
In some specific embodiments, when receiving a direction query request sent by a first device, a second device obtains bluetooth signal strength or WIFI signal strength of a close-range communication connection between the second device and the first device, calculates a distance between the second device and the first device according to the obtained bluetooth signal strength or WIFI signal strength, and calculates a first direction angle according to the calculated distance.
In other specific embodiments, the second device broadcasts its own bluetooth signal when detecting the direction inquiry request sent by the first device, establishes bluetooth connection with the first device through the bluetooth signal, calculates a distance between the second device and the first device according to the bluetooth signal strength after establishing bluetooth connection with the first device, and calculates the first azimuth angle according to the calculated distance.
In other specific embodiments, when detecting the direction query request sent by the first device, the second device broadcasts its own WIFI signal, establishes WIFI connection with the first device through the WIFI signal, calculates a distance between the second device and the first device according to the WIFI signal strength after establishing WIFI connection with the first device, and calculates a first direction angle according to the calculated distance.
S202, the second device obtains a current screen capture and feeds the current screen capture back to the first device.
In the embodiment of the application, after the second device is in close-range communication connection with the first device, the first device searches for a third device meeting a preset orientation condition from the second device to narrow a search range of the target device, after the third device meeting the preset orientation condition is found from the second device, the first device sends a screen information query request to the third device meeting the preset orientation condition, and after the third device receives the screen information query request sent by the first device, the third device obtains a current screen capture of the third device and feeds the current screen capture back to the first device, so that the first device determines the target device according to the screen capture and sends a file transmission request to the target device.
In some specific embodiments, after receiving a screen information query request sent by a first device, a third device meeting a preset orientation condition simultaneously acquires first file information and feeds the first file information back to the first device, so that the first device determines whether there is first file information matching file information extracted from a first image, and generates a corresponding file transmission request, that is, the generated file transmission request includes the matching first file information, so that a target device can transmit a target file after receiving the file transmission request.
S203, after receiving the file transmission request sent by the first device, the second device sends a target file to the first device, where the file transmission request is used to instruct the second device to send the target file to the first device.
In the embodiment of the application, after receiving a screenshot fed back by a second device, a first device matches the first image with the screenshot, and determines whether the screenshot matched with the first image exists, if the screenshot matched with the first image exists, the first device determines the second device corresponding to the matched screenshot as a target device, and sends a file transmission request to the target device, and after receiving the file transmission request sent by the first device, the second device serving as the target device sends a target file to the first device according to the file transmission request.
Illustratively, the file transfer request includes user information of the first device, where the user information includes, but is not limited to, account information of the user, such as account information of a device in which the user logs in, account information of a file server in which the user logs in, and the like.
In some embodiments, when receiving a file transmission request sent by a first device, a second device determines whether current account information is consistent with account information of the first device, and if so, the second device sends a target file corresponding to the file transmission request to the first device; otherwise, the second equipment generates a file and sends a confirmation request and feeds back the confirmation request to the first equipment. The file transmission confirmation request is used to instruct the first device to determine whether to transmit the target file.
In other embodiments, before sending the target file to the first device, the second device obtains current node information of the target file and generates file node information, where the node information includes, but is not limited to, a file page number, a cursor position, a file progress bar, and the like. And the second equipment simultaneously sends the file node information and the target file to the first equipment, so that the first equipment can realize continuous display of the file according to the file node information when opening the target file.
In other embodiments, before sending the target file to the first device, the second device determines whether the target file is a preset file type, such as a video file and an audio file, and if the target file is the preset file type, acquires current node information of the target file, generates file node information including a file name and a playing position, and sends the file node information to the first device, so that the first device continuously plays, according to the file node information, the target file corresponding to the file name, such as a video with the file name ABC in the target file, from the playing position, the playing progress is 00.
It should be noted that, when the first device does not have an application program for playing the preset file type, the first device generates an application program installation prompting message to prompt the user to install the application program corresponding to the preset file type.
It should be further noted that, when the first device continues to play the target file corresponding to the file name from the playing position according to the file node information, any application program in the first device that can play the audio/video file may be used, and the same application program that plays the target file in the second device may also be used to perform file continuation playing, which is not limited specifically here.
In the embodiment of the application, after receiving a screen information query request sent by a first device connected with the second device in close-range communication, the second device obtains a current screen capture and feeds the current screen capture back to the first device, so that the first device can determine a target device from the second device according to the screen capture, and after receiving a file transmission request sent by the first device, the target device searches for a target file and sends the target file to the first device, thereby achieving the purpose of quickly transmitting the target file between the first device and the second device, and improving the efficiency of file transmission.
As will be exemplified below with reference to fig. 5, as shown in fig. 5, an open document file is displayed on the screen of the device a, and when the user photographs the device a using the device B, a photographed image of the device a is displayed on the screen of the device B, the image including the device a and the document file displayed on the screen of the device a. The device B enters a file transfer state when confirming that the captured image contains the device a and the document file displayed on the screen of the device, establishes a close-range communication connection with N peripheral devices (including the device a) of the device B, and determines the azimuth of the peripheral device of the device B based on the close-range communication positioning. The device B can narrow the range of the peripheral device to two devices in the left side range according to the determined azimuth angle, and then the screen capture fed back by the two devices in the left side range is matched with the image shot by the device B to determine the target device, namely the device A, for finally transmitting the file.
As shown in fig. 6, a schematic structural diagram of a file transfer apparatus provided in an embodiment of the present application, where the file transfer apparatus is applied to a first device, and includes an image obtaining unit 201, a device searching unit 202, a screen information query request sending unit 203, a device confirming unit 204, and a file transfer request sending unit 205, which are as follows:
an image acquisition unit 201 for acquiring a first image;
a device searching unit 202, configured to search, if the first image includes a target device and a target file displayed on a screen of the target device, a third device that meets a preset orientation condition from second devices that are in close-range communication connection with the first device;
a screen information query request sending unit 203, configured to send a screen information query request to the third device, where the screen information query request is used to instruct the third device to feed back a current screenshot;
a device confirming unit 204, configured to receive a screenshot fed back by the third device, match the first image with the screenshot, and determine, if a matched screenshot exists, the third device corresponding to the matched screenshot as the target device;
a file transfer request sending unit 205, configured to send a file transfer request to the target device, where the file transfer request is used to instruct the target device to send the target file to the first device.
Illustratively, the file transmission apparatus further includes a file transmission request generating unit, configured to receive first file information fed back by the third device, match file information extracted from the first image with the first file information, and generate a file transmission request if there is the matched first file information, where the file transmission request includes the matched first file information.
The device searching unit 202 includes:
a direction query request sending subunit, configured to send a direction query request to the second device if the first image includes a target device and a target file displayed on a screen of the target device, where the direction query request is used to instruct the second device to feed back a current first direction angle, and the first direction angle is a distance direction angle between the second device and the first device, where the first direction angle is determined according to signal strength of near-distance communication between the second device and the first device;
and the equipment searching subunit is used for receiving the first azimuth angle fed back by the second equipment and searching third equipment meeting preset azimuth conditions from the second equipment according to the first azimuth angle.
The device lookup subunit is specifically configured to:
the first device obtains a second azimuth, matches the second azimuth with the first azimuth, and determines a second device corresponding to the matched first azimuth as a third device meeting a preset azimuth condition if the matched first azimuth exists, wherein the second azimuth is a visual azimuth of the first device and the target device.
The device lookup subunit is specifically further configured to:
the first device calculates the similarity of the second azimuth angle and the first azimuth angle;
if the calculated similarity is smaller than a first threshold value, the first device determines that a first azimuth angle corresponding to the similarity smaller than the first threshold value is matched with the second azimuth angle, and determines a second device corresponding to the matched first azimuth angle as a third device meeting a preset azimuth condition.
The device lookup subunit is specifically further configured to:
calculating an angle difference between the second azimuth and the first azimuth;
and if the calculated angle difference is smaller than a second threshold value, determining that the first azimuth angle corresponding to the angle difference smaller than the second threshold value is matched with the second azimuth angle, and determining the second equipment corresponding to the matched first azimuth angle as third equipment meeting a preset azimuth condition.
The device search unit 202 further includes:
and the second azimuth angle calculating subunit is used for determining the position information of the target device in the first image based on the trained deep neural network model and calculating the second azimuth angle according to the position information.
The second azimuth calculation subunit is further specifically configured to:
acquiring a third image, wherein the third image is a binocular image shot by the first equipment when shooting the first image;
and determining the depth information of the target equipment based on a binocular vision positioning method, and calculating the second azimuth angle according to the depth information.
As shown in fig. 7, a schematic structural diagram of a file transmission apparatus provided in this embodiment of the present application is a file transmission apparatus, where the file transmission apparatus is applied to a second device, and includes a screen information query request receiving unit 301, a screenshot feedback unit 302, and a target file sending unit 303, and the specific details are as follows:
a screen information query request receiving unit 301, configured to receive a screen information query request sent by a first device connected to a second device in close-range communication, where the screen information query request is used to instruct the second device to feed back a current screenshot;
a screenshot feedback unit 302, configured to obtain, by the second device, a current screenshot, and feed back the current screenshot to the first device;
a target file sending unit 303, configured to send a target file to the first device after the second device receives a file transmission request sent by the first device, where the file transmission request is used to instruct the second device to send the target file to the first device.
The file transfer device further comprises a first azimuth feedback unit for:
receiving an azimuth query request sent by the first device, determining a first azimuth according to the signal strength of near field communication between the second device and the first device, and feeding back the first azimuth to the first device, so that the first device can determine whether the second device is a device meeting a preset azimuth condition according to the first azimuth, and after determining that the second device is a device meeting the preset azimuth condition, sending a screen information query request to the second device, wherein the first azimuth is a distance azimuth between the second device and the first device.
Wherein, the first azimuth feedback unit includes:
the Bluetooth signal broadcasting subunit is used for broadcasting a Bluetooth signal of the first equipment when detecting the direction inquiry request sent by the first equipment;
and the first calculating subunit is used for establishing Bluetooth connection with the first equipment through the Bluetooth signal and calculating the first azimuth angle according to the Bluetooth signal strength.
The first azimuth feedback unit further includes:
the WIFI signal broadcasting subunit is used for broadcasting a WIFI signal of the WIFI signal broadcasting subunit when the direction inquiry request sent by the first equipment is detected;
and the second calculating subunit is used for establishing WIFI connection with the first equipment through the WIFI signal and calculating the first azimuth angle according to the WIFI signal strength.
The target file sending unit 303 includes:
a target file sending subunit, configured to determine whether current account information is consistent with account information of the first device, and if so, send the target file to the first device by the second device;
and a file sending confirmation request generating subunit, configured to, if the file sending confirmation request is not generated, generate a file sending confirmation request by the second device and feed back the file sending confirmation request to the first device, where the file sending confirmation request is used to instruct the first device to determine whether to transmit the target file.
An embodiment of the present application further provides an electronic device, including: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the foregoing method embodiments.
The embodiments of the present application provide a computer program product, which when running on a terminal device, enables the terminal device to implement the steps in the above method embodiments when executed.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above may be implemented by instructing relevant hardware by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the embodiments of the methods described above may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include at least: any entity or apparatus capable of carrying computer program code to a terminal device, recording medium, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable storage media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and proprietary practices.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.

Claims (12)

1. A file transmission method is applied to first equipment and is characterized by comprising the following steps:
the first equipment acquires a first image;
if the first image comprises target equipment and a target file displayed on a screen of the target equipment, the first equipment searches third equipment meeting a preset azimuth condition from second equipment in close-range communication connection with the first equipment;
the first device sends a screen information query request to the third device, wherein the screen information query request is used for indicating the third device to feed back a current screenshot and indicating the third device to feed back currently displayed first file information;
the first equipment receives the screenshot fed back by the third equipment, matches the first image with the screenshot, and determines the third equipment corresponding to the matched screenshot as the target equipment if the matched screenshot exists;
the first device sends a file transmission request to the target device, wherein the file transmission request is used for indicating the target device to send the target file to the first device;
the first device receives first file information which is fed back by the target device and is displayed currently, matches the first file information with file information extracted from the first image, and if matching is successful, a file transmission request is generated and comprises the first file information which is successfully matched; the file information includes a file type, a file name, a file path, and a file state.
2. The file transfer method according to claim 1, wherein if the first image includes a target device and a target file displayed on a screen of the target device, the first device searches for a third device that satisfies a preset orientation condition from second devices that are in close-range communication connection with the first device, and the method includes:
if the first image comprises a target device and a target file displayed on a screen of the target device, the first device sends a direction query request to the second device, wherein the direction query request is used for indicating the second device to feed back a current first direction angle, the first direction angle is a distance direction angle between the second device and the first device, and the first direction angle is determined according to the signal strength of near-distance communication between the second device and the first device;
and the first equipment receives the first azimuth angle fed back by the second equipment, and searches for third equipment meeting preset azimuth conditions from the second equipment according to the first azimuth angle.
3. The file transmission method according to claim 2, wherein the first device receives the first azimuth fed back by the second device, and searches for a third device satisfying a preset azimuth condition from the second device according to the first azimuth, including:
the first equipment acquires a second azimuth angle, matches the second azimuth angle with the first azimuth angle, and determines second equipment corresponding to the matched first azimuth angle as third equipment meeting a preset azimuth condition if the matched first azimuth angle exists, wherein the second azimuth angle is a visual azimuth angle between the first equipment and the target equipment.
4. The file transmission method according to claim 3, wherein the first device obtains a second azimuth angle, matches the second azimuth angle with the first azimuth angle, and if there is a matched first azimuth angle, determines the second device corresponding to the matched first azimuth angle as a third device satisfying a preset azimuth condition, including:
the first device calculates the similarity of the second azimuth angle and the first azimuth angle;
if the calculated similarity is smaller than a first threshold, the first device determines that a first azimuth corresponding to the similarity smaller than the first threshold is matched with the second azimuth, and determines a second device corresponding to the matched first azimuth as a third device meeting a preset azimuth condition.
5. The file transmission method according to claim 3, wherein the first device obtains a second azimuth angle, matches the second azimuth angle with the first azimuth angle, and if there is a matched first azimuth angle, determines the second device corresponding to the matched first azimuth angle as a third device satisfying a preset azimuth condition, including:
the first device calculating an angle difference between the second azimuth and the first azimuth;
if the calculated angle difference is smaller than a second threshold value, the first device determines that a first azimuth angle corresponding to the angle difference smaller than the second threshold value is matched with the second azimuth angle, and determines a second device corresponding to the matched first azimuth angle as a third device meeting a preset azimuth condition.
6. The file transfer method of claim 3, wherein the first device obtaining a second azimuth comprises:
the first device determines position information of the target device in the first image based on the trained deep neural network model, and calculates the second azimuth angle according to the position information.
7. The file transfer method of claim 3, wherein the first device obtaining a second azimuth comprises:
the first equipment acquires a third image, wherein the third image is a binocular image shot by the first equipment when shooting the first image;
the first equipment determines the depth information of the target equipment based on a binocular vision positioning method, and calculates the second azimuth angle according to the depth information.
8. A file transmission method is applied to a second device, and is characterized by comprising the following steps:
the method comprises the steps that the second equipment receives a screen information inquiry request sent by first equipment connected with the second equipment in short-distance communication, wherein the screen information inquiry request is used for indicating the second equipment to feed back a current screenshot and indicating the second equipment to feed back currently displayed first file information;
the second equipment acquires a current screen capture and feeds the current screen capture back to the first equipment;
after receiving a screen information query request sent by first equipment, the second equipment simultaneously acquires first file information and feeds the first file information back to the first equipment, so that the first equipment can judge whether first file information matched with file information extracted from a first image exists or not, if matching is successful, a corresponding file transmission request is generated, and the file transmission request comprises the successfully matched first file information; the file information comprises a file type, a file name, a file path and a file state;
and after receiving a file transmission request sent by the first equipment, the second equipment sends a target file to the first equipment, wherein the file transmission request is used for indicating the second equipment to send the target file to the first equipment.
9. The file transfer method according to claim 8, before the second device receives the screen information inquiry request transmitted from the first device connected in close range communication therewith, further comprising:
the second device receives an azimuth query request sent by the first device, determines a first azimuth according to the signal strength of near field communication between the second device and the first device, and feeds back the first azimuth to the first device, so that the first device can determine whether the second device is a device meeting a preset azimuth condition according to the first azimuth, and sends a screen information query request to the second device after determining that the second device is a device meeting the preset azimuth condition, wherein the first azimuth is a distance azimuth between the second device and the first device.
10. An electronic device, comprising: a processor and a memory, the processor and the memory coupled, the memory for storing a computer program that, when executed by the processor, causes an electronic device to perform the file transfer method of any of claims 1-7.
11. An electronic device, comprising: a processor and a memory, the processor and the memory being coupled, the memory for storing a computer program that, when executed by the processor, causes an electronic device to perform the file transfer method of claim 8 or 9.
12. A computer-readable storage medium, in which a computer program is stored which, when run on an electronic device, causes the electronic device to carry out a file transfer method according to any one of claims 1 to 7 and/or 8 to 9.
CN202011018822.4A 2020-09-24 2020-09-24 File transmission method and electronic equipment Active CN114338642B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011018822.4A CN114338642B (en) 2020-09-24 2020-09-24 File transmission method and electronic equipment
PCT/CN2021/117200 WO2022062902A1 (en) 2020-09-24 2021-09-08 File transfer method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011018822.4A CN114338642B (en) 2020-09-24 2020-09-24 File transmission method and electronic equipment

Publications (2)

Publication Number Publication Date
CN114338642A CN114338642A (en) 2022-04-12
CN114338642B true CN114338642B (en) 2023-04-07

Family

ID=80844902

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011018822.4A Active CN114338642B (en) 2020-09-24 2020-09-24 File transmission method and electronic equipment

Country Status (2)

Country Link
CN (1) CN114338642B (en)
WO (1) WO2022062902A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115562570B (en) * 2022-04-27 2023-09-12 荣耀终端有限公司 Data migration method, system and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013123694A1 (en) * 2012-02-21 2013-08-29 海尔集团公司 Method and system applicable in multi-screen sharing for close range azimuth positioning and for file transmission
CN103685707A (en) * 2012-09-21 2014-03-26 中国移动通信集团公司 File transmission method, system and device
CN106817677A (en) * 2017-01-19 2017-06-09 北京邮电大学 A kind of indoor objects information identifying method, apparatus and system based on multisensor
US9916328B1 (en) * 2014-07-11 2018-03-13 Google Llc Providing user assistance from interaction understanding
CN110536479A (en) * 2019-08-28 2019-12-03 维沃移动通信有限公司 Object transmission method and electronic equipment
CN111432331A (en) * 2020-03-30 2020-07-17 华为技术有限公司 Wireless connection method, device and terminal equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6258077B2 (en) * 2014-03-04 2018-01-10 アルパイン株式会社 COMMUNICATION SYSTEM AND ELECTRONIC DEVICE, PAIRING METHOD, PAIRING PROGRAM
CN105491088A (en) * 2014-09-17 2016-04-13 中兴通讯股份有限公司 File transfer method and device
CN105578229A (en) * 2015-12-15 2016-05-11 小米科技有限责任公司 Electronic equipment control method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013123694A1 (en) * 2012-02-21 2013-08-29 海尔集团公司 Method and system applicable in multi-screen sharing for close range azimuth positioning and for file transmission
CN103685707A (en) * 2012-09-21 2014-03-26 中国移动通信集团公司 File transmission method, system and device
US9916328B1 (en) * 2014-07-11 2018-03-13 Google Llc Providing user assistance from interaction understanding
CN106817677A (en) * 2017-01-19 2017-06-09 北京邮电大学 A kind of indoor objects information identifying method, apparatus and system based on multisensor
CN110536479A (en) * 2019-08-28 2019-12-03 维沃移动通信有限公司 Object transmission method and electronic equipment
CN111432331A (en) * 2020-03-30 2020-07-17 华为技术有限公司 Wireless connection method, device and terminal equipment

Also Published As

Publication number Publication date
WO2022062902A1 (en) 2022-03-31
CN114338642A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN113794801B (en) Method and device for processing geo-fence
WO2020000448A1 (en) Flexible screen display method and terminal
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN114546190A (en) Application display method and electronic equipment
CN116360725B (en) Display interaction system, display method and device
CN113254409A (en) File sharing method, system and related equipment
CN114650363A (en) Image display method and electronic equipment
CN111371849A (en) Data processing method and electronic equipment
CN111602108A (en) Application icon display method and terminal
CN114079893A (en) Bluetooth communication method, terminal device and computer readable storage medium
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN114995715B (en) Control method of floating ball and related device
CN111835904A (en) Method for starting application based on context awareness and user portrait and electronic equipment
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN113970888A (en) Household equipment control method, terminal equipment and computer readable storage medium
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN113973398A (en) Wireless network connection method, electronic equipment and chip system
CN113641271A (en) Application window management method, terminal device and computer readable storage medium
CN111492678B (en) File transmission method and electronic equipment
CN115914461B (en) Position relation identification method and electronic equipment
CN114201738A (en) Unlocking method and electronic equipment
CN114338642B (en) File transmission method and electronic equipment
CN114064160A (en) Application icon layout method and related device
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant