CN117472603A - Data transmission method, electronic equipment and cooperative work system - Google Patents

Data transmission method, electronic equipment and cooperative work system Download PDF

Info

Publication number
CN117472603A
CN117472603A CN202210859322.6A CN202210859322A CN117472603A CN 117472603 A CN117472603 A CN 117472603A CN 202210859322 A CN202210859322 A CN 202210859322A CN 117472603 A CN117472603 A CN 117472603A
Authority
CN
China
Prior art keywords
shared memory
module
data
camera
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210859322.6A
Other languages
Chinese (zh)
Inventor
高蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210859322.6A priority Critical patent/CN117472603A/en
Publication of CN117472603A publication Critical patent/CN117472603A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/54Indexing scheme relating to G06F9/54
    • G06F2209/543Local

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the application provides a data transmission method, electronic equipment and a cooperative work system. The method is applied to electronic equipment, the electronic equipment comprises a first process and a second process, the first process is used for running a target application, the second process is used for transmitting data to the first process, the second process comprises a first module and a second module, the first module is used for adapting the first process and the second module, the second module is used for receiving the data, the first process and the second process are communicated through AIDL, the first module and the second module are communicated through JNI, a shared memory is created by the first module and is used for transmitting the data between the first process and the second process, the second module copies the received data into the shared memory, the target application obtains the data in the shared memory and copies the data into a user space of the first process, therefore, data transmission across processes is achieved, the number of times of data copying is small, the speed of data transmission across processes is improved, and the time consumption of data transmission is reduced.

Description

Data transmission method, electronic equipment and cooperative work system
Technical Field
The application relates to the technical field of intelligent terminals, in particular to a data transmission method, electronic equipment and a cooperative work system.
Background
Because the processes in the Android system cannot share the memory, some mechanisms need to be provided to perform data communication between different processes. Wherein in a conventional communication mechanism, data transfer across processes requires 4 copy operations. Taking the example that the A process transfers data to the B process, firstly, copying the data from the user space of the A process to the kernel space of the A process, then copying the data from the kernel space of the A process to the memory, then copying the data from the memory to the kernel space of the B process, and finally copying the data from the kernel space of the B process to the user space of the B process.
Such data copying operations undoubtedly reduce the speed of data transfer between processes, and extend the transfer time. Therefore, how to increase the transmission speed of data across processes and reduce the transmission time consumption is a problem to be solved.
Disclosure of Invention
In order to solve the technical problems, embodiments of the present application provide a data transmission method, an electronic device, and a collaborative system. In the data transmission method, the data copying times are small during data cross-process transmission, so that the speed of data cross-process transmission is improved, and the transmission time is reduced.
In a first aspect, an embodiment of the present application provides a data transmission method, which is characterized in that the method is applied to an electronic device, where the electronic device includes a first process and a second process, the first process is used for running a target application, and the second process is used for transmitting target data to the first process; the second process comprises a first module based on JAVA and a second module based on C++, and the second module is used for receiving target data; the first process and the second process are communicated through AIDL, and the first module and the second module are communicated through JNI. Wherein the method comprises the following steps:
the second module obtains the frame size of the target data and sends the frame size to the first module;
the first module creates a shared memory according to the frame size and sends the shared memory address to the second module;
the second module fills the target data according to the shared memory address and sends the filling result to the first module;
the first module sends the shared memory address to the target application when the filling result indicates that the filling is successful;
and the target application acquires target data according to the shared memory address.
The target application may be any application program.
Thus, the first module creates the shared memory for transmitting data between the first process and the second process, the second module copies the received target data into the shared memory, and the target application obtains the target data from the shared memory and copies the target data into the user space of the first process, so that data transmission across processes is realized, the data copying times are less, the data transmission speed across processes is improved, and the time consumption of data transmission across processes is reduced.
According to the first aspect, the electronic equipment and the Internet of things equipment are in communication connection, and the target application is bound with the Internet of things equipment; wherein the method further comprises, before the second module obtains the frame size of the target data: the target application sends a data request to the Internet of things equipment through a second process; the second module receives target data fed back by the Internet of things equipment according to the data request.
According to the first aspect, or any implementation manner of the first aspect, the data request includes a task identifier, and the target data carries the task identifier; wherein after the first module creates the shared memory according to the frame size, the method further comprises: the first module correspondingly stores the task identifier and the shared memory address in a shared memory pool;
the second module sending the filling result to the first module may include: the second module sends the filling result and the task identifier to the first module;
before the first module sends the shared memory address to the target application, the method further comprises: the first module queries a corresponding shared memory address in the shared memory pool according to the task identifier.
Because the transmission flow of the shooting request and the shooting image data spans a plurality of modules, each module is an asynchronous flow process, the task identification is used as the unique identification of the data transmission, so that the accurate execution of the asynchronous flow can be ensured, namely the safe and efficient transmission of the data in multiple modules and multiple threads can be ensured, and the situation that the target application cannot acquire the accurate data is avoided.
According to the first aspect, or any implementation manner of the first aspect, the storing, by the first module, the task identifier and the shared memory address in the shared memory pool, may include:
when the number of the shared memory addresses stored in the shared memory pool is smaller than the threshold value of the maintenance number of the shared memory, the first module correspondingly stores the task identifier and the shared memory addresses into the shared memory pool; the shared memory maintenance number threshold is used for indicating the total number of memory addresses which can be maintained by the shared memory pool;
when the number of the shared memory addresses stored in the shared memory pool is greater than or equal to the threshold value of the maintenance number of the shared memory, the first module moves the target shared memory address out of the shared memory pool, and correspondingly stores the task identifier and the shared memory address into the shared memory pool; among the plurality of shared memory addresses currently stored in the shared memory pool, the time for storing the target shared memory address in the shared memory pool is earliest.
Thus, by setting the threshold value of the maintenance quantity of the shared memory for the shared memory pool, the infinite expansion of the shared memory can be avoided.
According to the first aspect, or any implementation manner of the first aspect, the data request further includes a data parameter, and the target data carries the data parameter;
The first module correspondingly stores the task identifier and the shared memory address in a shared memory pool, and the first module comprises: the first module correspondingly stores the task identifier and the shared memory address in a shared memory pool matched with the data parameter value.
By way of example, the data parameters may be parameters for distinguishing the target data, such as patterns, formats, and the like. Taking the target data as image data as an example, the data parameter may be a photographing mode or the like.
In this way, for the data parameter values carried in the data requests are different, different shared memory pools can be used to manage the shared memory addresses corresponding to the corresponding data requests.
According to a first aspect, or any implementation manner of the first aspect, the shared memory maintenance number threshold value of different shared memory pools matched with different data parameter values is different.
In this way, different shared memory maintenance quantity thresholds are set for different shared memory pools, so that the method can be suitable for more application scenes.
According to a first aspect, or any implementation manner of the first aspect, the frame size of the target data is larger than a preset threshold.
Therefore, for the cross-process transmission of big data or big files, the data cross-process transmission speed is improved more obviously, and the transmission time consumption is reduced more obviously.
According to the first aspect, or any implementation manner of the first aspect, the internet of things device is a desk lamp; the desk lamp is provided with a camera; the second process is used for running hardware virtualization service, the first module is a device management module, and the second module is a transmission channel module; the data request comprises a photographing request, and the target data comprises photographed image data; the method further comprises the steps of: the hardware virtualization service registers a virtual camera corresponding to the camera in the system; and generating a photographing request when the target application calls the virtual camera.
Thus, the method can be applied to the scene that the electronic equipment and the Internet of things equipment work cooperatively. The electronic equipment and the Internet of things equipment work cooperatively, so that better use experience can be brought to the user. Compared with shooting by a user holding the electronic equipment, the image acquisition of the Internet of things equipment is more stable, and related operation of the user when using the target application in the electronic equipment is not influenced. Moreover, the Internet of things equipment does not need to have a display function, and the cost of the Internet of things equipment is reduced.
According to the first aspect, or any implementation manner of the first aspect, the data request further includes a data parameter, where the data parameter is a photographing mode, and the photographing mode includes a normal photographing mode and a continuous photographing mode; wherein,
The shared memory maintenance quantity threshold of the first shared memory pool matched with the common photographing mode is smaller than that of the second shared memory pool matched with the continuous photographing mode.
Therefore, the shared memory maintenance quantity threshold value of the shared memory pool matched with the continuous shooting mode is larger than that of the shared memory pool matched with the common shooting mode, and the method is more suitable for the situation that the number of the shot images returned by the table lamp in a short time in the continuous shooting scene is large. And aiming at the request that the table lamp returns less shot images in a short time in a common shooting scene, the shared memory pool is wirelessly set to maintain more shared memories, so that the resource waste is avoided.
According to the first aspect, or any implementation manner of the first aspect, the electronic device includes a mobile phone and a tablet computer.
In a second aspect, embodiments of the present application provide an electronic device. The electronic device includes: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored on the memory, which when executed by the one or more processors, cause the electronic device to perform the data transmission method of the first aspect and any implementation of the first aspect.
Any implementation manner of the second aspect and the second aspect corresponds to any implementation manner of the first aspect and the first aspect, respectively. The technical effects corresponding to the second aspect and any implementation manner of the second aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a third aspect, an embodiment of the present application provides a collaborative system, including an electronic device executing the data transmission method in any one implementation manner of the first aspect and the first aspect, and an internet of things device, where the internet of things device is provided with a camera, and the camera is used to collect image data, and a target application in the electronic device is bound with the internet of things device;
the electronic device is used for: registering a virtual camera corresponding to the camera in the system, and sending an image preview request to the Internet of things equipment by calling the virtual camera;
the internet of things device is used for: according to an image preview request of the electronic equipment, invoking a camera to collect preview image data, and sending the preview image data to a target application of the electronic equipment for preview display;
the electronic device is also for: generating a photographing request when the virtual camera is called, and sending the photographing request to the Internet of things equipment;
The internet of things device is further configured to: and calling a camera to shoot an image according to a shooting request of the electronic equipment, and sending shooting image data to a target application of the electronic equipment for display.
Like this, electronic equipment and thing networking equipment collaborative work can bring better use experience for the user. And the data transmission speed is higher and the time consumption is less when the data returned by the Internet of things equipment is transmitted to the target application in a cross-process manner. Meanwhile, shooting is carried out relative to the electronic equipment held by the user, the image acquired by the equipment of the Internet of things is more stable, and related operation of the user in using the target application in the electronic equipment is not affected. In addition, the Internet of things equipment does not need to have a display function, and the cost of the Internet of things equipment is reduced.
According to a third aspect, the internet of things device is a desk lamp, and the camera is used for downwardly collecting image data.
Therefore, shooting is carried out relative to the electronic equipment held by the user, the image acquired by the equipment of the Internet of things is more stable, and related operation of the user in using the target application in the electronic equipment is not affected.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium. The computer readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the data transmission method of any one of the implementations of the first aspect and the second aspect described above.
Any implementation manner of the fourth aspect and any implementation manner of the fourth aspect corresponds to any implementation manner of the first aspect and any implementation manner of the first aspect, respectively. Technical effects corresponding to any implementation manner of the fourth aspect may be referred to the technical effects corresponding to any implementation manner of the first aspect, and are not described herein.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program, which when executed, causes a computer to perform the data transmission method of any one of the implementations of the first aspect and the first aspect described above.
Any implementation manner of the fifth aspect and any implementation manner of the fifth aspect corresponds to any implementation manner of the first aspect and any implementation manner of the first aspect, respectively. Technical effects corresponding to any implementation manner of the fifth aspect may be referred to the technical effects corresponding to any implementation manner of the first aspect, and are not described herein.
In a sixth aspect, the present application provides a chip comprising processing circuitry, a transceiver pin. Wherein the transceiver pin and the processing circuit communicate with each other through an internal connection path, and the processing circuit executes the data transmission method in any implementation manner of the first aspect and the first aspect to control the receiving pin to receive signals and control the transmitting pin to transmit signals.
Any implementation manner of the sixth aspect corresponds to any implementation manner of the first aspect and the first aspect, respectively. Technical effects corresponding to any implementation manner of the sixth aspect may be referred to the technical effects corresponding to any implementation manner of the first aspect, and are not described herein.
Drawings
FIGS. 1 a-1 b are exemplary illustrations of an application scenario;
fig. 2a is a schematic diagram of a hardware structure of an exemplary electronic device;
FIG. 2b is a schematic diagram of a software architecture of an exemplary electronic device;
fig. 3a is a schematic diagram of a hardware structure of an exemplary illustrated terminal of the internet of things;
fig. 3b is a schematic diagram of a software structure of an exemplary illustrated terminal of the internet of things;
FIG. 4a is a schematic diagram of module interaction according to an embodiment of the present disclosure;
FIG. 4b is a schematic diagram of module interaction according to an embodiment of the present disclosure;
FIGS. 5 a-5 b are exemplary illustrations of an application scenario;
FIG. 6a is a schematic diagram of module interaction according to an embodiment of the present disclosure;
FIG. 6b is a schematic diagram of module interaction according to an embodiment of the present disclosure;
7 a-7 b are exemplary illustrations of one application scenario;
Fig. 8 is a schematic diagram of module interaction provided in an embodiment of the present application;
9 a-9 c are exemplary illustrations of one application scenario;
FIG. 10 is a schematic diagram of module interaction according to an embodiment of the present application
FIGS. 11 a-11 b are exemplary illustrations of an application scenario;
FIG. 12 is a schematic diagram illustrating a communication interface during an inter-process data transfer;
FIG. 13 is a schematic diagram illustrating a data copy flow involved in an inter-process data transfer;
FIG. 14 is a schematic diagram of a data copying process involved in inter-process data transmission according to an embodiment of the present application;
FIG. 15 is a schematic diagram illustrating module interaction during inter-process data transmission according to an embodiment of the present disclosure;
FIG. 16 is a schematic diagram illustrating a shared memory pool management;
fig. 17 is a schematic diagram illustrating a shared memory pool management.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
With the development of the internet, online education is popular with more and more people, and users (such as students) have an increasing demand for online education. In some application scenarios, when a new word is encountered, students can perform online word searching to obtain relevant interpretation; in some application scenarios, online reading of book content is more convenient for students to learn knowledge and pronunciation; in some application scenarios, students need to submit their jobs online. Therefore, how to meet the online education needs of users based on intelligent devices is a problem to be solved.
At present, aiming at an online education scene, a user usually uses an intelligent learning machine with a shooting function and a display function, and the intelligent learning machine needs a camera or a reflecting mirror at a special position to shoot books, so that the universality and the usability are not strong. In addition, the equipment with the shooting function and the display function needs stronger hardware and system support, and the equipment cost is higher. Furthermore, how to provide better online education experience for users based on intelligent equipment, improves universality and usability, reduces online education cost and is a problem to be solved.
The collaborative work system provided by the embodiment of the application can be applied to an online education scene. The system working system comprises electronic equipment and a desk lamp which are in communication connection, wherein a camera is arranged on the desk lamp and can be used for shooting books downwards. The electronic equipment invokes the camera of the desk lamp to collect images and combines the platform online education resources to meet the online education requirements of users. The electronic device may be a tablet computer or a mobile phone. In addition to the online educational scenario, the electronic device and desk lamp may also serve users based on their respective basic functions (i.e., communication function and lighting function). Therefore, the system working system can create better online education experience for the user based on two intelligent devices commonly used by the user, and has strong universality and usability. In addition, as the tablet personal computer or the mobile phone and the like are necessary products of the household, the cost display of the desk lamp equipment with the shooting function is lower than that of the equipment with the shooting function and the display function, and the online education cost of the user is greatly reduced.
The technical scheme provided by the application is explained below by taking electronic equipment as a flat plate as an example.
Fig. 1a shows an exemplary application scenario. As shown in fig. 1a, the co-operating system comprises a tablet 100 and a desk lamp 200 which establish a communication connection. The desk lamp 200 includes a camera 201 for capturing images downwards, for example, for capturing text or picture contents in a book downwards. An education APP (Application) is installed in the tablet 100, and the education APP may call the camera 201 of the desk lamp 200 to collect images, and provide various online education functions, such as online word checking, online reading, online submission, etc., for the user according to the images collected by the camera 201 of the desk lamp 200.
Although the tablet is also provided with the front camera and the rear camera, no matter which camera is used for shooting books, the user is required to hold the tablet by hand to aim the cameras at the books, so that the shooting is unstable, the finger reading and the point reading operations executed by the user by hands are affected, and better online education experience cannot be provided for the user. As shown in fig. 1a, the tablet 100 and the desk lamp 200 can be placed at fixed positions, the tablet 100 uses the camera 201 of the desk lamp 200 to shoot a book, and the shot images are stable, so that the success rate of content identification is high, and a user can flexibly perform finger reading and point reading operations in the book. Therefore, the linkage of the tablet computer and the table lamp can provide a better online education experience for the user.
As shown in fig. 1b, the tablet 100 and the desk lamp 200 may perform near field communication or far field communication. The near field communication can complete information interaction among devices through the router and other devices, and the far field communication can complete information interaction among devices through the cloud server. Illustratively, the tablet 100 and the desk lamp 200 may implement near field communication based on Wi-Fi (wireless fidelity ) network protocols or the like.
Fig. 2a is a schematic structural diagram of the electronic device 100. Alternatively, the electronic device 100 may be a terminal, which may also be referred to as a terminal device, and the terminal may be a cellular phone (cellular phone) or a tablet computer (pad), which is not limited in this application. It should be noted that the schematic structural diagram of the electronic device 100 may be applied to the flat panel in fig. 1a to 1 b. It should be understood that the electronic device 100 shown in fig. 2a is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 2a may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor, a gyroscope sensor, an acceleration sensor, a temperature sensor, a motion sensor, a barometric sensor, a magnetic sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 100.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In the embodiment of the present application, the display screen 194 may display a photographing preview interface, a photographing image interface, and the like. It should be noted that, in the embodiment of the present application, the shooting preview interface refers to an interface where a user can view an image collected by the desk lamp camera in real time through the display screen 194.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121, for example, to cause the electronic device 100 to implement a cooperative method and/or a data transfer method in the embodiments of the present application. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A. In some embodiments, the electronic device 100 may be provided with a plurality of speakers 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor.
Touch sensors, also known as "touch panels". The touch sensor may be disposed on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type.
The keys 190 include a power-on key (or power key), a volume key, etc. The keys 190 may be mechanical keys or touch keys. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2b is a software architecture block diagram of the electronic device 100 according to an embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each with a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, a system layer, a HAL layer (Hardware Abstract Layer, hardware abstraction layer), and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 2b, the application packages may include conversations, video, bluetooth, camera, WLAN, educational applications, device manager applications, and the like. The application packages may also include calendar, map, navigation, music, short messages, etc. applications.
Among other things, the educational application may be used to provide online educational functions for the user, such as online word recognition, online reading aloud, online submission jobs, and the like.
In some examples, a device manager application may be used to bind IOT (Internet of Things ) devices such as desk lamps. In some examples, the educational application may enable binding of IOT (Internet of Things ) devices such as desk lamps.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 2b, the application framework layer may include a camera service, an authentication service, a hardware virtualization service, a device management service, a transmission management service, and the like.
Among other things, a camera service (camera service) may be used to invoke a camera (including a front-facing camera and/or a rear-facing camera) in response to a request of an application.
In the embodiment of the application, the camera service may be used for calling the virtual camera at the electronic device side, that is, calling the camera in the IOT device, in response to a request of the application.
Authentication services are used to provide secure rights management capabilities.
A hardware virtualization service may be used to establish a logical channel between the electronic device side (i.e., the center device side) and the IOT device side, providing the ability to virtualize a camera.
The device management service can be used for discovering and managing the IOT devices and providing far-field (i.e. cloud) IOT device information and near-field (i.e. near-field connectable) IOT device information for application programs such as education applications.
The transmission management service can be used for establishing a physical transmission channel and providing data transmission capability.
In addition, a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, etc. may be included.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture. The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction.
The system library and Runtime layer (i.e., system layer) includes a system library and Android Runtime (Android run time).
Android run time (Android Runtime) includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
In the embodiment of the application, the Android run time further comprises a virtual camera adaptation layer, and the virtual camera registration capability is provided.
The system library in the system layer may include a plurality of functional modules. For example: multimedia platform, graphic image processing library, codec, etc.
The multimedia platform can be used for managing multimedia and supporting various common audio, video format playback and recording, still image files and the like. The multimedia platform may support a variety of audio and video coding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
Graphics image processing libraries may be used to implement graphics drawing, image rendering, compositing, and layer processing, among others.
The codec may be used to implement codec operations on audio data, video data.
The HAL layer is an interface layer between the operating system kernel and the hardware circuitry. HAL layers include, but are not limited to: audio HAL, sensor HAL, modem HAL, camera HAL, virtual camera HAL.
Wherein the audio HAL is used for processing the audio stream, for example, noise reduction, directional enhancement, etc. of the audio stream. The camera HAL is used for processing the image stream corresponding to the camera at the electronic equipment side, and the virtual camera HAL is used for processing the image stream corresponding to the virtual camera registered at the electronic equipment side, namely, the image stream acquired by the camera at the IOT equipment side.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera area, an audio driver, a network driver (such as Wi-Fi driver), a CPU driver, a USB driver, a storage driver, a print driver, and the like. The hardware at least comprises a processor, a display screen, a Wi-Fi module and the like.
It will be appreciated that the layers and components contained in the layers in the software structure shown in fig. 2b do not constitute a specific limitation of the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer layers than shown, and more or fewer components may be included in each layer, as the present application is not limited.
Fig. 3a is a schematic diagram of a hardware structure of the internet of things device 200. It should be noted that the schematic structural diagram of the internet of things device 200 may be applicable to the desk lamp in fig. 1a to 1 b. It should be understood that the internet of things device 200 shown in fig. 3a is only one example of an electronic device, and that the internet of things device 200 may have more or fewer components than shown in the figures, may combine two or more components, or may have different configurations of components. The various components shown in fig. 3a may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The internet of things device 200 may include: processor 210, camera 201, wireless communication module 202, memory 203, audio module 204, usb interface 205, charge management module 206, power management module 207, battery 208, lighting device 209, keys 211, etc.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include a GPU, ISP, controller, memory, video codec, etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the internet of things device 200. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The camera 201 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. Taking a desk lamp as an example, the camera 201 may be disposed on a desk lamp stand for capturing images downwards.
The internet of things device 200 may implement a photographing function through an ISP, a camera 201, a video codec, a GPU, etc.
The ISP is used to process the data fed back by the camera 201. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, an ISP may be provided in the camera 201.
The wireless communication module 202 may provide solutions for wireless communication including WLAN (e.g., wi-Fi network), bluetooth (BT), etc. for use on the internet of things device 200. In some embodiments, the antenna of the internet of things device 200 and the wireless communication module 202 are coupled such that the internet of things device 200 can communicate with a network and other devices through wireless communication techniques.
Memory 203 may be used to store computer executable program code that includes instructions. The processor 210 executes the instructions stored in the memory 203, thereby performing various functional applications and data processing of the internet of things device 200, for example, to enable the internet of things device 200 to implement the cooperative working method in the embodiments of the present application.
The internet of things device 200 may implement audio functions, such as music playing, etc., through the audio module 204, the speaker 212, etc.
The USB interface 205 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 205 may be used to connect a charger to charge the internet of things device 200, or may be used to transfer data between the internet of things device 200 and a peripheral device.
The charge management module 206 is configured to receive a charge input from a charger. The charging management module 206 may also supply power to the internet of things device 200 through the power management module 207 while charging the battery 208.
The power management module 207 is used to connect the battery 208, the charge management module 206 and the processor 210. The power management module 207 receives input from the battery 208 and/or the charge management module 206 and provides power to the processor 210, the memory 203, the camera 201, the wireless communication module 202, the lighting device 209, and the like.
The keys 211 include a power-on key (or power key), and the like.
The software system of the internet of things device 200 may employ a layered architecture or other architecture, etc. The embodiment of the application takes a layered architecture as an example, and illustrates a software structure of the internet of things device 200.
Fig. 3b is a software architecture block diagram of the internet of things device 200 according to an embodiment of the present application.
The layered architecture of the internet of things device 200 divides the software into several layers, each of which has a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the system of the internet of things device 200 is divided into three layers, from top to bottom, an application layer, an application framework layer, a system layer, and a kernel layer, respectively.
As shown in fig. 3b, the application layer may include a device application service, which may be understood as a system level application, and the device application service is started after the system of the internet of things device 200 is started.
As shown in fig. 3b, the application framework layer may include device interconnection services, hardware abstraction services, resource managers, and the like.
The device interconnection service can be used for establishing a physical transmission channel, providing data transmission capability and managing a starting switch of the hardware abstraction service.
The hardware abstraction service may be used to establish a logical channel between the electronic device side (i.e., the hub device side) and the IOT device, provide the ability to virtualize the camera, and provide a camera open interface for the IOT device.
The resource manager may provide various resources to the application.
As shown in fig. 3b, the system layer may include a multimedia platform, a graphic image processing library, a codec, a device adaptation module, and the like.
The multimedia platform can be used for managing multimedia and supporting various common audio, video, still image files and the like. The multimedia platform may support a variety of audio and video coding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
Graphics image processing libraries may be used to implement graphics drawing, image rendering, compositing, and layer processing, among others.
The codec may be used to implement codec operations on audio data, video data.
The device adaptation module can realize an interface of hardware abstraction service, can provide device information and capability query, and can also provide functions of executing related operations on the IOT device side, such as functions of opening a camera, photographing, previewing and the like.
It may be appreciated that, in order to implement the cooperative method in the embodiments of the present application, the electronic device 100 and the internet of things device 200 include corresponding hardware and/or software modules that perform the respective functions. The steps of an algorithm for each example described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
FIG. 4a is a schematic diagram showing the interaction of the modules. Referring to fig. 4a, the embodiment of the present application provides a method flow for cooperative work of a tablet and a desk lamp, specifically including:
0. device service initialization phase
S0.1, responding to user operation, starting the device application service of the desk lamp, and loading the device interconnection service.
For example, the user operation may be an operation in which the user turns on the desk lamp power supply. And responding to the user operation, starting the desk lamp system, starting the equipment application service, and loading the equipment interconnection service. The device interconnection service may be used to establish a physical transmission channel between the tablet and the desk lamp, and to provide data transmission capability.
S0.2, loading hardware abstraction service by the equipment interconnection service of the desk lamp.
The device interconnection service may also control the opening of the hardware abstraction service. For example, after the device interconnection service is started, the device interconnection service may load the hardware abstraction service in the form of a plug-in. The hardware abstraction service can be used for establishing a logic channel between the tablet and the desk lamp, providing the capability of virtualizing the camera, and simultaneously providing an open interface of the desk lamp camera.
Referring to the block interaction diagram shown in FIG. 4b, the hardware abstraction service may include at least a base component and a camera component. In the device service initialization stage, the device interconnection service loads the base component first, and initializes the base component. After the basic component is initialized, information interaction can be carried out with the equipment adaptation module of the desk lamp, and equipment information and virtualization capability information are obtained. Exemplary device information includes, but is not limited to, device name, device identification, device type, and the like. Exemplary virtualization capability information includes, but is not limited to, whether a virtualized camera is supported, whether a virtualized microphone is supported, and the like.
The desk lamp has the capability of supporting the virtualized camera, and can be understood that the camera of the desk lamp allows other electronic devices (such as a tablet) to call, namely, the camera of the desk lamp can be understood as a virtual camera of other electronic devices.
After the base component obtains the device information and the capability information of the desk lamp, if the desk lamp has the capability of supporting the virtualized camera, the base component loads the camera component to provide the capability of the virtualized camera. At this time, the base component may prepare for negotiation channel establishment to negotiate network connection related information (including but not limited to IP address and port, etc.) with the tablet to establish the negotiation channel. When the base component performs negotiation channel establishment preparation, a Session service (Session Server) is created, and a Session Name (Session Name) of the Session service is sent to the device interconnection service, so that a negotiation channel is established between the transmission management service on the tablet side and the device interconnection service on the desk lamp side.
1. Device discovery phase
S1.1, responding to user operation, and transmitting a device discovery instruction to a device management service by the education APP of the tablet.
The user operation may be an operation that the user clicks a function option in the education APP that needs to call the virtual camera. For example, the user may click on the operations of the click-to-read function, the word search function, the job function, and the photographing function in the education APP.
The education APP of the tablet receives the user operation, and in response to the operation, transmits a device discovery instruction to the device management service of the tablet. The device discovery instruction is used for indicating to search the IOT device capable of establishing connection with the tablet. By way of example, the device discovery instructions may include, but are not limited to, an instruction type and a device type to be discovered. In this embodiment, the device discovery instruction is specifically configured to find a desk lamp that can establish a connection with the tablet.
S1.2, the device management service in the tablet invokes an authentication service to authenticate the education APP, and an authentication result of the education APP is obtained.
After receiving the device discovery instruction, the device management service can acquire the name (or identification) of the education APP based on the existing mechanism of the android system, and perform APP authentication on the education APP according to the name of the education APP. The device management service can invoke the authentication service to authenticate the education APP so as to obtain an authentication result of the education APP.
In this embodiment, the tablet side application framework layer is further provided with a device management API corresponding to the device management service, and a hardware virtualization API corresponding to the hardware virtualization service. In order to implement the technical solution provided in this embodiment, the education APP needs to register on a related platform (for example, a platform provided by a tablet manufacturer), adapt a framework of a device management service, a hardware virtualization service, and a transmission management service, and apply for rights of a device management API and a hardware virtualization API.
Illustratively, the authentication service accesses an authentication server to authenticate the educational APP through the authentication server, including, but not limited to, whether the authentication is registered on the relevant platform, whether the relevant framework is adapted, and whether the relevant API rights are applied.
Also exemplary, the authentication service may authenticate the educational APP according to a local whitelist.
And obtaining an authentication result (authentication success or authentication failure) of the education APP at the authentication service, and sending the authentication result to the equipment management service.
S1.3, the device management service in the tablet transmits a device search instruction to the transmission management service when the education APP authentication is successful.
If the education APP authentication is successful, the device management service sends a device search instruction to the transmission management service. The device search instruction may include, but is not limited to, an instruction type, a device type to be searched, and a search mode. Exemplary search means include, but are not limited to, near field device scanning and obtaining device information from a cloud server. In this embodiment, the device to be searched is a desk lamp.
S1.4, the transmission management service in the tablet acquires a far-near field device list according to the device search instruction, and sends the far-near field device list to the device management service.
The near-far field device list includes a far-field device list and a near-field device list. The far-field devices included in the far-field device list refer to registered devices acquired from the cloud server, and the near-field devices included in the near-field device list refer to devices scanned through near-field communication. In the far field device list and the near field device list, the device information includes, but is not limited to, a device name, a device identification, a device type, and the like.
When the transmission management service receives the device searching command, performing related device searching operations, such as performing near-field device scanning operations and acquiring related device information in the cloud server, according to the device type to be searched and the searching mode carried in the device searching command, obtaining a far-field device list and a near-field device list, and sending the far-field device list and the near-field device list to the device management service.
And S1.5, performing device filtering by the device management service in the tablet according to the far-near field device list, and reporting the device information obtained after filtering to the education APP.
The device management service performs device filtering according to the far-field device list and the near-field device list, determines desk lamp information which can be linked with the tablet, and sends the desk lamp information to the education APP. The device management service may perform an intersection operation on the far-field device list and the near-field device list, filter out the table lamps only existing in the far-field device list or only in the near-field device list, and use the table lamps existing in the far-field device list and the near-field device list as table lamp information capable of being linked with the tablet. Thus, the device management service can filter out the desk lamp which is not registered in the cloud server, and also can filter out the desk lamp which cannot perform near field communication with the tablet.
In another alternative embodiment, if the tablet and the desk lamp are under the same lan, the transmission management service of the tablet may obtain the communication device list and the registered device list according to the device search instruction. The devices included in the communication device list refer to devices scanned through near field communication or far field communication, and the devices included in the registration device list refer to registered devices acquired from a cloud server. The near field devices included in the near field device list refer to devices scanned by near field communication. In the communication device list and the registered device list, the device information includes, but is not limited to, a device name, a device identification, a device type, and the like.
The transmission management service in the tablet sends the communication equipment list and the registration equipment list to the equipment management service, and the equipment management service performs equipment filtering according to the communication equipment list and the registration equipment list and reports the equipment information obtained after filtering to the education APP. The device management service may perform an intersection operation on the communication device list and the registration device list, filter out the table lamps only existing in the communication device list or only existing in the registration device list, and use the table lamps existing in both the communication device list and the registration device list as table lamp information capable of being linked with the tablet. Thus, the device management service can filter out the desk lamp which is not registered in the cloud server, and also can filter out the desk lamp which cannot perform near field communication with the tablet.
2. Virtual camera enabled phase
S2.1, determining a desk lamp to be linked by the education APP of the flat plate.
The number of the table lamps which can be linked with the tablet, which are filtered by the device management service, can be one or more. When the number of the desk lamps is one, the education APP defaults to be used as the desk lamp to be linked; when the number of the desk lamps is multiple, the education APP can display a desk lamp list to be linked for the user, so that the user can select the desk lamps in the list, and the education APP can respond to the selection operation of the user to select the desk lamps to be linked.
It should be noted that, the step of determining the desk lamp to be linked by the education APP may also be divided into a device discovery stage, which is not limited in this embodiment.
S2.2, performing equipment verification and equipment connection on the desk lamp by the aid of the education APP of the tablet computer, and obtaining that the desk lamp has the capability of supporting a virtualized camera.
S2.3, the education APP of the tablet sends a virtual camera enabling request to the hardware virtualization service.
The education APP of the tablet sends a virtual camera enabling request to the hardware virtualization service after obtaining that the desk lamp has the capability of supporting the virtualized camera. Wherein the virtual camera enable request is for indicating that the virtual camera is registered in the virtual camera HAL. The virtual camera enable request may include, but is not limited to, a request type, a device name, a device identification, a device type, and an identification of the virtual camera.
S2.4, the hardware virtualization service of the tablet registers the virtual camera with the virtual camera HAL.
After receiving the virtual camera enabling request, the hardware virtualization service registers the corresponding virtual camera with the virtual camera HAL according to the virtual camera enabling request.
S2.5, the tablet' S virtual camera HAL sends a virtual camera enable success indication to the educational APP after the virtual camera registration is completed.
The flow of the virtual camera enabled phase is explained in detail below with reference to the schematic interaction diagram of the modules shown in fig. 4 b. Referring to fig. 4b, the flow of the virtual camera enable phase mainly includes a device check sub-phase (S301-S309), a device connection sub-phase (S310-S321), a device service capability request sub-phase (S322-S325), and a virtual camera enable sub-phase (S326-S331).
Referring to fig. 4b, the process of the virtual camera enabling phase specifically includes the following steps:
s301, the education APP in the tablet sends a virtual camera enabling instruction to the hardware virtualization API.
The virtual camera enabling instruction is used for indicating to enable the virtual camera, and the virtual camera enabling instruction can include but is not limited to an instruction type, a device name, a device identifier and a device type.
S302, after receiving the virtual camera enabling instruction, the hardware virtualization API in the tablet sends a device checking instruction to an interface scheduling module of the hardware virtualization service.
And the device verification instruction is used for indicating to verify the device information carried in the virtual camera enabling instruction. The device verification instruction may include, but is not limited to, an instruction type, a device name, a device identifier, and a device type.
S303, an interface scheduling module of the hardware virtualization service in the tablet sends an APP authentication instruction to a right management module of the hardware virtualization service.
After receiving the device checking instruction, the interface scheduling module of the hardware virtualization service firstly sends an APP authentication instruction to the authority management module of the hardware virtualization service so as to authenticate the APP initiating the virtual camera enabling instruction. The APP authentication instruction may include, but is not limited to, a name of the APP.
S304, performing APP authentication on the education APP by the permission management module of the hardware virtualization service in the tablet.
Illustratively, the rights management module may access an authentication server to authenticate the educational APP through the authentication server, including, but not limited to, whether the authentication is registered on the relevant platform, whether the relevant framework is adapted, and whether the relevant API rights are applied. The rights management module may access the authentication server through the authentication service, which is not limited in this embodiment.
S305, when the authority management module of the hardware virtualization service in the tablet is successful in authenticating the education APP, an authentication success indication is sent to the interface scheduling module.
After the authority management module obtains the authentication result of the education APP, if the education APP is successfully authenticated, an authentication success indication is sent to the interface scheduling module, and if the education APP is failed in authentication, an authentication failure indication is sent to the hardware virtualization API, so that the hardware virtualization API returns the indication information of the lack of authority of the APP to the education APP according to the authentication failure indication.
S306, when the interface scheduling module of the hardware virtualization service in the tablet determines that the education APP authentication is successful, a device verification instruction is sent to the device management module.
And the interface scheduling module of the hardware virtualization service receives the authentication success indication, and sends a device verification instruction to the device management module after determining that the education APP is successfully authenticated. The device verification instruction is used for performing state verification on the device to be linked, and is specifically used for performing state verification on the desk lamp to be linked in the embodiment. Exemplary device verification instructions may include, but are not limited to, instruction type, device name, device identification, device type.
S307, the device management module of the hardware virtualization service in the tablet sends a device information inquiry instruction to the device profile module of the device management service.
Wherein, the equipment profile module of the equipment management service stores the equipment information of the current online.
After receiving the device verification instruction, the device management module of the hardware virtualization service sends a device information inquiry instruction to the device profile module of the device management service, wherein the device information inquiry instruction can include, but is not limited to, a device name, a device identifier and a device type.
S308, the device profile module of the device management service in the tablet returns device information to the device management module of the hardware virtualization service.
And the device profile module of the device management service returns the device information to the device management module of the hardware virtualization service if the corresponding device is queried according to the device information query instruction. Among the returned device information may include, but is not limited to, device name, device identification, device type, and presence status.
And the device profile module of the device management service returns a null value to the device management module of the hardware virtualization service according to the device information inquiry instruction if the corresponding device is not inquired so as to indicate that the corresponding device is not inquired. At this time, the device management module of the hardware virtualization service may send a device verification failure indication to the hardware virtualization API, so that the hardware virtualization API returns, to the education APP, indication information of the device verification failure according to the device verification failure indication.
S309, after receiving the device information, the device management module of the hardware virtualization service in the tablet sends a device verification success indication to the hardware virtualization API.
And how the equipment management module of the hardware virtualization service receives the equipment information returned by the equipment profile module of the equipment management service, sending an equipment verification success indication to the hardware virtualization API so as to indicate that the verification of the desk lamp to be linked is successful.
S310, the hardware virtualization API in the tablet sends a device connection request to a device management module of the hardware virtualization service.
After confirming that the verification of the desk lamp to be linked is successful, the hardware virtualization API sends a device connection request to a device management module of the hardware virtualization service. The device connection request is used for indicating that network connection is established with the device to be linked, and in this embodiment, the device connection request is specifically used for indicating that network connection is established with the desk lamp to be linked. The device connection request may include, but is not limited to, a request type, a device name, a device identification, and a device type.
S311, after receiving the device connection request, the device management module of the hardware virtualization service executes a negotiation channel setup preparation operation and sends a negotiation channel opening request to the transmission management service.
After receiving the device connection request, the device management module of the hardware virtualization service prepares to negotiate the channel. When the device management module prepares to negotiate a channel, a Session Server is created, and a Session Name of the Session service is sent to the transmission management service. After the negotiation path is ready, a negotiation path opening request is sent to the transmission management service. The negotiation channel opening request is used for indicating to establish a negotiation channel, and the negotiation channel opening request may include, but is not limited to, a peer device identifier (i.e. a table lamp identifier) and a Session Name.
In this embodiment, the negotiation channel opening request is actively initiated by the tablet side, that is, the tablet needs to establish a connection with the desk lamp. At this time, the desk lamp may be understood as a server, and the tablet may be understood as a client that needs to access the server.
S312, the transmission management service in the tablet establishes a negotiation channel with the device interconnection service in the desk lamp.
After receiving the negotiation channel opening request, the transmission management service interacts with the device interconnection service in the desk lamp according to the Session Name to establish a negotiation channel. Wherein, establishing the negotiation channel can be concretely establishing the session, and determining the session identification.
S313, the device interconnection service in the desk lamp sends a negotiation channel successful establishment instruction to the camera component in the hardware abstraction service.
After the establishment of the negotiation channel is completed, the device interconnection service in the desk lamp sends an indication of successful establishment of the negotiation channel to the camera component in the hardware abstraction service so as to indicate that the establishment of the negotiation channel is completed and that the devices needing to establish connection currently exist. The negotiation channel successful establishment indication may include, but is not limited to, device information (i.e., tablet device information) that needs to establish a connection, and a session identifier.
S314, the transmission management service in the tablet sends a negotiation channel successful establishment instruction to the device management module of the hardware virtualization service.
After the negotiation channel is established, the transmission management service in the tablet sends a negotiation channel successful establishment instruction to the device management module of the hardware virtualization service so as to indicate that the establishment of the negotiation channel is completed and that the device needing to establish the connection exists currently. The negotiation channel successful establishment indication may include, but is not limited to, device information (i.e., desk lamp device information) that needs to establish a connection, and a session identifier.
The present embodiment does not limit the execution order of S313 and S314.
S315, the device management module of the hardware virtualization service in the tablet sends a device information negotiation request to the camera component of the hardware abstraction service in the desk lamp based on the negotiation channel.
The device negotiation request may include, but is not limited to, device information (such as a device name, a device identifier, a device type, etc.) and a control channel connection request.
S316, after the camera component of the hardware abstraction service in the desk lamp receives the device negotiation request, a control channel is prepared, and device negotiation information is returned to the device management module of the hardware virtualization service in the tablet.
After receiving the equipment negotiation request, the camera component of the hardware abstraction service in the desk lamp analyzes the equipment negotiation request to acquire the equipment information of the opposite end, determines an IP address and a port to be monitored according to the control channel connection request, adds the IP address and the port into the equipment negotiation information, and returns the equipment negotiation information to the equipment management module of the hardware virtualization service in the tablet.
It is noted that the device negotiation request and the device negotiation information are transmitted based on the established negotiation channel.
S317, after the device management module of the hardware virtualization service in the tablet receives the returned device negotiation information, the negotiation channel is closed.
Wherein closing the negotiation channel may specifically be closing the session. After the device management module of the hardware virtualization service in the tablet receives the returned device negotiation information, the session ends, and the device management module of the hardware virtualization service can close the corresponding session according to the session identifier.
S318, the device management module of the hardware virtualization service in the tablet sends a control channel opening request to the transmission management service.
And the control channel opening request is used for indicating to establish network communication connection with the desk lamp. The control channel opening request may include, but is not limited to, a communication protocol, a source IP, a source port, a destination IP, and a destination port, where the destination IP and the destination port are an IP and a port monitored by a camera component of a hardware abstraction service in the desk lamp.
S319, the transmission management service in the tablet is connected with a control channel of a camera component of the hardware abstraction service in the desk lamp, and a successful connection instruction of the control channel is sent to a device management module of the hardware virtualization service in the tablet.
The transmission management service in the tablet receives the control channel opening request, and establishes control channel connection with the table lamp according to the information carried by the control channel opening request, namely, establishes network communication connection between the tablet and the table lamp. Furthermore, the device management module of the hardware virtualization service in the tablet and the camera component of the hardware abstraction service in the desk lamp can perform network communication based on the control channel.
After the control channel is successfully established, the transmission management service in the tablet sends a control channel successful connection indication to the device management module of the hardware virtualization service in the tablet. The indication of successful connection of the control channel may include, but is not limited to, a connection success identifier and information related to the control channel.
S321, the device management module of the hardware virtualization service in the tablet sends a device connection success indication to the hardware virtualization API.
The device connection success indication may include, but is not limited to, a connection success identifier and connected device information.
S322, the hardware virtualization API in the tablet sends a device capability request to a device management module of the hardware virtualization service.
After receiving the device connection success indication, the hardware virtualization API sends a device capability request to a device management module of the hardware virtualization service. The device capability request may be used to request to obtain virtualization capability information of the peer device (i.e., the desk lamp). Exemplary virtualized device capability information includes, but is not limited to, whether a virtualized camera is supported, whether a virtualized microphone is supported, and the like.
S323, the device management module of the hardware virtualization service in the tablet sends a device capability request to the camera component of the hardware abstraction service in the desk lamp in the control channel.
S324, the camera component of the hardware abstraction service in the desk lamp returns the device capability information to the device management module of the hardware virtualization service in the tablet in the control channel.
In this embodiment, the returned device capability information of the desk lamp may include at least the capability of supporting the virtualized camera and the camera identifier of the desk lamp.
S325, the device management module of the hardware virtualization service in the tablet sends device capability information to the hardware virtualization API.
The device management module of the hardware virtualization service in the tablet sends the received device capability information to the hardware virtualization API so that the hardware virtualization API can know whether the desk lamp has the capability of supporting the virtualized camera.
S326, the hardware virtualization API in the tablet sends a virtual camera enabling request to the device management module of the hardware virtualization service.
The hardware virtualization API in the tablet knows that the desk lamp has the capability of supporting the virtualized camera and sends a virtual camera enabling request to the device management module of the hardware virtualization service. The virtual camera enabling request may include, but is not limited to, a request type and a camera identifier of a desk lamp.
S327, the device management module of the hardware virtualization service in the tablet registers the virtual camera in the virtual camera HAL.
After receiving the virtual camera enabling request, the device management module of the hardware virtualization service sends a virtual camera registration request to the virtual camera HAL. The virtual camera registration request may include, but is not limited to, a request type and a camera identification of the desk lamp. After receiving the virtual camera registration request, the virtual camera HAL registers a virtual camera driver for the camera of the desk lamp in the virtual camera HAL, assigns a camera ID (i.e. virtual camera ID) for the camera of the desk lamp, and registers the camera ID in the system. Thus, the mapping relation between the desk lamp camera and the virtual camera is established in the virtual camera HAL.
S328, the device management module of the hardware virtualization service in the tablet sends a service state update indication to the camera component of the hardware abstraction service in the desk lamp.
The service state update instruction is used for instructing a camera component of the hardware abstraction service in the desk lamp to update the virtualized service state of the camera component. The virtualized service state may include an occupied state, an unoccupied state, or may include a registered state, an unregistered state, among others. Exemplary service status update indications may include, but are not limited to, device information of a peer device (i.e., a desk lamp), a hardware identifier (e.g., a desk lamp camera identifier), and a virtualized service status corresponding to the hardware identifier.
S329, the camera component of the hardware abstraction service in the desk lamp updates the service state according to the service state update instruction.
When the virtualized service state corresponding to the desk lamp camera is indicated to be the occupied state (or called registration state) in the service state update indication, the camera component updates the virtualized service state corresponding to the desk lamp camera to be the occupied state (or called registration state).
S330, the device management module of the hardware virtualization service in the tablet sends a virtual camera enabling success indication to the hardware virtualization API.
The virtual camera enabling success indication may include, but is not limited to, an enabling success identifier (or called a virtualization success identifier), a camera identifier of the desk lamp, and a camera ID corresponding to the virtual camera (or called a camera ID corresponding to the desk lamp camera).
The present embodiment does not limit the execution order of S328 and S330.
S331, the hardware virtualization API in the tablet sends a virtual camera enabling success indication to the education APP.
3. Virtual camera preview access phase
S3.1, the hardware virtualization API in the tablet sends a virtual camera access instruction to the camera service.
Virtual camera access instructions refer to instructions for invoking a virtual camera. The virtual camera access instruction may include, but is not limited to, an instruction type, a virtual camera ID, and a camera configuration parameter, where the configuration parameter includes, but is not limited to, a camera resolution and an acquisition frame rate.
S3.2, the camera service in the tablet sends an image preview request to the virtual camera HAL according to the virtual camera access instruction.
After receiving the virtual camera access instruction, the camera service generates a corresponding image preview request according to the virtual camera ID and sends the corresponding image preview request to the virtual camera HAL. Wherein the image preview request is for requesting a preview image data stream. Illustratively, the image preview request may include, but is not limited to, a request identification, a virtual camera ID, camera configuration parameters, and the like.
S3.3, the virtual camera HAL in the tablet sends an image preview request to the hardware virtualization service.
After receiving the image preview request, the virtual camera HAL determines the matched virtualized hardware identification according to the virtual camera ID carried in the image request. In this embodiment, the virtual camera HAL determines the linked desk lamp camera according to the virtual camera ID and the mapping relationship between the virtual camera ID and the desk lamp camera, and generates a corresponding image request according to the determined virtualized hardware identifier, and sends the image request to the hardware virtualized service. By way of example, the image preview request may include, but is not limited to, a request identifier, device information (i.e., table lamp information), virtualized hardware identifier (i.e., table lamp camera identifier), camera configuration parameters, and the like.
And S3.4, the hardware virtualization service in the tablet sends an image preview request to the transmission management service.
The hardware virtualization service sends an image preview request to the transport management service. The image preview request may include, but is not limited to, a request identifier, device information (i.e., table lamp information), a virtualized hardware identifier (i.e., table lamp camera identifier), and a camera configuration parameter.
When the hardware virtualization service in the tablet sends an image preview request to the transmission management service, if the hardware virtualization service finds that the data channel is not established with the desk lamp, the hardware virtualization service generates a data channel establishment request and sends the data channel establishment request to the transmission management service. And the data channel establishment request is used for indicating the transmission of data with the desk lamp. The data channel establishment request may include, but is not limited to, session identifier, connection information, data codec mode, etc.
The transmission management service in the flat plate receives the data channel establishment request, and establishes data channel connection with the desk lamp according to the information carried by the data channel establishment request, namely, establishes a data channel between the flat plate and the desk lamp. Further, the in-tablet transmission management service and the in-desk device interconnection service may transmit various data including, but not limited to, image data based on the data channel.
After the data channel is successfully established, the in-plane transmission management service sends a data channel successful connection indication to the in-plane hardware virtualization service, and the in-desk lamp device interconnection service sends a data channel successful connection indication to the camera component in the hardware abstraction service. The indication of successful connection of the data channel may include, but is not limited to, a connection success identifier and information related to the data channel.
And S3.5, the transmission management service in the tablet transmits an image preview request to the device interconnection service of the desk lamp.
And the transmission management service in the tablet determines a corresponding control channel according to the equipment information carried in the image preview request, and transmits the image preview request to the equipment interconnection service of the desk lamp in the control channel.
S3.6, the device interconnection service in the desk lamp sends an image preview request to the camera driver.
After receiving the image preview request, the device interconnection service in the desk lamp determines a hardware driver (in this embodiment, determines a camera driver) according to the virtualized hardware identifier, and sends the corresponding image preview request to the camera driver.
S3.7, the camera in the desk lamp drives the camera to collect images, and preview image data are transmitted to the hardware virtualization service of the tablet through the data channel.
The camera drives the camera to start, and drives the camera to collect images according to camera configuration parameters carried in the image preview request, so as to obtain a preview image data stream, and the preview image data stream is sent to the device interconnection service by the hardware abstraction service, so that the device interconnection service continuously transmits the preview image data stream to the hardware virtualization service of the tablet in the data channel. The packetization and encoding and decoding processes of the preview image data stream are not described herein.
S3.8, the in-panel hardware virtualization service sends the preview image data to the virtual camera HAL.
The hardware virtualization service continues to receive the preview image data stream and sends the preview image data stream to the virtual camera HAL.
S3.9, the virtual camera HAL in the tablet sends the preview image data to the camera service.
At this time, the virtual camera HAL continuously acquires preview image data acquired by the desk lamp camera, and continuously transmits the preview image data to the camera service.
And S3.10, the camera service in the tablet sends the preview image data to the education APP.
S3.11, displaying preview images by the education APP in the tablet.
After the education APP receives the preview image data stream through the camera service, the preview image can be displayed in the corresponding interface.
4. Virtual camera photographing stage
S4.1, responding to the received user operation, and sending a photographing request to the hardware virtualization service by the education APP in the tablet.
The user operation may be, for example, an operation of clicking a photographing option. In response to the received user operation, the education APP in the tablet sends a photographing request to the hardware virtualization service. The photographing request may include, but is not limited to, a photographing image sequence number, device information (i.e., table lamp information), a virtualized hardware identifier (i.e., table lamp camera identifier), a camera configuration parameter, and the like. Camera configuration parameters include, but are not limited to, image resolution.
The photographing request can also carry a task identifier so as to ensure orderly management of multiple photographing tasks.
S4.2, the hardware virtualization service in the tablet sends a photographing request to the transmission management service.
S4.3, the transmission management service in the tablet transmits a photographing request to the device interconnection service of the desk lamp.
And the transmission management service in the tablet determines a corresponding control channel according to the equipment information carried in the image preview request, and transmits the image preview request to the equipment interconnection service of the desk lamp in the control channel.
S4.4, the equipment interconnection service in the desk lamp sends a photographing request to the camera driver.
After receiving the image preview request, the device interconnection service in the desk lamp determines a hardware driver for response (in this embodiment, determines a camera driver) according to the virtualized hardware identifier, and sends a corresponding photographing request to the camera driver.
S4.5, the camera in the desk lamp drives the camera to shoot an image, and shot image data is transmitted to the hardware virtualization service of the tablet through the data channel.
The camera drives the camera to acquire images according to camera configuration parameters carried in the photographing request, photographing image data are obtained, and the photographing image data are sent to the equipment interconnection service through the hardware abstraction service, so that the equipment interconnection service continuously transmits the photographing image data to the hardware virtualization service of the tablet in the data channel. The packetization and encoding and decoding processes of the captured image data are not described herein.
And S4.6, the hardware virtualization service in the tablet sends the shot image data to the education APP.
S4.7, displaying the shot image by the education APP in the plate.
After the education APP receives the shot image through the hardware virtualization service, the shot image can be displayed in the corresponding interface.
In this embodiment, the virtual camera preview access stage is implemented based on the android native camera frame, and the virtual camera photographing stage is implemented based on the private virtualized camera frame, so that the processing path involved in the virtual camera photographing stage is shorter, and the photographing delay is smaller. Meanwhile, because the image preview is realized based on the camera frame of the android native, the education APP is less in change of the technical scheme provided by the embodiment for adapting.
It should be noted that the above-mentioned phase division in the flow is merely an exemplary expression, and the embodiment of the present application is not limited thereto. In addition, after the preview image is displayed on the process panel in the virtual camera preview access stage, the process of displaying the preview image in real time and the process in the virtual camera photographing stage can be simultaneously performed. Where the above processes are not explained in detail, reference may be made to the prior art, and they are not described in detail herein.
Fig. 4a shows a communication architecture of the collaborative system, which is used to complete the management of the tablet to the virtual camera (i.e. the desk lamp camera), the control command interaction between the tablet and the desk lamp, and the return and processing of the image data.
It should be noted that, instructions, requests, etc. transmitted across devices (i.e. transmitted between the tablet and the desk lamp) need to be encapsulated based on a communication protocol and a parameter sequence, etc., which will not be described in detail in this embodiment. The hardware virtualization service in the tablet may also manage the life cycle of previewing image streams and capturing images by dynamically allocating memory and dynamically destroying memory.
In addition, it should be noted that, before executing the cooperative working method provided in this embodiment, the education APP binds with the desk lamp, and registers the desk lamp in the cloud server.
The embodiment of the application provides a frame scheme that android system equipment uses the camera of external equipment to take a picture, and this scheme not only can be applied in educational scene, is applicable to other equipment that is provided with the camera simultaneously, and these equipment can share its camera ability and give android system equipment such as cell-phone, tablet to realize android system equipment and establish the interconnection intercommunication of these equipment.
Fig. 5 a-5 b illustrate an application scenario. As shown in fig. 5a (1), the interface 401 is displayed on a flat panel display, and a plurality of application icons are displayed on the interface 401, and the user clicks the educational application icon 4011. In response to the received user operation, the tablet opens the educational application, and the tablet displays an educational application interface, which may be referred to as (2) in fig. 5 a. As shown in fig. 5a (2), the educational application interface 402 is displayed on the panel, and various function options of the educational application are displayed on the interface 402, including but not limited to a word search function, a click-to-read function, a job function, a photo function, etc. With the user using the photographing function capability, the user clicks the photographing function option 4021, and in response to the user operation, the tablet executes the flows of the device discovery phase and the virtual camera service enabling phase and the virtual camera preview access phase.
In the device discovery stage, if the number of the table lamps which can be linked with the tablet is one and obtained by filtering by the device management service of the tablet, the tablet automatically executes the processes of the virtual camera service enabling stage and the virtual camera preview access stage, and displays an interface shown in (1) in fig. 5b, for example. In the device discovery stage, if the number of the table lamps which can be linked with the tablet is a plurality of the table lamps obtained through filtering by the device management service of the tablet, the tablet displays a table lamp selection interface. For example, a list of the desklamps to be linked is displayed on the desk lamp selection interface, the user may perform a selection operation, and in response to the selection operation by the user, the educational application determines one desk lamp to be linked and continues to perform the processes of the virtual camera service enabling phase and the virtual camera preview access phase to display the interface shown in (1) of fig. 5b, for example.
As shown in fig. 5b (1), an image preview window 4031 and a photographing option 4032 are displayed in the interface 403, and a preview image acquired by the desk lamp camera in real time is displayed in the image preview window 4031. At this time, if the user clicks the photographing option 4032, the tablet performs a process of the photographing stage of the virtual camera in response to the user operation, and displays an interface shown in (2) of fig. 5b, for example. As shown in an interface 404 of fig. 5b (2), an image captured by the desk lamp camera is displayed in an image preview window 4041. At this time, when the user clicks the confirm option 4041, the tablet saves the captured image in response to the user operation, and continues to display the preview interface shown in fig. 5b (1), for example. If the user clicks the cancel option 4042, the tablet may display a preview interface, for example, as shown in (1) of fig. 5b, in response to a user operation.
It should be noted that, the interface shown in fig. 5b (2) is merely an exemplary example, and the image captured by the desk lamp camera may be displayed in other areas of the interface instead of the image preview window 4041, and the preview image captured by the desk lamp camera in real time is continuously displayed in the image preview window 4041, which is not limited in this application.
The cooperative working method provided by the embodiment mainly explains a low-cost technical scheme for realizing the online education function based on the combination of the tablet equipment and the desk lamp equipment. The technical solution provided in this embodiment is described below in connection with several different functions related to online education, respectively.
Scene one
Referring to the application scenario schematic diagram shown in fig. 1a, the technical scheme is illustrated by taking a word searching function as an example. When a student encounters an unrecognized word, the finger can point to the lower part of the word, a picture is shot by using the desk lamp camera, the image is identified by the flat plate to determine the content of the word, and the meaning of the word is fed back to the student through the display screen after the online word searching is completed, for example, the word explanation is displayed and broadcasted on an interface.
FIG. 6a is a schematic diagram showing the interaction of the modules. Referring to fig. 6a, the embodiment of the present application provides a method flow for cooperative work of a tablet and a desk lamp, specifically including:
S501, responding to the operation of clicking the word searching function by a user, executing the processes of a device discovery stage, a virtual camera enabling stage and a virtual camera preview access stage by the tablet and the table lamp, and displaying a preview interface by the tablet.
For the flow of the device discovery phase, the virtual camera enabling phase, and the virtual camera preview access phase, reference may be made to the foregoing, and details thereof will not be repeated herein.
It should be noted that in the virtual camera preview access phase, the hardware virtualization API in the tablet will carry the camera configuration parameters in sending the virtual camera access instruction to the camera service. The camera configuration parameters may include, but are not limited to, image resolution and image acquisition frame rate, among others. The desk lamp camera is set according to the received configuration parameters, and the preview image data is acquired according to the corresponding image resolution and image acquisition frame rate.
In this scenario, the tablet needs to accurately identify the preview image to determine the text content pointed by the user, so that the image quality requirement of the present scenario on the preview image is high, for example, the image resolution may be set to 1080P. Thus, a higher success rate of word searching can be ensured.
S502, the education APP in the tablet performs finger recognition on the preview image.
The education APP may perform finger recognition for each received preview image, or may perform finger recognition for the latest received preview image at regular time, which is not limited in this embodiment.
For example, the education APP may integrate an image recognition algorithm to implement an image recognition operation, and the education APP may also call an image recognition service to perform the image recognition operation, which is not limited in this embodiment.
The image recognition algorithm may refer to the prior art, and this embodiment is not described herein.
S503, in response to the operation of using finger words by the user, the education APP in the tablet identifies the fingers of the user in the preview image.
When a user points to a certain word in the book by using a finger, the desk lamp camera can acquire a preview image of the finger word, and then the education APP in the tablet can identify the finger of the user in the preview image.
When the education APP carries out finger recognition on the preview image, if a finger is recognized, position information, such as coordinate information, of the finger in the preview image can be obtained.
S504, the education APP in the tablet determines an ROI (region of interest ) image according to the position of the finger in the preview image.
After the educational APP identifies the user's finger in the preview image, the ROI image may be determined from the finger's position information in the preview image. The education APP can determine the ROI information according to the coordinate information of the finger in the preview image, wherein the ROI information comprises, but is not limited to, the coordinates of a central point and the region range (such as width and height information). Furthermore, the educational APP can crop out the ROI image in the preview image based on the ROI information.
S505, education APP in the plate accurately identifies the ROI image and determines the word to be interpreted.
The education APP may integrate an image recognition algorithm to accurately recognize the ROI image, and may call an image recognition service to accurately recognize the ROI image to determine the word to be interpreted, which is not limited in this embodiment.
The image recognition algorithm may refer to the prior art, and this embodiment is not described herein.
S506, the education APP in the tablet performs word searching operation on the word to be interpreted, and displays the paraphrasing of the word to be interpreted.
After determining the word to be interpreted, the education APP can perform online word searching operation or perform word searching operation in a database to obtain the paraphrasing of the word to be interpreted. Furthermore, the educational APP can display the paraphrasing of the new word to be interpreted for the user to view. In addition, the education APP may read the paraphrasing of the displayed new word, which is not limited in this embodiment.
Similarly, the user may perform the pointing operation using a pointing tool (or pointing tool) such as a stylus, which is not limited in this embodiment. Correspondingly, the education APP identifies the word pointing tool on the preview image to determine whether the user has word searching intention, and determines the ROI image according to the position information of the word pointing tool on the preview image.
Similarly, the user may also point to a picture in the book using a pointing device such as a finger or stylus. Correspondingly, the education APP determines the ROI image according to the position information of the finger or the finger-word tool such as the touch pen, performs picture content identification on the ROI image, displays the paraphrasing corresponding to the picture, and can read the displayed paraphrasing of the picture. In this case, the present embodiment will not be described in detail.
Fig. 1a, 7 a-7 b show an exemplary application scenario. As shown in fig. 7a (1), the tablet display education APP interface 701 displays various function options of the education application on the interface 701, including but not limited to a word search function, a click-to-read function, a job function, a photo function, etc. The user clicks the word search function option 7011, and in response to the user operation, the tablet executes the flows of the device discovery phase and the virtual camera service enabling phase, and the virtual camera preview access phase.
In the device discovery stage, if the number of the table lamps which can be linked with the tablet is one and obtained by filtering by the device management service of the tablet, the tablet automatically executes the processes of the virtual camera service enabling stage and the virtual camera preview access stage, and displays an interface shown in (2) in fig. 7a, for example. In the device discovery stage, if the number of the table lamps which can be linked with the tablet is a plurality of the table lamps obtained through filtering by the device management service of the tablet, the tablet displays a table lamp selection interface. For example, a list of the desklamps to be linked is displayed on the desk lamp selection interface, the user may perform a selection operation, and in response to the selection operation by the user, the educational application determines one desk lamp to be linked and continues to perform the processes of the virtual camera service enabling phase and the virtual camera preview access phase to display the interface shown in (2) of fig. 7a, for example.
As shown in fig. 7a (2), an image preview window 7021 and a word searching function operation diagram 7022 are displayed in the interface 702. The preview window 7021 displays a preview image acquired by the desk lamp camera in real time. The user may perform a word or figure pointing operation with reference to the word searching function operation diagram 7022 to trigger the word searching function. The education APP carries out word or figure recognition according to the preview image. With continued reference to fig. 7a (2), when the user uses a finger in a book, the desk lamp camera captures a finger preview image, which is displayed in the image preview window 7021. Further, the educational APP can identify the user's finger in the preview image and determine the position information of the finger in the preview image, such as coordinate information, etc. The education APP determines an ROI image according to the position of the finger in the preview image, and accurately identifies the ROI image to determine the word to be interpreted. After the education APP queries the paraphrasing of the new words, the corresponding paraphrasing of the new words is displayed on the interface, and the education APP can refer to FIG. 7 b.
However, in the above procedure, in order to ensure the success rate of word searching, the table lamp camera needs to continuously return to the preview image stream with high resolution (for example 1080 p), and the requirement on bandwidth is relatively high, and a bandwidth of 4-8Mbps is required. Therefore, the requirements of the desk lamp on the hardware chip are higher, so that the cost of the desk lamp can be increased.
In order to realize the scheme of realizing the word searching function by the cooperation of the flat plate and the desk lamp and reduce the hardware cost of the desk lamp, the embodiment also provides a technical scheme. In this case, since finger recognition or word recognition is performed in the image and the requirement of position recognition on the image resolution is not high, a low-resolution (for example, 480 p) preview image stream is continuously returned by using a desk-lamp camera in the virtual camera preview access stage. If the education APP identifies the user's word or figure operation based on the preview image, triggering the desk lamp to shoot a high-resolution (1080 p for example) image, so that the education APP can accurately identify and determine the word or picture to be interpreted according to the high-resolution image. Therefore, the desk lamp camera is used for continuously returning the low-resolution preview image stream, the bandwidth requirement is not high, only 0.5-1Mbps bandwidth is needed, and only more bandwidth is needed when the high-resolution image is transmitted. Therefore, the technical scheme not only reduces the requirement of the desk lamp on the hardware chip and reduces the cost of the desk lamp, but also can ensure the success rate of word searching.
Fig. 6b shows an interaction diagram of the modules. Referring to fig. 6b, the embodiment of the present application provides a method flow for cooperative work of a tablet and a desk lamp, specifically including:
s601, responding to the operation of clicking the word searching function by a user, executing the processes of a device discovery stage, a virtual camera enabling stage and a virtual camera preview access stage by the tablet and the table lamp, and displaying a preview interface by the tablet.
For the flow of the device discovery phase, the virtual camera enabling phase, and the virtual camera preview access phase, reference may be made to the foregoing, and details thereof will not be repeated herein.
It should be noted that in the virtual camera preview access phase, the hardware virtualization API in the tablet will carry the camera configuration parameters in sending the virtual camera access instruction to the camera service. The camera configuration parameters may include, but are not limited to, image resolution and image acquisition frame rate, among others. The desk lamp camera is set according to the received configuration parameters, and the preview image data is acquired according to the corresponding image resolution and image acquisition frame rate.
In this scenario, in order to reduce the bandwidth occupied by the preview image stream, in the quasi-camera preview access phase, the desk-lamp camera may collect the preview image according to the low resolution. For example, the hardware virtualization API in the tablet may send a virtual camera access instruction to the camera service with a first configuration parameter, where the first configuration parameter includes a first image resolution (e.g., 480P). And setting the table lamp camera according to the received configuration parameters, and acquiring preview image data according to the first image resolution and the corresponding image acquisition frame rate.
S602, the education APP in the tablet performs finger recognition on the preview image.
S603, in response to the operation of using finger words by the user, the education APP in the tablet identifies the fingers of the user in the preview image.
S604, the education APP in the tablet determines ROI information according to the position of the finger in the preview image, and generates a photographing request according to the ROI information.
The ROI information refers to information for determining the ROI, and may include, but is not limited to, center point coordinates and region ranges (e.g., width-height information).
S605, the tablet transmits a photographing request to the desk lamp side.
The photographing request may include, but is not limited to, a virtual camera ID corresponding to the desk lamp camera, a second configuration parameter of the desk lamp camera, and ROI information. The second configuration parameters include, but are not limited to, a second image resolution that is higher than the first image resolution, such as the second image resolution set to 1080P. Therefore, the education APP can accurately identify and determine the word or picture to be interpreted according to the high-resolution image.
S606, the camera of the desk lamp is set according to a second configuration parameter carried in the photographing request, images are photographed according to a second image resolution, and the photographed images are sent to the hardware abstraction service.
S607, the hardware abstraction service determines the ROI image according to the ROI information.
The hardware abstraction service may crop the ROI image in the captured image based on the ROI information.
In an alternative embodiment, the photographing request includes, but is not limited to, the second configuration parameter, but does not include ROI information. Like this, the dull and stereotyped request of shooing is transmitted to the desk lamp side, and the camera of desk lamp sets up according to the second configuration parameter that carries in the request of shooing to shoot the image according to second image resolution, and returns the image of shooing to education APP in the dull and stereotyped. Furthermore, the educational APP may determine the ROI image from the ROI information, for example, crop the ROI image from the captured image based on the ROI information.
S608, the desk lamp transmits the ROI image to the education APP in the tablet.
Compared with the table lamp, the method has the advantages that the high-resolution shooting image is directly returned to the education APP in the panel, the cut ROI image is returned to the education APP in the panel, the data transmission quantity can be reduced, and the bandwidth occupation is reduced.
S609, education APP in the plate accurately identifies the ROI image and determines the word to be interpreted.
S610, the education APP in the tablet performs word searching operation on the word to be interpreted, and displays the paraphrasing of the word to be interpreted.
The present process is not explained in detail, and reference is made to the foregoing, and will not be repeated here.
Similarly, the user may use a pointing tool such as a stylus to perform a pointing operation, which is not limited in this embodiment. Correspondingly, the education APP identifies the word pointing tool on the preview image to determine whether the user has word searching intention, and determines the ROI information according to the position information of the word pointing tool on the preview image.
Similarly, the user may also point to a picture in the book using a pointing device such as a finger or stylus. Correspondingly, the education APP determines the ROI information according to the position information of the finger or the finger-word tool such as a touch pen, determines the ROI image in the shot image for the ROI information, performs picture content identification for the ROI, displays the paraphrasing corresponding to the picture, and can read the displayed picture paraphrasing. In this case, the present embodiment will not be described in detail.
For the application scenario of this flow, reference may be made to the application scenario shown in fig. 1a, fig. 7 a-fig. 7 b. Referring to fig. 7a (2), when a user uses a finger in a book, a desk lamp camera captures a finger preview image, which is displayed on an image preview window 7021. Furthermore, the education APP can identify the finger of the user in the preview image, determine the ROI information, and generate a photographing request according to the ROI information and the image high resolution to trigger the desk lamp camera to photograph the image. The desk lamp camera shoots images according to the high resolution of the images, the desk lamp side cuts the high resolution shooting images according to the ROI information to obtain the ROI images, and the ROI images are returned to the education APP in the flat plate. The education APP accurately identifies the ROI image and determines the word to be interpreted. After the education APP queries the paraphrasing of the new words, the corresponding paraphrasing of the new words is displayed on the interface, and the education APP can refer to FIG. 7 b.
Scene two
Referring to the application scenario schematic diagram shown in fig. 1a, the present scenario is illustrated by taking a job function as an example. When students need to submit homework online, the students can click to shoot in the education APP, the desk lamp camera is used for shooting homework images, and the homework images are uploaded to the database through the education APP.
FIG. 8 is a schematic diagram showing the interaction of the modules. Referring to fig. 8, the embodiment of the application provides a method flow for cooperative work of a tablet and a desk lamp, specifically including:
s801, in response to operation of clicking a job function by a user, the education APP in the tablet displays a job submission list.
Note that the job submission list refers to a list including a plurality of job submission options, one job submission option corresponding to each job, and can be referred to as interface 704 in fig. 9 a. The job options may be divided according to disciplines or time, which is not limited in this embodiment.
If only one job image is submitted for one job in the job function of the education APP, the education APP does not display the job submission list. At this time, in response to the operation of clicking the job function by the user, the tablet and the desk lamp execute the flows of the device discovery stage, the virtual camera enabling stage, and the virtual camera preview access stage, and the tablet displays the preview interface.
S802, responding to the click of a user to submit a job option, the flat panel and the desk lamp execute the processes of a device discovery stage, a virtual camera enabling stage and a virtual camera preview access stage, and the flat panel displays a preview interface.
For the flow of the device discovery phase, the virtual camera enabling phase, and the virtual camera preview access phase, reference may be made to the foregoing, and details thereof will not be repeated herein.
It should be noted that in the virtual camera preview access phase, the hardware virtualization API in the tablet will carry the camera configuration parameters in sending the virtual camera access instruction to the camera service. The camera configuration parameters may include, but are not limited to, image resolution and image acquisition frame rate, among others. The desk lamp camera is set according to the received configuration parameters, and the preview image data is acquired according to the corresponding image resolution and image acquisition frame rate.
In this scenario, in order to reduce the bandwidth occupied by the preview image stream, in the quasi-camera preview access phase, the desk-lamp camera may collect the preview image according to the low resolution. For example, the hardware virtualization API in the tablet may send a virtual camera access instruction to the camera service with a first configuration parameter, where the first configuration parameter includes a first image resolution (e.g., 480P). And setting the table lamp camera according to the received configuration parameters, and acquiring preview image data according to the first image resolution and the corresponding image acquisition frame rate.
S803, responding to the operation of clicking the photographing option by the user, and generating a photographing request by the education APP in the tablet.
When a user places a job or a book and the like in the acquisition area of the desk lamp camera, the user can click a photographing option to trigger the desk lamp camera to photograph a job image.
The photographing request may include, but is not limited to, a virtual camera ID corresponding to the desk lamp camera, and a second configuration parameter of the desk lamp camera. The second configuration parameters include, but are not limited to, a second image resolution that is higher than the first image resolution, such as the second image resolution set to 1080P. Thus, the education APP can upload the high-resolution operation images.
S804, the tablet transmits a photographing request to the desk lamp side.
S805, the camera of the desk lamp is set according to a second configuration parameter carried in the photographing request, and images are photographed according to a second image resolution.
S806, transmitting the shot image to the education APP in the flat plate by the table lamp.
S807, education APP in the tablet displays the photographed image.
The education APP in the tablet receives the operation image shot by the desk lamp camera, and displays the operation image. If the user is satisfied with the captured job image, the user may click on a submit option to upload the job image to the database. If the user is not satisfied with the photographed job image, the user may re-click the photographing option to trigger the desk lamp camera to re-photograph the job image.
S808, in response to the user clicking the submit option, the education APP in the tablet uploads the shot image to the database.
The present process is not explained in detail, and reference is made to the foregoing, and will not be repeated here.
It should be noted that the above mentioned job images are only exemplary examples, and the user may click on the photographing option to trigger the desk lamp camera to take other images. After the table lamp sends the shot image to the education APP in the tablet, the education APP can upload the received shot image to a corresponding database.
Fig. 1a, 9 a-9 c show an exemplary application scenario. As shown in fig. 9a (1), the tablet display education APP interface 701 displays various function options of the education application on the interface 701, including but not limited to a word search function, a click-to-read function, a job function, a photo function, etc. The user clicks the word search function option 7012, and in response to a user operation, the education APP in the tablet displays a job submission list, as shown in (2) of fig. 9 a. In the job submission list interface 704 shown in fig. 9a (2), a plurality of submitted job options (e.g., submitted job 1, submitted job 2, submitted job 3, submitted job 4, etc.) are displayed, with different job submission options corresponding to different jobs. Taking the example of a user needing to upload a job image for commit job 4, the user clicks commit job 4 option 7042. In response to a user operation, the tablet may display a job submission interface 705 as shown in fig. 9b (1).
With continued reference to (1) in fig. 9b, in the job submission interface 705, an image preview window 7041, a photograph option 7051, and a submission option 7052 are displayed. The preview window 7041 displays a preview image acquired by the desk lamp camera in real time. When a user places a job or a book or the like in the desk lamp camera acquisition area, the user can click the photographing option 7051 to trigger the desk lamp camera to photograph a job image. In response to user operation, the education APP generates a photographing request and sends the photographing request to the desk lamp side so as to call the desk lamp camera to photograph the operation image. The desk lamp camera shoots the operation image according to the high resolution of the image carried in the shooting request, and returns the shot operation image to the education APP in the tablet for display through the education APP, and reference can be made to an interface 706 as shown in (2) in fig. 9 b.
With continued reference to (2) in fig. 9b, an image preview window 7041, a photographing option 7051, a submitting option 7052, and a job image 7061 photographed by a desk lamp camera are displayed in the interface 706. A close option 7062 is also displayed in the job image 7061. If the user is not satisfied with the job image 7061, the close option 7062 may be clicked and the job image 7061 will not be displayed on the interface. At this time, the user may click on the photographing option 7051 to trigger the desk lamp camera to re-photograph the job image. If the user is satisfied with the job image 7061, the submit option 7052 may be clicked. In response to a user operation, the tablet may display an interface to be confirmed 701 as shown in fig. 9 c. Therein, a job image 7061 to be submitted and whether to confirm the submit window 7071 are displayed in the interface to be confirmed 701. If the user clicks on the confirmation option 7072 in the confirmation submission window 7071, the educational APP uploads the job image 7061 to the database in response to the user operation. If the user clicks the cancel option 7071 in the confirm submission window 7071, the education APP may display an interface as shown in (1) of fig. 9b in response to the user operation to trigger the desk lamp camera to re-photograph the job image with the standby click photographing option 7051.
Note that, with continued reference to (2) in fig. 9b, if the user is satisfied with the job image 7061, the submit option 7052 may be clicked. In response to a user operation, the educational APP may not display the interface as shown in fig. 9c any more, and directly upload the job image 7061 to the database. This embodiment is not limited thereto.
Scene three
Referring to the application scenario schematic diagram shown in fig. 1a, the present scenario is illustrated by taking a point-to-read function (or referred to as a read function) as an example. When students need to read the contents in books by the education APP, the desk lamp camera can be used for collecting book images in real time, so that the education APP can load corresponding book contents according to the book images, and the book contents to be read are determined to be read according to the finger positions or page turning operation of the students.
An interaction diagram of the modules is shown in fig. 10. Referring to fig. 10, the embodiment of the application provides a method flow for cooperative work of a tablet and a desk lamp, specifically including:
s901, responding to the operation of clicking the click-to-read function by a user, executing the processes of a device discovery stage, a virtual camera enabling stage and a virtual camera preview access stage by the tablet and the desk lamp, and displaying a preview interface by the tablet.
For the flow of the device discovery phase, the virtual camera enabling phase, and the virtual camera preview access phase, reference may be made to the foregoing, and details thereof will not be repeated herein.
It should be noted that in the virtual camera preview access phase, the hardware virtualization API in the tablet will carry the camera configuration parameters in sending the virtual camera access instruction to the camera service. The camera configuration parameters may include, but are not limited to, image resolution and image acquisition frame rate, among others. The desk lamp camera is set according to the received configuration parameters, and the preview image data is acquired according to the corresponding image resolution and image acquisition frame rate.
In this scenario, in order to reduce the bandwidth occupied by the preview image stream, in the quasi-camera preview access phase, the desk-lamp camera may collect the preview image according to the low resolution. For example, the hardware virtualization API in the tablet may send a virtual camera access instruction to the camera service with a first configuration parameter, where the first configuration parameter includes a first image resolution (e.g., 480P). And setting the table lamp camera according to the received configuration parameters, and acquiring preview image data according to the first image resolution and the corresponding image acquisition frame rate.
S902, identifying the preview image by the education APP in the tablet, and determining the name of the book.
The education APP may perform book information identification for each received preview image, or may perform book information identification for the latest received preview image at regular time, which is not limited in this embodiment.
For example, the education APP may integrate an image recognition algorithm to implement an image recognition operation, and the education APP may also call an image recognition service to perform the image recognition operation, which is not limited in this embodiment.
The image recognition algorithm may refer to the prior art, and this embodiment is not described herein.
S903, the education APP in the tablet computer searches the database according to the book names and loads book contents corresponding to the book names.
If the education APP in the tablet retrieves the database according to the book names and determines books with different versions, a corresponding book list can be displayed for the user to select. Further, in response to a user's selection operation of a certain version of a book, the educational APP loads content corresponding to the version of the book
S904, responding to page turning or finger click operation of a user, identifying the preview image by the education APP in the tablet, determining a paragraph to be read, and reading the corresponding paragraph.
After the book content is loaded, the education APP can recognize the instruction tool of the finger or the touch pen instruction for each received preview image, or can recognize the instruction tool of the finger or the touch pen instruction for the latest received preview image at regular time, which is not limited in the embodiment. In response to a click operation of a user, the education APP can identify the page number of the book and click position information of the user, such as coordinate information and the like, according to the preview image, and further can determine paragraphs to be read in the loaded book content according to the page number of the book and the click position information of the user, and read corresponding paragraphs.
After the book content is loaded, the education APP can also perform page turning identification according to the preview image stream. In response to a page turning operation of a user, the education APP can determine paragraphs to be read in the loaded book content according to the identified book page numbers, and read corresponding paragraphs.
The present process is not explained in detail, and reference is made to the foregoing, and will not be repeated here.
Fig. 1a, 11 a-11 b show an exemplary application scenario. As shown in (1) of fig. 11a, the tablet display education APP interface 701 displays various function options of the education application on the interface 701, including but not limited to a word search function, a click-to-read function, a job function, a photo function, etc. The user clicks the click function option 7013, and in response to the user operation, the tablet executes the flows of the device discovery phase and the virtual camera service enable phase, and the virtual camera preview access phase.
In the device discovery stage, if the number of table lamps that can be linked with the tablet is one, the tablet automatically executes the processes of the virtual camera service enabling stage and the virtual camera preview access stage, and displays an interface shown in (2) in fig. 11a, for example. In the device discovery stage, if the number of the table lamps which can be linked with the tablet is a plurality of the table lamps obtained through filtering by the device management service of the tablet, the tablet displays a table lamp selection interface. For example, a list of the desklamps to be linked is displayed on the desk lamp selection interface, the user may perform a selection operation, and in response to the selection operation by the user, the educational application determines one desk lamp to be linked and continues to perform the flow of the virtual camera service enabling phase and the virtual camera preview access phase to display the interface shown in (2) of fig. 11a, for example.
As shown in fig. 11a (2), an image preview window 7081 is displayed in the interface 708. The preview window 7081 displays a preview image acquired by the desk lamp camera in real time. The educational APP identifies the preview image to determine the book name. After identifying the book name, the educational APP retrieves the database based on the book name. If a corresponding book is retrieved, a loading operation of the book contents is performed, as shown in (1) of fig. 11 b.
With continued reference to fig. 11b (1), an image preview window 7081, an identified book name 7091, and a book content loading progress identifier 7092 are displayed in the interface 709. After the book content is loaded, the education APP can identify the click-to-read operation or page turning operation of the user according to the preview image. Taking the example of a user performing a page turning operation, referring to the interface 710 as shown in (2) in fig. 11b, a page turning action of the user may be displayed in the image preview window 7081. In response to a user operation, the education APP recognizes the page number of the book according to the preview image. Furthermore, the education APP can determine the paragraphs to be read in the loaded book content according to the identified book page numbers, and read the corresponding paragraphs.
In the collaborative work method provided by the embodiment of the application, the household flat plate and the table lamp with the camera are used for combining the professional online education experience. The camera of the desk lamp equipment is matched with the tablet education APP to complete the scenes of students such as finger word searching, homework submitting, book reading and the like needing photographing.
In the photographing application scene, after the transmission channel module of the hardware virtualization service in the tablet receives the photographing image acquired by the desk lamp camera, the photographing image is transmitted to the education APP through the equipment management module of the hardware virtualization service.
With continued reference to fig. 12, the educational APP and hardware virtualization service in the tablet are implemented based on different processes, respectively. For example, the educational APP runs in a first process (which may be referred to as a three-way application layer) and the hardware virtualization service runs in a second process. The second process may be further divided into a JAVA-based device management layer and a c++ -based active transmission channel layer, that is, the above-mentioned device management module of the hardware virtualization service operates in the JAVA-based device management layer, and the transmission channel module of the hardware virtualization service operates in the c++ -based active transmission channel layer. The first process and the second process communicate with each other through AIDL (Android Interface Definition Language ), and the device management module and the transmission channel module communicate with each other through JNI (Java Native Interface, java local interface) in the second process. The device management layer is used for receiving the three-party application layer and the native transmission channel layer.
In the photographing application scenario, a user requests to photograph image data at a three-party application layer, and a photographing request is transmitted to a native transmission channel layer through an equipment management layer, and the native transmission channel layer interacts with a desk lamp side. After receiving the data from the desk lamp side, the native transmission channel layer transmits the data to the device management layer, and the device management layer transmits the image data to the three-party application layer process for display to the user.
Like this, when the shooting image data (or called image file) that desk lamp camera gathered transmitted to education APP, spanned three level, namely active transmission channel layer, equipment management layer and three party application layer. The data is transferred from the active transfer channel layer to the device management layer or from the device management layer to the three-party application layer without copying.
Because the processes in the Android system cannot share the memory, some mechanisms need to be provided to perform data communication between different processes. In the conventional communication mechanism, 4 copy operations are required for data cross-process transmission, 1 copy operation is also required for data transfer between the active transmission channel layer and the device management layer, which are in communication through the JNI, and one copy operation is also required for data transfer from the lamp side to the active transmission channel layer, so that 6 copy operations are required for data transfer to the three-party application layer through the transmission channel layer and the device management layer.
Referring to fig. 13, the 6 data copy operations involved in the transfer of data (or files) to the three-way application layer via the transport channel layer, the device management layer, include:
1. the transmission channel module of the hardware virtualization service copies the received photographed image data into a transmission channel buffer (buffer).
2. The transmission channel module of the hardware virtualization service copies the photographed image data from the transmission channel buffer memory to the device management buffer memory.
Wherein the two copy operations are performed in the user space of the second process.
3. The hardware virtualization service copies the captured image data from the device management cache into the kernel space of the second process.
4. The hardware virtualization service copies the captured image data from the kernel space of the second process into the memory.
5. The education APP copies the shot image data from the memory to the kernel space of the first process.
6. The educational APP copies the captured image data from the kernel space of the first process into the user space of the first process.
Thus, the educational APP can display the photographed image data.
In the scenario of the collaborative work of the tablet desk lamp provided by the embodiment of the application, the resolution of the photographed image is generally higher (such as 1080P), and the file size of the corresponding photographed image is larger. However, when a large image file needs to be transferred across tiers, the more times the data is copied, the lower the transfer rate of the large image file across tiers, and the longer the time consuming. In the traditional scheme, the large image file is transferred to the three-party application layer through the transmission channel layer and the equipment management layer, 6 copying operations are needed, the transmission speed of the large image file is certainly reduced, the time consumption is prolonged, the shot image data is a public resource, and the process is operated by a plurality of processes, so that the memory error is easily caused by the process, and the memory management is more complex.
Therefore, how to increase the transmission rate of large image files across the hierarchy and reduce the communication time consumption is a technical problem to be solved.
The embodiment of the application also provides a data transmission method, which improves the transmission rate of large image files across the layers by establishing a shared memory mechanism, reduces the communication time consumption and solves the problems of excessive copying times and long time consumption of the traditional scheme.
In the data transmission method provided by the embodiment of the application, the large image file is transmitted to the three-party application layer through the transmission channel layer and the equipment management layer only by two copying operations, so that the transmission rate of the large image file across layers is improved, and the communication time consumption is reduced. As shown in fig. 14, the two data copy operations involved in the transfer of data to the three-party application layer via the transport channel layer, the device management layer, include:
1. the transmission channel module of the hardware virtualization service copies the received shot image data to the shared memory.
2. The educational APP copies the captured image data from the shared memory into the user space of the first process.
FIG. 15 is a schematic diagram showing the interaction of the modules. Referring to fig. 15, a flow of a data transmission method provided in an embodiment of the present application specifically includes:
S1001, the flat education APP sends a photographing request to a device management module in the hardware virtualization service.
Illustratively, in the submit job scenario, in response to a user clicking on a photographing option, the tablet education APP sends a photographing request to a device management module in the hardware virtualization service.
Illustratively, in the word searching scene, in response to the operation of using finger words by the user, the education APP in the tablet identifies the finger of the user in the preview image, determines ROI information according to the position of the finger in the preview image, generates a photographing request and sends the photographing request to a device management module in the hardware virtualization service.
In this embodiment, the photographing request may include, but is not limited to, a task identifier (such as PictureNo), a callback function (callback), and a photographing parameter (canara). Among them, the photographing parameters include, but are not limited to, photographing modes, such as a continuous photographing mode (also called a continuous photographing mode), a normal photographing mode, and the like.
Since each photographing request involves different image data, a task identification may be used as a unique identification of the photographing request.
S1002, a device management module of the hardware virtualization service in the tablet correspondingly stores a task identifier and a callback function carried in a photographing request into a callback buffer area, and sends the photographing request to a transmission channel module of the hardware virtualization service.
After receiving the photographing request, the device management module analyzes the task identifier and the callback function written in the photographing request, takes the task identifier as the identifier, and stores the callback function corresponding to the task identifier in the callback buffer. The callback function information stored in the callback buffer area can exist in the form of key value pairs, a key value is a task identifier, and a value is a callback function.
Because the photographing request and the transmission flow of the photographed image data are asynchronous processes, a transmission channel layer, an equipment management layer and a three-party application layer are involved on the flat side, after the equipment management module receives the photographed data from the desk lamp, a matched callback function can be queried in a callback buffer area through a task identifier, and the photographed image data are transferred to the three-party application layer through the callback function.
S1003, a transmission channel module of the hardware virtualization service in the tablet transmits a photographing request to the desk lamp side in the control channel.
S1004, the desk lamp calls the camera to shoot an image according to the shooting request, and returns shooting image data to a transmission channel module of the hardware virtualization service in the tablet in the data channel.
Wherein, the shot image data can include but is not limited to task identification, image frame size, shooting parameters and image data.
Illustratively, the captured image data may be transmitted back to the transmission channel module of the hardware virtualization service in the tablet in the form of a data packet, where the data packet includes, but is not limited to, a task identifier, an image frame size, a capturing parameter, and image data.
It should be noted that, for the shot image data fed back by a certain shooting request, the task identifier and the shooting parameter carried in the corresponding data message are the same as the task identifier and the shooting parameter carried in the shooting request, so that the shot image data returned by the desk lamp can be accurately transmitted to a certain shooting request place sent by the education APP.
S1005, a transmission channel module of the hardware virtualization service in the tablet acquires a task identifier, an image frame size and shooting parameters corresponding to the shot image data, and sends the task identifier, the image frame size and the shooting parameters to the equipment management module.
S1006, the device management module of the hardware virtualization service in the tablet creates a shared memory according to the image frame size and the task identifier, and stores the task identifier and the shared memory address into a shared memory pool corresponding to the shooting parameters.
In this embodiment, a section of shared memory (shared memory address may be identified by ShareM) is created for each photographing request for inter-process communication, where the size of the shared memory address is determined according to the image frame size, and the identification of the shared memory address may be determined by the unique identification (i.e., task identification) of the photographing request transferred by the three-party application layer.
After receiving the shot image collected by the desk lamp side, the transmission channel module transmits the image frame size (frame size) and the task identifier (PictureNo) of the shot image to the equipment management module through the JNI. The device management module creates a shared memory with PictureNo as an identifier and frame size as a size, and stores the shared memory address in a shared memory pool, wherein PictureNo is used as a unique identifier of the shared memory address.
Because the hardware space of the tablet is limited, the shared memory cannot be created infinitely, a shared memory pool management module can be arranged in a device management layer realized based on JAVA and used for managing the shared memory pool so as to ensure that three-party applications can acquire image data in time and clear unnecessary shared memory clearing in real time according to the request quantity.
The shared memory pool management module may be integrated in the device management module, or may be an independent module in the device management layer, which is not limited in this application.
For example, the device management module may invoke the shared memory pool management module to complete the operation of storing the shared memory address in the shared memory pool.
As an optional implementation manner, after the device management module creates the shared memory according to the image frame size and the task identifier, the task identifier and the shared memory address are sent to the shared memory pool management module, and the shared memory pool management module stores the task identifier and the shared memory address in the shared memory pool correspondingly.
Illustratively, in the shared memory pool, the task identifier and the shared memory address exist in the form of a key value pair, the key value is the task identifier, and the value is the shared memory address.
For example, the shared memory pool may be created during a virtual camera enabling phase, for example, the device management module in the application virtualization service creates the shared memory pool when receiving a virtual camera enabling request sent by the education APP, and for example, the device management module in the application virtualization service creates the shared memory pool when confirming that the virtual camera enabling is successful.
Also exemplary, the shared memory pool may also be created by the device management module in the application virtualization service during the virtual camera preview access phase.
As another example, the shared memory pool may also be created by a shared memory pool management module in the device management layer during the virtual camera enabled phase or the virtual camera preview access phase.
The creation timing of the shared memory pool is not particularly limited in this embodiment, and the shared memory pool is only created before the user initiates the photographing request for the first time and before the education APP initiates the photographing request for the first time automatically. With respect to the creation main body of the shared memory pool, the embodiment is not limited in particular, and the related module in the device management layer implemented based on JAVA may complete the creation operation of the shared memory pool.
The following explanation will be given taking the creation and management of the shared memory pool by the shared memory pool management module as an example.
In order to avoid infinite expansion of the shared memory, the shared memory pool management module may define a shared memory maintenance number threshold for shared memory pool maintenance when creating the shared memory pool. For example, assuming that the number of shared memory maintenance thresholds maintained by the shared memory pool is n+1, when n+1 shared memory addresses are stored in the shared memory pool, each time a new task identifier and a shared memory address are correspondingly stored in the shared memory pool, the task identifier and the shared memory address which are stored in the shared memory pool at the earliest time are moved out of the shared memory pool, and the corresponding shared memory is released.
As shown in fig. 16, it is assumed that the threshold value of the number of shared memory maintenance for the shared memory pool is n+1. After the device management module newly creates a section of shared memory according to the image frame size and the task identifier, and sends the shared memory information (PictureNo x, shareM x) to the shared memory pool management module, the shared memory pool management module performs an operation of storing (PictureNo x, shareM x) into the shared memory pool.
The shared memory pool management module first determines the amount of shared memory information currently stored in the shared memory pool after receiving the shared memory information, and if the amount of shared memory information currently stored does not reach the shared memory maintenance amount threshold, the shared memory pool management module directly stores the shared memory information (PictureNo x, shareM x) into the shared memory pool.
When the number of currently stored shared memory information reaches the threshold of the number of shared memory maintenance, as shown in fig. 16, n+1 pieces of shared memory address information (e.g., (PictureNo 0, shareM 0) - (PictureNo, shareM n)) are stored in the shared memory pool, the shared memory pool management module first moves out one piece of shared memory address information stored in the shared memory pool earliest in the n+1 pieces of shared memory address information, and releases the shared memory corresponding to the shared memory address in the shared memory address information. For example, when the shared memory information (PictureNo x, shareM x) is stored in the shared memory pool, the shared memory information (PictureNo 0, shareM 0) is moved out of the shared memory pool, and the shared memory corresponding to ShareM0 is released.
As an alternative implementation manner, in order to be compatible with different photographing modes, for example, a continuous photographing mode and a normal photographing mode, the shared memory pool management module may respectively create shared memory pools respectively corresponding to each photographing module, where the number of shared memory maintenance thresholds maintained by different shared memory pools are different. The threshold value of the number of shared memory maintenance of the shared memory pool corresponding to the continuous shooting mode is n1, and the threshold value of the number of shared memory maintenance of the shared memory pool corresponding to the normal shooting mode is n2, where n1 is greater than n2.
Illustratively, after the device management module creates the shared memory according to the image frame size and the task identifier, the shared memory address, and the shooting parameters (including but not limited to a shooting mode) are sent to the shared memory pool management module, which stores the task identifier and the shared memory address in the shared memory pool matched with the shooting mode.
Illustratively, in the present embodiment, n1 may be set to 15 and n2 may be set to 5.
As shown in fig. 17, assume that the first shared memory pool is a shared memory pool corresponding to the normal photographing mode, the number of maintained shared memories is 5, the first shared memory pool is a shared memory pool corresponding to the continuous photographing mode, and the number of maintained shared memories is 15. After the device management module newly creates a section of shared memory according to the image frame size and the task identifier, and sends the shared memory information (PictureNo x, shareM x) and shooting parameter information to the shared memory pool management module, the shared memory pool management module performs an operation of storing (PictureNo x, shareM x) into the matched shared memory pool according to the shooting parameter.
When the photographing parameter indicates the normal photographing mode, the shared memory pool management module performs an operation of storing (PictureNo x, shareM x) into the first shared memory pool. For example, the shared memory pool management module first determines the amount of shared memory information currently stored in the first shared memory pool, and if the amount of shared memory information currently stored in the first shared memory pool does not reach the shared memory maintenance amount threshold 5, the shared memory pool management module directly stores the shared memory information (PictureNo x, shareM x) into the first shared memory pool. If the number of the pieces of the shared memory information currently stored in the first shared memory pool reaches the threshold value 5 of the number of shared memory maintenance, as shown in fig. 17, the shared memory pool management module first moves out of the first shared memory pool one piece of shared memory address information stored in the first shared memory pool earliest among the 5 pieces of shared memory address information, and releases the shared memory corresponding to the shared memory address in the shared memory address information. For example, when the shared memory information (PictureNo x, shareM x) is stored in the first shared memory pool, the shared memory information (PictureNo 0, shareM 0) is moved out of the first shared memory pool, and the shared memory corresponding to ShareM 0 is released.
When the photographing parameter indicates the continuous photographing mode, the shared memory pool management module performs an operation of storing (PictureNo x, shareM x) into the second shared memory pool. The shared memory pool management module first determines the amount of shared memory information currently stored in the second shared memory pool, and if the amount of shared memory information currently stored in the second shared memory pool does not reach the shared memory maintenance amount threshold 15, the shared memory pool management module directly stores the shared memory information (PictureNo x, shareM x) into the second shared memory pool. If the number of the shared memory information currently stored in the second shared memory pool reaches the threshold value 15 of the number of shared memory maintenance, as shown in fig. 17, the shared memory pool management module first shifts out one shared memory address information stored in the second shared memory pool earliest in the 5 shared memory address information from the second shared memory pool, and releases the shared memory corresponding to the shared memory address in the shared memory address information. For example, when the shared memory information (PictureNo x, shareM x) is stored in the second shared memory pool, the shared memory information (PictureNo 0, shareM 0) is moved out of the second shared memory pool, and the shared memory corresponding to ShareM 0 is released.
In this way, the shared memory pool management module is able to maintain the size and lifecycle of the shared memory.
S1007, the device management module of the hardware virtualization service in the tablet returns the shared memory address corresponding to the task identifier to the transmission channel module through a file descriptor (file descriptor).
The device management module creates a shared memory with PictureNo as an identifier and with a frame size as a size, and returns a file descriptor to the transmission channel module, and meanwhile, the shared memory address is stored in a shared memory pool and is identified by PictureNo as a unique identifier.
S1008, the transmission channel module of the hardware virtualization service in the tablet fills the shot image data according to the shared memory address.
The transmission channel module obtains the shared memory address through the file descriptor, and fills the shot image data according to the shared memory address.
S1009, the transmission channel module of the hardware virtualization service in the tablet returns the task identifier, the shooting parameters and the shared memory filling result to the device management module.
The shared memory filling result may include, but is not limited to, an identification of whether the filling was successful.
S1010, when the filling result of the shared memory indicates that the filling is successful, the equipment management module of the hardware virtualization service in the tablet searches the shared memory address corresponding to the task identifier in the matched shared memory pool according to the task identifier and the shooting parameter.
When the number of the shared memory pools is 1, the device management module can query the shared memory addresses corresponding to the task identifiers in the shared memory pools according to the task identifiers.
When the number of the shared memory pools is multiple and the shared memory pools are associated with the shooting parameters, the device management module can query the shared memory address corresponding to the task identifier in the shared memory pools matched with the shooting parameters according to the task identifier. For example, when the photographing parameter indicates the normal photographing mode, the device management module may query the first shared memory pool for the shared memory address corresponding to the task identifier according to the task identifier, and when the photographing parameter indicates the continuous photographing mode, the device management module may query the second shared memory pool for the shared memory address corresponding to the task identifier according to the task identifier.
S1011, when the filling result of the shared memory indicates that the filling is successful, the device management module of the hardware virtualization service in the tablet queries a callback function corresponding to the task identifier in the callback buffer according to the task identifier.
The present embodiment does not limit the execution order of S1010 and S1011.
S1012, the device management module of the hardware virtualization service in the tablet returns a shared memory address to the education APP through a callback function.
The device management module acquires the shared memory address from the matched shared memory pool by taking the PictureNo as a key, and returns the shared memory address to the education APP through the callback function after acquiring the callback function from the callback buffer by taking the PictureNo as a key.
S1013, the education APP in the tablet reads the shot image data according to the shared memory address, and copies the shot image data to a user space of the process where the education APP is located.
So far, in response to the shooting request initiated by the education APP, the shooting image data collected by the camera is returned to the education APP by the desk lamp side, and then the education APP can display the corresponding shooting image to the user, or image recognition is carried out based on the shooting image so as to determine the text picture to be interpreted and the like.
The creation of the shared memory, filling data into the shared memory and acquiring data from the shared memory are all asynchronous processes of cross processes. Therefore, the shared memory pool is maintained, and the life cycle of the shared memory is managed, so that the three-party application can be ensured to accurately acquire the data, and infinite expansion of the shared memory can be avoided.
The present process is not explained in detail, and reference is made to the foregoing, and will not be repeated here.
In this way, in the data transmission method provided by the embodiment, the file cross-process transmission only needs two times of data copying, namely, the file is transmitted from the user space of the second process to the memory and then from the memory to the user space of the first process, so that the speed of the file cross-process transmission is improved, and the communication time consumption is reduced.
The present embodiment also provides a computer storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to execute the above-described related method steps to implement the collaborative method or the data transmission method in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-mentioned related steps to implement the cooperative method or the data transmission method in the above-mentioned embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is configured to store computer-executable instructions, and when the device is running, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the cooperative method or the data transmission method in the above method embodiments.
The electronic device (such as a tablet, a mobile phone, an IOT device, etc.), a computer storage medium, a computer program product, or a chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects of the corresponding methods provided above, and are not described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (14)

1. The data transmission method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a first process and a second process, the first process is used for running a target application, and the second process is used for transmitting target data to the first process; the second process comprises a first module based on JAVA and a second module based on C++, the first module is used for accepting the first process and the second module, and the second module is used for receiving the target data; the first process and the second process are communicated through AIDL, and the first module and the second module are communicated through JNI;
The method comprises the following steps:
the second module obtains the frame size of the target data and sends the frame size to the first module;
the first module creates a shared memory according to the frame size and sends a shared memory address to the second module;
the second module fills the target data according to the shared memory address and sends a filling result to the first module;
the first module sends the shared memory address to the target application when the filling result indicates that filling is successful;
and the target application acquires the target data according to the shared memory address.
2. The method of claim 1, wherein the electronic device establishes a communication connection with an internet of things device, the target application being bound to the internet of things device;
before the second module obtains the frame size of the target data, the method further includes:
the target application sends a data request to the internet of things equipment through the second process;
and the second module receives the target data fed back by the Internet of things equipment according to the data request.
3. The method according to claim 2, wherein the data request includes a task identifier, and the target data carries the task identifier;
After the first module creates the shared memory according to the frame size, the method further includes:
the first module correspondingly stores the task identifier and the shared memory address in a shared memory pool;
the second module sends the filling result to the first module, including:
the second module sends the filling result and the task identifier to the first module;
before the first module sends the shared memory address to the target application, the method further includes:
and the first module queries the corresponding shared memory address in the shared memory pool according to the task identifier.
4. The method of claim 3, wherein the first module storing the task identity and the shared memory address in a shared memory pool, comprising:
when the number of the shared memory addresses stored in the shared memory pool is smaller than a shared memory maintenance number threshold, the first module correspondingly stores the task identifier and the shared memory addresses into the shared memory pool; the shared memory maintenance number threshold is used for indicating the total number of memory addresses which can be maintained by the shared memory pool;
When the number of the shared memory addresses stored in the shared memory pool is greater than or equal to the threshold value of the maintenance number of the shared memory, the first module moves the target shared memory address out of the shared memory pool, and correspondingly stores the task identifier and the shared memory address into the shared memory pool; among the plurality of shared memory addresses currently stored in the shared memory pool, the time for storing the target shared memory address in the shared memory pool is earliest.
5. A method according to claim 3, wherein the data request further comprises a data parameter, and the target data carries the data parameter;
the first module correspondingly stores the task identifier and the shared memory address in a shared memory pool, including:
the first module correspondingly stores the task identifier and the shared memory address in a shared memory pool matched with a data parameter value.
6. The method of claim 5, wherein the number of shared memory maintenance thresholds for different shared memory pools that match different data parameter values are different.
7. The method of claim 1, wherein the frame size of the target data is greater than a preset threshold.
8. The method of any one of claims 2-6, wherein the internet of things device is a desk lamp; the desk lamp is provided with a camera; the second process is used for running hardware virtualization service, the first module is an equipment management module, and the second module is a transmission channel module; the data request comprises a photographing request, and the target data comprises photographed image data;
the method further comprises the steps of:
the hardware virtualization service registers a virtual camera corresponding to the camera in a system;
and generating a photographing request when the target application calls the virtual camera.
9. The method of claim 8, wherein the data request further includes data parameters, the data parameters being a photographing mode, the photographing mode including a normal photographing mode and a continuous photographing mode; wherein,
and the shared memory maintenance quantity threshold of the first shared memory pool matched with the common photographing mode is smaller than that of the second shared memory pool matched with the continuous photographing mode.
10. The method of claim 8, wherein the electronic device comprises a cell phone, a tablet computer.
11. An electronic device, comprising:
a memory and a processor, the memory coupled with the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the data transmission method of any of claims 1-10.
12. A cooperative work system, comprising: the electronic device for executing the data transmission method according to any one of claims 1-10, and the internet of things device, wherein a camera is arranged on the internet of things device, the camera is used for collecting image data, and a target application in the electronic device is bound with the internet of things device;
the electronic device is used for: registering a virtual camera corresponding to the camera in a system, and sending an image preview request to the internet of things equipment by calling the virtual camera;
the internet of things device is used for: invoking the camera to collect preview image data according to an image preview request of the electronic equipment, and sending the preview image data to a target application of the electronic equipment for preview display;
the electronic device is further configured to: generating a photographing request when the virtual camera is called, and sending the photographing request to the Internet of things equipment;
The internet of things device is further configured to: and calling the camera to shoot an image according to the shooting request of the electronic equipment, and sending shooting image data to the target application of the electronic equipment for display.
13. The system of claim 12, wherein the internet of things device is a desk lamp and the camera is configured to collect image data downward.
14. A computer readable storage medium comprising a computer program, characterized in that the computer program, when run on an electronic device, causes the electronic device to perform the data transmission method of any one of claims 1-10.
CN202210859322.6A 2022-07-21 2022-07-21 Data transmission method, electronic equipment and cooperative work system Pending CN117472603A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210859322.6A CN117472603A (en) 2022-07-21 2022-07-21 Data transmission method, electronic equipment and cooperative work system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210859322.6A CN117472603A (en) 2022-07-21 2022-07-21 Data transmission method, electronic equipment and cooperative work system

Publications (1)

Publication Number Publication Date
CN117472603A true CN117472603A (en) 2024-01-30

Family

ID=89622523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210859322.6A Pending CN117472603A (en) 2022-07-21 2022-07-21 Data transmission method, electronic equipment and cooperative work system

Country Status (1)

Country Link
CN (1) CN117472603A (en)

Similar Documents

Publication Publication Date Title
US11385857B2 (en) Method for displaying UI component and electronic device
EP4130963A1 (en) Object dragging method and device
WO2020155014A1 (en) Smart home device sharing system and method, and electronic device
CN112558825A (en) Information processing method and electronic equipment
CN113691842B (en) Cross-device content projection method and electronic device
CN112527174B (en) Information processing method and electronic equipment
US20230422154A1 (en) Method for using cellular communication function, and related apparatus and system
CN116360725B (en) Display interaction system, display method and device
CN111164983A (en) Lending local processing capability by interconnection terminal
JP7201827B2 (en) Image classification method and electronic device
CN112527222A (en) Information processing method and electronic equipment
JP2016502781A (en) Service updates during real-time communication and experience sharing sessions
CN114489350B (en) Input method calling method and related equipment
EP4209890A1 (en) File transmission method and related device
CN113190307A (en) Control adding method, device, equipment and storage medium
WO2023005711A1 (en) Service recommendation method and electronic device
CN114995591B (en) Sensor registration method, control system and related equipment
CN111132047A (en) Network connection method and device
CN116056076B (en) Communication system, method and electronic equipment
CN117472603A (en) Data transmission method, electronic equipment and cooperative work system
CN117499445A (en) Collaborative work system, collaborative work method and electronic equipment
CN117499446A (en) Collaborative work system, collaborative work method and electronic equipment
CN115225753A (en) Shooting method, related device and system
CN117478682A (en) Method, equipment and cooperative work system for establishing point-to-point channel
CN117478653A (en) Image data transmission method, device and cooperative work system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination