CN115002384B - Method for transmitting data, electronic device and readable storage medium - Google Patents

Method for transmitting data, electronic device and readable storage medium Download PDF

Info

Publication number
CN115002384B
CN115002384B CN202111608704.3A CN202111608704A CN115002384B CN 115002384 B CN115002384 B CN 115002384B CN 202111608704 A CN202111608704 A CN 202111608704A CN 115002384 B CN115002384 B CN 115002384B
Authority
CN
China
Prior art keywords
virtual camera
service
camera
controlling
adaptation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111608704.3A
Other languages
Chinese (zh)
Other versions
CN115002384A (en
Inventor
滕智飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111608704.3A priority Critical patent/CN115002384B/en
Publication of CN115002384A publication Critical patent/CN115002384A/en
Application granted granted Critical
Publication of CN115002384B publication Critical patent/CN115002384B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Abstract

The application discloses a data transmission method, electronic equipment and a readable storage medium, and belongs to the technical field of terminals. The method comprises the following steps: in the process of video call, if the switching condition of the camera using the second electronic device is met, the first instruction is converted through the virtual camera process according to the first instruction of the first application and according to a first data format defined by the virtual camera adaptation process, and the virtual camera adaptation process is used for converting data transmitted between the virtual camera process and the interconnection service process. And controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process so as to send the first instruction to the interconnection service process through the virtual camera adaptation process, wherein the interconnection service process is used for receiving and caching the video stream sent by the second electronic equipment. And controlling the interconnection service process to execute the operation corresponding to the first instruction based on the video stream. Thus, the virtual camera process does not need to pay attention to upper logic, and a large amount of code adaptation can be avoided.

Description

Method for transmitting data, electronic device and readable storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method for transmitting data, an electronic device, and a readable storage medium.
Background
With the rapid development of terminal technology, the multi-screen cooperation technology is widely applied. The multi-screen cooperation refers to that, on the basis of connection between a first electronic device (e.g., a mobile phone) and a second electronic device (e.g., a tablet computer), a screen of the first electronic device is displayed in a display window of the second electronic device, and a user can operate in the display window to enable the first electronic device to execute a corresponding function, for example, instant messaging software in the first electronic device is triggered in the display window to enable the first electronic device to initiate a video call.
In a multi-screen cooperative scene, when a video call is performed, the first electronic device can acquire a video frame by using a camera configured by the first electronic device, and can switch to acquire the video frame by using a camera of the second electronic device. In all of these processes, the first electronic device typically needs to be implemented by a virtual camera process of a hardware abstraction layer.
However, for different types of chip platforms, the implementation difference of the virtual camera process of the hardware abstraction layer is large, so large-scale logic conversion is required to be performed on the different types of chip platforms to perform code adaptation, resulting in high development cost.
Disclosure of Invention
The application provides a data transmission method, electronic equipment and a readable storage medium, which can solve the problem that the development cost is high due to the fact that large-scale logic conversion needs to be carried out on a virtual camera process aiming at different types of chip platforms in the related technology.
The technical scheme is as follows:
in a first aspect, a method for transmitting data is provided, where the method is applied to a first electronic device, and the first electronic device is connected to a second electronic device, and the method includes:
in the process of video call, if a switching condition of using a camera of the second electronic device is met, converting a first instruction according to a first data format defined by a virtual camera adaptation process through the virtual camera process according to the first instruction of a first application, wherein the virtual camera adaptation process is used for converting data transmitted between the virtual camera process and an interconnection service process;
controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process so as to send the first instruction to the interconnection service process through the virtual camera adaptation process, wherein the interconnection service process is used for receiving and caching a video stream sent by second electronic equipment;
and controlling the interconnection service process to execute the operation corresponding to the first instruction based on the video stream.
Therefore, the virtual camera process does not need to pay attention to upper-layer logic, only needs to convert the data according to a general data format defined by the virtual camera adaptation process and then delivers the data to the virtual camera adaptation process, and the virtual camera adaptation process converts the data according to upper-layer requirements and then sends the data to the interconnection service process. Therefore, a large amount of code adaptation work is avoided, and development cost can be reduced.
As an example of the present application, the controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process includes:
controlling the virtual camera process to acquire a first service handle;
controlling the virtual camera process to obtain a corresponding first service based on the first service handle, wherein the first service supports the first data format;
and controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process through the first service.
Therefore, by acquiring the first service of the virtual camera adaptation process and sending the converted first instruction to the virtual camera adaptation process, the first electronic device only needs to convert the first instruction into the first data format supported by the first service, and a large amount of logic conversion is avoided.
As an example of the present application, the sending, by the virtual camera process, the first instruction to the internet service process through the virtual camera adaptation process includes:
controlling the virtual camera adaptation process to convert the converted first instruction again to obtain an instruction which can be processed by the interconnection service process;
and controlling the virtual camera adaptation process to send an instruction which can be processed by the interconnection service process to the interconnection service process through a first callback function.
Therefore, the virtual camera adaptation process performs format conversion, so that the virtual camera process is prevented from being required to perform a large amount of code adaptation, namely, the virtual camera process does not need to pay attention to upper-layer logic, and the development cost can be reduced.
As an example of the application, after the controlling the interconnection service process to execute the operation corresponding to the first instruction based on the video stream, the method further includes:
controlling the interconnection service process to convert response data according to a second data format defined by the virtual camera adaptation process, wherein the response data is generated after the operation corresponding to the first instruction is executed;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process so as to transmit the response data to the virtual camera process through the virtual camera adaptation process.
For the interconnection service process, attention does not need to be paid to bottom layer logic, the response data are converted according to a general data format defined by the virtual camera adaptation process and then delivered to the virtual camera adaptation process, and the virtual camera adaptation process carries out conversion processing according to lower layer requirements and then sends the converted response data to the virtual camera process. Therefore, a large amount of code adaptation work is avoided, and the development cost can be reduced.
As an example of the present application, the controlling the internet service process to send the converted response data to the virtual camera adaptation process includes:
controlling the interconnection service process to acquire a second service handle;
controlling the interconnection service process to acquire a corresponding second service based on the second service handle, wherein the second service supports the second data format;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process through the second service.
Therefore, by acquiring the second service of the virtual camera adaptation process and sending the converted response data to the virtual camera adaptation process, the first electronic device only needs to convert the response data into the second data format supported by the second service, and a large amount of logic conversion is avoided.
As an example of the present application, the transmitting, by the internet service process, the response data to the virtual camera process through the virtual camera adaptation process includes:
controlling the virtual camera adaptation process to convert the converted response data again so as to obtain data which can be processed by the virtual camera process;
and controlling the virtual camera adaptation process to send the data which can be processed by the virtual camera process to the virtual camera process through a second callback function.
Therefore, the virtual camera adaptation process is used for format conversion, and the interconnection service process does not need to care about the characteristics of the bottom chip, so that the development cost can be reduced.
As an example of the present application, the method further comprises:
when the camera process receives a camera opening instruction of the first application, controlling the camera process to execute an operation of opening a camera;
controlling the camera process to send an open camera notification to the virtual camera process if the opening of the camera is successful;
controlling the virtual camera process and the camera process to establish a binding relationship;
and controlling the virtual camera process request and the virtual camera adaptation process to establish a binding relationship for data transmission, wherein the virtual camera adaptation process is started after the first electronic device is started.
Therefore, after the camera is opened, the virtual camera process and the virtual camera adaptation process are controlled to establish a binding relationship, so that a foundation is established for interaction between the two processes in a camera virtualization service scene subsequently.
As an example of the present application, the controlling the virtual camera process to request to establish a binding relationship with the virtual camera adaptation process for data transmission includes:
controlling the virtual camera process to request to acquire a first service handle of the virtual camera adaptation process, wherein a first service corresponding to the first service handle is used for the virtual camera process to transmit data to the virtual camera adaptation process;
and controlling the virtual camera process to register a second callback function in the virtual camera adaptation process, wherein the second callback function is used for the virtual camera adaptation process to transmit data to the virtual camera process.
In this way, by obtaining the first service handle and the callback, a binding relationship is established so that data interaction can be performed subsequently by using the first service and the second callback function.
As an example of the present application, after controlling the camera process to perform an operation of opening a camera when the camera process receives a camera opening instruction of the first application, the method further includes:
under the condition that the camera is opened successfully, controlling the interconnection service process to request to acquire a second service handle of the virtual camera adaptation process, wherein a second service corresponding to the second service handle is used for the interconnection service process to transmit data to the virtual camera adaptation process;
and controlling the interconnection service process to register a first callback function in the virtual camera adaptation process, wherein the first callback function is used for the virtual camera adaptation process to transmit data to the interconnection service process.
In this way, by obtaining the second service handle and the callback, a binding relationship is established, so that data interaction can be performed subsequently by using the second service and the first callback function.
As an example of the present application, the method further comprises:
controlling the virtual camera adaptation process to send a binding success notification to the interconnection service process, wherein the binding success notification is used for indicating that the virtual camera adaptation process is respectively bound with the virtual camera process and the interconnection service process;
and controlling the interconnection service process to start receiving the video stream.
Therefore, after the channel for transmitting data between the upper layer and the lower layer is established, the data transmission can be performed, so that the interconnection service process can be controlled to start receiving the video stream of the second electronic device, the switching of the video stream is realized, and the successful transmission of the video stream is ensured.
As an example of the present application, the method further comprises:
when the virtual camera adaptation process monitors that the virtual camera process is abnormal, controlling the virtual camera adaptation process to remove the binding relation with the virtual camera process; and/or the presence of a gas in the gas,
and when the virtual camera process monitors that the virtual camera adaptation process is abnormal, controlling the virtual camera process to remove the binding relation with the virtual camera adaptation process.
Therefore, each process controls the life cycle and exception handling of the process, and the influence of one process on the other process caused by exception is avoided.
In a second aspect, an apparatus for transmitting data is provided, where the apparatus for transmitting data structurally includes a processor and a memory, and the memory is used for storing a program for supporting the apparatus for transmitting data to execute the method for transmitting data provided in the first aspect, and storing data involved in implementing the method for transmitting data in the first aspect. The apparatus for transferring data may further include a communication bus for establishing a connection between the processor and the memory. The processor is configured to:
in the process of video call, if a switching condition of using a camera of the second electronic device is met, converting a first instruction according to a first data format defined by a virtual camera adaptation process through the virtual camera process according to the first instruction of a first application, wherein the virtual camera adaptation process is used for converting data transmitted between the virtual camera process and an interconnection service process;
controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process so as to send the first instruction to the interconnection service process through the virtual camera adaptation process, wherein the interconnection service process is used for receiving and caching a video stream sent by second electronic equipment;
and controlling the interconnection service process to execute the operation corresponding to the first instruction based on the video stream.
As an example of the present application, the processor is configured to:
controlling the virtual camera process to acquire a first service handle;
controlling the virtual camera process to acquire a corresponding first service based on the first service handle, wherein the first service supports the first data format;
and controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process through the first service.
As an example of the present application, the processor is configured to:
controlling the virtual camera adaptation process to convert the converted first instruction again to obtain an instruction which can be processed by the interconnection service process;
and controlling the virtual camera adaptation process to send an instruction which can be processed by the interconnection service process to the interconnection service process through a first callback function.
As an example of the present application, the processor is further configured to:
controlling the interconnection service process to convert response data according to a second data format defined by the virtual camera adaptation process, wherein the response data is generated after the operation corresponding to the first instruction is executed;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process so as to transmit the response data to the virtual camera process through the virtual camera adaptation process.
As an example of the present application, the processor is configured to:
controlling the interconnection service process to acquire a second service handle;
controlling the interconnection service process to acquire a corresponding second service based on the second service handle, wherein the second service supports the second data format;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process through the second service.
As an example of the present application, the processor is configured to:
controlling the virtual camera adaptation process to convert the converted response data again so as to obtain data which can be processed by the virtual camera process;
and controlling the virtual camera adaptation process to send data which can be processed by the virtual camera process to the virtual camera process through a second callback function.
As an example of the present application, the processor is further configured to:
when the camera process receives a camera opening instruction of the first application, controlling the camera process to execute an operation of opening a camera;
controlling the camera process to send a camera opening notification to the virtual camera process if the camera opening is successful;
controlling the virtual camera process and the camera process to establish a binding relationship;
and controlling the virtual camera process request and the virtual camera adaptation process to establish a binding relationship for data transmission, wherein the virtual camera adaptation process is started after the first electronic device is started.
As an example of the present application, the processor is configured to:
controlling the virtual camera process to request to acquire a first service handle of the virtual camera adaptation process, wherein a first service corresponding to the first service handle is used for the virtual camera process to transmit data to the virtual camera adaptation process;
and controlling the virtual camera process to register a second callback function in the virtual camera adaptation process, wherein the second callback function is used for the virtual camera adaptation process to transmit data to the virtual camera process.
As an example of the present application, the processor is further configured to:
under the condition that the camera is opened successfully, controlling the interconnection service process to request to acquire a second service handle of the virtual camera adaptation process, wherein a second service corresponding to the second service handle is used for the interconnection service process to transmit data to the virtual camera adaptation process;
and controlling the interconnection service process to register a first callback function in the virtual camera adaptation process, wherein the first callback function is used for the virtual camera adaptation process to transmit data to the interconnection service process.
As an example of the present application, the processor is further configured to:
controlling the virtual camera adaptation process to send a binding success notification to the interconnection service process, wherein the binding success notification is used for indicating that the virtual camera adaptation process is respectively bound with the virtual camera process and the interconnection service process;
and controlling the interconnection service process to start receiving the video stream.
As an example of the present application, the processor is further configured to:
when the virtual camera adaptation process monitors that the virtual camera process is abnormal, controlling the virtual camera adaptation process to remove the binding relation with the virtual camera process; and/or the presence of a gas in the gas,
and when the virtual camera process monitors that the virtual camera adaptation process is abnormal, controlling the virtual camera process to remove the binding relation with the virtual camera adaptation process.
In a third aspect, there is provided a computer readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of the first aspect described above.
In a fourth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the first aspect described above.
The technical effects obtained by the second, third and fourth aspects are similar to the technical effects obtained by the corresponding technical means in the first aspect, and are not described herein again.
Drawings
FIG. 1 is a schematic diagram illustrating modules in an electronic device in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating modules in an electronic device in accordance with another exemplary embodiment;
FIG. 3 is a schematic diagram illustrating an interface in a collaboration scenario in accordance with an illustrative embodiment;
FIG. 4 is a schematic diagram illustrating the structure of an electronic device in accordance with one illustrative embodiment;
FIG. 5 is a software architecture diagram of an electronic device shown in accordance with an exemplary embodiment;
FIG. 6 is an interface schematic diagram of a tablet computer according to an exemplary embodiment;
FIG. 7 is an interface schematic diagram of a cell phone, according to an example embodiment;
FIG. 8 is an interface schematic of a tablet computer according to another exemplary embodiment;
FIG. 9 is an interface schematic of a cell phone according to another exemplary embodiment;
FIG. 10 is a schematic diagram of a tablet computer shown in accordance with an exemplary embodiment;
FIG. 11 is a schematic diagram illustrating an interface in a collaboration scenario in accordance with an illustrative embodiment;
FIG. 12 is a schematic diagram of an interface in a collaboration scenario shown in accordance with another illustrative embodiment;
FIG. 13 is a flow chart illustrating a method for establishing a channel between an upper layer and a lower layer in accordance with an exemplary embodiment;
FIG. 14 is a diagram illustrating a virtual camera process establishing a binding relationship with a virtual camera adaptation process in accordance with an illustrative embodiment;
FIG. 15 is an initialization schematic diagram of a virtual camera adaptation process, shown in accordance with an exemplary embodiment;
FIG. 16 is a diagram illustrating an interconnection services process establishing a binding relationship with a virtual camera adaptation process in accordance with an illustrative embodiment;
FIG. 17 is a flow chart illustrating a method for establishing a channel between an upper layer and a lower layer in accordance with another exemplary embodiment;
FIG. 18 is a schematic flow chart diagram illustrating a method of transmitting data in accordance with an exemplary embodiment;
FIG. 19 is a schematic diagram illustrating the transfer of data between a virtual camera process and an interconnection service process in accordance with an illustrative embodiment;
FIG. 20 is a schematic diagram illustrating an exception handling flow in accordance with an illustrative embodiment;
FIG. 21 is a flowchart illustrating a method of transmitting data in accordance with another exemplary embodiment;
fig. 22 is a schematic diagram illustrating a structure of an apparatus for transmitting data according to an exemplary embodiment.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that reference to "a plurality" in this application means two or more. In the description of the present application, "/" means "or" unless otherwise stated, for example, a/B may mean a or B; "and/or" herein is only an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, for the convenience of clearly describing the technical solutions of the present application, the words "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," and the like do not denote any order or importance, but rather the terms "first," "second," and the like do not denote any order or importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In a possible home scenario, a first user is in a first room and makes a video call with a second user through a mobile phone, and in the process, if the second user wants to see what a third user is doing, if the third user is in a second room, the first user needs to move from the first room to the second room by holding the mobile phone, and then always faces a lens to the third user. As such, chatting between the first user and the second user during the video call may affect the third user to do things.
In a possible conference scenario, a first employee performs a video call with a second employee who is different from the company through a mobile phone, and in the process, when the first employee wants to share the content of a certain blackboard newspaper with the second employee, the first employee needs to hold the mobile phone by hand and always align a lens with the blackboard newspaper, so that the user experience is poor.
With the popularization of the multi-screen cooperative technology, under the condition that the multi-screen cooperative connection is established between the mobile phone and the tablet computer, the mobile phone can acquire video frames by means of a camera of the tablet computer in the video call process, so that the situation that a user needs to hold the mobile phone by hand and shoot a camera towards a target all the time is avoided.
In a multi-screen collaborative scene, acquiring a video stream by a mobile phone through a tablet computer in a video call is called a camera virtualization service. At present, the implementation of the camera virtualization service is generally completed by a collaboration of a collaboration service process and a plurality of modules of a hardware abstraction layer, please refer to fig. 1 (a), where the plurality of modules at least include a camera module and a virtual camera module. After the camera virtualization service is started, the collaborative service process is used for receiving and caching a video stream acquired by a camera of the tablet computer, the virtual camera module is used for acquiring the video stream acquired by the camera of the tablet computer from the collaborative service process and replacing the video stream acquired by the camera module from the camera of the mobile phone by using the video stream, and the camera module feeds the replaced video stream back to the application layer. However, since the camera module and the virtual camera module run in the same process and have coupling dependency with each other, if the virtual camera module is abnormally interrupted, an abnormality occurs in the camera module.
To this end, in some embodiments, the camera module and the virtual camera module are decoupled, for example, as shown in fig. 1 (b), functions of the two modules are executed by separate processes, respectively, the two processes are a camera process and a virtual camera process, respectively, and a communication connection is established between the two processes through a virtual camera interface layer. The camera process is used for executing the functions of the camera module, and the virtual camera process is used for executing the functions of the virtual camera module, so that the two modules are decoupled, and the problem of camera module abnormity caused by abnormal interruption of the virtual camera module can be avoided.
However, on different chip platforms, since the virtual camera process implementation of the hardware abstraction layer is very different, a large amount of adaptation code is usually required. For example, referring to fig. 2 (a), the bottom chip includes a first type chip corresponding to the first camera interface and a second type chip corresponding to the second camera interface, and according to a specific scheme of the bottom chip, an intermediate service module and a top layer conversion interface are usually required to be added on the bottom chip to perform code adaptation on different types of chip platforms, so as to adapt to the different chip platforms respectively, and the scheme compatibility and the expandability are poor. For this purpose, the present embodiment provides a method, for example, please refer to fig. 2 (b), in which a chip adaptation layer module is added, and the chip adaptation layer module is independent from the bottom chip and is used for opening a channel between the upper layer and the lower layer. The chip adaptation layer module provides a top layer conversion interface for the upper layer and a bottom layer conversion interface for the lower layer, and comprises a converter, wherein the converter is used for converting data transmitted by the upper layer and the lower layer. Therefore, data between the upper layer and the lower layer are converted by the chip adaptation layer module, namely, the upper layer only needs to convert the data to be forwarded into a data format supported by the top conversion interface without paying attention to the characteristics of the bottom chip. In addition, for the bottom layer, upper layer logic does not need to be concerned, and no matter for a first type chip corresponding to the first camera interface, a second type chip corresponding to the second camera interface, or other types of chips corresponding to other camera interfaces, the virtual camera process only needs to convert the data to be forwarded into a data format supported by the bottom layer conversion interface. Therefore, the problem that a large amount of code adaptation needs to be performed for different chip platforms is avoided, and specific implementation of the method can be seen in the following.
Before describing the method provided by the embodiment of the present application in detail, an executive body related to the embodiment of the present application is described. The method provided by the embodiment of the application can be executed by the electronic equipment. The electronic equipment supports the video call function, for example, the electronic equipment is provided with instant messaging software, and the video call can be realized through the instant messaging software. The electronic equipment is provided with one or more cameras, and video frames can be collected through the configured cameras in the video call process. In one embodiment, in a case that the electronic device includes a plurality of cameras, the plurality of cameras may include a front camera and a rear camera, the number of the front cameras may be one or more, and the number of the rear cameras may also be one or more. In addition, the electronic equipment has multi-screen coordination capability, and can establish a multi-screen coordination relationship with other electronic equipment. By way of example and not limitation, the electronic device may include, but is not limited to, a cell phone, a tablet computer, a laptop computer, a smart watch. In an example, referring to fig. 3, taking an example that the electronic device is a mobile phone and the other electronic device is a tablet computer, after the mobile phone and the tablet computer establish a multi-screen cooperative relationship, the mobile phone may project a display picture onto the tablet computer, and during a video call, the mobile phone may acquire a video stream through a camera of the tablet computer and send the video stream to an opposite-end device performing a video call with the mobile phone.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 4, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management process 140, a power management process 141, a battery 142, an antenna 1, an antenna 2, a mobile communication process 150, a wireless communication process 160, an audio process 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor process 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor process 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces, such as an inter-integrated circuit (I2C) interface, an inter-integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication process 150, the wireless communication process 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication process 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on the electronic device 100. The mobile communication process 150 may include at least one filter, switch, power amplifier, low Noise Amplifier (LNA), and the like. The mobile communication process 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication process 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194.
The wireless communication process 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication process 160 may be one or more devices that integrate at least one communication processing process. The wireless communication process 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signal, and transmits the processed signal to the processor 110. Wireless communication process 160 may also receive signals to be transmitted from processor 110, frequency modulate them, amplify them, and convert them to electromagnetic radiation via antenna 2.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication process 150 and antenna 2 is coupled to wireless communication process 160 so that electronic device 100 may communicate with networks and other devices through wireless communication techniques.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when taking a picture, open the shutter, on light passed through the lens and transmitted camera light sensing element, light signal conversion was the signal of telecommunication, and camera light sensing element transmits the signal of telecommunication to ISP and handles, turns into the image that the naked eye is visible. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being an integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. Thus, the electronic device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving files of music, video, etc. in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created by the electronic device 100 during use, and the like.
Next, a software system of the electronic apparatus 100 will be described.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the present application, an Android (Android) system with a layered architecture is taken as an example to exemplarily describe a software system of the electronic device 100.
Fig. 5 is a block diagram of a software system of an electronic device 100 according to an embodiment of the present disclosure. Referring to fig. 5, the layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into an application layer, an application framework layer, a system layer, an extension layer, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 5, the application package may include, but is not limited to, instant messenger, multi-screen collaboration, camera, bluetooth, gallery, call, map. The instant messaging software can be used for realizing video call; and the multi-screen cooperation is used for starting the multi-screen cooperation function. For convenience of description, the other electronic devices that establish a cooperative relationship with the home terminal will be referred to as cooperative electronic devices, and the connection established between the home terminal and the cooperative electronic devices will be referred to as a cooperative connection hereinafter.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 5, the application framework layer may include a service discovery process, a collaborative helper process, and an interconnection services process.
The service discovery process is used for monitoring a connection instruction for indicating multi-screen cooperation after Bluetooth or NFC is started, and notifying the cooperation assistant process after the connection instruction is monitored. And the cooperation assistant process is used for establishing cooperation connection by mutually exchanging information with the cooperation assistant processes in other electronic equipment after receiving the notification of the service discovery process.
As an example of the application, in a video call process, if the video call process is in a multi-screen collaborative scene, the interconnection service process is configured to receive and cache a video stream sent by the collaborative electronic device, and provide a corresponding camera service for a bottom layer according to a request of the bottom layer.
In one example, the interconnection service process includes a data processing module, a transmission channel module, a flow control module and a capability collection module. The data processing module can be used for processing the video frame according to the requirements of the bottom layer, such as format conversion and the like; the transmission channel module is used for configuring a transmission channel; the flow control module is used for caching the video stream; the capability acquisition module is used for acquiring the camera capabilities of the local-end electronic equipment and the collaborative electronic equipment so as to match a camera (or called a camera) of the local-end electronic equipment with a camera of the collaborative electronic equipment according to the acquired camera capabilities. For example, in the case that it is determined that the home electronic device and the cooperative electronic device each include the front camera and the rear camera according to the collected camera capabilities, the front camera of the home electronic device corresponds to the front camera of the cooperative electronic device, and the rear camera of the home electronic device corresponds to the rear camera of the cooperative electronic device.
As an example of the present application, the system layer includes a virtual camera adaptation process, a multimedia platform, and the like. For example, referring to fig. 2 (b), the virtual camera adaptation process includes a chip adaptation layer module. In one example, the virtual camera adaptation process is initiated after the electronic device is powered on, which is referred to as a resident process. The virtual camera adaptation process is mainly used for performing format conversion processing on data transmitted between the interconnection service process and the extension layer, and can be understood as a bridge for data transmission between the interconnection service process and the virtual camera process. Illustratively, when the interconnection service process needs to transmit data to the virtual camera process, the interconnection service process first sends the data to the virtual camera adaptation process, the virtual camera adaptation process performs format conversion, that is, converts the data into a data format that can be recognized and processed by the virtual camera process, and then sends the converted data to the virtual camera process. On the contrary, when the virtual camera process needs to transmit the data to the interconnection service process, the virtual camera process firstly transmits the data to the virtual camera adaptation process, the virtual camera adaptation process performs format conversion on the data, namely converts the data into a data format which can be recognized and processed by the interconnection service process, and then transmits the converted data to the interconnection service process. Therefore, due to the existence of the virtual camera adaptation process, the virtual camera process does not need to pay attention to upper-layer logic, and the interconnection service process does not need to pay attention to the characteristics of a bottom-layer chip.
In one example, a virtual camera adaptation process includes a first service, a second service, and a translator.
The first service is used for providing a service for the next service, and supports a first data format, and the first data format can be set by technical personnel according to actual requirements. For example, the first service is a TRANSLATOR service, and referring to fig. 2 (b), the TRANSLATOR service may be an underlying translation interface. For example, if the virtual camera process needs to send data to the interconnection service process of the upper layer, the data may be converted according to the first data format, and then the converted data is sent to the virtual camera adaptation process through the TRANSLATOR service, so as to be sent to the interconnection service process through the virtual camera adaptation process.
The second service is used for providing service for the upper side, and supports a second data format, and the second data format can be set by technicians according to actual requirements. For example, the second service is CHANNEL service, and referring to fig. 2 (b), the CHANNEL service may be a top-layer conversion interface. For example, if the interconnection service process needs to send data to the underlying virtual camera process, the data may be converted according to the second data format, and then the converted data is sent to the virtual camera adaptation process through the CHANNEL service, so as to be sent to the virtual camera process through the virtual camera adaptation process.
The converter is used for carrying out format conversion on the data. For example, the data of the bottom layer is converted into a data format which can be recognized and processed by the upper layer, or the data of the upper layer is converted into a data format which can be recognized and processed by the bottom layer.
As an example of the present application, the extension layer mainly includes a camera process and a virtual camera process, and the camera process and the virtual camera process are located in a HAL (hardware abstraction layer). The camera process is used to open a local camera according to the business requirements of the application layer, and capture video frames through the local camera, for example, referring to fig. 2 (b), the camera process may include a first camera interface or a second camera interface or other camera interfaces. By way of example and not limitation, the camera process is initiated upon power-up. In one embodiment, in a multi-screen collaborative scenario, a camera process requests to acquire a video stream captured by a collaborative electronic device through a virtual camera process. Correspondingly, the virtual camera process acquires the video frame acquired by the cooperative electronic equipment from the interconnection service process according to the request of the camera process, and sends the video frame acquired by the cooperative electronic equipment to the camera process so as to replace the local video stream, thereby realizing the switching of the video stream.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and the like. Wherein, the camera drive is used for driving the camera hardware to make the camera start, so, electronic equipment can pass through the camera and gather the image.
It should be noted that the internal structure of the cooperative electronic device (e.g., a tablet computer) is the same as the internal structure of the electronic device (e.g., a mobile phone), and details are not repeated in this application.
For the convenience of understanding, taking the cooperation between the mobile phone and the tablet pc as an example, several possible connection modes of multi-screen cooperation will be described first.
1. The connection is established via bluetooth.
In one embodiment, when the user wants to work with the mobile phone and the tablet computer, the bluetooth in both the mobile phone and the tablet computer can be turned on. In addition, in the mobile phone, the user manually turns on the multi-screen coordination function, for example, the user may find the switch of the "multi-device coordination" through the path of "set" - "more connect" - "multi-device coordination", and set the switch in the on state, so that the multi-screen coordination function of the mobile phone is turned on.
Referring to fig. 6 (a), the user slides down a notification panel from the status bar of the tablet computer, and the notification panel includes a "multi-screen cooperation" option 61. The user can click the "multi-screen collaboration" option 61, and in response to the user's trigger operation on the "multi-screen collaboration" option 61, the tablet computer displays a first prompt window, where the first prompt window includes first operation prompt information for instructing the user how to operate. Illustratively, as shown in a diagram (b) in fig. 6, the first operation prompt message includes "1. Turn on your mobile phone bluetooth and approach to the local computer, and click" connect "after finding the local computer. 2. After connection, a user can operate the mobile phone on the tablet personal computer to realize data sharing among devices. "is used as the prompt. Therefore, the user can operate according to the first operation prompt information in the first prompt window. For example, a cell phone is brought close to a tablet computer.
In one example, when the tablet is found by the mobile phone during the process of the mobile phone approaching the tablet, the mobile phone displays a second prompt window, such as please refer to fig. 7 (a), which includes a "connect" option 71 and a "cancel" option 72. When the user clicks the "connect" option 71, it indicates that the user confirms that a connection is to be established, and at this time, in response to the triggering operation of the user on the "connect" option 71, the mobile phone establishes a connection with the tablet computer through bluetooth. When the user clicks the cancel option 72, it indicates that the user wants to cancel the connection establishment, and at this time, the mobile phone does not execute the operation of establishing the connection in response to the user's trigger operation on the cancel option 72. In another example, during the process that the mobile phone approaches the tablet computer, when the mobile phone finds the tablet computer, the second prompt window may not be displayed, and the connection with the tablet computer is automatically established through bluetooth.
By way of example and not limitation, in the process of establishing a connection between the mobile phone and the tablet computer through bluetooth, in order to display the progress of establishing the connection, the mobile phone may further display a third prompt window for indicating that the connection is being made, for example, the third prompt window is shown in (b) of fig. 7. Optionally, a "cancel" option is included in the third prompt window, so that the user can cancel the connection in the process of establishing the connection if necessary.
2. And establishing connection in a code scanning mode.
At the tablet computer end, a user can operate on the tablet computer according to a path of my mobile phone, immediate connection and code scanning connection, and in response to the operation of the user, the tablet computer displays a two-dimensional code for establishing connection, and exemplarily, the display result is as shown in fig. 8. Optionally, the tablet pc may further display a second operation prompt message for prompting the user how to operate, for example, the second operation prompt message is "scan code connection using a mobile phone browser".
At the mobile phone end, an interface including a "scan" option may be entered in the browser (or smart vision), for example, please refer to fig. 9 (a), and the interface of the browser is entered, which includes a "scan" option 91. The user can click the "scan" option 91, and in response to the user's trigger operation on the "scan" option 91, the mobile phone starts the camera, as shown in fig. 9 (b), so that the user can align the camera with the two-dimensional code displayed at the tablet computer terminal to perform code scanning operation.
In one example, after the mobile phone successfully scans the code, a request for establishing a connection is sent to the tablet computer. After receiving the request of the mobile phone, the tablet pc may display a fourth prompt window, where the fourth prompt window includes prompt information for prompting the user whether to approve establishing the connection, for example, the prompt information is "xx equipment requests to establish a connection with the home terminal, and whether to approve establishing the connection? ", an" agree "option and a" reject "option may also be included in the fourth prompt window. When the user clicks the 'agree' option, the user is indicated to allow the mobile phone to be connected with the tablet computer, and at the moment, the tablet computer is connected with the mobile phone in response to the trigger operation of the user on the 'agree' option. When the user clicks the 'reject' option, the user is indicated that the mobile phone is not allowed to be connected with the tablet computer, and at the moment, the tablet computer informs the mobile phone of connection failure in response to the triggering operation of the user on the 'reject' option.
It should be noted that, the above description is only given by taking an example of opening the two-dimensional code at the tablet pc end through the path of "my mobile phone" - "immediate connection" - "code scanning connection". In another embodiment, the two-dimensional code can be opened through other paths. Exemplarily, referring to fig. 6 (b), the first prompt window includes, in addition to the first operation prompt information, second operation prompt information, where the second operation prompt information is "cannot find a local device? You can also scan a code connection ", wherein four words of" scan a code connection "are set to be triggerable. The user may click on the "scan link" content in the first prompt window. In response to the trigger operation of the user on the code scanning connection, the tablet computer displays the two-dimensional code interface shown in fig. 8, so that the user can scan the two-dimensional code on the tablet computer through the mobile phone, and the connection is established in a code scanning mode.
3. The connection is established by a touch-and-dash manner.
The user can respectively start the NFC and the multi-screen cooperative function in the mobile phone and the tablet computer. Then, the user touches the NFC region on the back of the mobile phone (around the camera on the back of the mobile phone) to the NFC region on the keyboard (usually located in the lower right corner region of the tablet computer, as shown at 101 in fig. 10), and in response to the operation of the user, a connection is established between the mobile phone and the tablet computer through NFC. Optionally, before the connection is established through NFC, whether the user agrees to establish the connection may be respectively prompted on the tablet computer and the mobile phone, and after the user agrees to establish the connection, the mobile phone and the tablet computer perform an operation of establishing the connection. In one example, the cell phone may also alert the user by vibrating or ringing when the cell phone is connected to the tablet.
It should be noted that, the above several possible connection manners are all described by taking a wireless implementation as an example. In another embodiment, the screen projection may be performed in a wired manner, for example, the screen projection is implemented through a connection line of a Type-C to high-definition multimedia interface (HDMI), which is not limited in this embodiment of the present application.
After the mobile phone is successfully connected to the tablet pc, the screen of the mobile phone is displayed in the display window of the tablet pc in a mirror image manner, for example, as shown in fig. 11. Therefore, the user can operate in the window according to the requirement. In one example, when the user wants to make a video call through the instant messenger, the user can click on the icon of the instant messenger a in the window of the tablet computer to open the instant messenger and then click on the "video call" option in the instant messenger. Responding to the triggering operation of the user on the video call option, the tablet personal computer sends a video call control instruction to the mobile phone, and the mobile phone initiates a video call request after receiving the video call control instruction and carries out video call with other users. Referring to fig. 3, in the process, the frames of the windows of the mobile phone and the tablet computer are displayed synchronously.
In one embodiment, referring to fig. 12, after the notification bar of the tablet computer is pulled down, the multi-screen collaboration notification bar of the tablet computer displays that collaboration has been performed on the mobile phone. In addition, the multi-screen coordination notification bar comprises a state switch, and a user can adjust the on-off of the state switch according to requirements. For example, if a user wants to capture a picture through a camera of a tablet computer during a video call, the state switch may be turned to an on state, and the state switch is "audio/video switched to the mobile phone" at this time, that is, in this case, the picture during the video call of the mobile phone is captured by the camera of the tablet computer. Of course, if the user only wants to use the camera of the mobile phone to acquire the picture, the state switch can be turned to the off state, and thus, the picture of the mobile phone in the video call process is acquired by the camera of the mobile phone. The specific implementation principle can be seen in the following embodiments.
Based on the software architecture diagram provided in fig. 5, the method flow provided in the embodiment of the present application is described in detail below. Referring to fig. 13, fig. 13 is a schematic flow chart illustrating a channel establishment between an upper layer and a lower layer according to an exemplary embodiment, which is described as an example and not a limitation, where the method is applied to a mobile phone, and the mobile phone is implemented by interaction of multiple processes shown in fig. 5. Assume that a cooperative connection is established between the mobile phone and the tablet computer. The method may include some or all of the following:
step A1: and the service discovery process of the mobile phone starts an instruction monitoring function.
As an example of the present application, when a user wants to perform a multi-screen coordination between a mobile phone and a tablet computer, bluetooth and NFC in the mobile phone may be turned on, and bluetooth and NFC in the tablet computer may be turned on. For any electronic device in a mobile phone and a tablet computer, the service discovery process starts the instruction monitoring function under the condition that Bluetooth or NFC is started. That is, the service discovery process of the mobile phone starts to execute the instruction monitoring function when detecting that the bluetooth or the NFC is turned on. Similarly, when the bluetooth or NFC is detected to be turned on, the service discovery process of the tablet computer starts to execute the instruction monitoring function.
Step A2: and when the service discovery process of the mobile phone monitors the connection instruction, sending the connection instruction to the cooperative assistant process.
In one example, when a user triggers a "multi-screen collaboration" option on the tablet computer side, the multi-screen collaboration in the tablet computer sends a connection instruction to a service discovery process of an application framework layer. And the service discovery process broadcasts the connection instruction after monitoring the connection instruction, so that the service discovery process of the mobile phone can monitor the connection instruction in the process that the mobile phone approaches the tablet computer.
In one example, after the service discovery process of the mobile phone listens to the connection instruction, the connection instruction is sent to the cooperative assistant process of the mobile phone.
In another example, referring to fig. 7 (a), after the service discovery process of the mobile phone listens to the connection instruction, a second prompt window may be displayed on the mobile phone, so that the user may confirm on the mobile phone whether to approve the cooperative connection between the mobile phone and the tablet computer through the second prompt window. And when receiving a confirmation instruction of the user based on the second prompt window, indicating that the user agrees to establish the cooperative connection, in this case, the service discovery process of the mobile phone sends the connection instruction to the cooperative assistant process of the mobile phone.
Step A3: and the cooperative assistant process of the mobile phone and the cooperative assistant process of the tablet computer exchange equipment information.
Illustratively, the exchanged device information may include, but is not limited to, device location information, device capability information. The device orientation information can be coordinate information and is used for establishing a data transmission channel between the mobile phone and the tablet computer. The device capability information of the mobile phone may include the number of cameras, the attribute of each camera, and the capability information of each camera, the number of cameras refers to the number of cameras in the mobile phone, the attribute of the camera may include a front camera or a rear camera, and the capability information of the camera may include information describing a resolution, a frame rate, and the like that the camera can support. Similarly, the device capability information of the tablet computer may also include the number of cameras of the tablet computer, the attributes of each camera, and the capability information of each camera.
That is, the mobile phone sends the coordinate information and the device capability information of the mobile phone to the cooperative assistant process of the tablet computer. And after receiving the equipment information sent by the mobile phone, the cooperative assistant process of the tablet computer sends the coordinate information and the equipment capability information of the tablet computer to the mobile phone. Thus, the two mutually complete the exchange of the device information.
Step A4: and the cooperative assistant process of the mobile phone informs the internet service process of starting.
In one example, the cooperative assistant process of the mobile phone may send a start notification to the internet service process of the mobile phone to pull up the internet service process of the mobile phone.
In addition, after the device information is exchanged, the cooperative assistant process of the tablet computer may also send a start notification to the interconnection service process of the tablet computer, so as to pull up the interconnection service process of the tablet computer.
Step A5: and carrying out initialization configuration on the interconnection service process.
In implementation, after the internet service process of the mobile phone receives the start notification, the internet service process starts to perform an initialization configuration operation so as to establish a basis for subsequent camera virtualization services.
In one example, the initial configuration operation of the interconnection service process includes: and collecting the equipment capability information of the tablet personal computer and the equipment capability information of the mobile phone. As described above, since the cooperative assistant process of the mobile phone and the cooperative assistant process of the tablet computer have performed device information exchange, the cooperative assistant process of the mobile phone may provide the device capability information of the mobile phone and the tablet computer for the internet service process.
In one example, the initial configuration operation of the interconnection service process further includes pre-enabling the audio and telephony modules such that the audio and telephony modules are operational to provide the base conditions for the implementation of the video call. In addition, a data transmission channel is established between the interconnection service process of the mobile phone and the interconnection service process of the tablet personal computer, and the data transmission channel is used for carrying out interconnection service.
The above describes a process of establishing a cooperative connection between a mobile phone and a tablet computer by taking interaction of multiple processes in the mobile phone as an example, and the process is an optional embodiment of the present application. Based on the established cooperative connection, a binding process between the processes in the video call process is introduced next.
Step A6: and the instant messaging software sends a camera opening instruction to the camera process.
In one example, the user triggers a "video call" option in the instant messaging software. The instant communication software detects the triggering operation of the user, which indicates that the user requests to carry out video call, and in response to the triggering operation of the user on the video call option, the instant communication software of the mobile phone initiates a video call request and requests the camera process to open the camera, for example, sends a camera opening instruction to the camera process.
Step A7: and the camera process controls the camera to drive the camera to be turned on.
And after the camera process receives the camera opening instruction, controlling the camera driver to load the camera. Illustratively, the camera process sends an instruction for opening the camera to the camera driver, and the camera driver drives the camera to load after receiving the instruction, so as to complete the operation for opening the camera.
Step A8: after the camera is opened, the camera process pulls up the virtual camera process.
In one embodiment, for a camera process, when a camera open event occurs, the camera process notifies the virtual camera process to start, regardless of whether the application requesting the opening of the camera is instant messaging software. Correspondingly, the virtual camera process starts initialization after receiving the notification of the camera process, and the main purpose of the initialization is to establish a binding relationship with the camera process so as to perform data interaction with the camera process subsequently through the binding relationship.
Step A9: and establishing a binding relationship between the virtual camera process and the virtual camera adaptation process.
In one example, referring to fig. 14, the establishing the binding relationship between the virtual camera process and the virtual camera adaptation process includes: at 141, the virtual camera process requests the virtual camera adaptation process to obtain a service handle of the TRANSLATOR service; at 142, the virtual camera process registers a second callback function with the virtual camera adaptation process; at 143, the virtual camera process may acquire the transport service. That is, the subsequent virtual camera process may acquire the TRANSLATOR service using the service handle, thereby sending data to the virtual camera adaptation process through the TRANSLATOR service. In addition, the subsequent virtual camera adaptation process can transmit data to the virtual camera process by using the second callback function in a callback mode.
Step A10: and under the condition that a camera opening event occurs, if the state switch is in an opening state, the cooperative assistant process notifies the interconnection service process.
In one example, there is a first listening process in the mobile phone, and the first listening process is configured to send a camera open notification to a process registered in the listening list after listening for a camera open event. Therefore, the cooperative helper process may register in the listening list in advance. In this way, when the first listening process monitors the camera open event, it sends a camera open notification to the cooperative assistant process, so that the cooperative assistant process can know that the camera open event exists.
At this time, for the cooperative assistant process, if it is determined that the current state switch is in the on state, it is determined that the mobile phone uses the video stream acquired by the camera of the tablet computer, and therefore, the cooperative assistant process notifies the internet service process that the camera virtualization service is already started.
Step A11: and establishing a binding relationship between the interconnection service process and the virtual camera adaptation process.
As can be seen from the foregoing, the virtual camera adaptation process is started after the mobile phone is powered on. In one example, referring to fig. 15, after the virtual camera adaptation process is started, the CHANNEL service 150 on the pair and the transport service 151 on the pair may be object-bound through the middle cswitch instance 152, so as to facilitate bidirectional communication, that is, a CHANNEL for data transmission is established between the CHANNEL service 150 and the transport service 151 after the virtual camera adaptation process is started.
For the interconnection service process, under the condition that a camera opening event is determined to occur, actively establishing a binding relationship with the started virtual camera adaptation process. In one example, referring to fig. 16, the establishing the binding relationship between the interconnection service process and the virtual camera adaptation process includes: at 161, the interconnection service process obtains a service handle of the CHANNEL service 150 to the virtual camera adaptation process; at 162, the interconnection service process registers a first callback function with the virtual camera adaptation process; at 163, the interconnect service process can obtain CHANNEL service 150. That is, the subsequent internet service process can acquire CHANNEL service 150 using the service handle, thereby facilitating data transmission to the virtual camera adaptation process through CHANNEL service 150; in addition, the subsequent virtual camera adaptation process transmits the data to the interconnection service process by utilizing the first callback function in a callback mode.
In addition, after receiving the camera opening notification of the cooperative assistant process, the interconnection service process also sends the camera opening notification to the interconnection service process of the tablet computer through a pre-established data transmission channel for interconnection service. Correspondingly, after the interconnection service process of the tablet computer receives the camera opening notification, the camera of the tablet computer is called to drive the camera of the tablet computer to open.
After the binding between the upper end and the lower end is completed, namely the binding relationship between the interconnection service process and the virtual camera adaptation process is established, and the binding relationship between the virtual camera process and the virtual camera adaptation process is established, the virtual camera adaptation process returns a successful binding notification to the interconnection service process. And after receiving the binding success notification, the interconnection service process sends the binding success notification to the interconnection service process of the tablet computer. And after receiving the binding success notification, the interconnection service process of the tablet computer calls a camera driver to continuously acquire the video stream, and sends the video stream to the interconnection service process of the mobile phone through a data transmission channel for interconnection service, which is established between the interconnection service process of the mobile phone and the interconnection service process of the mobile phone.
Of course, the above description is made by taking the state switch in the on state as an example. In another embodiment, if the status switch is in the off state, it indicates that the user only needs to use the camera of the mobile phone to capture the video frames. In this case, the cooperative assistant process does not notify the interconnection service process after receiving the camera opening notification, that is, the interconnection service process is not bound to the virtual camera adaptation process, and in this case, the video picture in the video call is acquired from the local camera by the camera process.
In addition, as an example of the present application, when a mobile phone uses a tablet computer to capture a video stream during a video call, after a camera virtualization operation is completed, a status label may be set in each layer, where the status label is in a virtual status, which means that the video stream transmitted by each layer in the mobile phone is not captured by the camera of the mobile phone, but is captured by the tablet computer in cooperation. When a local camera is used for capturing video frames, the state tag can be set to be in a physical state, which means that a video stream used in a video call process is captured by a camera of a mobile phone.
In the embodiment of the application, if a video stream acquired by a camera of a tablet computer is required to be used during a video call, a channel for data transmission among an interconnection service process, a virtual camera adaptation process and the interconnection service process is opened first to realize camera virtualization operation, so that a video frame can be transmitted to a camera process of an extension layer from the interconnection service process and displayed conveniently.
The above embodiment is described by taking an example in which the virtual camera adaptation process is loaded after being booted. In another embodiment, the virtual camera adaptation process may also be loaded after the camera virtualization service is turned on, for example, referring to fig. 17, and fig. 17 is a flowchart illustrating a process of establishing a channel between an upper layer and a lower layer according to another exemplary embodiment. By way of example and not limitation, the method is described herein as being applied to a handset that is implemented by interaction of multiple processes as shown in fig. 5. Assume that a cooperative connection is established between the mobile phone and the tablet computer. The method may include some or all of the following:
for steps B1 to B7, see steps A1 to A7 in the above embodiments:
and step B8: and under the condition that a camera opening event occurs, if the video collaboration switch is in an opening state, the collaboration helper process informs the interconnection service process and the virtual camera process.
By way of example and not limitation, a first listening process exists in the handset, and the first listening process is used for sending a camera open notification to processes registered in a listening list after a camera open event is listened to. Therefore, the cooperative helper process may register in the listening list in advance. In this way, when the first listening process monitors the camera open event, it sends a camera open notification to the cooperative assistant process, so that the cooperative assistant process can know that the camera open event exists.
At this time, for the cooperative assistant process, if it is determined that the current video cooperative switch is in an on state, it is determined that the camera virtualization service is already turned on, that is, it is determined that the mobile phone needs to capture a video stream by using a local camera of the tablet computer. To do so, the cooperative helper process notifies the interconnection service process and the virtual camera process. The interconnection service process, upon receiving the notification, proceeds to step B9 as follows. The virtual camera process receives the notification of the cooperative helper process and proceeds to step B12 as follows.
Step B9: the interconnection service process pulls up the virtual camera adaptation process.
And after the interconnection service process receives the camera opening notification, the interconnection service process sends a loading notification to the virtual camera adaptation process so as to enable the virtual camera adaptation process to start loading.
In addition, after receiving the camera opening notification of the cooperative assistant process, the interconnection service process also sends the camera opening notification to the interconnection service process of the tablet computer through a pre-established data transmission channel for interconnection service. Correspondingly, after the interconnection service process of the tablet computer receives the camera opening notification, the camera of the tablet computer is called to drive the camera of the tablet computer to open.
Step B10: the virtual camera adaptation process is initialized.
In one example, referring to fig. 15, after the virtual camera adaptation process is started, the CHANNEL service 150 on the pair and the transport service 151 on the pair are object-bound by the middle cswitch instance 152, thereby facilitating two-way communication, i.e., establishing a CHANNEL for data transmission between the CHANNEL service 150 and the transport service 151.
Step B11: and after the initialization of the virtual camera adaptation process is finished, establishing a binding relationship between the interconnection service process and the virtual camera adaptation process.
The specific implementation of the binding relationship between the interconnection service process and the virtual camera adaptation process can be seen in fig. 16.
Step B12: the virtual camera process updates the target state attribute to a first state value.
The target status attribute is used to indicate whether camera virtualization traffic has been turned on. When the target state attribute is a first state value, the method is used for indicating that the camera virtualization service is started, and when the target state attribute is a second state value, the method is used for indicating that the camera virtualization service is not started. As an example of the present application, the target state attribute exists in the system attribute, and is a global attribute. Illustratively, the first state value is 1 and the second state value is 0.
That is, after receiving the notification of the cooperative assistant process, the virtual camera process updates the target state attribute to the first state value to indicate that the camera virtualization service is currently started in the video call. Illustratively, the virtual camera process may modify the state value of the target state attribute through the CHANNEL service 150.
Step B13: and when the camera process senses that the target state attribute is the first state value, pulling up the virtual camera process.
In one embodiment, the camera process and the virtual camera process can be connected through a virtual camera interface layer, and a virtual camera service awareness module is included in the virtual camera interface layer. After the local camera is opened by the camera process, the virtual camera service sensing module can be informed to start the sensing function, and correspondingly, the virtual camera service sensing module scans the state value of the state attribute of the target. In one example, the virtual camera service activity awareness module scans the state value of the target state attribute once every period time threshold, for example, the state value of the target state attribute may be obtained by calling a system interface, which may be a propertyget interface, to implement the scanning operation. Therefore, the camera process can sense the state value of the target state attribute through the virtual camera service sensing module. The period duration threshold may be set according to actual requirements, for example, the period duration threshold may be 100 milliseconds.
In one case, if the camera process senses that the target state attribute is the first state value, it indicates that the mobile phone needs to use the virtual camera process, and therefore, the camera process sends a load instruction to the virtual camera process through the virtual camera interface layer to pull up the virtual camera process.
In another case, if the camera process senses that the target state attribute is the second state value, it indicates that the current upper layer does not trigger the camera virtualization service, that is, the video cooperative switch is in the off state. Specifically, under the condition that the video collaboration switch is in the off state, the collaboration helper process does not notify the virtual camera process to update the target state attribute to the first state value after receiving the camera on notification, so that the target state attribute sensed by the camera process at this time is the second state value. In this case, the video frames in the video call are acquired by the camera process from the local camera.
Step B14: the virtual camera process begins to initialize.
After receiving a loading instruction of the camera process, the virtual camera process starts to execute initialization operation, wherein the initialization is mainly used for establishing a binding relationship with the camera process through a virtual camera interface layer. For example, the camera process obtains services of the virtual camera process to establish a binding relationship with the virtual camera process. This is so that the camera process can subsequently perform data interaction with the virtual camera process via the binding relationship.
Step B15: and establishing a binding relationship between the virtual camera process and the virtual camera adaptation process.
That is, after the virtual camera process is started, in addition to establishing a binding relationship with the camera process, a binding relationship with the virtual camera adaptation process is also established. As an example of the present application, in order to determine whether the virtual camera adaptation process has completed initialization, a thread scanning operation may be started by the virtual camera process after starting, so as to scan the initialization condition of the virtual camera adaptation process through the thread scanning operation. And if the virtual camera adaptation process is determined to be not initialized through thread scanning, the virtual camera process enters a waiting state, and once the virtual camera adaptation process is scanned to be initialized, the virtual camera process actively requests the virtual camera adaptation process to establish a binding relationship.
In one example, an implementation of the virtual camera process to establish a binding relationship with the virtual camera adaptation process can be seen in fig. 14.
After the binding between the upper layer and the lower layer is completed, namely after the binding relationship between the interconnection service process and the virtual camera adaptation process is established and the binding relationship between the virtual camera process and the virtual camera adaptation process is established, the virtual camera adaptation process returns an enabling success notification to the interconnection service process, and the enabling success notification is used for indicating that a channel for data transmission between the upper layer and the lower layer is established. And after receiving the enabling success notification, the interconnection service process sends the enabling success notification to the interconnection service process of the tablet computer. After the interconnection service process of the tablet computer receives the enabling success notification, the camera driver is called to continuously acquire the video stream, and the video stream is sent to the interconnection service process of the mobile phone through a data transmission channel for interconnection service, which is established between the interconnection service process of the mobile phone and the interconnection service process of the mobile phone. Correspondingly, the interconnection service process of the mobile phone receives and caches the video stream sent by the tablet computer.
In another embodiment, the virtual camera process may not always initiate a request to bind to the virtual camera adaptation process, for example, because the virtual camera process does not scan until the virtual camera adaptation process has initialized to an end. For the virtual camera adaptation process, if the binding request of the virtual camera process is not received within the preset time length, an enabling failure notice is returned to the interconnection service process. And after receiving the enabling failure notification, the interconnection service process sends the enabling failure notification to the interconnection service process of the tablet computer. And after receiving the enabling failure notification, the interconnection service process of the tablet computer calls a camera driver to close the camera of the tablet computer.
The preset duration can be set according to actual requirements. For example, the preset time period may be 3 seconds.
In one embodiment, if the user adjusts the video cooperative switch from the on state to the off state in the video call, it indicates that the user only needs to use the camera of the mobile phone to capture the video frame at this time, and in this case, the mobile phone does not need to use the virtual camera process, the virtual camera adaptation process, and the interconnection service process for the moment. At this point, the collaboration helper process notifies the virtual camera process that the camera virtualization service has been closed. Accordingly, the virtual camera process receives the notification and updates the target status attribute from the first status value to the second status value, for example, the virtual camera process updates the status value of the target status attribute to the second status value through the CHANNEL service 150.
In one embodiment, if the user ends the video call during the video call, the instant messaging application requests the camera process to turn off the camera. The camera process invokes the camera driver to turn off the camera through the camera driver. After the camera is closed, the camera process sends a camera closing instruction to the virtual camera process, the camera closing instruction is used for indicating that the virtual camera process terminates receiving the video frame, and the camera process releases the service of the virtual camera process, so that the binding relation between the camera process and the virtual camera process is released. For the virtual camera process, in the case of receiving a camera closing instruction, the target state attribute is updated from the first state value to the second state value, and the virtual camera process indicates that the interconnection service process no longer receives the video stream sent by the second electronic device, after that, the virtual camera process notifies the virtual camera adaptation process that the camera virtualization service is closed, and the virtual camera process releases the TRANSLATOR service 151, thereby releasing the binding relationship with the virtual camera adaptation process. For the virtual camera adaptation process, after receiving the notification that the camera virtualization service is closed, the interconnection service process is notified. Accordingly, the interconnection service process releases the CHANNEL service 150 of the virtual camera adaptation process, thereby releasing the binding relationship with the virtual camera adaptation process. In this manner, the virtual camera adaptation process is restored to an unloaded state.
It is worth mentioning that, because the virtual camera adaptation process and the virtual camera process both need to occupy a certain memory space after being loaded, the two processes are loaded only when being used and are not loaded when not being used in the embodiment of the present application, so that the two processes do not occupy a large amount of memory in a non-camera virtualization service scene, the decoupling degree between the processes is high, and thus a certain memory space can be saved.
In one example, if the virtual camera process is closed but the camera process is not closed, the camera process turns on the virtual camera traffic awareness module again to detect whether the camera virtualization traffic is started again.
In the embodiment of the application, if a video stream acquired by a camera of a tablet computer is required to be used during a video call, a channel for data transmission among an interconnection service process, a virtual camera adaptation process and the interconnection service process is opened first to realize camera virtualization operation, so that a video frame can be transmitted to a camera process of an extension layer from the interconnection service process and displayed conveniently.
The above embodiments describe the building process of the channel, and the process is completed only before data transmission, and all the processes are optional implementations. Based on the above embodiments, the flow of data transmission is described in detail below. Please refer to fig. 18. Fig. 18 is a flow chart illustrating a method of transmitting data according to an example embodiment. By way of example and not limitation, the method is applied to a mobile phone, and the mobile phone is described by taking an interactive implementation of multiple processes as an example. The method may include the following:
step C1: the virtual camera process converts the first instruction according to a first data format defined by the virtual camera adaptation process.
As one example of the present application, the first instructions may include, but are not limited to, image acquisition instructions, configuration instructions, close instructions, and swipe instructions. The image acquisition instruction is used for requesting to acquire a video frame, and the configuration instruction is used for requesting to configure video parameters, such as configuration resolution, frame rate, and the like. The shutdown instruction is used to instruct the camera to be shut down. The flush instruction is used to indicate that instructions that have been cached are flushed.
As an example of the present application, the first instruction is from instant messaging software. Illustratively, the first instruction is an image capture instruction, such as during a video call of the mobile phone through the instant messaging software, the instant messaging software requests the camera process to capture a video frame, and the camera process sends the image capture instruction to the virtual camera process. The virtual camera process converts the image acquisition instructions in the generic data format defined by the virtual camera adaptation process to obtain data supported by the TRANSLATOR service 151.
In one example, the TRANSLATOR service 151 may be developed secondarily based on an android (android) hardware abstraction layer interface definition language (HIDL) interface, with good compatibility and extensibility.
And step C2: and the virtual camera process sends the converted first instruction to the virtual camera adaptation process.
In one embodiment, referring to fig. 19, the virtual camera process acquires a service handle of the TRANSLATOR service 151, acquires the TRANSLATOR service 151 according to the service handle, and then sends the converted first instruction to the virtual camera adaptation process through the TRANSLATOR service 151.
In one example, the TRANSLATOR service 151 defines several request access interfaces, and the virtual camera process may call the corresponding request access interface according to the type of the first instruction (such as an image acquisition instruction or a configuration instruction) to send the converted first instruction to the virtual camera adaptation process through the called request access interface.
And C3: and the virtual camera adaptation process carries out conversion processing on the converted first instruction.
In one example, after receiving the converted first instruction, the virtual camera adaptation process performs format conversion or logic conversion on the first instruction to convert the first instruction into a data format that can be recognized and processed by the internet service process.
And C4: and the virtual camera adaptation process sends the first instruction obtained after the conversion processing to the interconnection service process.
As described above, in the binding process, the interconnection service process registers the first callback function with the virtual camera adaptation process. Therefore, when the virtual camera adaptation process needs to send data to the interconnection service process, a callback mode can be adopted, and the first instruction obtained after conversion processing is sent to the interconnection service process through the first callback function.
And C5: and the interconnection service process executes the operation corresponding to the first instruction.
Illustratively, assuming that the first instruction is an image acquisition instruction, the interconnection service process acquires a video frame corresponding to the first instruction from the cached video stream according to a picture requested by the first instruction. If the first instruction is a configuration instruction, the interconnection service process configures the video frame according to the configuration instruction.
After the operation corresponding to the first instruction is executed, response data (such as a video frame) is usually generated, and the interconnection service module usually needs to feed back the response data to the virtual camera module, so in one example, the mobile phone further performs the following operation.
And C6: and the interconnection service process converts the response data according to the second data format.
In one example, the response data is the video frame acquired in step C5.
In an implementation, the interconnection service process converts the response data into a data format supported by the CHANNEL service 150 so that the response data can be transmitted to the virtual camera process through the virtual camera adaptation process.
In one example, CHANNEL service 150 can be developed twice based on the HIDL interface of android with good compatibility and scalability.
It should be noted that, the interconnection service process of the mobile phone may not immediately perform the step C6 after performing the operation corresponding to the first instruction, that is, may not immediately feed back the response data, but feed back the response data after a certain time interval, and may be specifically set according to an actual requirement.
Step C7: and the interconnection service process sends the converted response data to the virtual camera adaptation process.
Illustratively, the interconnection service process acquires a service handle of the CHANNEL service 150, then acquires the CHANNEL service 150 based on the service handle, and sends the converted response data to the virtual camera adaptation process through the CHANNEL service 150.
And C8: and the virtual camera adaptation process carries out conversion processing on the received response data.
The virtual camera adaptation process converts the response data into a data format that the virtual camera process can recognize and process.
Step C9: and sending the response data after the conversion processing to the virtual camera process.
As described above, in the binding process, the virtual camera process registers the second callback function with the virtual camera adaptation process, so that the virtual camera adaptation process can send the response data to the virtual camera process by using the second callback function registered in advance by the virtual camera process in a callback manner.
Step C10: the virtual camera process processes the response data.
In one example, taking the response data as a video frame as an example, the virtual camera process performs format conversion processing on the video frame to obtain data that can be processed by the camera process. And then, the virtual camera process sends the processed data to the camera process, the camera process fills the data, and sends the filled data to an application layer, for example, the camera process sends the filled data to instant messaging software of the application layer, thereby realizing the video call service.
By way of example and not limitation, in the event that the video stream is switched to camera capture via the tablet, the local camera process may turn off the camera of the cell phone, such as controlling the camera driver so that the camera is turned off to avoid tying up camera resources. Further, if the video stream is switched back to the camera of the mobile phone, for example, if the user turns off the video collaboration switch, the collaboration helper process may notify the local camera process that the camera virtual service has been turned off, but if the video call is still in progress, that is, the local camera process does not receive an instruction of the instant messaging application to turn off the camera, the local camera process turns on the local camera again through the camera driver, so as to capture the video stream through the local camera.
As an example of the present application, the virtual camera process, the virtual camera adaptation process, and the interconnection service process each control their own lifecycle and perform exception handling, for example, please refer to fig. 2 (b), and for the chip adaptation layer module, it includes a module for lifecycle management and a module for exception handling. When any one of the three processes is abnormal, the other process bound with the process actively releases the binding relation with the abnormal process. Illustratively, a second monitoring process exists in the handset, and the second monitoring process is used for monitoring the running state of each process. When the second monitoring process monitors that the virtual camera process is abnormal, the second monitoring process informs the virtual camera adaptation process which has a binding relationship with the virtual camera process, and thus the virtual camera adaptation process receives the abnormal notice. And after the virtual camera adaptation process receives the abnormal notification, releasing the second callback function registered by the virtual camera process so as to release the binding relation with the virtual camera process. In addition, the virtual camera adaptation process modifies the state label to a physical state.
Similarly, when the virtual camera adaptation process is abnormal, the second monitoring process notifies the virtual camera process. And the virtual camera process actively releases the binding relation with the virtual camera adaptation process. For another example, when the virtual camera adaptation process is abnormal, the interconnection service process actively releases the binding relationship with the virtual camera adaptation process. And when the interconnection service process is abnormal, the virtual camera adaptation process releases the binding relation with the interconnection service process.
For example, as shown in fig. 20, in 201, a virtual camera process and a virtual camera adaptation process are established to have a binding relationship, and the virtual camera adaptation process and an interconnection service process are established to have a binding relationship. At 202, an exception occurs to the virtual camera process. In 203, the handset notifies the virtual camera adaptation process through the second listening process, so that the virtual camera adaptation process releases the binding relationship with the virtual camera process, such as releasing a callback and resetting the status tag. In 204, the virtual camera adaptation process is abnormal. In 205, the handset notifies the internet service process through the second listening process, so that the internet service process releases the binding relationship with the virtual camera adaptation process, such as releasing the CHANNEL service 150, and resetting the status flag.
It should be noted that each process controls its own lifecycle and exception handling, so as to avoid one process from being affected by another process due to exception. For example, when the virtual camera process is abnormal, the virtual camera adaptation process is not affected, and the virtual camera adaptation process restores the initial state after detecting the abnormality. When the virtual camera adaptation process is abnormal, the virtual camera process is not affected, when the virtual camera adaptation process is abnormal, the virtual camera service provided by the virtual camera process is closed, and meanwhile the virtual camera process needs to be restored to a physical state.
In the embodiment of the application, the virtual camera process does not need to pay attention to upper-layer logic, only needs to convert data according to a general data format defined by the virtual camera adaptation process, then delivers the data to the virtual camera adaptation process, and the virtual camera adaptation process performs conversion processing according to upper-layer requirements and then sends the data to the interconnection service process. Similarly, for the interconnection service process, the underlying logic is not required to be concerned, the data is converted according to a general data format defined by the virtual camera adaptation process and then is delivered to the virtual camera adaptation process, and the virtual camera adaptation process performs conversion processing according to the underlying requirements and then sends the converted data to the virtual camera process. Therefore, a large amount of code adaptation work is avoided, and development cost can be reduced.
In addition, the method provided by the embodiment of the application can be suitable for executing camera virtualization operation on the HAL layer of various types of chips, if other types of chips are added, only the data interface of the virtual camera module needs to be processed on the HAL layer, the data is converted into a universal format supported by the virtual camera adaptation process, the acquisition of the TRANSLATOR service and the callback registration are completed, and the virtual camera adaptation module does not need to be additionally modified.
Referring to fig. 21, fig. 21 is a flowchart illustrating a method for transmitting data according to another exemplary embodiment. By way of example and not limitation, the method is applied to a first electronic device, which is connected to a second electronic device. The first electronic device may be the mobile phone in the above embodiment, and the second electronic device may be the tablet computer in the above embodiment. The method may include the following.
Step 2101: in the process of video call, if the switching condition of the camera using the second electronic device is met, the first instruction is converted through the virtual camera process according to the first instruction of the first application and according to a first data format defined by the virtual camera adaptation process, and the virtual camera adaptation process is used for converting data transmitted between the virtual camera process and the interconnection service process.
The first application can be used for video calls. For example, the first application is the instant messaging software in the above embodiment.
In one example, when the camera process receives an open camera instruction of a first application, the camera process is controlled to perform an operation of opening the camera. In the event that the opening of the camera is successful, the controlling camera process sends an open camera notification to the virtual camera process. And controlling the virtual camera process to establish a binding relationship with the camera process. And controlling the virtual camera process request to establish a binding relationship for data transmission with the virtual camera adaptation process, wherein the virtual camera adaptation process is started after the first electronic device is started.
In one example, controlling the virtual camera process to request establishment of a binding relationship with the virtual camera adaptation process for transmitting data includes: and controlling the virtual camera process to request to acquire a first service handle of the virtual camera adaptation process, wherein a first service corresponding to the first service handle is used for the virtual camera process to transmit data to the virtual camera adaptation process. And controlling the virtual camera process to register a second callback function in the virtual camera adaptation process, wherein the second callback function is used for the virtual camera adaptation process to transmit data to the virtual camera process.
In an example, when the camera process receives a camera opening instruction of a first application, after the camera process is controlled to execute a camera opening operation, and the camera is successfully opened, the interconnection service process is controlled to request to acquire a second service handle of the virtual camera adaptation process, where a second service corresponding to the second service handle is used for the interconnection service process to transmit data to the virtual camera adaptation process. And controlling the interconnection service process to register a first callback function in the virtual camera adaptation process, wherein the first callback function is used for the virtual camera adaptation process to transmit data to the interconnection service process.
In one example, the virtual camera adaptation process is controlled to send a binding success notification to the interconnection service process, where the binding success notification is used to indicate that the virtual camera adaptation process has completed binding with both the virtual camera process and the interconnection service process, respectively. And controlling the internet service process to start receiving the video stream.
Step 2102: and controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process so as to send the first instruction to the interconnection service process through the virtual camera adaptation process, wherein the interconnection service process is used for receiving and caching the video stream sent by the second electronic equipment.
In one embodiment, controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process includes: the virtual camera process is controlled to obtain a first service handle. And controlling the virtual camera process to acquire a corresponding first service based on the first service handle, wherein the first service supports a first data format. And controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process through the first service. Illustratively, the first service handle is a service handle of the TRANSLATOR service 151.
In one example, the virtual camera process sending the first instruction to the interconnection service process through the virtual camera adaptation process includes: and controlling the virtual camera adaptation process to convert the converted first instruction again so as to obtain an instruction which can be processed by the interconnection service process. And controlling the virtual camera adaptation process to send an instruction which can be processed by the interconnection service process to the interconnection service process through the first callback function.
Step 2103: and controlling the interconnection service process to execute the operation corresponding to the first instruction based on the video stream.
In one embodiment, after step 2103, the interconnection service process is controlled to convert the response data according to the second data format defined by the virtual camera adaptation process, where the response data is generated after the operation corresponding to the first instruction is executed. And controlling the interconnection service process to send the converted response data to the virtual camera adaptation process so as to transmit the response data to the virtual camera process through the virtual camera adaptation process.
In one example, controlling the internet service process to send the converted response data to the virtual camera adaptation process includes: and controlling the interconnection service process to acquire the second service handle. And controlling the interconnection service process to acquire the corresponding second service based on the second service handle, wherein the second service supports a second data format. And controlling the interconnection service process to send the converted response data to the virtual camera adaptation process through a second service. Illustratively, the second service handle is the service handle of CHANNEL service 150.
In one example, the internet service process transmits the response data to the virtual camera process through the virtual camera adaptation process, including: and controlling the virtual camera adaptation process to convert the converted response data again so as to obtain data which can be processed by the virtual camera process. And controlling the virtual camera adaptation process to send the data which can be processed by the virtual camera process to the virtual camera process through the second callback function.
In one example, when the virtual camera adaptation process monitors that the virtual camera process is abnormal, the virtual camera adaptation process is controlled to release the binding relation with the virtual camera process; and/or when the virtual camera process monitors that the virtual camera adaptation process is abnormal, controlling the virtual camera process to remove the binding relation with the virtual camera adaptation process.
In the embodiment of the application, the virtual camera process does not need to pay attention to upper layer logic, only needs to convert the data according to a general data format defined by the virtual camera adaptation process, then delivers the converted data to the virtual camera adaptation process, and the virtual camera adaptation process converts the data according to upper layer requirements and then sends the converted data to the interconnection service process. Therefore, a large amount of code adaptation work is avoided, and development cost can be reduced.
Fig. 22 is a schematic structural diagram of an apparatus for transmitting data according to an embodiment of the present application, where the apparatus may be implemented by software, hardware, or a combination of the two as part or all of an electronic device, which may be the electronic device shown in fig. 4. Referring to fig. 22, the apparatus for transmitting data structurally includes a processor 2210 and a memory 2220, where the memory 2220 is used for storing a program for supporting the apparatus for transmitting data to execute the methods provided in the foregoing embodiments, and storing data involved in implementing the methods described in the foregoing embodiments. The means for transferring data may further comprise a communication bus 2230, said communication bus 2230 being used for establishing a connection between said processor and said memory. The number of the processors 2210 may be one or more, the processors 2210 being configured to:
in the process of video call, if a switching condition of using a camera of the second electronic device is met, converting a first instruction according to a first data format defined by a virtual camera adaptation process through the virtual camera process according to the first instruction of a first application, wherein the virtual camera adaptation process is used for converting data transmitted between the virtual camera process and an interconnection service process;
controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process so as to send the first instruction to the interconnection service process through the virtual camera adaptation process, wherein the interconnection service process is used for receiving and caching a video stream sent by second electronic equipment;
and controlling the interconnection service process to execute the operation corresponding to the first instruction based on the video stream.
As an example of the present application, the processor 2210 is configured to:
controlling the virtual camera process to acquire a first service handle;
controlling the virtual camera process to obtain a corresponding first service based on the first service handle, wherein the first service supports the first data format;
and controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process through the first service.
As an example of the present application, the processor 2210 is configured to:
controlling the virtual camera adaptation process to convert the converted first instruction again to obtain an instruction which can be processed by the interconnection service process;
and controlling the virtual camera adaptation process to send an instruction which can be processed by the interconnection service process to the interconnection service process through a first callback function.
As an example of the present application, the processor 2210 is further configured to:
controlling the interconnection service process to convert response data according to a second data format defined by the virtual camera adaptation process, wherein the response data is generated after the operation corresponding to the first instruction is executed;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process so as to transmit the response data to the virtual camera process through the virtual camera adaptation process.
As an example of the present application, the processor 2210 is configured to:
controlling the interconnection service process to acquire a second service handle;
controlling the interconnection service process to acquire a corresponding second service based on the second service handle, wherein the second service supports the second data format;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process through the second service.
As an example of the present application, the processor 2210 is configured to:
controlling the virtual camera adaptation process to convert the converted response data again so as to obtain data which can be processed by the virtual camera process;
and controlling the virtual camera adaptation process to send data which can be processed by the virtual camera process to the virtual camera process through a second callback function.
As an example of the present application, the processor 2210 is further configured to:
when the camera process receives a camera opening instruction of the first application, controlling the camera process to execute an operation of opening a camera;
controlling the camera process to send an open camera notification to the virtual camera process if the opening of the camera is successful;
controlling the virtual camera process and the camera process to establish a binding relationship;
and controlling the virtual camera process request and the virtual camera adaptation process to establish a binding relationship for data transmission, wherein the virtual camera adaptation process is started after the first electronic device is started.
As an example of the present application, the processor is configured to:
controlling the virtual camera process to request to acquire a first service handle of the virtual camera adaptation process, wherein a first service corresponding to the first service handle is used for the virtual camera process to transmit data to the virtual camera adaptation process;
and controlling the virtual camera process to register a second callback function in the virtual camera adaptation process, wherein the second callback function is used for the virtual camera adaptation process to transmit data to the virtual camera process.
As an example of the present application, the processor 2210 is further configured to:
under the condition that the camera is opened successfully, controlling the interconnection service process to request to acquire a second service handle of the virtual camera adaptation process, wherein a second service corresponding to the second service handle is used for the interconnection service process to transmit data to the virtual camera adaptation process;
and controlling the interconnection service process to register a first callback function in the virtual camera adaptation process, wherein the first callback function is used for the virtual camera adaptation process to transmit data to the interconnection service process.
As an example of the present application, the processor is further configured to:
controlling the virtual camera adaptation process to send a binding success notification to the interconnection service process, wherein the binding success notification is used for indicating that the virtual camera adaptation process is respectively bound with the virtual camera process and the interconnection service process;
and controlling the interconnection service process to start receiving the video stream.
As an example of the present application, the processor 2210 is further configured to:
when the virtual camera adaptation process monitors that the virtual camera process is abnormal, controlling the virtual camera adaptation process to remove the binding relation with the virtual camera process; and/or the presence of a gas in the gas,
and when the virtual camera process monitors that the virtual camera adaptation process is abnormal, controlling the virtual camera process to release the binding relation with the virtual camera adaptation process.
In the embodiment of the application, the virtual camera process does not need to pay attention to upper layer logic, only needs to convert the data according to a general data format defined by the virtual camera adaptation process, then delivers the converted data to the virtual camera adaptation process, and the virtual camera adaptation process converts the data according to upper layer requirements and then sends the converted data to the interconnection service process. Therefore, a large amount of code adaptation work is avoided, and development cost can be reduced.
It should be noted that: in the data transmission device provided in the above embodiment, only the division of the functional modules is used for illustration when data is transmitted, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
Each functional unit and module in the above embodiments may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present application.
The embodiments of the apparatus for transmitting data and the method for transmitting data provided in the embodiments belong to the same concept, and for specific working processes of units and modules and technical effects brought by the working processes in the embodiments, reference may be made to the portions of the embodiments of the methods, which are not described herein again.
In the above embodiments, the implementation may be wholly or partly realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., digital Versatile Disk (DVD)), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
The above description is not intended to limit the present application to the particular embodiments disclosed, but rather, the present application is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present application.

Claims (13)

1. A method for transmitting data, which is applied to a first electronic device connected with a second electronic device, the method comprising:
in the process of video call, if a switching condition of using a camera of the second electronic device is met, converting a first instruction according to a first data format defined by a virtual camera adaptation process through the virtual camera process according to the first instruction of a first application, wherein the virtual camera adaptation process is used for converting data transmitted between the virtual camera process and an interconnection service process;
controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process so as to send the first instruction to the interconnection service process through the virtual camera adaptation process, wherein the interconnection service process is used for receiving and caching a video stream sent by second electronic equipment;
and controlling the interconnection service process to execute the operation corresponding to the first instruction based on the video stream.
2. The method according to claim 1, wherein the controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process comprises:
controlling the virtual camera process to acquire a first service handle;
controlling the virtual camera process to acquire a corresponding first service based on the first service handle, wherein the first service supports the first data format;
and controlling the virtual camera process to send the converted first instruction to the virtual camera adaptation process through the first service.
3. The method according to claim 1 or 2, wherein the virtual camera process sends the first instruction to the internet service process through the virtual camera adaptation process, and comprises:
controlling the virtual camera adaptation process to convert the converted first instruction again to obtain an instruction which can be processed by the interconnection service process;
and controlling the virtual camera adaptation process to send an instruction which can be processed by the interconnection service process to the interconnection service process through a first callback function.
4. The method according to any one of claims 1 to 3, wherein after controlling the interconnection service process to perform the operation corresponding to the first instruction based on the video stream, the method further comprises:
controlling the interconnection service process to convert response data according to a second data format defined by the virtual camera adaptation process, wherein the response data is generated after the operation corresponding to the first instruction is executed;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process so as to transmit the response data to the virtual camera process through the virtual camera adaptation process.
5. The method according to claim 4, wherein the controlling the internet service process to send the converted response data to the virtual camera adaptation process comprises:
controlling the interconnection service process to acquire a second service handle;
controlling the interconnection service process to acquire a corresponding second service based on the second service handle, wherein the second service supports the second data format;
and controlling the interconnection service process to send the converted response data to the virtual camera adaptation process through the second service.
6. The method of claim 5, wherein the internet service process transmits the response data to the virtual camera process through the virtual camera adaptation process, comprising:
controlling the virtual camera adaptation process to convert the converted response data again so as to obtain data which can be processed by the virtual camera process;
and controlling the virtual camera adaptation process to send data which can be processed by the virtual camera process to the virtual camera process through a second callback function.
7. The method of claim 1, further comprising:
when the camera process receives a camera opening instruction of the first application, controlling the camera process to execute an operation of opening a camera;
controlling the camera process to send an open camera notification to the virtual camera process if the opening of the camera is successful;
controlling the virtual camera process and the camera process to establish a binding relationship;
and controlling the virtual camera process request and the virtual camera adaptation process to establish a binding relationship for data transmission, wherein the virtual camera adaptation process is started after the first electronic device is started.
8. The method of claim 7, wherein the controlling the virtual camera process to request a binding relationship with the virtual camera adaptation process for data transmission comprises:
controlling the virtual camera process to request to acquire a first service handle of the virtual camera adaptation process, wherein a first service corresponding to the first service handle is used for the virtual camera process to transmit data to the virtual camera adaptation process;
and controlling the virtual camera process to register a second callback function in the virtual camera adaptation process, wherein the second callback function is used for the virtual camera adaptation process to transmit data to the virtual camera process.
9. The method according to claim 7, wherein after controlling the camera process to perform the operation of opening the camera when the camera process receives the instruction of opening the camera of the first application, the method further comprises:
under the condition that the camera is opened successfully, controlling the interconnection service process to request to acquire a second service handle of the virtual camera adaptation process, wherein a second service corresponding to the second service handle is used for the interconnection service process to transmit data to the virtual camera adaptation process;
and controlling the interconnection service process to register a first callback function in the virtual camera adaptation process, wherein the first callback function is used for the virtual camera adaptation process to transmit data to the interconnection service process.
10. The method according to any one of claims 7-9, further comprising:
controlling the virtual camera adaptation process to send a binding success notification to the interconnection service process, wherein the binding success notification is used for indicating that the virtual camera adaptation process is respectively bound with the virtual camera process and the interconnection service process;
and controlling the interconnection service process to start receiving the video stream.
11. The method according to any one of claims 1-10, further comprising:
when the virtual camera adaptation process monitors that the virtual camera process is abnormal, controlling the virtual camera adaptation process to remove the binding relation with the virtual camera process; and/or the presence of a gas in the gas,
and when the virtual camera process monitors that the virtual camera adaptation process is abnormal, controlling the virtual camera process to release the binding relation with the virtual camera adaptation process.
12. An electronic device comprising a processor and a memory, wherein the memory is configured to store a program that enables the electronic device to perform the method of any of claims 1-11 and to store data that is used to implement the method of any of claims 1-11; the processor is configured to execute programs stored in the memory.
13. A computer-readable storage medium having stored therein instructions which, when executed on a computer, cause the computer to perform the method of any one of claims 1-11.
CN202111608704.3A 2021-12-24 2021-12-24 Method for transmitting data, electronic device and readable storage medium Active CN115002384B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111608704.3A CN115002384B (en) 2021-12-24 2021-12-24 Method for transmitting data, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111608704.3A CN115002384B (en) 2021-12-24 2021-12-24 Method for transmitting data, electronic device and readable storage medium

Publications (2)

Publication Number Publication Date
CN115002384A CN115002384A (en) 2022-09-02
CN115002384B true CN115002384B (en) 2023-01-31

Family

ID=83018528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111608704.3A Active CN115002384B (en) 2021-12-24 2021-12-24 Method for transmitting data, electronic device and readable storage medium

Country Status (1)

Country Link
CN (1) CN115002384B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102215373A (en) * 2010-04-07 2011-10-12 苹果公司 In conference display adjustments
CN102215369A (en) * 2011-04-27 2011-10-12 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and data interactive method of video call
CN112584049A (en) * 2020-12-22 2021-03-30 Oppo广东移动通信有限公司 Remote interaction method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040235413A1 (en) * 2003-01-21 2004-11-25 Min Dong Uk Mobile terminal having image processing function and method therefor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102215373A (en) * 2010-04-07 2011-10-12 苹果公司 In conference display adjustments
CN102215369A (en) * 2011-04-27 2011-10-12 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and data interactive method of video call
CN112584049A (en) * 2020-12-22 2021-03-30 Oppo广东移动通信有限公司 Remote interaction method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115002384A (en) 2022-09-02

Similar Documents

Publication Publication Date Title
WO2020238871A1 (en) Screen projection method and system and related apparatus
JP7378576B2 (en) Terminal device, method and system for implementing one-touch screen projection using remote control device
EP3944063A1 (en) Screen capture method and electronic device
WO2021051989A1 (en) Video call method and electronic device
WO2020014880A1 (en) Multi-screen interaction method and device
WO2021185244A1 (en) Device interaction method and electronic device
JP7369281B2 (en) Device capacity scheduling method and electronic devices
WO2021258809A1 (en) Data synchronization method, electronic device, and computer readable storage medium
WO2022100304A1 (en) Method and apparatus for transferring application content across devices, and electronic device
US20230422154A1 (en) Method for using cellular communication function, and related apparatus and system
WO2022105803A1 (en) Camera calling method and system, and electronic device
EP4224307A1 (en) Screen projection method for application window and electronic devices
WO2022042326A1 (en) Display control method and related apparatus
US20240095000A1 (en) Plug-In Installation Method, Apparatus, and Storage Medium
CN115022570B (en) Method for acquiring video frame, electronic equipment and readable storage medium
CN115379126B (en) Camera switching method and related electronic equipment
CN115002384B (en) Method for transmitting data, electronic device and readable storage medium
CN114928900B (en) Method and apparatus for transmission over a WiFi direct connection
WO2021218544A1 (en) Wireless connection providing system, method, and electronic apparatus
WO2024087900A1 (en) Camera switching method and related electronic device
CN117119295B (en) Camera control method and electronic device
WO2023045966A1 (en) Capability sharing method, electronic devices and computer-readable storage medium
EP4351181A1 (en) Bluetooth communication method and system
CN114827514B (en) Electronic device, data transmission method and medium for electronic device and other electronic devices
WO2023051204A1 (en) Cross-device connection method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant