CN117499780A - Photographing method, electronic equipment and collaborative work system - Google Patents

Photographing method, electronic equipment and collaborative work system Download PDF

Info

Publication number
CN117499780A
CN117499780A CN202210865708.8A CN202210865708A CN117499780A CN 117499780 A CN117499780 A CN 117499780A CN 202210865708 A CN202210865708 A CN 202210865708A CN 117499780 A CN117499780 A CN 117499780A
Authority
CN
China
Prior art keywords
photographing
request
photographing request
queue
requests
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210865708.8A
Other languages
Chinese (zh)
Inventor
滕智飞
李裕
白帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210865708.8A priority Critical patent/CN117499780A/en
Publication of CN117499780A publication Critical patent/CN117499780A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

A photographing method, electronic equipment and a collaborative work system relate to the technical field of terminals and the technical field of Internet of things. The method comprises the following steps: sequentially receiving i photographing requests, wherein the i photographing requests are sent by j application programs, each application program sends at least one photographing request, i and j are integers which are larger than or equal to 1, and i is larger than or equal to j; inserting the photographing requests into a photographing request queue in sequence according to the receiving sequence of the i photographing requests, wherein the first photographing request of the photographing request queue is the photographing request received first; and sending an nth photographing request to the Internet of things equipment to acquire photographing data, and deleting the nth photographing request from a photographing request queue when the photographing task of the nth photographing request is completed until no photographing request exists in the photographing request queue, wherein n is 1,2 and … i in sequence. By the method, after one piece of Internet of things equipment simultaneously has a plurality of photographing requests, the photo data can be correctly returned to a plurality of pieces of electronic equipment, and user experience is improved.

Description

Photographing method, electronic equipment and collaborative work system
Technical Field
The application relates to the technical field of terminals, in particular to a photographing method, electronic equipment and a collaborative work system.
Background
Currently, by installing components such as a device virtualization software development suite (device virtualization software development kit, DVSDK) and a distributed mobile awareness development platform (distribute mobile sensing development platform, DMSDP) on a center side device such as a mobile phone and a tablet computer, and setting a corresponding DMSDP on an internet of things (Internet of Things, ioT) device such as a desk lamp, which can use a camera, to perform adaptation, the center side device can be connected with the IoT device, and the center side device can use the virtualized camera function of the IoT device.
A typical application scenario for the above techniques is an educational scenario. The education application program installed on the center side device can send out a photographing request, and after the IoT devices such as the desk lamp and the like take a photograph by using the camera, the photograph is transmitted back to the center side device, so that the functions of answering questions, submitting operations and the like are realized.
However, in the present solution, after the IoT device establishes a transmission channel with the hub device, the IoT device can only process a photographing request of one application program on the hub device. When multiple application programs on the center side device send a photographing request to the IoT device, photo data feedback errors may occur, even virtual camera photographing service interruption, affecting experience.
Disclosure of Invention
In order to solve the above problems, the present application provides a photographing method, an electronic device, and a collaborative work system, which can enable an IoT device to correctly return photograph data to a corresponding application program when receiving photographing requests sent by a plurality of application programs on the electronic device, thereby improving user experience.
In a first aspect, the present application provides a photographing method applied to an electronic device, that is, a center side device, where the electronic device is configured to photograph by using a camera of an internet of things device to obtain photograph data, the method includes: sequentially receiving i photographing requests, wherein the i photographing requests are sent by j application programs, each application program sends at least one photographing request, i and j are integers which are larger than or equal to 1, and i is larger than or equal to j; inserting the photographing requests into a photographing request queue in sequence according to the receiving sequence of the i photographing requests, wherein the first photographing request of the photographing request queue is the photographing request received first; and sending an nth photographing request to the Internet of things equipment to acquire photographing data, and deleting the nth photographing request from a photographing request queue when the photographing task of the nth photographing request is completed until no photographing request exists in the photographing request queue, wherein n is 1,2 and … i in sequence.
According to the scheme provided by the application, each photographing task is arranged by using the photographing request queue, and the photographing request queue is executed according to the sequence of first-in first-out, last-in last-out. And the photographing tasks in the photographing request queue are sequentially completed, so that the problem that a plurality of tasks are issued to the equipment side of the Internet of things at the same time, and the equipment side of the Internet of things breaks down or the photo data is returned in error is avoided. The task of shooting one by one means: and sending a photographing request of the head of the photographing request queue to the Internet of things equipment, and after receiving the photographing data returned by the Internet of things equipment, returning the photographing data to a correct corresponding application program, wherein the head of the photographing request queue is considered to complete the photographing task. And deleting the first photographing task of the photographing request queue, wherein the second photographing task of the original queue is updated to be the first photographing task, and then a new first photographing task is issued to the equipment side of the Internet of things. In summary, by using the method, through completing the photographing task one by one, when the IoT device receives the photographing requests sent by the plurality of application programs on the electronic device, the photograph can be correctly returned to the corresponding application program, thereby improving the user experience.
In a possible implementation manner, the inserting the photographing requests into the photographing request queue sequentially according to the receiving order of the i photographing requests specifically includes:
Creating the photographing queue;
and inserting the photographing requests into the photographing request queue in sequence according to the receiving sequence of the i photographing requests, and storing the corresponding relation among photographing request codes, time information and request queue numbers, wherein the photographing request codes are codes of the photographing requests sent by the application program, and the request queue numbers are sequence numbers of the photographing requests in the photographing request queue.
In one possible implementation manner, the creating the photographing queue specifically includes:
when a transmission channel of the photo data is not established between the electronic equipment and the Internet of things equipment, establishing the transmission channel and establishing the photographing queue;
and when the transmission channel is established between the electronic equipment and the Internet of things equipment, the photographing queue is created.
In one possible implementation, the photographing request code counts cyclically between a first preset value and a second preset value.
In one possible implementation, the request queue number is counted in cycles between a third preset value and a fourth preset value.
In one possible implementation, the time information includes:
At least one of a photographing task time consumption or a photographing request receiving time.
In one possible implementation, the method further includes:
when the transmission channel between the electronic equipment and the Internet of things equipment is destroyed, deleting the photographing request queue to release space and reduce resource occupation. .
In one possible implementation manner, the method includes sending an nth photographing request to the internet of things device to obtain photographing data, and deleting the nth photographing request from the photographing request queue when the photographing task of the nth photographing request is completed, where the method specifically includes:
sending an nth photographing request to the internet of things equipment through the transmission channel to acquire photographing data;
returning the photographing data to the application program sending the nth photographing request according to the corresponding relation between the callback function of the application program and the photographing request code so as to complete the photographing task of the nth photographing request;
and deleting the corresponding relation among the photographing request code of the nth photographing request, the time information of the nth photographing request and the request queue number of the nth photographing request from the photographing request queue.
In a second aspect, the present application further provides an electronic device, including: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored on the memory, which when executed by the one or more processors, cause the electronic device to perform the second aspect and the photographing method of any of the implementations of the second aspect that is constituted by the steps of the method performed by the electronic device.
The electronic equipment utilizes the photographing request queue to arrange each photographing task, and the photographing request queue is executed according to the first-in first-out sequence and the last-in last-out sequence. And the photographing tasks in the photographing request queue are sequentially completed, so that the problem that a plurality of tasks are issued to the equipment side of the Internet of things at the same time, and the equipment side of the Internet of things breaks down or the photo data is returned in error is avoided. Completing photographing tasks one by one means: and sending a photographing request of the head of the photographing request queue to the Internet of things equipment, and after receiving the photo data returned by the Internet of things equipment, returning the photo data to a correct corresponding application program, wherein the head of the photographing request queue is considered to complete the photographing task. And deleting the first photographing task of the photographing request queue, wherein the second photographing task of the original queue is updated into the first photographing task, and then a new first photographing task is issued to the equipment side of the Internet of things. In summary, by using the method, by completing the photographing tasks one by one, when the IoT device receives the photographing requests sent by the plurality of application programs on the electronic device, the photos can be correctly returned to the corresponding application programs, thereby improving the user experience.
In a third aspect, the application further provides a collaborative system, where the collaborative system includes one or more electronic devices provided in the second aspect, and one or more devices of the internet of things.
In one possible implementation manner, the cooperative working system may include two or more electronic devices, where each of the at least two electronic devices establishes a transmission channel with the internet of things device.
Drawings
FIG. 1 is a schematic illustration of a first scenario;
FIG. 2a is a schematic diagram of a center-side device provided herein;
fig. 2b is a schematic diagram of a software structure of the center side device provided in the present application;
fig. 3a is a schematic diagram of an IoT device provided herein;
fig. 3b is a schematic diagram of a software structure of an IoT device provided herein;
fig. 4 is a flowchart of a photographing method according to an embodiment of the present application;
fig. 5 is a schematic diagram of a photographing method according to an embodiment of the present application;
fig. 6a is a second flowchart of a photographing method according to an embodiment of the present application;
fig. 6b is a flowchart of another photographing method according to an embodiment of the present application;
fig. 7 is a schematic diagram of another photographing method according to an embodiment of the present application;
Fig. 8 is a schematic diagram of a cooperative system according to an embodiment of the present application.
Detailed Description
The following description of the technical solutions in the embodiments of the present application will be made with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present application, are within the scope of the present application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects.
In order to make the technical personnel in the technical field more clearly understand the scheme of the application, the application scenario of the technical scheme of the application is first described below.
At present, online education is popular with more and more people, and the demands of students for online education are also increasing. In some application scenarios, when a new word is encountered, students can perform online word searching to obtain relevant interpretation; in some application scenarios, online reading of book content is more convenient for students to learn knowledge and pronunciation; in some application scenarios, students need to submit their work online. Therefore, how to meet the online education needs of users based on intelligent devices is a problem to be solved.
Referring to fig. 1, a scene diagram one is shown.
The center side device 10 is provided with an education application program, an IoT device 20 such as a desk lamp, etc. which takes a picture by using a camera and then transmits the picture back to the center side device to realize functions such as answering and job submitting, etc., or the center side device 10 takes a preview of the picture by using the camera on the IoT device 20 and provides various online education functions, such as online word searching and online reading, for the user according to the image acquired by the IoT device 20.
The center-side device 10 may be a mobile phone, a tablet computer, a notebook computer, a desktop computer, or the like, which is not particularly limited in the embodiments of the present application.
The hub-side device 10 and the IoT device 20 may be in near field communication and/or far field communication. The near field communication can complete information interaction among devices through the router and other devices, and the far field communication can complete information interaction among devices through the cloud server.
The hardware architecture of the center side device 10 will be first described below.
Referring to fig. 2a, a schematic diagram of a center-side device is provided herein.
The center side device 10 shown in fig. 2a is only one example, and the center side device 10 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 2a may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The center side apparatus 10 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor, a gyroscope sensor, an acceleration sensor, a temperature sensor, a motion sensor, a barometric sensor, a magnetic sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc.
Wherein the different processing units may be separate devices or may be integrated in one or more processors.
Wherein the controller may be a neural hub and command center of the hub-side device 10. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the center-side device 10, or may be used to transfer data between the center-side device 10 and peripheral devices. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130.
In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the hub-side device 10. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the center-side device 10 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the center-side device 10 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the center side device 10. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied on the center-side device 10.
In some embodiments, the antenna 1 of the center-side device 10 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the center-side device 10 can communicate with the network and other devices through wireless communication technology.
The center side device 10 implements display functions by a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the center-side device 10 may include 1 or N display screens 194, N being a positive integer greater than 1.
In the embodiment of the present application, the display screen 194 may display a photographing preview interface, a photographing image interface, and the like. It should be noted that, in the embodiment of the present application, the shooting preview interface refers to an interface where a user can view an image acquired by a camera of the IoT device in real time through the display screen 194.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize the memory capability of the expansion center side device 10. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the center side device 10 and data processing by executing instructions stored in the internal memory 121, for example, to cause the center side device 10 to implement the cooperative working method in the embodiment of the present application. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc.
The storage data area may store data (such as audio data, phonebook, etc.) created during use of the center-side device 10, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The center-side device 10 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The center-side device 10 can listen to music through the speaker 170A or listen to handsfree talk. In some embodiments, the center-side device 10 may be provided with a plurality of speakers 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the center-side device 10 listens to a telephone call or voice information, the voice can be heard by bringing the receiver 170B close to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The center side device 10 may be provided with at least one microphone 170C. In other embodiments, the center side apparatus 10 may be provided with two microphones 170C, and may realize a noise reduction function in addition to collecting a sound signal. In other embodiments, the center-side device 10 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The software structure of the center side device 10 is explained below.
Referring to fig. 2b, a schematic diagram of a software structure of a center side device provided in the present application is shown.
The software system of the center-side device 10 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Taking an Android system with a layered architecture as an example, the embodiment of the present invention illustrates a software structure of the center side device 10.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, from top to bottom, there are an application layer, an application framework layer, a system library, an extension layer, and a kernel layer, respectively.
The application layer may include a series of application packages. For example, the application package may include applications such as gallery, map, wi-Fi, bluetooth, SMS, music, talk, navigation, video, camera, device management application, educational application, and the like.
Wherein the device management application may perceive, discover, and register IoT devices within the region.
Educational applications may be developed by third party vendors for providing online educational functionality to users, and may be provided with, but are not limited to, view, broadcast, annotate, query, etc.
In some embodiments, the device management application may bind with IoT devices such as a desk lamp; the educational application may also bind with IoT devices such as desk lamps.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 2b, the application framework layer may include DVSDK, DMSDK, AUTH, intelligent interconnection services, and the like.
The Authentication (AUTH), also called authentication, refers to that authentication is called by an interface through a certain means, security authority management capability is provided, and confirmation of user identity is completed, namely authentication service is provided.
The DVSDK is used to provide a device virtualization service, which may also be referred to as a hardware virtualization service, and is not distinguished in the following description. And providing an entrance of interconnection service for the application programs of the third party manufacturer, and simultaneously carrying out interface access security rights management. In particular, may be used to establish a logical channel between the hub device side and the IoT device side, providing the capability to virtualize the camera.
The DMSDK is configured to provide a device management service, and provide far-field (i.e., cloud) IOT device information and near-field (i.e., connectable near-field) IOT device information for third-party vendors.
The intelligent interconnection service is used for providing a physical transmission channel and providing data transmission capability, and simultaneously managing the starting of the interconnection service.
The interconnection service establishes a logical channel between the hub-side device and the IoT device, providing virtualized camera capabilities. The method is particularly used for realizing the functions of data processing, transmission channels, flow control and capability acquisition.
In addition, a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, etc. may be included.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture. The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, and can automatically disappear after a short dwell without user interaction.
The system layer includes a system library and Android Runtime (Android run time).
Android run time includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
In the embodiment of the application, the Android run time further comprises a virtual camera adaptation layer, and the virtual camera registration capability is provided.
The system library may include a plurality of functional modules. For example: the system comprises a multimedia platform, an audio framework, a graphic image processing library, a decoding module, a virtual camera adaptation layer and the like.
The multimedia platform can be used for managing multimedia and supporting various common audio, video format playback and recording, still image files and the like. The multimedia platform may support a variety of audio and video coding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
Graphics image processing libraries may be used to implement graphics drawing, image rendering, compositing, and layer processing, among others.
The codec may be used to implement codec operations on audio data, video data.
The expansion layer comprises a hardware abstraction layer (Hardware Abstraction Layer, HAL), wherein the HAL is an example program package of a software layer, is an interface layer between an operating system kernel and a hardware circuit, and is used for simulating the details of a specific system platform so that a program can directly access the resources of hardware.
HAL layers include, but are not limited to: audio (Audio) HAL, sensor (Sensor) HAL, modem (Modem) HAL, camera HAL, virtual camera HAL.
Wherein the audio HAL is used for processing the audio stream, for example, noise reduction, directional enhancement, etc. of the audio stream. The camera HAL is used for processing the image stream corresponding to the camera at the electronic equipment side, and the virtual camera HAL is used for processing the image stream corresponding to the virtual camera registered at the electronic equipment side, namely, the image stream acquired by the camera at the IOT equipment side.
The kernel layer is a layer between hardware and software. The kernel layer may include a display driver, a camera driver, a USB driver, a CPU driver, an audio driver, a network driver (e.g., wi-Fi driver), a storage driver, a print driver, and the like.
The hardware structure of IoT device 20 is described below.
Referring to fig. 3a, a schematic diagram of an IoT device is provided herein.
In some embodiments, the IoT device is a table lamp with a camera. It should be understood that the IoT device 20 shown in fig. 3a is only one example, and may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 3a may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The IoT device may include: processor 210, camera 201, wireless communication module 202, memory 203, audio module 204, usb interface 205, charge management module 206, power management module 207, battery 208, lighting device 209, keys 211, etc.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include a GPU, ISP, controller, memory, video codec, etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural and command center of the IoT device 20. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The camera 201 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor.
The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. Taking a desk lamp as an example, the camera 201 may be disposed on a desk lamp stand for capturing images downwards.
IoT device 20 may implement shooting functionality through an ISP, camera 201, video codec, GPU, or the like.
The ISP is used to process the data fed back by the camera 201. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, an ISP may be provided in the camera 201.
The wireless communication module 202 may provide solutions for wireless communication, including WLAN (e.g., wi-Fi network), bluetooth (BT), etc., applied on the IoT device 20. In some embodiments, the antenna of the IoT device 20 and the wireless communication module 202 are coupled such that the IoT device 20 may communicate with a network and other devices through wireless communication techniques.
Memory 203 may be used to store computer executable program code that includes instructions. The processor 210 executes the instructions stored in the memory 203 to perform various functional applications and data processing of the IoT device 20, for example, to cause the IoT device 20 to implement the cooperative methods in the embodiments of the present application.
The IoT device 20 may implement audio functions, such as music playing, etc., through the audio module 204, speakers 212, etc.
The USB interface 205 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 205 may be used to connect a charger to charge the IoT device 20 and may also be used to transfer data between the IoT device 20 and a peripheral device.
The charge management module 206 is configured to receive a charge input from a charger. The charging management module 206 may also power the IoT device 20 through the power management module 207 while charging the battery 208.
The power management module 207 is used to connect the battery 208, the charge management module 206 and the processor 210. The power management module 207 receives input from the battery 208 and/or the charge management module 206 and provides power to the processor 210, the memory 203, the camera 201, the wireless communication module 202, the lighting device 209, and the like.
The keys 211 include a power-on key (or power key), and the like.
The software structure of IoT device 20 is described below.
Referring to fig. 3b, this figure is a schematic diagram of the software architecture of the IoT device provided herein.
The layers of the IoT device communicate through a software interface. In some embodiments, from top to bottom, there are an application layer, an application framework layer, a system library, and a kernel layer, respectively.
The application layer may include device application services, which may be understood as system-level applications, that are initiated after an IoT device system is initiated.
The application framework layer includes a number of predefined functions. As shown in fig. 3b, the application framework layer may include intelligent interconnection services, resource managers, interconnection services, and the like.
Resource managers are used to provide various resources such as localization strings, icons, pictures, layout files, audio files, video files, and the like.
The intelligent interconnection service provides a physical transmission channel for providing data transmission capability while managing the start switch of the interconnection service.
Interconnection service: a logical channel is established between the hub-side device and the IoT, providing virtualized camera capabilities while providing a device camera open interface, which may include data processing, capability acquisition, virtual audio modules, virtual camera modules, and so forth.
The system layer may include a plurality of functional modules. For example: including multimedia platforms, audio frameworks, graphics image processing libraries, decoding modules, ioT vendor-adapted camera modules, and the like.
The multimedia platform can be used for managing multimedia and supporting various common audio, video format playback and recording, still image files and the like. The multimedia platform may support a variety of audio and video coding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
Graphics image processing libraries may be used to implement graphics drawing, image rendering, compositing, and layer processing, among others.
The codec may be used to implement codec operations on audio data, video data.
The IoT vendor adapts the camera module to implement an interface to the interconnect module, providing functions of opening the camera, photographing, previewing, etc.
The kernel layer is a layer between hardware and software. The kernel layer may include a camera driver, a USB driver, a CPU driver, an audio driver, a network driver, a storage driver, and the like.
At present, components such as DVSDK and DMSDP are installed on center side devices such as mobile phones and tablet computers, corresponding DMSDP is arranged on an IoT device such as a desk lamp to be adapted, so that the center side device can be connected with the IoT device, and the center side device can use the virtualized camera function of the IoT device.
At present, a typical application scenario of the above technology is an educational scenario. The center-side device 10 includes a camera on an IoT device 20 such as a desk lamp, and an educational application installed thereon. After the hub-side device 10 establishes a connection with the IoT device 20, virtualized camera functions of the IoT device 20 may be used on the hub-side device 10. For example, the camera of IoT device 20 is used to capture text of a book on a desktop, and when the handset points at the text, the educational application on hub-side device 10 automatically recognizes the text display annotation and voice broadcasts it. The education application program can also send out a photographing request, and after the IoT equipment such as a desk lamp and the like uses the camera to photograph, the photograph is returned to the center side equipment so as to realize functions such as answering and job submitting.
However, in the present solution, after the IoT device establishes a transmission channel with the hub device, the IoT device can only process a photographing request of one application program on the hub device. When multiple applications on the center-side device all send a photographing request to the IoT device, photo return errors may occur, even virtual camera photographing service interruption, affecting experience.
In order to solve the technical problems, the application provides a photographing method, electronic equipment and a collaborative work system, which can enable an IoT device to correctly transmit photos back to corresponding application programs when receiving photographing requests sent by a plurality of application programs on the electronic equipment, thereby improving user experience.
The following description is made in connection with specific implementations.
In order to enable those skilled in the art to more clearly understand the technical solutions of the present application, the following first describes a process in which a central side device establishes a connection with an IoT device and performs photographing cooperatively. In the following description, the center side device is taken as a tablet computer, and the IoT device is taken as a desk lamp for example.
See also fig. 4 and 5. Fig. 4 is a flowchart of a photographing method according to an embodiment of the present application; fig. 5 is a schematic diagram of a photographing method according to an embodiment of the present application.
Wherein FIG. 5 only illustrates the correspondence between a partial flow and a module in FIG. 4
0. The device service initialization stage specifically comprises the following steps:
s0.1: and responding to the user operation, starting the device application service of the desk lamp, and loading the interconnection service.
For example, the user operation may be an operation in which the user turns on the desk lamp power supply. In response to user operation, the desk lamp system is started, the equipment application service is started accordingly, and the interconnection service is loaded. Wherein the interconnection service may be used to establish a physical transmission channel between the tablet and the desk lamp for providing data transmission capability.
S0.2, loading hardware abstraction service by the interconnection service of the desk lamp.
The interconnection service may also control the opening of the hardware abstraction service. For example, after the interconnection service is started, the interconnection service may load the hardware abstraction service in the form of a plug-in. The hardware abstraction service can be used for establishing a logic channel between the tablet and the desk lamp, providing the capability of virtualizing a camera, and simultaneously providing an open interface of the desk lamp camera.
The hardware abstraction service may include at least a base component and a camera component. In the device service initialization stage, the interconnection service loading firstly loads the basic component and initializes the basic component. After the basic component is initialized, information interaction can be carried out with the equipment adaptation module of the desk lamp, and equipment information and virtualization capability information are obtained. Exemplary device information includes, but is not limited to, device name, device identification, device type, and the like. By way of example, virtualization capability information includes, but is not limited to, whether a virtualized camera is supported, whether a virtualized microphone is supported, etc., and may further include the functionality of a supported virtualized camera, such as supporting only video previews, only photographs, or both photographs and video previews.
The desk lamp has the capability of supporting the virtualized camera, and can be understood that the camera of the desk lamp allows other electronic devices (such as a tablet) to call, namely, the camera of the desk lamp can be understood as a virtual camera of other electronic devices.
After the base component obtains the device information and the capability information of the desk lamp, if the desk lamp has the capability of supporting the virtualized camera, the base component loads the camera component to provide the capability of the virtualized camera. At this time, the base component may perform negotiation channel establishment preparation to negotiate network connection related information (including but not limited to IP address and port, etc.) with the tablet. When the base component performs negotiation channel establishment preparation, a Session service (Session Server) is created, and a Session Name (Session Name) of the Session service is sent to the interconnection service, so that a negotiation channel is established between the transmission management service on the tablet side and the interconnection service on the desk lamp side.
1. The device discovery stage specifically comprises the following steps:
s1.1: in response to a user operation, the tablet's educational APP sends a device discovery instruction to the device management service.
The user operation may be an operation in which the user clicks a function option in the education APP that needs to call the virtual camera. By way of example, the user operation may be an operation of a click-to-read function, a word search function, a job function, a photographing function in the click education APP.
The education APP of the tablet receives the user operation, and responds to the operation, and sends a device discovery instruction to the device management service of the tablet. Wherein the device discovery instructions are to instruct to find IoT devices that are capable of establishing a connection with the tablet. By way of example, the device discovery instructions may include, but are not limited to, an instruction type and a device type to be discovered. In this embodiment, the device discovery command is specifically used to find a desk lamp that can establish a connection with the tablet.
S1.2: the device management service in the tablet invokes an authentication service to authenticate the education APP, and an authentication result of the education APP is obtained.
After receiving the device discovery instruction, the device management service carries out APP authentication on the education APP according to the name of the education APP.
And obtaining an authentication result (authentication success or authentication failure) of the education APP at the authentication service, and sending the authentication result to the equipment management service.
S1.3: and the device management service in the tablet sends a device search instruction to the transmission management service when the education APP is successfully authenticated.
The device search instruction may include, but is not limited to, an instruction type, a device type to be searched, and a search mode. Exemplary search means include, but are not limited to, near field device scanning and obtaining device information from a cloud server. In this embodiment, the device to be searched is a desk lamp.
S1.4: the transmission management service in the tablet acquires a far-near field device list according to the device search instruction, and sends the far-near field device list to the device management service.
The near-far field device list includes a far-field device list and a near-field device list. The far field devices included in the far field device list refer to registered devices acquired from the cloud server, and the near field devices included in the near field device list refer to devices scanned through near field communication. In the far field device list and the near field device list, the device information includes, but is not limited to, a device name, a device identification, a device type, and the like.
When the transmission management service receives the device searching command, related device searching operations, such as near-field device scanning operations and related device information acquisition operations in the cloud server, are executed according to the device type to be searched and the searching mode carried in the device searching command, a far-field device list and a near-field device list are obtained, and the far-field device list and the near-field device list are sent to the device management service.
S1.5: and the device management service in the tablet performs device filtering according to the far-near field device list, and reports the device information obtained after the filtering to the education APP.
And filtering the equipment, determining the information of the desk lamp which can be linked with the flat plate, and sending the information to the education APP.
The device management service may perform an intersection operation on the far-field device list and the near-field device list, filter out the table lamps only existing in the far-field device list or only in the near-field device list, and use the table lamps existing in both the far-field device list and the near-field device list as table lamp information capable of being linked with the tablet. Thus, the device management service can filter out the desk lamp which is not registered in the cloud server, and also can filter out the desk lamp which cannot perform near field communication with the tablet.
In another alternative embodiment, if the tablet and the desk lamp are under the same lan, the transmission management service of the tablet may obtain the communication device list and the registered device list according to the device search instruction. The devices included in the communication device list refer to devices scanned through near field communication or far field communication, and the devices included in the registered device list refer to registered devices acquired from a cloud server. The near field devices included in the near field device list refer to devices scanned by near field communication. In the communication device list and the registered device list, the device information includes, but is not limited to, a device name, a device identification, a device type, and the like.
The transmission management service in the tablet sends the communication equipment list and the registration equipment list to the equipment management service, and the equipment management service performs equipment filtering according to the communication equipment list and the registration equipment list and reports the equipment information obtained after filtering to the education APP. The device management service may perform an intersection taking operation on the communication device list and the registration device list, filter out the table lamps only existing in the communication device list or only existing in the registration device list, and use the table lamps existing in both the communication device list and the registration device list as table lamp information capable of being linked with the tablet. Thus, the device management service can filter out the desk lamp which is not registered in the cloud server, and also can filter out the desk lamp which cannot perform near field communication with the tablet.
2. A virtual camera enabled phase comprising the steps of:
s2.1: the education APP of the flat plate determines the desk lamp to be linked.
S2.2: the device verification and the device connection are carried out on the desk lamp by the tablet education APP, and the desk lamp is obtained to have the capability of supporting the virtualized camera.
S2.3: the tablet's educational APP sends a virtual camera enable request to the hardware virtualization service.
The virtual camera enable request is used to indicate that the virtual camera is registered in the virtual camera HAL. The virtual camera enable request may include, but is not limited to, a request type, a device name, a device identification, a device type, and an identification of the virtual camera.
S2.4: the hardware virtualization service of the tablet registers the virtual camera with the virtual camera HAL.
After receiving the virtual camera enabling request, the hardware virtualization service registers the corresponding virtual camera with the virtual camera HAL according to the virtual camera enabling request.
S2.5: the tablet's virtual camera HAL sends a virtual camera enable success indication to the educational APP after virtual camera registration is complete.
3. Virtual camera preview access phase:
s3.1: the hardware virtualization API in the tablet sends virtual camera access instructions to the camera service.
Virtual camera access instructions refer to instructions for invoking a virtual camera. The virtual camera access instruction may include, but is not limited to, an instruction type, a virtual camera ID, and a camera configuration parameter.
S3.2: the in-tablet camera service sends an image preview request to the virtual camera HAL according to the virtual camera access instruction.
After receiving the virtual camera access instruction, the camera service generates a corresponding image preview request according to the virtual camera ID and sends the corresponding image preview request to the virtual camera HAL.
Wherein the image preview request is for requesting a preview image data stream. Illustratively, the image preview request may include, but is not limited to, a request identification, a virtual camera ID, camera configuration parameters, and the like.
S3.3: the in-panel virtual camera HAL sends an image preview request to the hardware virtualization service.
After receiving the image preview request, the virtual camera HAL determines the matched virtualized hardware identification according to the virtual camera ID carried in the image request.
In this embodiment, the virtual camera HAL determines the linked desk lamp camera according to the virtual camera ID and the mapping relationship between the virtual camera ID and the desk lamp camera, and generates a corresponding image request according to the determined virtualized hardware identifier and sends the image request to the hardware virtualized service.
By way of example, the image preview request may include, but is not limited to, a request identifier, device information (i.e., table lamp information), virtualized hardware identifier (i.e., table lamp camera identifier), camera configuration parameters, and the like.
S3.4: the hardware virtualization service in the tablet sends an image preview request to the transport management service.
The hardware virtualization service sends an image preview request to the transport management service. The image preview request may include, but is not limited to, a request identifier, device information (i.e., table lamp information), a virtualized hardware identifier (i.e., table lamp camera identifier), and a camera configuration parameter.
When the hardware virtualization service in the tablet sends an image preview request to the transmission management service, if the data channel is found not to be established with the desk lamp, a data channel establishment request is generated and sent to the transmission management service. And the data channel establishment request is used for indicating the transmission of data with the desk lamp. The data channel establishment request may include, but is not limited to, session identifier, connection information, data codec mode, etc.
The transmission management service in the flat plate receives the data channel establishment request, and establishes data channel connection with the desk lamp according to the information carried by the data channel establishment request, namely, establishes a data channel between the flat plate and the desk lamp. Further, the in-tablet transmission management service and the in-desk lamp interconnection service may transmit various data, including but not limited to image data, based on the data channel.
After the data channel is successfully established, the transmission management service in the tablet sends a successful connection indication of the data channel to the hardware virtualization service in the tablet, and the interconnection service in the desk lamp sends a successful connection indication of the data channel to the camera component in the hardware abstraction service. The indication of successful connection of the data channel may include, but is not limited to, a connection success identifier and information related to the data channel.
S3.5: the in-plane transmission management service transmits an image preview request to the interconnection service of the desk lamp.
And the transmission management service in the tablet determines a corresponding control channel according to the equipment information carried in the image preview request, and transmits the image preview request to the interconnection service of the desk lamp in the control channel.
S3.6: the interconnection service in the desk lamp sends an image preview request to the camera driver.
After receiving the image preview request, the interconnection service in the desk lamp determines a hardware driver (in this embodiment, determines a camera driver) according to the virtualized hardware identifier, and sends the corresponding image preview request to the camera driver.
S3.7: and the camera in the desk lamp drives the camera to collect images, and the preview image data is transmitted to the hardware virtualization service of the tablet through the data channel.
The camera drive enables the camera to be started, the camera is driven to collect images according to camera configuration parameters carried in the image preview request, a preview image data stream is obtained, and the preview image data stream is sent to the interconnection service through the hardware abstraction service, so that the interconnection service continuously transmits the preview image data stream to the hardware virtualization service of the tablet in the data channel. The packetization and encoding and decoding processes of the preview image data stream are not described herein.
S3.8: the in-panel hardware virtualization service sends preview image data to the virtual camera HAL.
The hardware virtualization service continues to receive the preview image data stream and sends the preview image data stream to the virtual camera HAL.
S3.9: the virtual camera HAL in the tablet sends the preview image data to the camera service.
At this time, the virtual camera HAL continuously acquires preview image data acquired by the desk lamp camera, and continuously transmits the preview image data to the camera service.
S3.10: the in-tablet camera service sends preview image data to the educational APP.
S3.11: education APP in the tablet displays the preview image.
4. Virtual camera photographing stage
S4.1: in response to the received user operation, the education APP in the tablet sends a photographing request to the hardware virtualization service.
The user operation may be, for example, an operation of clicking a photographing option. In response to the received user operation, the education APP in the tablet sends a photographing request to the hardware virtualization service.
The photographing request may include, but is not limited to, a photographing image sequence number, device information (i.e., table lamp information), a virtualized hardware identifier (i.e., table lamp camera identifier), a camera configuration parameter, and the like. Camera configuration parameters include, but are not limited to, image resolution.
The photographing request can also carry a task identifier so as to ensure orderly management of multiple photographing tasks.
S4.2: the hardware virtualization service in the tablet sends a photographing request to the transmission management service.
S4.3: the transmission management service in the tablet transmits a photographing request to the interconnection service of the desk lamp.
And the transmission management service in the tablet determines a corresponding control channel according to the equipment information carried in the image preview request, and transmits the image preview request to the interconnection service of the desk lamp in the control channel.
S4.4: the interconnection service in the desk lamp sends a photographing request to the camera driver.
After receiving the image preview request, the interconnection service in the desk lamp determines a hardware driver (in this embodiment, determines a camera driver) according to the virtualized hardware identifier, and sends a corresponding photographing request to the camera driver.
S4.5: the camera in the desk lamp drives the camera to shoot images, and shot image data are transmitted to the hardware virtualization service of the tablet through the data channel.
The camera drives the camera to acquire images according to camera configuration parameters carried in the photographing request, photographed image data are obtained, and the photographed image data are sent to the interconnection service through the hardware abstraction service, so that the interconnection service continuously transmits the photographed image data to the hardware virtualization service of the tablet in the data channel. The packetization and encoding and decoding processes of the captured image data are not described herein.
S4.6: the hardware virtualization service in the tablet sends the captured image data to the education APP.
S4.7: education APP in the plate displays the photographed image.
In the above flow, the hardware abstraction service on the IoT device side is in the scheme of the present application, namely, the hardware virtualization service.
According to the technical scheme, the center side device can directly adopt electronic devices such as a mobile phone and a tablet personal computer, and the IoT device and the center side device can be separated and arranged, so that the center side device does not need to be provided with a thick base with a lifting camera, the placement mode of the IoT device such as a desk lamp is free, and the screen angle can be adjusted according to actual needs. And the education application program installed on the center side equipment has higher compatibility and expansibility, is easy to realize and popularize, and can provide great convenience for education.
In the above scheme, a process that a center side uses an IoT device to take a photograph through a virtual camera service is described, and an APP is operated, which describes a process that an APP requests to take a photograph. The following description will take, as an example, a case where a center side device runs two APPs, and an IoT device receives photographing requests of the two APPs. It will be appreciated that the principle is similar when IoT devices receive photographing requests of three and more APP's, and will not be described in detail here.
The meaning of nouns involved in the method is first described below.
The device number (device Id) is used to represent the individual cameras, and the IoT device may include one or more cameras thereon, each camera having a different device number, e.g., 1 for device Id of camera 1 and 2 for device Id of camera 2. When only one camera is on the IoT device, the device Id is fixed, e.g., the device Id is 2. The center side device also has a device Id, for example, the device Id of the center side device is 1.
The service sequence number (service Id) refers to a virtualized service that IoT devices can provide, for example: a service Id of 1 indicates a virtual camera service, and a service Id of 2 is an example of a virtual microphone service.
The following method uses a central side device as a tablet personal computer (tablet for short), and an IoT device as an electronic desk lamp as an example
Referring to fig. 6a, a second flowchart of a photographing method according to an embodiment of the present application is shown.
The method comprises the following steps:
s601: and sequentially receiving i photographing requests.
i photographing requests are sent by j application programs, each application program sends at least one photographing request, i and j are integers which are larger than or equal to 1, and i is larger than or equal to j.
S602: and inserting the photographing requests into a photographing request queue in sequence according to the receiving sequence of the i photographing requests, wherein the first photographing request of the photographing request queue is the photographing request received first.
S603: and sending an nth photographing request to the Internet of things equipment to acquire photographing data, and deleting the nth photographing request from a photographing request queue when the photographing task of the nth photographing request is completed until no photographing request exists in the photographing request queue, wherein n is 1,2 and … i in sequence.
According to the scheme provided by the embodiment of the application, each photographing task is arranged by using the photographing request queue, and the photographing request queue is executed according to the first-in first-out sequence and the last-in last-out sequence. And the photographing tasks in the photographing request queue are sequentially completed, so that the problem that a plurality of tasks are issued to the Internet of things equipment side at the same time, and the Internet of things equipment side breaks down or the photo data returns errors is avoided. Completing photographing tasks one by one means: and sending a photographing request of the head of the photographing request queue to the Internet of things equipment, and after receiving the photo data returned by the Internet of things equipment, returning the photo data to a correct corresponding application program, wherein the head of the photographing request queue is considered to complete the photographing task. And deleting the first photographing task of the photographing request queue, wherein the second photographing task of the original queue is updated into the first photographing task, and then a new first photographing task is issued to the equipment side of the Internet of things. In summary, by using the method, through completing the photographing task one by one, when the IoT device receives the photographing requests sent by the plurality of application programs on the electronic device, the photograph can be correctly returned to the corresponding application program, thereby improving the user experience.
The following description is made in connection with specific implementations.
See also fig. 6b and fig. 7. Fig. 6b is a flowchart of another photographing method according to an embodiment of the present application;
fig. 7 is a schematic diagram of a photographing method according to an embodiment of the present application.
Specific implementation manners of establishing connection and transmission channels between IoT devices and hub devices may be referred to the description in the above embodiments, and the embodiments of the present application are not described herein again.
. The method comprises the following steps:
s5.1: the remote device management object receives the photographing request 10001 of APP 1.
In this embodiment of the present application, each APP corresponds to a Callback function of an APP, or simply called Callback (cb), which is used for returning to the main function after being called by the main function.
The callback function of APP1 is cb1, the callback function of APP2 is cb2, and similarly, a mapping relation exists between the callback function and the APP.
The photographing request sent by each APP corresponds to a request code.
In some embodiments, for an APP, the request code starts from a first preset value, the APP issues a photographing request each time, the corresponding request code increments by 1 until the increment stops after reaching a second preset value, and then starts to increment cyclically again from the first preset value. The first preset value and the second preset value are not particularly limited in this embodiment. Taking the first preset value as 10001 and the second preset value as 20000 as an example, the request code of the first photographing request sent by the APP is 10001, the request code of the second photographing request is 10002, and the like, and when 20000 is reached, cycle counting is started from 10001 again.
The request codes of the photographing requests sent by each APP are respectively counted independently and are not mutually influenced. The first preset value and the second preset value corresponding to each APP may be the same or different, and the embodiment of the present application is not specifically limited.
S5.2: the remote device object receives the photographing request 10001 of APP 1.
The remote Device object determines that the object to be controlled is the remote IoT Device and the desk lamp Device2 according to the photographing request 10001 of the APP 1.
S5.3: the remote service receives the photographing request 10001 of APP 1.
The remote service determines that the requested service serial number (service Id) is service 1 according to the photographing request 10001 of APP1, that is, the requested service is a virtualized camera service.
S5.4: the remote service checks whether the current camera logical channel exists.
The camera logical channel, i.e., the data transmission channel established between the hub device and the IoT device.
The collaborative work system supports an idle release channel, and when a camera logic channel established by the center side device and the IoT device does not transmit data for a long time, the channel is destroyed to release space, so that the occupation of resources is reduced. Therefore, when the previously established camera logical channel is destroyed because data is not transmitted for a long time, the channel needs to be re-established.
Therefore, whether the current camera logic channel exists or not needs to be judged, if yes, the current camera logic channel does not need to be established again, and the photographing instruction is directly transmitted by utilizing the channel. If not, S5.5 is performed.
S5.5: the remote service determines that the current camera logical channel does not exist.
S5.6: create a camera logical channel and cache the photo request 10001 of APP 1.
For a specific implementation of creating the camera logic channel, reference may be made to step S3.4 in the above description related to fig. 4, that is, the process of creating the data channel is not repeated herein.
In the embodiment of the invention, a photographing management module is added in the hardware virtualization service of the center side device, and the photographing management module is used for realizing photographing task management. Namely, each photographing request is filled into a photographing request queue, and the uniqueness of the photographing sequence is realized. The photographing request queue is characterized by first-in first-out, i.e. earlier photographing requests are processed, and later photographing requests are processed.
The maximum number of photographing requests which are allocated in advance for the queue and allowed to be queued in the embodiment of the application is not particularly limited.
The contents of the cache may be shown in table 1, and the request queue number and the delay in table 1 are only illustrative and not limiting to the technical solution of the present application. Also, in other embodiments, the time consumption may be replaced with a specific timestamp, or both the time consumption and the timestamp may be cached. The timestamp may record the sending time and/or the buffering time of the corresponding photographing request, which is not limited herein. The addition of time consuming or time stamping information can avoid overlapping of the newly cached content and the cached content corresponding to the old channel after the logical channel of the camera is destroyed and rebuilt.
Table 1: cache content table 1
Request queue number Status of Time consuming Request encoding
00001 Waiting to send 0ms 10001
In some embodiments, the request queue number starts from a third preset value, each line of the request queue is filled with a photographing request, the corresponding request queue number is incremented by 1 until the request queue number stops after the request queue number is incremented to a fourth preset value, and then the request queue number is circularly incremented again from the third preset value. The third preset value and the fourth preset value are not particularly limited in this embodiment. Taking the third preset value as 00001 and the fourth preset value as 10000 as examples, the request queue number of the first photographing request of the request queue is 00001, the request queue number of the second photographing request is 00002, and so on, when 10000 is reached, the cycle count is started again from 00001. Different from the request codes of the APPs, the photographing requests sent by the APPs are recorded by using the request queue number together, namely, the photographing request 10001 sent by the APP1 is used as the first queue, and the request queue number is 00001; the second bit of the photographing request 10001 sent by the APP2 as a queue has a request queue number of 00002.
S5.7: a creation success message of the camera logical channel is returned to the remote service.
S5.8: the remote service reads the cache state.
The remote service reads a photographing request 10001 sent by an APP1 at the head of a reading photographing request queue.
In some embodiments, the cache state of the camera data processing may also be updated after the remote service reads the cache state. To refresh the time-consuming records in the cache.
The process of establishing the buffer and the camera logic channel when the first photographing request comes is described above, and the flow when other subsequent photographing requests come is continuously described below.
5.9: the remote service sends a photographing request 10001 sent by APP1 to the IoT device side.
To cause the IoT device to perform photographing.
S6.1: the remote device management object receives the photographing request 10001 of APP 2.
The callback function of APP2 is cb2.
S6.2: the remote device object receives the photographing request 10001 of APP 2.
The remote Device object determines that the object to be controlled is the remote IoT Device and the desk lamp Device2 according to the photographing request 10001 of APP 2.
S6.3: the remote service receives the photographing request 10001 of APP 2.
The remote service determines that the requested service serial number (service Id) is service 1 according to the photographing request 10001 of APP2, that is, the requested service is a virtualized camera service.
That is, the photographing request 10001 of APP2 may be arranged in a photographing request queue with the photographing request 10001 of APP 1.
S6.4: and sending a cache request, and requesting to cache the photographing request 10001 of the APP 2.
S6.5: updating the cache state.
That is, the camera data processing updates the buffer status, and the updated buffer status is referred to table 2.
Table 2: cache content table two
Request queue number Status of Time consuming Request encoding
00001 Waiting for return 500ms 10001
00002 Waiting to send 0ms 10001
S6.6: the cache is returned to the remote service.
At this time, according to the state shown in table 2, the photographing data is not acquired. At this point, the wait is continued.
S7.1: the remote device management object receives the photographing request 10002 of APP 1.
APP1 again sends a photograph request. The request code of the photographing request is incremented by 1 at this time.
S7.2: the remote device object receives the photographing request 10002 of APP 1.
The remote Device object determines that the object to be controlled is the remote IoT Device and the desk lamp Device2 according to the photographing request 10002 of the APP 1.
S7.3: the remote service receives the photographing request 10002 of APP 1.
The remote service determines that the requested service serial number (service Id) is service 1 according to the photographing request 10002 of APP1, that is, the requested service is a virtualized camera service.
S7.4: and sending a cache request, and requesting to cache the photographing request 10002 of the APP 1.
S7.5: updating the cache state.
That is, the camera data processing updates the buffer status, and the updated buffer status is referred to in table 3.
Table 3: cache content table three
Request queue number Status of Time consuming Request encoding
00001 Waiting for return 550ms 10001
00002 Waiting to send 50ms 10001
00003 Waiting to send 0ms 10002
S7.6: ioT devices take photographs to obtain photograph data.
S7.7: the IoT device returns image data to the hub device over the camera logic channel.
S7.8: the image data is returned to the camera data of the center-side device.
At this time, the image data corresponding to the photographing request 10001 of APP1 has been acquired, and the ending process of the photographing request 10001 of APP1 is described below.
S8.1: and returning image data corresponding to the photographing request 10001 of the APP1 to the remote service, and updating the cache.
The updated cache state is shown in table 4.
Table 4: cache content table four
Request queue number Status of Time consuming Request encoding
00002 Waiting to send 350ms 10001
00003 Waiting to send 300ms 10002
That is, the cache content of the image data corresponding to the photographing request 10001 of the APP1 may be deleted in the request queue at this time, so as to implement the first-in first-out of the photographing request.
S8.2: the remote service returns image data corresponding to the photographing request 10001 of the APP1 to the remote device object.
S8.3: the remote device object returns image data corresponding to the photographing request 10001 of the APP1 to the remote device management object.
S8.4: the remote device management object returns image data corresponding to the photographing request 10001 of the APP1 to the application program.
At this point the photographing task required by the photographing request 10001 of APP1 has been completed.
S8.5: the remote service sends a photographing request 10001 sent by APP2 to the IoT device side.
S8.6: ioT devices take photographs to obtain photograph data.
The completion process of the subsequent photographing task is similar to the above steps, and follows the rule of first in first out of the photographing request queue, and will not be described in detail here.
It will be appreciated that the above steps are merely for convenience of description, and are not limited to the technical solution of the present application, and in practical application, the order of the steps may be appropriately adjusted. For example, the step of "the remote service sends the photographing request 10001 sent by APP2 to the IoT device side" in S8.5 may be performed in advance, for example, after S8.1. For another example, the step of establishing a logical transmission channel starts when the first photographing request in the photographing request queue is received, but the time when the channel establishment is completed may be after receiving a plurality of photographing requests, that is, the channel establishment time is not limited.
Furthermore, as the logic transmission channel can be dynamically created or dynamically destroyed, the photographing request queue adopted by the scheme of the application can also be dynamically created and destroyed, namely when a photographing request is received and the logic transmission channel is started to be created, the photographing request queue is generated; when the logic transmission channel is destroyed because data transmission is not performed for a long time, no photographing task exists at this time, the photographing request queue can be deleted, so that space is released, and resource occupation is reduced.
In summary, by using the scheme provided by the embodiment of the present application, when the IoT device receives the photographing request sent by the plurality of applications on the electronic device, the photograph is correctly returned to the corresponding application, thereby improving user experience.
Based on the photographing method provided by the embodiment, the application also provides electronic equipment.
With continued reference to the schematic diagram of the electronic device shown in fig. 2 a.
The electron provided by the embodiment of the application comprises: memories, namely an internal memory 121 and a processor 110.
Wherein an internal memory 121 is coupled to the processor 110.
The internal memory 121 stores program instructions that, when executed by the processor 110, cause the electronic device to perform the photographing method described in the above embodiments.
Specifically, the electronic device sequentially receives i photographing requests, wherein the i photographing requests are sent by j application programs, each application program sends at least one photographing request, i and j are integers greater than or equal to 1, and i is greater than or equal to j.
And when the transmission channel of the photo data is not established between the electronic equipment and the Internet of things equipment, establishing the transmission channel and establishing the photographing queue.
And when the transmission channel is established between the electronic equipment and the Internet of things equipment, the photographing queue is created. And inserting the photographing requests into the photographing request queue in sequence according to the receiving sequence of the i photographing requests, and storing the corresponding relation among photographing request codes, time information and request queue numbers, wherein the photographing request codes are codes of the photographing requests sent by the application program, and the request queue numbers are sequence numbers of the photographing requests in the photographing request queue.
In some embodiments, the photographing request code loops counting between the first preset value and the second preset value.
In some embodiments, the request queue number is counted in cycles between the third preset value and the fourth preset value.
In some embodiments, the time information includes at least one of a photographing task time consumption or a photographing request receiving time.
The electronic equipment sends an nth photographing request to the Internet of things equipment through the transmission channel so as to acquire photographing data;
the electronic equipment returns the photographing data to the application program for sending the nth photographing request according to the corresponding relation between the callback function of the application program and the photographing request code so as to complete the photographing task of the nth photographing request;
And deleting the corresponding relation among the photographing request code of the nth photographing request, the time information of the nth photographing request and the request queue number of the nth photographing request from the photographing request queue by the electronic equipment. And until no photographing request exists in the photographing request queue, n is 1,2 and … i in sequence.
Further, the embodiment of the application also provides a cooperative work system.
Referring to fig. 8, a schematic diagram of a cooperative system according to an embodiment of the present application is provided.
The cooperative work system 30 includes: an electronic device 10 and an IoT device 20.
The electronic device 10 is also referred to as a center-side device. Reference may be made to the related descriptions in the above embodiments regarding specific implementations of the electronic device 10 and the IoT device 20, and the embodiments of the present application are not described herein.
After the scheme provided by the embodiment of the present application is utilized, when the IoT device receives a photographing request sent by the plurality of applications on the electronic device, the photograph can be correctly returned to the corresponding application, so that user experience is improved.
The electronic device may be a tablet computer, a mobile phone, a notebook computer or a desktop computer, which is not particularly limited in the embodiments of the present application. The internet of things device may be a desk lamp.
The embodiment also provides a computer storage medium, in which computer instructions are stored, and when the computer instructions run on the internet of things device, the internet of things device is caused to execute the related method steps to implement the photographing method in the above embodiment.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described related steps to implement the photographing method in the above-described embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer-executable instructions, and when the device is operated, the processor can execute the computer-executable instructions stored in the memory, so that the chip executes the shooting method in each method embodiment.
The IOT device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects of the corresponding methods provided above, and will not be described herein.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (10)

1. The photographing method is characterized by being applied to electronic equipment, wherein the electronic equipment is used for photographing through a camera of the Internet of things equipment to obtain photo data, and the method comprises the following steps:
sequentially receiving i photographing requests, wherein the i photographing requests are sent by j application programs, each application program sends at least one photographing request, i and j are integers which are larger than or equal to 1, and i is larger than or equal to j;
inserting photographing requests into a photographing request queue in sequence according to the receiving sequence of the i photographing requests, wherein the first photographing request of the photographing request queue is the photographing request received first;
and sending an nth photographing request to the Internet of things equipment to obtain photographing data, deleting the nth photographing request from the photographing request queue when the photographing task of the nth photographing request is completed until no photographing request exists in the photographing request queue, wherein n is 1,2 and … i in sequence.
2. The photographing method according to claim 1, wherein the sequentially inserting the photographing requests into the photographing request queue according to the receiving order of the i photographing requests comprises:
creating the photographing queue;
And inserting the photographing requests into the photographing request queue in sequence according to the receiving sequence of the i photographing requests, and storing the corresponding relation among photographing request codes, time information and request queue numbers, wherein the photographing request codes are codes of the photographing requests sent by the application program, and the request queue numbers are sequence numbers of the photographing requests in the photographing request queue.
3. The photographing method according to claim 2, wherein said creating said photographing queue specifically comprises:
when a transmission channel of the photo data is not established between the electronic equipment and the Internet of things equipment, establishing the transmission channel and establishing the photographing queue;
and when the transmission channel is established between the electronic equipment and the Internet of things equipment, the photographing queue is created.
4. The photographing method of claim 2, wherein the photographing request codes a cycle count between a first preset value and a second preset value.
5. The photographing method of claim 2, wherein the request queue number is counted in a cycle between a third preset value and a fourth preset value.
6. The photographing method of claim 2, wherein the time information comprises:
At least one of a photographing task time consumption or a photographing request receiving time.
7. A photographing method as claimed in claim 3, further comprising:
and deleting the photographing request queue when the transmission channel between the electronic equipment and the Internet of things equipment is destroyed.
8. The photographing method according to claim 3, wherein the nth photographing request is sent to the internet of things device to obtain photographing data, and when the photographing task of the nth photographing request is completed, the nth photographing request is deleted from the photographing request queue, and specifically includes:
sending an nth photographing request to the internet of things equipment through the transmission channel to acquire photographing data;
returning the photographing data to the application program sending the nth photographing request according to the corresponding relation between the callback function of the application program and the photographing request code so as to complete the photographing task of the nth photographing request;
and deleting the corresponding relation among the photographing request code of the nth photographing request, the time information of the nth photographing request and the request queue number of the nth photographing request from the photographing request queue.
9. An electronic device, the electronic device comprising:
a memory and a processor, the memory coupled with the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the photographing method of any of claims 1-8.
10. A co-operating system, characterized in that the system comprises an electronic device as claimed in claim 9, further comprising an internet of things device.
CN202210865708.8A 2022-07-21 2022-07-21 Photographing method, electronic equipment and collaborative work system Pending CN117499780A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210865708.8A CN117499780A (en) 2022-07-21 2022-07-21 Photographing method, electronic equipment and collaborative work system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210865708.8A CN117499780A (en) 2022-07-21 2022-07-21 Photographing method, electronic equipment and collaborative work system

Publications (1)

Publication Number Publication Date
CN117499780A true CN117499780A (en) 2024-02-02

Family

ID=89669516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210865708.8A Pending CN117499780A (en) 2022-07-21 2022-07-21 Photographing method, electronic equipment and collaborative work system

Country Status (1)

Country Link
CN (1) CN117499780A (en)

Similar Documents

Publication Publication Date Title
WO2020244495A1 (en) Screen projection display method and electronic device
WO2020244492A1 (en) Screen projection display method and electronic device
CN112291764B (en) Content connection system
CN112527174B (en) Information processing method and electronic equipment
JP7369281B2 (en) Device capacity scheduling method and electronic devices
CN112394895A (en) Cross-equipment display method and device of picture and electronic equipment
US12010257B2 (en) Image classification method and electronic device
WO2022078295A1 (en) Device recommendation method and electronic device
CN112130788A (en) Content sharing method and device
WO2022143883A1 (en) Photographing method and system, and electronic device
CN114845035B (en) Distributed shooting method, electronic equipment and medium
WO2022135156A1 (en) Distributed cross-device collaboration method, and electronic device and communication system
WO2022135527A1 (en) Video recording method and electronic device
CN113961157A (en) Display interaction system, display method and equipment
US20230273902A1 (en) File Opening Method and Device
CN116056076B (en) Communication system, method and electronic equipment
WO2023005711A1 (en) Service recommendation method and electronic device
WO2022052706A1 (en) Service sharing method, system and electronic device
CN117499780A (en) Photographing method, electronic equipment and collaborative work system
CN113835802A (en) Device interaction method, system, device and computer readable storage medium
CN117499781A (en) Photographing method, internet of things equipment and collaborative work system
CN116366957B (en) Virtualized camera enabling method, electronic equipment and cooperative work system
WO2024140279A1 (en) File transfer method and electronic device
CN117473465A (en) Compatibility verification method, device and system for device virtualization service
CN117478682A (en) Method, equipment and cooperative work system for establishing point-to-point channel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination