WO2022166521A1 - 跨设备的协同拍摄方法、相关装置及系统 - Google Patents

跨设备的协同拍摄方法、相关装置及系统 Download PDF

Info

Publication number
WO2022166521A1
WO2022166521A1 PCT/CN2022/070618 CN2022070618W WO2022166521A1 WO 2022166521 A1 WO2022166521 A1 WO 2022166521A1 CN 2022070618 W CN2022070618 W CN 2022070618W WO 2022166521 A1 WO2022166521 A1 WO 2022166521A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
shooting
slave device
master device
slave
Prior art date
Application number
PCT/CN2022/070618
Other languages
English (en)
French (fr)
Inventor
冯可荣
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202280009678.9A priority Critical patent/CN116724560A/zh
Publication of WO2022166521A1 publication Critical patent/WO2022166521A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Definitions

  • the present application relates to the field of photographing technologies, and in particular, to a cross-device collaborative photographing method, related apparatus, and system.
  • the present application provides a cross-device collaborative shooting method, related apparatus and system, which solve the problem that the master device cannot control the shooting effect of the slave device when the cross-device collaborative shooting is performed.
  • an embodiment of the present application provides a cross-device collaborative shooting method.
  • the method is applied to a master device.
  • the master device has established communication connections with m slave devices, where m is an integer greater than or equal to 1.
  • the method includes: : the master device displays an application interface; the master device receives the first image sent by the slave device, and the first image is an image obtained by the slave device according to the first shooting parameters; the master device displays m first images on the interface; the master device displays m first images on the interface;
  • the device receives at least one operation; the master device, in response to the at least one operation, sends a control command carrying a second shooting parameter to the slave device, and the second shooting parameter is used to adjust the shooting effect of the slave device; the master device receives the first shot sent by the slave device. Two images, the second image is obtained by the slave device according to the shooting parameters; the master device displays the second image on the interface and no longer displays the first image.
  • the master device can first obtain an image collected and processed by the slave device according to the first shooting parameter, and the user can find the shooting parameter that needs to be adjusted through the above image, Then the user can send a control command to adjust the shooting effect to the slave device through the master device. After the slave device adjusts the shooting parameters, the master device can display the image adjusted according to the control command returned from the slave device, so that the user can control the shooting effect of the slave device through the master device.
  • the method further includes: the interface of an application displayed by the master device further includes multiple shooting options corresponding to the slave device, and the multiple shooting options respectively correspond to the shooting capabilities of the slave device; wherein, the operation includes: acting on multiple shooting options The operation of one shooting option among the shooting options; the second shooting parameter includes: including: the shooting parameter corresponding to the shooting option acted on by the operation.
  • the user can directly view the shooting capability of the slave device on the master device, and control the slave device to make corresponding adjustments through user operations on the master device, so as to realize the effect of the master device remotely controlling the shooting of the slave device.
  • the master device may also acquire the shooting capability of the slave device; wherein, the second shooting parameter is within the shooting capability range of the slave device.
  • the master device can acquire the shooting capability of the slave device, so that the master device can display the capability of the slave device to the user for viewing.
  • the method further includes: the main device acquires and processes the image to obtain a third image; and the main device further displays the third image on the interface.
  • the master device can also display the images collected and processed by itself.
  • the user can see the images shot by the master device and the slave device on the display screen of the master device at the same time, thereby improving the user experience and meeting the needs of multi-angle and multi-screen shooting.
  • the interface further includes a plurality of shooting options corresponding to the main device, and the plurality of shooting options corresponding to the main device correspond to the shooting capabilities of the main device; the main device receives a plurality of shooting options that act on the corresponding main device.
  • the image is collected and processed according to the shooting parameters corresponding to the shooting option acted on by the other operation to obtain the fourth image; the main device displays the fourth image on the interface, and no longer displays the third image .
  • the method before the master device sends a control command carrying the second shooting parameter to the slave device in response to at least one operation, the method further includes: the master device determines a first quantity and a first type, the first The number is the number of image streams required to display the second image, the first type includes the type of image streams required to display the second image; the master device determines the second number and the second type, the second number is less than the first number, the first One type includes the second type; wherein, the control command also carries: the second quantity and the second type; the second image includes: the image stream of the second quantity and the second type obtained by the slave device collected and processed according to the shooting parameters.
  • the master device can reduce the number of image streams requested from the slave device.
  • the slave device sends an image stream to the master device, the number of streams to be sent can be reduced, thereby reducing network data transmission load and improving transmission efficiency.
  • the method further includes: the master device processes the second image into an image stream of a first quantity and a first type; the master device displays the first image stream on the interface.
  • the two images include: the main device displays the second image on the interface according to the first quantity and the first type of image streams.
  • the master device can replicate more image streams from a smaller number of image streams received.
  • the method further includes: the main device runs a photographing application, and the first interface is provided by the photographing application.
  • the method further includes: the master device runs a live broadcast application, and the first interface is provided by the live broadcast application; after the master device receives the first image sent from the slave, the method further includes: The main device sends the first image to the server corresponding to the live broadcast application, and the server sends the first image to the first device.
  • an embodiment of the present application provides a cross-device collaborative shooting method.
  • the method is applied to a slave device, and the slave device establishes a communication connection with the master device.
  • the method includes: the slave device collects and processes the first shooting parameters according to the first shooting parameter. The first image is obtained; the slave device sends the first image to the master device; the slave device receives a control command sent by the master device that carries the second shooting parameter, and the second shooting parameter is used to adjust the shooting effect of the slave device; 2.
  • the slave device can first send the image acquired and processed according to the first shooting parameter to the master device for the master device to use. Then, the slave device can also respond to the control command sent by the master device to adjust the shooting effect. After the slave device adjusts the shooting parameters, the slave device can send the adjusted image to the master device, so that the user can use the image of the slave device on the master device, and can control the shooting effect of the slave device through the master device.
  • the method further includes: displaying an interface of an application from the device; displaying the first image on the interface from the device; acquiring and processing the second image from the device according to the second shooting parameters;
  • the slave device displays the second image on the application interface, and no longer displays the first image.
  • the slave device can not only display the image acquired by its own camera according to the first shooting parameter, but also display the adjusted image after responding to the control command sent by the master device to adjust the shooting effect. Therefore, the user of the slave device can view the images captured by the slave device at any time after agreeing to perform collaborative shooting with the master device.
  • an embodiment of the present application provides a cross-device collaborative shooting method, and the method is applied to a communication system including one master device and m slave devices.
  • the method includes: m slave devices acquire and process a first image according to a first shooting parameter; m slave devices send the first image to a master device; the master device displays m first images on an interface; the master device responds to At least one operation is to send a control command carrying a second shooting parameter to n slave devices, and the second shooting parameter is used to adjust the shooting effect of the slave device; n slave devices respectively collect and process the n slave devices according to the second shooting parameter.
  • the second image; the n slave devices send their respective second images to the master device; the master device uses the second image from the i-th slave device in the above interface to replace the i-th slave device.
  • An image; n is less than or equal to m, and 1 ⁇ i ⁇ n.
  • the master device can establish a connection with multiple slave devices, and send a control command to adjust the shooting effect to the multiple slave devices.
  • Each slave device adjusts the shooting effect according to the shooting parameters carried in the control command, and transmits the adjusted captured and processed image back to the master device. Therefore, the user can control the shooting effects of multiple slave devices at the same time through the master device, and view images captured by multiple slave devices on one master device.
  • the first shooting parameters corresponding to different slave devices are different.
  • the first photographing parameter may be a default photographing parameter, or may be a photographing parameter carried in a control command sent by the master device that was previously received by the device.
  • the second shooting parameters corresponding to different slave devices may be the same or different, and the second shooting parameters corresponding to one slave device depend on the user's operation on the slave device.
  • each slave device corresponds to a first image, and the first image corresponding to the slave device is acquired and processed according to the first shooting parameters corresponding to the slave device.
  • each slave device corresponds to a second image, and the second image corresponding to the slave device is acquired and processed according to the second shooting parameters corresponding to the slave device.
  • the method further includes: the interface further includes a plurality of shooting options, the multiple shooting options respectively correspond to m slave devices, and the shooting options correspond to the shooting capabilities of the slave devices; wherein, the operations in the third aspect include: : the operation acting on the shooting option; the second shooting parameter includes: the shooting parameter corresponding to the shooting option acted on by the operation.
  • the user can directly view the shooting capabilities of multiple slave devices on one master device, and then control the multiple slave devices through the master device to make corresponding adjustments.
  • the master device before the master device displays multiple shooting options, the master device can also acquire the shooting capabilities of m slave devices; wherein, the second shooting parameter corresponding to one of the slave devices is within the shooting capability range of the slave device.
  • the master device can acquire the shooting capabilities of multiple slave devices, so that the master device can display the shooting capabilities of each slave device to the user for viewing.
  • the method further includes: displaying an interface of an application from the device; displaying the first image on the interface from the device; acquiring and processing the second image from the device according to the second shooting parameter; The device displays the second image on the application interface and no longer displays the first image.
  • the slave device can not only display the image obtained by its own camera according to the first shooting parameter, but also display the adjusted image after responding to the control command sent by the master device to adjust the shooting effect. Therefore, the user of the slave device can view the images captured by the slave device at any time after agreeing to perform collaborative shooting with the master device.
  • the method further includes: the main device collects and processes the images to obtain a third image; the main device further displays a third image on the interface image.
  • the master device can display images collected and processed by itself while displaying images collected and processed by multiple slave devices.
  • the user can see the images shot by the master device and multiple slave devices at the same time on the display screen of the master device, thereby improving the user experience and meeting the needs of multi-angle and multi-screen shooting.
  • the interface further includes a plurality of shooting options corresponding to the main device, and the plurality of shooting options corresponding to the main device correspond to the shooting capabilities of the main device; the main device receives a plurality of shooting options that act on the corresponding main device.
  • the main device In another operation of one of the shooting options, collect and process images according to the shooting parameters corresponding to the shooting options acted by the second operation to obtain a fourth image; the main device displays the fourth image on the interface, and no longer displays the third image .
  • the method further includes: the master device determines a first quantity and a first type, and the first quantity is for displaying the second shooting parameter.
  • the number of image streams required for the image, the first type includes the type of image streams required to display the second image; the master device determines the second number and the second type, the second number is less than the first number, the first type includes the second Type; wherein, the control command also carries: the second quantity and the second type; the second image includes: the second quantity and the second type of image streams obtained by the slave device collected and processed according to the shooting parameters.
  • the master device can reduce the number of image streams requested from the slave device.
  • the slave device sends an image stream to the master device, the number of streams to be sent can be reduced, thereby reducing network data transmission load and improving transmission efficiency.
  • the method further includes: the main device processes the second image into an image stream of a first quantity and a first type; the main device displays the second image on the interface
  • the image includes: the main device displays the second image on the interface according to the first quantity and the first type of image stream.
  • the master device can replicate more image streams from a smaller number of image streams received.
  • embodiments of the present application provide an electronic device, the electronic device includes one or more processors and one or more memories; wherein, one or more memories are coupled with one or more processors, one or more A plurality of memories for storing computer program code, the computer program code comprising computer instructions, when executed by one or more processors, causes the electronic device to perform the method described in the first aspect or any one of the embodiments of the first aspect ; or, as described in the third aspect or any one of the embodiments of the third aspect.
  • an embodiment of the present application provides an electronic device, the electronic device includes one or more processors and one or more memories; wherein, one or more memories are coupled with one or more processors, one or more a plurality of memories for storing computer program code, the computer program code comprising computer instructions, when executed by one or more processors, causes the electronic device to perform the method described in the second aspect or any one of the embodiments of the second aspect .
  • the embodiments of the present application provide a computer program product including instructions, when the computer program product is run on an electronic device, the electronic device is made to execute the first aspect or any one of the implementation manners of the first aspect. a method; alternatively, a method as described in the third aspect or any one of the embodiments of the third aspect.
  • an embodiment of the present application provides a computer-readable storage medium, including instructions, when the instructions are executed on an electronic device, the electronic device is made to execute the first aspect or any one of the implementation manners of the first aspect. a method; alternatively, a method as described in the third aspect or any one of the embodiments of the third aspect.
  • an embodiment of the present application provides a computer program product containing instructions, when the computer program product runs on an electronic device, the electronic device is made to execute the second aspect or any one of the implementation manners of the second aspect. method.
  • an embodiment of the present application provides a computer-readable storage medium, including instructions, when the instructions are executed on an electronic device, the electronic device is made to execute the second aspect or any one of the implementation manners of the second aspect. method.
  • an embodiment of the present application provides a communication system, where the communication system includes: a master device and a slave device.
  • the master device is used to execute the method described in the first aspect or any embodiment of the first aspect; or, as described in the third aspect or any embodiment of the third aspect, the slave device is used to execute A method as described in the second aspect or any one of the embodiments of the second aspect.
  • the user can connect one or more slave devices through the master device, which can not only provide the user with a multi-view shooting experience, but also control the shooting effect of the slave devices, so as to satisfy the user The need to control the effect of remote shooting.
  • Implementing the cross-device collaborative shooting method can also solve distributed control among devices with operating systems, extend the functions of electronic devices to other common hardware camera devices, and flexibly expand lenses.
  • FIGS. 1A-1B are schematic diagrams of two types of cross-device collaborative shooting provided by an embodiment of the present application.
  • FIG. 2A is a system structure diagram provided by an embodiment of the present application.
  • FIG. 2B is a schematic diagram of a hardware structure of an electronic device 400 provided by an embodiment of the present application.
  • FIG. 3 is a software structural framework of the master device 100 provided by an embodiment of the present application.
  • Fig. 4 is the software structural framework of the slave device 200 provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of a service scenario provided by an embodiment of the present application.
  • FIGS. 6A-6D, 7A-7B, 8A-8B, 9A-9D, 10A-10C, and 11A-11D are schematic diagrams of some user interfaces provided by embodiments of the present application.
  • FIG. 13 is a schematic diagram of a dynamic pipeline processing principle provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a multiplexing and stream splitting processing principle provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a frame synchronization processing principle provided by an embodiment of the present application.
  • UI user interface
  • the term "user interface (UI)" in the description, claims and drawings of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the internal form of information Conversion to and from user-acceptable forms.
  • the user interface of the application is the source code written in a specific computer language such as java and extensible markup language (XML).
  • the interface source code is parsed and rendered on the terminal device, and finally presented as content that the user can recognize.
  • Controls also known as widgets, are the basic elements of the user interface. Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar), pictures and text.
  • the attributes and content of controls in the interface are defined by tags or nodes.
  • XML specifies the controls contained in the interface through nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to a control or property in the interface, and the node is rendered as user-visible content after parsing and rendering.
  • applications such as hybrid applications, often contain web pages in their interface.
  • a web page, also known as a page can be understood as a special control embedded in an application interface.
  • a web page is source code written in a specific computer language, such as hypertext markup language (GTML), cascading styles Tables (cascading style sheets, CSS), java scripts (JavaScript, JS), etc.
  • GTML hypertext markup language
  • cascading styles Tables cascading style sheets, CSS
  • java scripts JavaScript, JS
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page.
  • GTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • GUI graphical user interface
  • GUI refers to a user interface related to computer operations that is displayed graphically. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. visual interface elements.
  • FIG. 1A exemplarily shows a process of cross-device collaborative shooting.
  • a cross-device collaborative shot involves a master device and a slave device.
  • the master device can discover the slave device, and then the slave device can register with the master device.
  • the master device can send commands to the slave device through the wireless network, such as a command to turn on the camera, the slave device can start the camera to capture images in response to the command, and compress the captured images and send them to the master device.
  • the master device can display the images captured by the slave device and the images captured by the camera of the master device on the preview interface, so as to realize collaborative shooting across devices.
  • the master device cannot obtain the capability information of the slave device, so it cannot control and adjust the shooting screen of the slave device, for example, it cannot issue a control command to adjust the shooting effect to the slave device, such as zooming , flash, etc.
  • FIG. 1B shows a cross-device collaborative shooting scene involved in the instant messaging process.
  • device A and device B can make a video call through an instant messaging server, such as WeChat Video calls provided.
  • an instant messaging server such as WeChat Video calls provided.
  • device A may capture an image through a camera, and then send the image to device B through an instant messaging server.
  • Device B can directly display the image sent by device A on the video call interface, and can also display the image collected by device B itself, so as to realize cross-device collaborative shooting.
  • the method shown in FIG. 1B is similar to that in FIG. 1A.
  • Device A cannot obtain the capability information of device B, so it cannot control and adjust the shooting screen of device B. For example, it cannot issue a control command to adjust the shooting effect to device B. Commands such as zoom, flash, etc.
  • the device A and the device B communicate through the network, a large delay will be caused in the process of cross-device collaborative shooting, which affects the user experience.
  • the cross-device collaborative shooting method involves a master device and a slave device.
  • the master device in the process of cooperative shooting between the master device and the slave device, the master device can receive a user's control operation on the slave device; the slave device can adjust the shooting effect in response to the control operation, and then obtain the adjusted result.
  • the received image is sent to the master device.
  • the master device can render the image returned from the slave device.
  • the host device can also display images captured and processed by itself at the same time.
  • the main device can also perform processing such as photographing, video recording, forwarding, etc. on the displayed image in response to the user operation.
  • the master device can not only provide the user with a multi-view shooting experience, but also control the shooting effect of the slave device, so as to meet the user's requirement of controlling the remote shooting effect.
  • the master device can also perform processing such as previewing, photographing, video recording, forwarding, and editing of the respective screens of the slave device and the master device.
  • Implementing the cross-device collaborative shooting method can meet the control requirements of various cross-device collaborative shooting scenarios, such as the shooting effect of a mobile phone controlling a TV, a watch controlling the shooting effect of a mobile phone, and a mobile phone controlling the shooting effect of a tablet computer, and so on.
  • the number of master devices is one.
  • the number of slave devices is not limited, and there can be one or more.
  • the image finally presented by the master device may include images from multiple devices, for example, the preview screen finally displayed by the master device includes: the image of the master device and images returned from multiple slave devices.
  • the collaborative shooting mentioned in the following embodiments of this application means that the master device and the slave device establish a communication connection, and both the master device and the slave device use a camera to shoot and process, and the slave device sends the captured image to the master device based on the communication connection, And the master device displays the captured image of the slave device.
  • the master device may also simultaneously display images collected and processed by itself.
  • the communication connection between the master device and the slave device may be a wired connection or a wireless connection.
  • the wireless connection can be high-fidelity wireless communication (wireless fidelity, Wi-Fi) connection, Bluetooth connection, infrared connection, NFC connection, ZigBee connection and other close-range connections, or long-distance connections (long-distance connections include but are not limited to supporting 2G, 3G, 4G, 5G and subsequent standard protocols for mobile networks).
  • the master device and the slave device can log in to the same user account (such as a Huawei account), and then connect remotely through a server (such as a multi-device collaborative shooting server provided by Huawei).
  • the adjustment of the shooting effect involved in the following embodiments of the present application refers to the adjustment of the shooting parameters of the electronic device.
  • the shooting parameters of the electronic device further include: hardware parameters of the camera involved in capturing images, and/or software parameters involved in processing images.
  • the shooting parameters also include some combination parameters of hardware parameters and software parameters. Such as Hybrid Zoom Range, Night Mode, Portrait Mode, Time Lapse, Slow Motion, Panorama Mode, HDR, and more.
  • Hardware parameters include one or more of the following: number of cameras, camera type, optical zoom value, optical image stabilization, aperture size, flash, fill light, shutter time, ISO sensitivity, pixels, and video frame rate, etc.
  • the types of cameras may include but are not limited to ordinary cameras, wide-angle cameras, and ultra-wide-angle cameras; the optical zoom value may be 1x zoom, 2x zoom, and 5x zoom; the aperture size may be f/1.8, f/1 .9, f/3.4; shutter time can be 1/40, 1/60, 1/200, etc.
  • Software parameters include one or more of the following: digital zoom value, image crop size, image color temperature calibration mode, whether to denoise the image, beauty/body beauty type, filter type, sticker option, whether to enable selfie mirroring, etc. Wait.
  • the digital zoom value can be 10x zoom, 15x zoom
  • the image crop size can be 3:3, 3:4, 9:16
  • the color temperature calibration mode can be daylight, fluorescent, incandescent, shadow, cloudy calibration mode
  • Beauty/body types can be face-lifting, slimming, microdermabrasion, whitening, big eyes, acne removal, etc.
  • filter types can be Japanese, texture, bright, soft light, cyberpunk, etc.
  • stickers can be expressions, animals, Stickers for landscapes, illustrations, and more.
  • the electronic device When the electronic device responds to a specific shooting mode or activates a specific algorithm, the electronic device will adjust the shooting parameters at this time. For example, when the camera uses the face mode function, the electronic device can reduce the focal length parameter in the shooting parameters, increase the aperture, turn on the fill light, and use the default beauty algorithm at the same time.
  • the parameter ranges of the shooting parameters and the processing parameters can be determined according to the shooting capability.
  • the default capture parameters indicate the parameters used by the master and slave devices when enabling the camera.
  • the default shooting parameters may be parameters preset by the camera at the factory, or may be parameters used by the user when the camera was used last time.
  • the above parameters include multiple hardware parameters used by the camera to capture images, and multiple software parameters used by the image processing module to process images.
  • FIG. 2A exemplarily shows the structure of the system 10 .
  • the system 10 includes: a master device 100 and a slave device 200 .
  • the number of slave devices 200 may be one or more, and FIG. 2A takes one slave device 200 as an example for description.
  • Both the master device 100 and the slave device 200 are electronic devices equipped with cameras. This embodiment of the present application does not limit the number of cameras that the master device 100 and the slave device 200 have.
  • slave device 200 may be configured with five cameras (2 front cameras and 3 rear cameras).
  • Electronic devices include but are not limited to smart phones, tablet computers, personal digital assistants (PDAs), wearable electronic devices with wireless communication functions (such as smart watches, smart glasses), augmented reality (AR) devices , virtual reality (virtual reality, VR) equipment, etc.
  • Exemplary embodiments of electronic devices include, but are not limited to, onboard Portable electronic devices with Linux or other operating systems.
  • the above-mentioned electronic device may also be other portable electronic device, such as a laptop computer (Laptop) and the like. It should also be understood that, in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer or the like.
  • a communication connection is established between the master device 100 and the slave device 200, and the communication connection may be a wired connection or a wireless connection.
  • the wireless connection may be a proximity connection such as a high-fidelity wireless communication (Wi-Fi) connection, a Bluetooth connection, an infrared connection, an NFC connection, a ZigBee connection, or the like.
  • the master device 100 may directly send a control command to adjust the shooting effect to the slave device 200 through the short-range connection.
  • the slave device 200 may respond to the above-mentioned control command issued by the electronic device 100 and transmit the adjusted image back to the master device 100 .
  • the main device 100 may display the image transmitted back from the device 200 .
  • the main device 100 can also use the above-mentioned images to complete the tasks of video recording, photographing, and forwarding.
  • the master device 100 sends a control command to adjust the shooting effect to the slave device 200, and the slave device 200 adjusts the image according to the control command.
  • the slave device 200 adjusts the image according to the control command.
  • the wireless connection may also be a long-distance connection, and the long-distance connection includes but is not limited to mobile networks supporting 2G, 3G, 4G, 5G and subsequent standard protocols.
  • the system 10 shown in FIG. 2A may also include a server 300, and the master device and the slave device may log in to the same user account (for example, a Huawei account), and then use the server 300 (for example, a multi-device collaborative shooting server provided by Huawei). long-distance connection.
  • the server 300 may be used for data transmission between the master device 100 and the slave device 200 . That is, the master device 100 can send a control command to the slave device 200 through the server 300 . Likewise, the slave device 200 may send an image to the master device 100 through the server 300 .
  • FIG. 2B is a schematic structural diagram of an electronic device 400 provided by an embodiment of the present application.
  • the electronic device 400 may be the master device 100 or the slave device 200 in the system 10 shown in FIG. 2A .
  • the electronic device 400 may include: a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, And a subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller can generate an operation control signal according to the command operation code and the timing signal, and complete the control of acquiring and executing the command.
  • a memory may also be provided in the processor 110 for storing commands and data.
  • the memory in processor 110 is cache memory.
  • the memory may hold commands or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the command or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the wireless communication function of the electronic device 400 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and detect electromagnetic wave signals.
  • Each antenna in electronic device 400 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed into a diversity antenna of the wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the electronic device 400 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • LNA low noise amplifier
  • the mobile communication module 150 can detect electromagnetic waves by the antenna 1, filter and amplify the detected electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the detected electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs a sound signal through an audio output device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 400 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 detects the electromagnetic wave via the antenna 2 , modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor 110 .
  • the wireless communication module 160 can also detect the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, and the
  • the antenna 1 of the electronic device 400 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 400 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), 5G and subsequent standard protocols, BT , GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • LTE long term evolution
  • 5G and subsequent standard protocols BT , GNSS, WLAN
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 400 realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program commands to generate or change display information.
  • Display screen 194 is used to display images, videos, and the like. Display screen 194 includes a display panel. In some embodiments, the electronic device 400 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 400 can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 400 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 400 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 400 may support one or more video codecs.
  • the electronic device 400 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the mobile communication module 150 and the wireless communication module 160 may be used to provide communication services for the main device 100 .
  • the master device 100 may establish a communication connection with other electronic devices having the camera 193 (ie, the slave device 200 ) through the mobile communication module 150 or the wireless communication module 160 .
  • the master device 100 can send a control command to the slave device 200 and receive images transmitted back from the slave device 200 .
  • the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc. provide the host device 100 with the function of capturing and displaying images.
  • the main device 100 turns on the camera 193, the main device 100 can obtain the optical image captured by the camera 193, and convert the optical signal into an electrical signal through the ISP.
  • the ISP can also adjust the shooting parameters such as exposure and color temperature of the shooting scene, and optimize the image processing parameters for the noise, brightness, and skin color of the image.
  • video codecs can be used to compress or decompress digital video.
  • the main device 100 may encode the shot files that are collaboratively shot across the devices into video files in multiple formats through a video codec.
  • the main device 100 can realize the display function. Specifically, the electronic device may display the image and the like captured by the camera 193 through the display screen 194 . In addition, the image received by the master device 100 and sent by the slave device 200 can also be displayed by means of the above-mentioned display screen 194 and other devices. In some embodiments, display screen 194 may also only display images sent from device 200 . Meanwhile, through the touch sensor 180K, that is, a "touch panel", the main device 100 can respond to user operations acting on the user interface controls.
  • the mobile communication module 150 and the wireless communication module 160 may be used to provide communication services for the slave device 200 .
  • the slave device 200 may establish a communication connection with the master device 100 through the mobile communication module 150 or the wireless communication module 160 .
  • the slave device 200 receives the control command sent by the master device 100 for controlling the shooting effect, can capture an image in response to the control command, and send the image captured and processed by the camera 193 to the master device 100 .
  • the ISP, the camera 193, and the video codec can provide the slave device 200 with the function of capturing and sending images.
  • the video codec may be used to compress or decompress digital video when sending images from the device 200 to the master device 100 .
  • the slave device 200 can encode the shot file that is collaboratively shot across the devices into an image stream through a video codec, and then send it to the master device 100 .
  • the slave device 200 may also display images captured by its own camera 193 .
  • the display function can be implemented for the slave device 200 through the GPU, the display screen 194, the application processor, and the like.
  • the slave device 200 can respond to user operations on the user interface controls.
  • the software systems of the master device 100 and the slave device 200 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present invention take an Android system with a layered architecture as an example to exemplarily describe the software structures of the master device 100 and the slave device 200 .
  • Android system with a layered architecture as an example to exemplarily describe the software structures of the master device 100 and the slave device 200 .
  • other operating systems eg, Hongmeng system, Linux system, etc.
  • the solution of the present application can also be implemented.
  • FIG. 3 is a block diagram of the software structure of the master device 100 according to the embodiment of the present invention.
  • the software structural block diagram of the master device 100 may include an application layer, a framework layer, a service layer, and a hardware abstraction layer (HAL).
  • the framework layer may further include a device virtualization kit (device virtual kit, DVKit) and a device virtualization platform (distributed mobile sensing development platform, DMSDP).
  • DVkit is a software development kit (software development kit, SDK). DVkit can provide capability interfaces to the application layer. Through the above interface, the application layer can invoke the services and capabilities provided in DVkit, such as discovering slave devices and so on.
  • DMSDP is a framework layer service. When the DVkit initiates the connection to the slave device, the DVkit can pull up the DMSDP service, and then DMSDP can realize the control session and data session transmission in the process of connecting the slave device.
  • the application layer can include a series of application packages. For example, it can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, SMS, etc.
  • the application layer includes various types of applications that use cameras, such as a camera application, a live broadcast application, a video call application, and the like.
  • a video calling application refers to an application that has both a voice call and a video call, such as the instant messaging application WeChat (not shown in Figure 3) and so on.
  • Camera applications can include native camera applications, and third-party camera applications.
  • the application layer can request the ability to use the camera from the framework layer.
  • the framework layer provides an application programming interface (API) and a programming framework for applications in the application layer.
  • API application programming interface
  • the framework layer includes some predefined functions.
  • the framework layer may include a camera kit (CameraKit) and a camera interface (Camera API).
  • the camera kit may include a mode management module, and the mode management module may be used to adjust the shooting mode used when the main device 100 runs various types of applications using the camera.
  • the shooting mode may include, but is not limited to, a preview mode, a photographing mode, a video recording mode, a cross-device mode, and the like.
  • the device virtualization kit can be used to discover the slave device 200 .
  • These shooting modes can be implemented by calling the camera interface (Camera API).
  • the camera interface can include two parts: camera management (CameraManager) and camera device (CameraDevice).
  • the camera management may be used to manage the shooting capability of the master device 100 and the shooting capability of the slave device 200 connected to the master device 100 .
  • the shooting capability of the device may include the hardware capability of the camera and the software capability of image processing software such as ISP/GPU.
  • Hardware capabilities refer to some of the adjustable capabilities that the camera has.
  • Software capability refers to the capability of image processing modules such as ISP and GPU to process electrical signal images.
  • Shooting capabilities also include the ability to combine hardware capabilities and software capabilities at the same time, such as hybrid zoom range, night mode, portrait mode, time-lapse, slow motion, panorama mode, and more. Taking the face mode as an example: the master device 100 (or the slave device 200 ) can adjust the focal length of the camera and add a beauty algorithm at the same time.
  • Hardware capabilities include one or more of the following: number of cameras, camera type, optical zoom range, optical image stabilization, aperture adjustment range, flash, fill light, shutter time, ISO sensitivity, pixels, and video frame rate, etc. .
  • the types of cameras may include, but are not limited to, ordinary cameras, wide-angle cameras, ultra-wide-angle cameras, and the like.
  • the optical zoom range can be 1x-5x zoom; the aperture size can be f/1.8-f/17; the shutter time can be 1/40, 1/60, 1/200 and so on.
  • Software capabilities include one or more of the following: digital zoom range, supported image cropping specifications, supported image color temperature calibration methods, supported image noise reduction, supported beauty/body beauty types, supported filter types, supported stickers , Support selfie mirroring.
  • the digital zoom range can be 10x-15x zoom
  • the image crop size can be 3:3, 3:4, 9:16
  • the color temperature calibration mode can be daylight, fluorescent, incandescent, shadow, cloudy calibration mode, As well as face-lifting, slimming, microdermabrasion, whitening, big eyes, acne removal and other beauty algorithms, body beauty algorithms, Japanese, texture, bright, soft light, cyberpunk and other filter algorithms, expressions, animals, landscapes, illustrations and other stickers, etc. Wait.
  • Table 1 exemplarily shows the respective shooting capabilities of the master device 100 and the slave device 200 .
  • the camera management can record the numbers of the cameras of the master device 100 and the cameras of the slave device 200 .
  • the three cameras of the main device 100 may be numbered 1, 2, and 3, respectively.
  • the three cameras of the slave device 200 may be numbered as 1001, 1002, and 1003, respectively. This number is used to uniquely identify the camera.
  • the numbering of the cameras of the slave device 200 can be completed through the virtual camera HAL of the master device 100 .
  • the specific numbering rules refer to the subsequent introduction to the virtual HAL, which will not be repeated here.
  • the above-mentioned hardware capability and software capability may further include other capabilities, which are not limited to the above-mentioned content, which is not limited in this embodiment of the present application.
  • the camera device can be used to respond to each application in the application layer and forward the flow control commands to the service layer for further processing.
  • a stream in this embodiment of the present application refers to a group of data sequences that arrive in order, in large quantities, quickly and continuously. In general, a data stream can be viewed as a dynamic collection of data that grows infinitely over time.
  • the stream of the present application is an image stream composed of frame-by-frame images.
  • the flow control commands are created by each application in the application layer and sent to the rest of the modules below the application layer.
  • the camera device can also cooperate with the device virtualization kit (DVKit) and the device virtualization platform (DMSDP) to establish a communication connection between the master device 100 and the slave device 200, and bind the slave device 200 and the master device 100. .
  • DVDit device virtualization kit
  • DMSDP device virtualization platform
  • the Device Virtualization Kit (DVKit) can be used to discover the slave device 200
  • the Device Virtualization Platform (DMSDP) can be used to establish a session channel with the discovered slave device 200
  • DMSDP may include a control session and a data session.
  • the control session is used to transmit control commands between the master device 100 and the slave device 200 (for example, a request to use the camera of the slave device 200, a photographing command, a video recording command, a control command for adjusting the shooting effect, etc.).
  • the data session is used to transmit streams returned from the device 200, such as preview streams, photo streams, video streams, and the like.
  • the service layer may include modules such as CameraService, CameraDeviceClient, Camera3Device, CameraProviderManager, device collaborative management, dynamic pipeline and stream processing.
  • the camera service provides various services for implementing the interface functions for the camera interface (Camera API).
  • CameraDeviceClient is an instantiation of camera.
  • a cameraDeviceClient corresponds to a camera.
  • CameraService can include functions to create cameraDeviceClient, such as connectDevice().
  • CameraService calls the above function to create a cameraDeviceClient instance corresponding to the above camera.
  • Camera3Device can be used to manage the life cycle of various types of streams, including but not limited to creating, stopping, clearing, destroying stream information, etc.
  • the CameraProviderManager can be used to obtain virtual camera HAL information, where the virtual camera HAL information includes the shooting capability of the slave device 200 .
  • the virtual camera HAL information includes the shooting capability of the slave device 200 .
  • Table 1 for a detailed description of the shooting capability of the slave device 200 .
  • the device cooperative management can be used to control the time delay of the respective screens of the master device 100 and the slave device 200 .
  • the master device 100 and the slave device 200 perform cross-device collaborative shooting, since it takes a certain amount of time for the slave device 200 to send the captured image to the master device 100, the two preview images displayed by the master device 100 may not be at the same time. Has time delay.
  • the device coordination management module can add a buffer, for example, copy some frames of the images captured by the main device 100, so that the preview images of the two parties are collected at a similar time point, so that the preview images of the two parties generate a time delay. It can be controlled within the user's visual perception range and will not affect the user experience.
  • the dynamic pipeline can be used to generate a pending command queue in response to the stream creation command issued by the camera device (CameraDevice). Specifically, the dynamic pipeline can generate one or more command queues to be processed according to the type of the flow creation command and the device it acts on.
  • the type of the stream creation command may include, for example, a preview command, a video recording command, a photographing command, and the like.
  • the devices to which the stream creation command acts may include the master device 100, the slave device 200, and may even be further refined to a specific camera of the device. The specific work flow of the dynamic pipeline will be described in detail in the subsequent method embodiments, which will not be repeated here.
  • the dynamic pipeline can add, in each command in the generated pending command queue, the identification (ID) or tag (tag) that the command acts on the device.
  • the dynamic pipeline can also add a request tag of a single frame or a continuous frame to each command in the generated pending command queue to indicate the type of the command.
  • the photographing command is a single frame command
  • the preview command or the video recording command is a continuous frame command.
  • the dynamic pipeline can also be used to distribute the commands that act on the slave device 200 in the pending command queue to the stream processing module, and distribute the commands that act on the master device 200 to the local camera HAL module of the HAL layer.
  • the dynamic pipeline can also refresh or add various parameters used to control the shooting effect in the commands of the pending command queue, such as zoom value, beauty algorithm (such as grinding skin level, whitening level, etc.), filters, color temperature, exposure, etc.
  • various parameters used to control the shooting effect in the commands of the pending command queue such as zoom value, beauty algorithm (such as grinding skin level, whitening level, etc.), filters, color temperature, exposure, etc.
  • Stream processing may include: life cycle management module, pre-processing, post-processing, frame synchronization and other modules.
  • the lifecycle management module can be used to monitor the entire lifecycle of a flow.
  • the lifecycle management can record the information of the stream, such as the timestamp of the request to create the stream, whether the slave device 200 responds to create the stream, and so on.
  • the life cycle management module can record the end time of the stream.
  • the pre-processing module is used to process each command issued by the dynamic pipeline, including modules such as multi-stream configuration and multiplexing control.
  • the multi-stream configuration can be used to configure the type and quantity of the required streams according to the stream type of the stream creation command issued by the camera device (CameraDevice).
  • Different types of control commands correspond to different types and numbers of required streams.
  • Types of streams may include, but are not limited to, preview streams, camera streams, video streams, analysis streams, and the like.
  • a camera device can issue a control command of "create camera stream”
  • the multi-stream configuration can configure four preview streams, one analysis stream, one photo stream and one video stream for the above control command.
  • Multi-stream configuration Refer to Table 2 for the method of configuring multiple streams for the command to create a stream:
  • the multi-stream configuration may also have other configuration methods, which are not limited in this embodiment of the present application.
  • the multiplexing control can be used to multiplex multiple streams requested by the camera device (CameraDevice), that is, to simplify the multiple streams configured by the multi-stream configuration module. For example, a preview stream with a requested picture quality of 1080P and an analysis stream with a requested picture quality of 720P can be multiplexed into a 1080P preview stream.
  • the post-processing module is used to process the image stream returned from the device 200 .
  • the post-processing module may include smart diversion and multi-stream output modules.
  • Smart streaming can be used to expand the stream returned from the device 200 into a stream that is consistent with the type and number of streams requested by the camera device (CameraDevice).
  • the camera device (CameraDevice) requests a 1080P preview stream and a 720P analysis stream.
  • the above control commands requesting two streams are multiplexed into a control command requesting one 1080P preview stream.
  • the slave device 200 may return a 1080P preview stream to the master device 100 .
  • the stream processing can restore the above-mentioned 1080P preview stream into a 1080P preview stream and a 720P analysis stream through the intelligent offloading module.
  • the multi-stream output can be used to output the actual stream required by the smart distribution, and send the output stream to the camera device (CameraDevice).
  • the post-processing module may further include image mirroring, rotation and other processing, which is not limited here.
  • the frame synchronization module can be used to perform frame synchronization processing on the image frames when taking pictures.
  • the cross-device transmission will cause a delay when the instruction sent by the master device 100 reaches the slave device 200 . That is, the slave device 200 will receive the same instruction later than the master device 100 . Therefore, the execution result obtained by the slave device 200 when executing the above-mentioned instruction may also be different from the result expected by the master device 100 .
  • the slave device 200 may respond to the control command and return the image stream at the second moment, and the master device 100 expects to receive the image stream from the slave device 200 at the first moment. Therefore, the frame synchronization can move forward the result of the second moment sent back from the device 200 to obtain a result closer to the first moment (the user expected result), thereby reducing the influence of network delay.
  • a hardware abstraction layer may include a local camera HAL and a virtual camera HAL.
  • the local camera HAL may include a camera session and a camera provider module.
  • the camera session can be used by the master device 100 to issue control commands to the hardware.
  • the camera provider module can be used to manage the shooting capability of the camera of the main device 100 .
  • For the shooting capability of the main device 100 reference may be made to Table 1 above.
  • the camera provider can only manage the shooting capability of the local camera of the main device 100.
  • the virtual camera HAL also includes camera session (camera session) and camera provider modules.
  • the camera session can also be used to register the slave device 200 that has established a communication connection with the master device 100 to the local, feedback the connection status of the slave device 200 to the DVkit, and send the control commands issued by the master device 100 to slave device 200.
  • the camera provider module is responsible for managing the shooting capability of the slave device 200 . Similarly, for the shooting capability of the above-mentioned management slave device 200, reference may be made to Table 1, which will not be repeated here.
  • the camera virtual HAL also provides the function of numbering the cameras of the registered slave device 200 .
  • the virtual camera HAL of the master device obtains the shooting capability of the slave device, the virtual camera HAL can obtain the number of cameras possessed by the slave device, and establish an ID for each camera. The ID can be used by the master device 100 to distinguish multiple cameras of the slave device 200 .
  • the virtual camera HAL numbers the above-mentioned cameras, a numbering method different from that of the camera of the main device 100 may be adopted. For example, when the cameras of the master device 100 are numbered from 1, the cameras of the slave device 200 can be numbered from 1000.
  • the structure of the master device to 100 shown in FIG. 3 does not constitute a specific limitation on the slave device 200 .
  • the slave device 200 may include more or less modules than shown, or combine some modules, or split some modules, or arrange different modules.
  • FIG. 4 exemplarily shows a system frame diagram of the slave device 200 .
  • the slave device 200 may include an application layer, a framework layer, a service layer, and a hardware abstraction layer (HAL).
  • HAL hardware abstraction layer
  • the application layer may include a series of application packages, which may include, for example, a camera proxy service.
  • the camera proxy service may include modules such as pipeline control, multi-stream adaptation, and multi-operating system (Operating System, OS) adaptation.
  • the pipe control can be used to establish a communication connection with the main device 100 (including establishing a control session and a data session), transmit control commands and image streams.
  • Multi-stream adaptation can be used for the camera proxy service to return stream data according to the stream configuration information sent by the master device 100 .
  • the master device 100 may configure a desired stream for a camera device session in response to a request to create a stream.
  • Performing the above configuration process can generate corresponding flow configuration information.
  • the camera proxy service can generate corresponding stream creation commands according to the above configuration information. Therefore, in response to the above-mentioned control command to create a flow, the underlying service of the slave device 200 can create a flow that matches the above-mentioned command.
  • Multi-operating system (Operating System, OS) adaptation can be used to solve compatibility problems between different operating systems, such as android system and Hongmeng system.
  • the framework layer includes camera management (CameraManager) and camera device (CameraDevice);
  • the service layer includes camera service (CameraService), CameraDeviceClient, Camera3Device and CameraProviderManager modules.
  • the camera management (CameraManager), camera device (CameraDevice), camera service (CameraService), CameraDeviceClient, Camera3Device, CameraProviderManager, and the corresponding modules in the main device 100 have the same functions, please refer to the introduction in FIG. 3 above.
  • the local camera HAL of the slave device 200 may refer to the local camera HAL of the master device 100 . This embodiment of the present application will not be repeated here.
  • the camera HAL layer of slave device 200 may also include camera session and camera provider modules.
  • a camera session can be used to control the communication between commands and hardware.
  • the camera provider module may be used to manage the capture capability of the slave device 200 .
  • For the shooting capability of the slave device 200 refer to Table 1 above.
  • the camera provider of the slave device 200 can only manage the shooting capability of the local camera of the slave device 200.
  • the structure of the slave device 200 illustrated in FIG. 4 does not constitute a specific limitation on the slave device 200 .
  • the slave device 200 may include more or less modules than shown, or combine some modules, or split some modules, or arrange different modules.
  • the cross-device collaborative shooting method provided by the embodiments of the present application can be applied to various scenarios, including but not limited to:
  • the master device 100 can be connected to the slave device 200, the master device 100 and the slave device 200 can shoot the same object or different objects at different angles, and the slave device 200 can send the captured image to the master device 100,
  • the master device 100 can upload the images of both parties to the live server, and the live server distributes the images to more users for viewing.
  • the master device 100 can control the shooting effect of the slave device 200 , for example, parameters such as the focal length of the camera of the slave device 200 or the filters used can be adjusted by the master device 100 .
  • the host who initiates the live broadcast can conveniently control the shooting effect of the slave device 200 on the master device 100, and the user watching the live broadcast can see the same object in different Multiple images shown in angular.
  • the master device 100 After the master device 100 starts the camera application, it can connect with the slave device 200 and take images from different angles.
  • the slave device 200 can send the captured image to the master device 100 .
  • the master device 100 can control the shooting effect of the slave device 200, and can also display the captured images of both parties at the same time, and perform processing such as previewing, photographing, and video recording for the captured images.
  • the above camera application may be a native camera application or a third-party camera application.
  • the master device 100 may be a mobile phone, and the slave device 200 may be a large-screen TV.
  • the mobile phone can make video calls with other devices. During this process, the mobile phone can connect to the large-screen TV and control the camera of the large-screen TV to shoot.
  • the large-screen TV can send the captured image to the mobile phone, and then the mobile phone sends the image to the video.
  • the device on the other end of the call In this way, the mobile phone can realize a video call through a large-screen TV, and the user does not need to hold the mobile phone to take an image at a specific angle, which can give the user a more convenient video call experience.
  • Wearable devices such as smart watches can be connected to smart electronic devices such as mobile phones, and control the camera of the mobile phone to shoot.
  • the mobile phone can send the captured images to the smart watch, so that the user can directly view the pictures captured and processed by the mobile phone on the smart watch. .
  • the user can also control the shooting effect of the mobile phone through the smart watch.
  • the user can process the shooting screen of the mobile phone through the smart watch, and can conveniently and quickly complete the shooting without the help of others in the process of taking a group photo or shooting a scene.
  • the following takes the live broadcast scene as an example, and combines the UI in the live broadcast scene to describe the cross-device collaborative shooting method.
  • FIG. 5 exemplarily shows a live broadcast scenario provided by an embodiment of the present application.
  • the live broadcast scene may include a master device 100 , a slave device 200 , object A, and object B.
  • the master device 100 and the slave device 200 establish a communication connection, and the master device 100 and the slave device 200 may be in different positions or angles.
  • the master device 100 photographs the subject A
  • the slave device 200 photographs the subject B.
  • the slave device 200 may display the captured image, and send the image to the master device 100 after processing.
  • the main device 100 can simultaneously display the image sent from the device 200 and the image obtained by photographing the subject A by itself.
  • the main device 100 may also upload the two displayed images to the live broadcast server. Further, the server may distribute the above two images to the devices of other users entering the live room.
  • Figures 6A-6D, Figures 7A-7B, Figures 8A-8B, Figures 9A-9D, Figures 10A-10C exemplarily show some users implemented on the master device 100 and the slave device 200 in a live broadcast scenario interface.
  • 6A-6C illustrate a manner in which the master device 100 and the slave device 200 establish a communication connection.
  • 6A-6C are user interfaces implemented on the master device 100
  • FIG. 6D is a user interface implemented on the slave device 200 .
  • FIG. 6A shows an exemplary user interface 60 on the host device 100 for presenting installed applications.
  • the user interface 60 displays: a status bar, a calendar indicator, a weather indicator, a tray with icons of frequently used applications, a navigation bar, icons 601 of live broadcast applications, icons 602 of camera applications, and icons of other applications.
  • the status bar may include: one or more signal strength indicators of mobile communication signals (also known as cellular signals), operator name (such as "China Mobile"), one or more signal strengths of Wi-Fi signals indicator, battery status indicator, time indicator, etc.
  • the navigation bar may include system navigation keys such as the back key, the home screen key, and the multitasking key.
  • the user interface 60 exemplarily shown in FIG. 6A may be the Home screen.
  • the main device 100 may detect a user operation acting on the icon 601 of the live broadcast application, and in response to the user operation, display the user interface 61 shown in FIG. 6B .
  • the user interface 61 may be a main interface provided by a live broadcast application.
  • the user interface 61 may include: an area 611 , an interactive message window 612 , a preview box 613 , an add control 614 , and a setting control 615 .
  • the area 611 can be used to display some information of the host, such as avatar, live broadcast duration, number of viewers, live broadcast account and so on.
  • the interactive message window 612 can be used to display messages sent by the host or viewer during the live broadcast, or system messages generated by interactive operations such as “like” and "like".
  • the preview frame 613 may be used to display images captured and processed in real time by the camera of the main device 100 .
  • the main device 100 may refresh the displayed content in real time, so that the user can preview the images captured and processed in real time by the camera of the main device 100 .
  • the camera may be a rear camera of the main device 100 or a front camera.
  • the setting controls 615 can be used to adjust the shooting effect of the main device 100 .
  • the main device 100 may display options for adjusting the shooting parameters and/or image processing parameters of the main device 100 .
  • options please refer to the related description of the subsequent user interface, which will not be repeated here.
  • Add control 614 may be used to find slave devices 200 .
  • the main device 100 can use the aforementioned short-range communication technologies such as Bluetooth, Wi-Fi, NFC to discover other nearby electronic devices, and can also use the aforementioned long-range communication technologies to discover remote electronic devices Other electronic devices, and inquire whether the other electronic devices found have cameras.
  • the main device 100 may display the discovered electronic devices with cameras on the user interface 62 .
  • the main device 100 may display a window 622, which may include information of two electronic devices, including respective: icons, names, distances, locations, and the like of the electronic devices.
  • Icon 623 may be used to display the type of electronic device.
  • the first slave device displayed by the above-mentioned master device 100 may be a tablet computer. The user can quickly and easily identify whether the slave device is the device he wants to connect to through the icon 623 .
  • Name 624 can be used to display the name of the slave device.
  • the name 624 may be the model of the slave device.
  • the name may also be a user-defined name of the slave device.
  • the above name can also be a combination of device model and user-defined name. This application does not limit this. It will be appreciated that listings such as "pad C1", “Phone P40-LouS”, etc. in the user interface 62 are exemplary names.
  • the master device 100 can detect a user operation acting on the icon 623 of the electronic device, and in response to the user operation, the master device 100 sends a request to establish a communication connection to the electronic device (slave device 200 ) corresponding to the icon 623 .
  • FIG. 6D shows the user interface 63 displayed by the electronic device (slave device 200 ) after the slave device 200 receives the request for establishing a communication connection sent by the master device 100 .
  • the user interface 63 includes: device information 631 of the main device 100 , a confirmation control 632 and a cancel control 633 .
  • the device information 631 can be used to reveal the identity information of the master device 100 . That is, the user can determine the information of the master device 100 that issued the above request through the device information 631 . When the user can determine the information of the main device 100 through the device information 631 and trust the main device, the above-mentioned user can approve the main device 100 to use the camera of the electronic device through the confirmation control 632 .
  • the electronic device can detect the operation acting on the confirmation control 632, and in response to the user operation, the slave device 200 can agree to the master device 100 to use its own camera, that is, the master device 100 can establish a communication connection with the electronic device.
  • the above-mentioned communication connection may be the aforementioned wired connection or wireless connection.
  • the master device and the slave device can log in to the same user account (such as a Huawei account), and then connect remotely through a server (such as a multi-device collaborative shooting server provided by Huawei).
  • a server such as a multi-device collaborative shooting server provided by Huawei.
  • This embodiment of the present application does not limit this. It should be understood that the electronic device is equivalent to a slave device of the master device 100 .
  • User interface 63 also includes cancel control 633 .
  • cancel control 633 When the main device 100 cannot be determined through the device information 631 , or when the main device 100 is not trusted, the user can reject the request sent by the main device 100 to use the camera of the electronic device through the cancellation control 633 .
  • the electronic device can detect the operation acting on the cancel control 633, and in response to the user operation, the electronic device can refuse the main device 100 to use its own camera, that is, the electronic device does not agree to establish a communication connection with the main device 100.
  • FIG. 7A-7B illustrate another manner in which the master device 100 and the slave device 200 establish a communication connection.
  • 7A is a user interface 71 implemented on the slave device 200
  • FIG. 7B is a user interface 72 implemented on the master device 100 .
  • the master device 100 When the master device 100 detects an operation acting on the icon 623 in the user interface 62, in response to the user operation, the master device 100 sends a request to establish a communication connection to the electronic device (slave device 200) corresponding to the icon 623.
  • the user interface 71 may include a verification code 712 and a cancel control 714 .
  • the verification code 712 can be used to confirm the connection between the master device 100 and the slave device 200 .
  • the slave device 200 may generate a verification code 712.
  • the verification code 712 can also be generated by the server 300 and then sent to the slave device 200 through the wireless network. Then, the slave device 200 may display the above verification code on the user interface 71 .
  • Cancel control 714 may be used to deny the request sent by master device 100 to use the camera of slave device 200 .
  • the slave device 200 may detect a user operation acting on the cancel control 714 , and in response to the user operation, the master device 100 may close the dialog box 711 .
  • FIG. 7B exemplarily shows the user interface 72 for the main device 100 to input the verification code. While the slave device 200 displays the user interface 71 , the master device 100 may display the user interface 72 . User interface 72 may display dialog 721 .
  • Dialog 721 may include verification code 7211, confirmation control 7212.
  • the verification code 7211 may represent a verification code input by the user to the main device 100 .
  • the master device 100 can detect the operation acting on the confirmation control 7212. In response to the user operation, the master device 100 may transmit a verification code 7211 to the slave device 200 .
  • the above-mentioned user operation is, for example, a click operation, a long-press operation, and the like.
  • the slave device 200 can check whether the received verification code 7211 is consistent with the verification code 712 displayed by itself. If the two verification codes are the same, the slave device 200 allows the master device 100 to use its own camera. Further, the slave device 200 can turn on its own camera, and transmit the image captured by the camera and processed by the ISP to the master device 100 . On the contrary, the slave device 200 may reject the request of the master device 100 to use the camera of the slave device 200 .
  • the slave device 200 may keep displaying the verification code 712 and wait for the master device 100 to input a new verification code.
  • the slave device 200 may also allow the master device 100 to use its own camera.
  • the slave device 200 when the verification code 7211 sent by the master device 100 is different from the verification code 712 displayed by the slave device 200, the slave device 200 can regenerate another verification code M, and the master device 100 can obtain the verification code M again.
  • the slave device 200 may also allow the master device 100 to use its own camera.
  • a communication connection can also be established in other ways, for example, by using near field communication (near field communication, NFC) technology, the master device 100 and the slave device 200 can complete the authentication through user operations such as touch.
  • NFC near field communication
  • the authentication method of this application is not limited to the above-mentioned 2 authentication methods.
  • the master device 100 and the slave device 200 may display prompt information respectively.
  • the above prompt information may prompt the user that the master device 100 and the slave device 200 have established a communication connection.
  • the user interface 81 shows a user interface for displaying prompt information from the device 200 .
  • the slave device 200 may display the user interface 81 .
  • User interface 81 may include a prompt box 811 and a preview box 812 .
  • Preview box 812 may be used to display images captured from the camera of device 200 .
  • the prompt box 811 can be used to display prompt information.
  • the above prompt information is, for example, "the camera of the slave device is being used by the master device 100".
  • the slave device 200 After the slave device 200 grants the master device 100 to use the camera of the slave device 200, the slave device 200 can turn on its own camera. Then, the slave device 200 may display the picture captured and processed by its own camera in the preview box 812 . On top of the display layer of the preview box 812 , the slave device 200 may display a prompt box 811 .
  • the slave device 200 can also display the image captured and processed by its own camera through the floating window. Specifically, the slave device 200 may display a floating window in the upper right corner of the user interface 60 as shown in FIG. 6A . The floating window can display images captured and processed from the device 200 camera.
  • the master device 100 may display the user interface 82 as shown in FIG. 8B.
  • User interface 82 may include prompt window 821 , window 822 and window 823 .
  • the prompt window 821 can be used to display prompt information.
  • the above prompt information is, for example, "the master device 100 has connected to the slave device 200".
  • Window 822 may be used to display images captured and processed from the device 200 camera.
  • the window 823 may display images captured and processed by the camera of the main device 100 .
  • the master device 100 may obtain from the slave device 200 the images captured and processed by the camera of the slave device 200 . Then, the main device 100 may display the above-mentioned image on the window 822 . Meanwhile, the main device 100 may also display a prompt window 821 . The user can know that the master device 100 has been connected to the slave device 200 through the prompt content displayed in the prompt window 821 .
  • a setting control 824 may be added to the user interface 82 .
  • the settings controls may be used to display capture capability options of the slave device 200 .
  • master device 100 may also exchange the content displayed by window 823 and window 822 .
  • the main device 100 may detect a user operation acting on the window 822 , and in response to the user operation, the main device 100 may display the image captured and processed from the camera of the device 200 in the window 823 .
  • the main device 100 may display the image captured and processed by the camera of the main device 100 in the window 822 .
  • the above-mentioned user operations may be operations such as clicks, left swipes, and the like.
  • the master device may also divide the window 823 into two separate parts. One part is used to display the image captured and processed by the camera of the master device 100 , and the other part is used to display the image captured and processed by the camera of the slave device 200 .
  • the present application does not limit the display arrangement of the preview images of the master device 100 and the slave device 200 on the master device 100 .
  • FIGS. 6A-8B exemplarily illustrate a set of user interfaces in which the master device 100 establishes a communication connection with the slave device 200 and displays images captured and processed by the camera of the slave device 200 .
  • the master device 100 may acquire the capability of the slave device 200 to control the shooting effect, and may send a command to control the shooting effect to the slave device 200 .
  • FIGS. 9A-9D exemplarily show a set of user interfaces for the master device 100 to control the shooting effect of the slave device 200 .
  • 9A-9C are user interfaces on the master device 100
  • FIG. 9D is a user interface on the slave device 200 .
  • the main device 100 may display the user interface 91 shown in FIG. 9A .
  • User interface 91 may include window 911, window 912, delete control 913, set control 915, set control 916.
  • the window 911 can display the images captured and processed by the camera of the main device 100 .
  • Window 912 may display images captured and processed from the device 200 camera.
  • Delete control 913 can be used to close window 912 .
  • Delete control 913 can be used to close window 912 .
  • the main device 100 may detect a user operation acting on the delete control 913 , and in response to the operation, the main device 100 may close the window 912 .
  • the settings controls 915 may be used to display options for the ability of the main device 100 to control the shooting effects.
  • Settings controls 916 may be used to display options for the ability to control capture effects from device 200 .
  • a prompt message, such as "click to adjust the remote picture" may also be displayed beside the setting control 916 .
  • the master device 100 may detect a user operation acting on the setting control 916, and in response to the user operation, the master device 100 may display a capability option to control the shooting effect of the slave device 200, referring to FIG. 9B.
  • the user interface 91 may further include a delete control 913 , an add control 914 .
  • Delete control 913 may be used to close one or more windows in user interface 91, eg, window 911, window 912.
  • Add control 914 can be used to find other slave devices to connect. After the main device 100 detects the user operation acting on the add control 914, the main device 100 may display the query result shown in the window 622 in FIG. 6C.
  • the master device 100 may send a request to use the camera to the slave device.
  • the above-mentioned other slave devices can agree to the request sent by the master device 100, and then the slave device can enable its own camera to capture and process images according to default shooting parameters, and further, send the processed images to the master device.
  • the main device 100 may add a window to display the above image.
  • the master device 100 can display a plurality of images sent from the slave devices, thereby providing the user with a richer shooting experience.
  • the master device 100 may also multiplex the setting controls 915 and 916 .
  • the specific user interface 91 may display a general settings control.
  • the setting control may display the ability option of the main device 100 to control the shooting effect.
  • the setting control may display the ability option of the slave device 200 to control the shooting effect.
  • FIG. 9B exemplarily shows the user interface 92 of the master device 100 displaying the capability options of the slave device 200 to control the shooting effect.
  • the user interface 92 may include a capture effect window 921 of the slave device 200 .
  • the window 921 may display various capability options of the slave device 200 to control shooting effects, such as aperture, flash, smart follow, white balance 922, ISO sensitivity, zoom range, beauty, filters, and the like.
  • This application takes the adjustment of the white balance 922 as an example, and specifically introduces a user interface for the master device 100 to send a control command to the slave device 200 and the slave device 200 to execute the above control command.
  • White balance can be used to calibrate the color temperature deviation of the camera.
  • White balance 922 may include daylight mode, incandescent light mode 923, fluorescent mode, cloudy sky mode, shadow mode.
  • the main device 100 can detect user operations acting on any of the above-described modes. When the master device 100 detects a user operation acting on the incandescent lamp mode 923 , in response to the user operation, the master device 100 may issue a control command to change the white balance mode to the incandescent lamp mode to the slave device 200 .
  • the slave device 200 receiving the above command can change the white balance 922 to the incandescent mode 923 .
  • the master device 100 may receive and display the image of the replacement incandescent lamp mode 923 sent from the device 200 . See Figure 9C.
  • the image displayed from the viewfinder of the device 200 can also be adjusted to the image after changing the white balance mode. Refer to Figure 9D
  • the master device 100 may also set a dedicated page to display the capability options of the slave device 200 to control the shooting effect. That is, the main device 100 may display the capability options in the window 921 using a separate page. This embodiment of the present application does not limit this.
  • user interface 93 may include window 931 .
  • Window 931 may be used to display images captured and processed from the device 200 camera. After the master device 100 can send a control command to change the white balance mode to the incandescent lamp mode to the slave device 200, the master device 100 can receive the image that the slave device 200 can change the white balance mode to the incandescent lamp mode.
  • the window 931 may display the above-mentioned images.
  • the slave device 200 may also display the user interface 94 after adjusting the white balance. See Figure 9D.
  • User interface 94 is the user interface displayed on slave device 200 .
  • User interface 94 may include preview window 941 .
  • the preview window 941 may be used to display images captured and processed from the camera of the device 200 . After receiving the control command that the master device 100 sends to the slave device 200 to change the white balance mode to the incandescent lamp mode, the slave device 200 can change the white balance mode to the incandescent lamp mode. The preview window 941 may then display the image captured and processed from the camera of the device 200 in incandescent mode. Meanwhile, the slave device 200 may transmit the above-mentioned image to the master device 100 . The main device 100 may display the above image, as shown in FIG. 9C .
  • the ability of the slave device 200 to control the shooting effect may have some or all of the capabilities in the above list, or also have other capabilities not mentioned in the window 921 to control the shooting effect. This application does not limit this.
  • the main device 100 may also acquire the ability of the main device 100 to control the shooting effect, and issue a control instruction to itself.
  • the master device 100 delivers to itself a set of user interfaces for controlling shooting effects.
  • FIG. 10A exemplarily shows a user interface 101 in which the main device 100 acquires the ability to control the shooting effect by itself.
  • User interface 101 may include settings controls 1011 .
  • the setting control 1011 may be used to display the capability options of the main device 100 to control the shooting effect.
  • a prompt message may also be displayed beside the setting control 1011, such as "click to adjust the local screen”.
  • the main device 100 may detect a user operation acting on the setting control 1011, and in response to the user operation, the main device 100 may display a shooting effect capability list of the main device 100, as shown in the user interface 102 in FIG. 10B.
  • User interface 102 may include window 1021 .
  • the window may display options for the ability of the main device 100 to control the shooting effects, such as aperture 1022, flash, smart follow, beauty, filters, and the like.
  • the aperture is taken as an example to illustrate that the master device 100 sends a control command to adjust the shooting effect to the master device 100 .
  • the size of the aperture 1022 can be adjusted by the dial 1024 .
  • the main device 100 may detect a user operation acting on the dial 1024 , and in response to the user operation, the main device 100 may issue a control command to adjust the aperture to the main device 100 .
  • the initial scale of the dial 1024 may be "f/8".
  • the user can slide the float on the dial 1024 to the aperture scale of "f/17" by swiping right.
  • the main device 100 may detect this user operation, and in response to the user operation, the camera of the main device 100 may replace the aperture of "f/8" with "f/17".
  • Changing the aperture to “f/17” can obtain a shallower depth of field, and correspondingly, the window displaying the image captured by the main device 100 can display the image captured by the main device 100 with a shallower depth of field. As shown in window 1031 in Figure 10C.
  • FIG. 10C exemplarily shows the user interface 103 in which the preview window of the main device 100 becomes shallower.
  • User interface 103 may include preview window 1031 .
  • the preview window 1031 can be used to display the images captured and processed by the camera of the main device 100 .
  • the master device 100 may replace the aperture "f/8" with "f/17” 17.
  • the preview window 1031 may display an image captured and processed by the main device 100 with an aperture size of "f/17".
  • Figures 6A-6D, 7A-7B, 8A-8B, 9A-9D, 10A-10C, 11A-11C describe that in the live broadcast scenario, the master device 100 and the slave device 200 establish A series of user interfaces that communicate and control effects captured from the device 200 .
  • the above-mentioned method for controlling the shooting effect of the slave device 200 can also be used in a shooting scene.
  • the following introduces a series of user interfaces in which the master device 100 establishes a communication connection with the slave device 200 and controls the shooting effect of the slave device 200 in a photographing application scenario.
  • FIG. 11A exemplarily shows the master device 100 displaying a user interface 111 for adding a slave device.
  • User interface 111 may include add controls 1112 , dialog boxes 1113 .
  • the main device 100 may detect a user operation acting on the add control 1112, and in response to the user operation, the main device 100 may query the electronic device having a camera.
  • the main device 100 may display a dialog box 1113 when receiving the message of having a camera returned by the above electronic device.
  • Dialog 1113 may be used to present information for electronic devices with cameras.
  • dialog 1113 exemplarily shows information of two electronic devices (electronic device 1114, electronic device 1115) with cameras.
  • the above-mentioned information is, for example, the name, location and other information of the slave device.
  • the master device 100 may detect a user operation acting on the electronic device 1115 , and in response to the user operation, the master device 100 may send a request to use the camera of the electronic device 1115 to the slave device 200 .
  • the electronic device 1115 may detect an operation of the user agreeing to the main device 100 to use its own camera, and in response to the operation, the electronic device 1115 may allow the main device 100 to use the camera of the electronic device 1115 .
  • For the user interface of the process in which the electronic device 1115 grants the use permission to the main device 100 reference may be made to the user interface in the live broadcast scene. As shown in Figures 6C-6D, or Figures 7A-7B. This application will not repeat this.
  • the slave device 200 may display the user interface 112, referring to FIG. 11B .
  • the main device 100 may display the user interface 113, as shown in FIG. 11C.
  • User interface 112 is the user interface on slave device 200 .
  • User interface 112 exemplarily illustrates a user interface for displaying prompt information from device 200 .
  • the user interface 112 may include a prompt window 1121 and a preview window 1122 .
  • the slave device 200 can turn on its own camera, and further, the preview window 1122 can display the image collected and processed by the camera of the slave device 200.
  • the user interface 112 may also display a prompt window 1121 .
  • the prompt window 1121 prompts the user that the camera of the device has been used by other devices. For example "The current screen is being used by LISA".
  • the user interface 113 is the user interface on the main device 100 .
  • the user interface 113 exemplarily shows a user interface in which the main device 100 displays prompt information.
  • the master device 100 may display the user interface 113 .
  • the user interface 113 may include a window 1131 , a window 1132 , and a prompt window 1134 .
  • the window 1131 can be used to display images captured and processed by the camera of the main device 100 .
  • Window 1132 may display images captured and processed from the camera of device 200 .
  • the window 1132 may display the above-mentioned image.
  • the prompt window 1134 can be used to display prompt information, for example, the above prompt information is "connected camera: Phone P40-LouS".
  • the above prompt information can be used to prompt the user that the master device 100 has connected to the slave device 200 .
  • window 1131 and window 1132 may also exchange display content. That is, the window 1131 can display the image captured and processed from the camera of the device 200 .
  • the window 1132 can be used to display images captured and processed by the camera of the main device 100 .
  • the main device 100 can set the above-mentioned delete key 1133 in the window 1131. In response to a user operation acting on the delete key 1133, the main device 100 may close the window 1131.
  • window 1132 may be displayed over window 1131 in the form of a floating window in user interface 113 .
  • the window 1132 can also be displayed tiled with the window 1131 . Refer to Figure 11D.
  • FIG. 11D exemplarily shows another user interface 114 in which the main device 100 displays prompt information.
  • the user interface 114 may include a window 1141 , a window 1142 , a prompt window 1143 , a setting control 1147 , and a setting control 1148 .
  • the window 1141 can be used to display images captured and processed by the camera of the main device 100 .
  • Window 1142 may be used to display images captured and processed from the device 200 camera.
  • the window 1141 and the window 1142 can also exchange display contents. That is, the window 1141 may display the image captured and processed from the camera of the device 200 .
  • the window 1142 may be used to display images captured and processed by the camera of the main device 100 . This application does not limit this.
  • the prompt window 1143 is used to display prompt information.
  • the settings control 1147 may be used to display options for capturing capabilities of the camera of the main device 100 .
  • Settings controls 1148 may be used to display capture capability options for the camera of slave device 200 .
  • the master device 100 may detect a user operation on the setting control 1148 , and in response to the user operation, the master device 100 may display the capture capability option of the slave device 200 .
  • the master device 100 may display the capture capability option of the slave device 200 .
  • the shooting capability options displayed in the dialog 921 in the user interface 92 shown in FIG. 9B . This application will not repeat this.
  • the master device 100 may detect a user operation acting on a certain shooting capability option, and in response to the user operation, the master device 100 may send a control command to adjust the shooting effect to the slave device 200 .
  • the master device 100 may send a control command to the slave device 200 to adjust the white balance mode to the incandescent lamp mode .
  • the slave device 200 may perform color temperature calibration of the incandescent lamp mode on the image captured by the camera. Then, the slave device 200 may transmit the color temperature calibrated image in the incandescent lamp mode to the master device 100 .
  • the main device 100 may display the above-mentioned image.
  • user interface 114 may also include delete controls 1144 , add controls 1145 , and capture controls 1146 .
  • the delete control 1144 can be used to close the window 1142 .
  • Addition controls 1145 may be used by the main device 100 to discover other electronic devices with cameras.
  • the main device 100 may detect a user operation acting on the photographing controls 1146 . In response to the user operation, the main device 100 may store the content displayed in the window 1131 and the window 1132 as a picture or a video. In the scenario of live broadcast or video call, the live broadcast application/video call application may also obtain the above picture or video, and send it to the server that provides live broadcast/video call.
  • the number of slave devices may not be limited to one.
  • the main device 100 may detect a user operation of adding a control, such as the adding control 914 of the user interface 91, the adding control 1145 of the user interface 114, and the like. In response to the user operation, the main device 100 may establish a connection with other electronic devices with cameras. Further, the master device 100 may establish connections with multiple slave devices, and use the images sent by the multiple slave devices.
  • the master device 100 may also only display images sent from the slave devices. For example, after the master device 100 establishes a communication connection with the slave device 200 , the master device 100 can turn off its own camera and only display the image sent by the slave device 200 .
  • FIG. 12 shows the detailed flow of the cross-device collaborative shooting method. As shown in Figure 12, the method may include the following steps:
  • the master device 100 establishes a communication connection with the slave device 200.
  • the communication connection established between the master device 100 and the slave device 200 may be the aforementioned wired connection or wireless connection.
  • the main device 100 may first respond to the received user operation (eg, the user operation acting on the add control 614 shown in FIG. 6B ), discover other electronic devices with cameras, and then report to the electronic device selected by the user.
  • the device sends a connection request.
  • the electronic device responds to the user operation for agreeing to the request (eg, the user operation acting on the confirmation control 632 shown in FIG. 6D )
  • the main device 100 and the electronic device successfully establish a communication connection.
  • the main device 100 may scan the two-dimensional code of the electronic device 200 to establish a connection with the electronic device 200 .
  • the above-mentioned electronic device can display a two-dimensional code using the camera of the electronic device.
  • the main device 100 can acquire the above two-dimensional code, and in response to the above operation, the main device 100 can send a use request to the above electronic device and obtain the consent of the electronic device.
  • the master device 100 may also establish a communication connection in other ways.
  • a touch operation based on NFC technology This embodiment of the present application does not limit this.
  • the master device 100 may search for other electronic devices with cameras through the Device Virtualization Kit (DVKit), and send a request to establish a communication connection to the discovered electronic device, and then receive a feedback from the other electronic device in DVKit that agrees to establish a communication connection. After the message, a communication connection can be established with the electronic device. Further, DVKit establishes a communication connection with the above electronic device through a distributed device virtualization platform (DMSDP), and DMSDP is specifically used to establish a session with the above electronic device.
  • the above sessions include control sessions and data sessions.
  • the above-mentioned electronic device that establishes a session with the master device 100 may be referred to as the slave device 200 of the master device 100 .
  • the master device 100 acquires the shooting capability information of the slave device 200 through a communication connection.
  • DMSDP may register the slave device into the virtual camera HAL. Meanwhile, the DMSDP may request the slave device 200 for the shooting capability of the slave device 200 .
  • the slave device 200 may obtain its own shooting capability from its own camera service module (CameraService) module.
  • the above shooting capabilities include the hardware capabilities of the camera and the software capabilities of image processing modules such as ISP and GPU, as well as some shooting capabilities that combine hardware capabilities and software capabilities.
  • ISP and GPU image processing modules
  • Some shooting capabilities that combine hardware capabilities and software capabilities.
  • Table 1 in FIG. 3 please refer to the introduction in Table 1 in FIG. 3 .
  • the slave device 200 may transmit the above-mentioned shooting capability to the master device 100 .
  • the DMSDP may send the above-mentioned shooting capability information to the HAL layer virtual HAL module.
  • the virtual camera HAL may also send the above shooting capability to the camera manager (CameraManager).
  • Table 1 shows the hardware capabilities that the slave device 200 may include, and the shooting functions that combine the hardware capabilities and software capabilities. For example, the number of cameras, camera IDs, pixels, aperture size, zoom range, filters, beauty, and various shooting modes of the slave device 200 are provided. In particular, shooting modes such as night scene mode and portrait mode include not only the hardware capability of the camera of the slave device 200 , but also the image processing capability of the slave device 200 .
  • the device cooperative management and dynamic pipeline can learn that the slave device 200 is currently registered in the virtual camera HAL.
  • the device collaborative management needs to perform collaborative management on the slave device 200 .
  • the device collaborative management can distinguish which camera of which device the above-mentioned image comes from according to the ID of the camera, such as the camera 1 of the master device and the slave device. camera 1001.
  • the device cooperative management can make the time delay of displaying the images from the master device 100 and the slave device 200 within a range acceptable to human eyes by repeating or buffering the images of the master device 100 .
  • the dynamic pipeline can also distinguish the control commands flowing to the master device 100 or the slave device 200 according to information such as the camera ID, and then send the control commands sent to the slave device 200 to the virtual camera HAL.
  • the virtual HAL can send notifications to DVKit to change the connection state. Specifically, the virtual HAL can notify DVKit to change the unconnected state to the connected state.
  • the above non-connected state means that the main device 100 has not established a connection with other electronic devices using the camera of the electronic device.
  • the above-mentioned connected state means that the main device 100 has established a connection with other electronic devices using the camera of the electronic device.
  • S103 the slave device 200 collects and processes an image, and then sends the collected and processed image to the master device 100 .
  • the slave device 200 can automatically start capturing and processing images.
  • the slave device 200 can open its own camera application.
  • the user interface of the camera application is shown in Figure 8A.
  • the slave device 200 can display the image captured and processed by its own camera, referring to the preview box 812 in FIG. 8A .
  • the slave device 200 may transmit the above-mentioned image to the master device 100 .
  • the main device 100 may display the above-described image.
  • the main device 100 may add a window 822 to the user interface 82 .
  • the preview window may display images captured and processed from the device 200 .
  • the master device 100 can realize collaborative shooting across devices, and simultaneously display images captured and processed by multiple cameras on the display screen.
  • the master device 100 may also only control and use the camera of the slave device 200 , that is, the master device 100 only displays the image of the slave device 200 , and does not display the captured and processed image of the master device 100 .
  • the master device 100 may send a control command to turn on the camera to the slave device 200 in response to the user's operation of turning on the camera of the slave device 200 .
  • the slave device 200 can respond to the above command and turn on the camera.
  • the main device 100 may also display a prompt window.
  • the prompt window may ask the user whether to turn on the camera of the slave device 200 .
  • the master device 100 may send a control command to turn on the camera to the slave device 200 .
  • the main device 100 may turn on the camera, and obtain the image captured and processed by the camera.
  • the slave device 200 may send the above image to the master device 100 .
  • the main device 100 may display the above-mentioned image, such as the window 822 shown in FIG. 8B.
  • the slave device 200 can also display the above image on its own display screen.
  • User interface 81 as shown in Figure 8A.
  • the slave device 200 may also not display images captured and processed by its own camera.
  • the shooting parameters used by the camera of the slave device 200 can be default, for example, the slave device 200 can use the normal rear camera to capture by default, the camera uses double the focal length, the color temperature calibration uses the default daylight mode, aperture size f/1.6, optical image stabilization on, flash off, shutter time 1/60, ISO sensitivity 400, pixel 8192 ⁇ 6144, crop frame size 3:4, beauty/body beauty algorithm turned off, no filter, no stickers and more.
  • the master device 100 may send a series of control commands for coordinated shooting to the slave device 200, such as control commands for taking pictures, recording videos, or adjusting the shooting effect, such as changing filters.
  • the slave device 200 can adjust image acquisition and processing according to the above commands.
  • the master device 100 In response to the received user operation, the master device 100 sends a command for controlling the shooting effect to the slave device 200.
  • the control command includes the following information: the shooting parameters adjusted in response to the user-specific operation, the type of the stream creation command (ie, the preview command, the video recording command, the photographing command).
  • Multi-stream configuration configures different numbers and types of streams according to different stream creation commands.
  • the shooting parameters adjusted by the user depend on the user operation received by the main device 100, and the shooting parameters may include but are not limited to: hardware parameters of the camera involved in capturing images and/or software parameters involved in processing images.
  • the shooting parameters also include some combination parameters of hardware parameters and software parameters. Such as Hybrid Zoom Range, Night Mode, Portrait Mode, Time Lapse, Slow Motion, Panorama Mode, HDR, and more.
  • the hardware parameters include one or more of the following: camera ID, optical zoom range, whether to enable optical image stabilization, adjustment range of aperture size, whether to enable flash, whether to enable fill light, shutter time, ISO sensitivity value, pixels and video frame rate, etc.
  • Software parameters include one or more of the following: digital zoom value, image crop size, image color temperature calibration mode, whether to denoise the image, beauty/body beauty type, filter type, sticker option, whether to enable selfie mirroring, etc. Wait.
  • control commands may further include default values of other shooting parameters.
  • the above-mentioned other shooting parameters refer to other shooting parameters except the parameters adjusted by the user operation.
  • the control command carrying the shooting parameters may be sent to the slave device 200 through the communication connection established by the master device 100 and the slave device 200 .
  • the virtual camera HAL may send the created control command to a device virtualization platform (DMSDP), and the DMSDP may include a data session channel and a control session channel of the master device 100 and the slave device 200 .
  • DMSDP device virtualization platform
  • the above control command may be sent to the slave device 200 through the control session channel.
  • the stream creation command may be a preview command.
  • the above-mentioned control commands may also include default shooting parameters, such as double the focal length, no filter, close beauty, and so on.
  • Dialog 921 is shown as shown in user interface 92 .
  • the master device 100 may display shooting capability options corresponding to the above shooting capabilities on the display screen.
  • the main device 100 may detect a user operation acting on a certain shooting capability option.
  • the master device 100 may transmit a command to control the photographing effect to the slave device 200 .
  • the master device 100 can detect a user operation acting on the incandescent lamp mode, such as a user operation acting on the incandescent lamp 923 in the user interface 92 , in response to the user operation, the master device 100 may send the above control command to the slave device 200, where the control command is used to control the slave device 200 to set the white balance mode to the incandescent lamp mode.
  • a user operation acting on the incandescent lamp mode such as a user operation acting on the incandescent lamp 923 in the user interface 92
  • the master device 100 may send the above control command to the slave device 200, where the control command is used to control the slave device 200 to set the white balance mode to the incandescent lamp mode.
  • the master device 100 In the process of creating and sending the above-mentioned control commands, the master device 100 also performs various processing on the above-mentioned commands, such as dynamic pipeline, multi-stream configuration, and multiplexing control.
  • the camera application 131 may generate a control command with a preview control 1311 and an adjustment shooting effect 1312 .
  • This control command corresponds to a repeat frame control 1314 .
  • Repeated frame control may indicate that the control command is applied to multiple frames.
  • Repeat Frame Control may include fields Cmd and Surfaces. The Cmd field can be used to represent control commands. In some embodiments, the number of the control command may also be included in cmd. Surfaces can be used to receive views of the rendered screen and send the resulting effects to surfaceflinger for image synthesis and display to the screen.
  • Dynamic pipelining can add labels to the above control commands.
  • the above labels may include on-demand delivery labels, flow direction labels, and repeat frame labels.
  • Camera service 132 may include queue 1321 of pending commands.
  • the repeated frame control command can replace the basic command 1322 (for example, "cmd+streamIds+buffer") in the original pending command queue 1321, and the above-mentioned basic command also includes other control
  • the default parameters of the shooting effect such as the default mode of white balance: daylight mode, the default filter: no filter, etc.
  • a repeating frame control command 1324 (for example, "cmd+streamIds+buffer+incandescent mode+IsNew+Repeating") is added to the pending command queue 1321 to be issued on demand.
  • the repeat frame control command 1324 can add two fields, IsNew and Repeating. IsNew can be used to indicate that the command is an on-demand control issued by the application. Repeating can be used to indicate that the command is a repeating frame control.
  • the dynamic pipeline can also mark the flow-direction tags that send the above-mentioned control commands to the slave device 200 .
  • the dynamic pipeline can add the flow direction label Device to the above control commands.
  • Device can represent the object of the control command through the camera ID (ID).
  • ID the camera ID
  • the above control command may represent a control command flowing to the front lens of the main device 100 .
  • the local camera HAL of the main device 100 can receive the above control command.
  • the above-mentioned control command may represent the control command flowing to the common lens behind the slave device 200 .
  • the virtual camera HAL of the main device 100 can receive the above control command. Therefore, the Device according to the control command, which sets the white balance mode of the slave device 200 to the incandescent lamp mode, may be set to 1002 .
  • the multi-stream configuration in the stream processing module can add stream configuration information to the above control command.
  • the above control commands include a preview control 1311 and a shooting effect 1312 of "set white balance mode to incandescent mode". That is, the multi-stream configuration can configure one preview stream (1080P) and one analysis stream (720P) for the preview control 1311 .
  • the specific configuration rules of the multi-flow configuration reference may be made to the introduction in Table 2 in FIG. 3 , and details are not repeated here.
  • Multiplexing control can multiplex multiple streams configured by the multi-stream configuration module. After multiplexing control, the number of streams transmitted back from the slave device 200 to the master device 100 can be reduced, thereby reducing network load and improving transmission efficiency. Specifically, multiplexing control can use high-quality streams to cover low-quality streams. For example, a split stream with a picture quality of 720P can multiplex the preview stream of 1080P. Therefore, for a control command requiring a preview stream (1080P) and an analysis stream (720P), the slave device 200 may only return a preview stream of 1080P.
  • the above control command can be sent to the virtual camera HAL module of the HAL layer.
  • the virtual camera HAL can filter out the control commands 1324 issued on demand.
  • the filtered on-demand control commands may be stored in the send command queue 1331 .
  • the master device 100 may transmit the control commands in the transmit queue 1331 to the slave device 200 .
  • the multi-stream configuration can also configure a photographing stream and a video recording stream.
  • the multiplexing control changes accordingly.
  • FIG. 14 exemplarily shows an example of multiplexing and splitting of a different stream when the control command is a photographing command.
  • Figure 14 may include a pre-processing portion and a post-processing portion.
  • the pre-processing part can be completed by the pre-processing module of stream processing
  • the post-processing part can be completed by the post-processing module of stream processing, the above two modules can refer to the introduction in FIG. 3 .
  • the multi-stream configuration 142 module can configure a preview stream (1080P), an analysis stream (720P), a video stream (4K), and a photo stream (4K) for the above command.
  • the multiplexing control 143 module can multiplex the information for configuring the four streams to configure two streams: a preview stream (1080P) and a video stream (4K).
  • the master device 100 sends the split and multiplexed shooting control commands to the slave device 200 .
  • the master device 100 may multiplex the image stream (stream) required in the control command.
  • the slave device 200 can only send the multiplexed image stream, thereby reducing the image stream transmitted in the network, reducing the network load, and further improving the transmission efficiency.
  • S105 A command for controlling the photographing effect is received from the device 200, and in response to the command, the slave device 200 may acquire and process an image.
  • the camera proxy service of the slave device 200 may receive a control command sent by the master device 100 to adjust the shooting effect.
  • the DMSDP can establish a data session channel and a control session channel between the master device 100 and the slave device 200 .
  • the camera proxy service module of the slave device 200 can receive the control command sent by the above-mentioned control session channel.
  • the slave device 200 may acquire and process images according to the shooting parameters carried in the control command.
  • the slave device 200 may perform corresponding operations according to the shooting parameters carried in the control command.
  • the control command carries hardware parameters
  • the slave device 200 can collect images according to the hardware parameters, which may include but are not limited to: the slave device 200 uses the front or rear camera indicated in the control command, the optical zoom range, the optical Image stabilization, turn on the flash, turn on the fill light, and capture images at a frame rate of 30fps.
  • the control command carries software parameters
  • the slave device 200 can process the collected images according to the software parameters, including but not limited to: cropping the collected images, color temperature calibration, noise reduction, adding filters effect, etc.
  • the slave device can adapt and parse the above command, and then use the above command
  • the sent local camera HAL of the slave device 200 finally obtains the processing result.
  • FIG. 13 partially exemplarily shows the processing procedure of the above-mentioned control instruction in the slave device.
  • the camera proxy service 134 may include a receive command 1341 , a surfaces map 1344 table, and a repeat frame control 1342 .
  • the receive command queue 1341 may receive a control command sent by the master device 100 to set the white balance mode of the slave device 200 to incandescent mode.
  • control commands please refer to the command, please refer to the command 1343 in the figure (ie "cmd+streamIds+Repeating+Incandescent mode+1002, a 1080P preview stream").
  • the incandescent lamp mode may indicate that the white balance mode of the slave device 200 is set to the incandescent lamp mode, and 1002 may identify that the object sent by the command is the camera 1002 of the slave device 200 .
  • the camera proxy service 134 can convert the StreamIds of the control commands in the received command queue into the Surface corresponding to the slave device 200 according to the surface mapping table 1344 .
  • the camera service there is a one-to-one correspondence between streamId and surfaceId, that is, one streamId corresponds to one surfaceId.
  • surfaceId can be used to identify surfaces.
  • the above-mentioned surfaces may serve as carriers for displaying images of the streams generated from the device 200 .
  • streamId may indicate a specific stream, and surfaces may indicate a carrier that specifically displays image rendering.
  • the slave device 200 needs to generate a corresponding stream according to the request sent by the master device 100. Therefore, after receiving the control command, the slave device 200 needs to generate the corresponding surfaces from the StreamIds according to the mapping table 1344.
  • the pending command queue 1351 can use the above-mentioned repeated frame control to replace the basic command in the original queue to obtain the pending command 1352 .
  • the ISP of the camera 1002 can set the white balance mode to the incandescent lamp mode, so that the ISP of the camera 1002 can perform incandescent lamp color temperature calibration on the image captured by the camera 1002 .
  • the above-mentioned ISP may also be an image video processor (image video processor, IVP)/natural processing unit (NPU)/digital signal processing (digital signal processing, DSP), etc., which is not limited in this application.
  • the image collected and processed by the camera 1002 of the slave device 200 can be displayed with reference to the window 912 .
  • the window 912 under the default white balance option (eg, daylight mode), the image captured and processed by the camera 1002 may have low brightness and a grayish image.
  • the image captured and processed by the camera 1002 can be referred to as shown in the window 931 in the user interface 93 . At this time, the brightness of the image is higher, and the overall tone of the picture is closer to the color observed by the human eye.
  • the slave device 200 displays the processed image.
  • the slave device 200 may display the image captured and processed by the own camera on the display screen of the device. Reference is made to the user interface 81 shown in Figure 8A. Therefore, the slave device 200 can also display the updated image after receiving the control command in response to adjusting the photographing effect.
  • User interface 94 as shown in Figure 9D.
  • the slave device 200 may not display the image captured and processed by its own camera when the camera is turned on. Therefore, when the slave device 200 responds to the control command sent by the master device 100 to adjust the photographing effect, the slave device 200 may not display the adjusted image. That is, the slave device only sends the obtained image to the master device 100 for use, and its own display screen does not display the image collected and processed by its own camera.
  • the slave device 200 sends the processed image to the master device 100 .
  • the slave device 200 may obtain a new image.
  • the slave device 200 may generate the streams requested by the master device 100 accordingly.
  • slave device 200 may stream a set of image streams.
  • the group of image streams has the highest image quality among the types and numbers of streams required by the main device 100 .
  • the above image quality can be the highest resolution and so on.
  • the slave device 200 can copy the above-mentioned set of image streams into multiple streams, and perform processing such as compression, adjustment, etc. on the copied streams according to the type and number of streams required by the master device 100, for example, from a captured image stream.
  • a group of 720P analysis streams are copied from the 1080P video stream.
  • the multiplexing control module can multiplex the two streams configured by the multi-stream configuration module into one stream. Therefore, the master device 100 finally sends a control command requesting a preview stream of 1080P to the slave device 200 .
  • the introduction of S103 refer to the introduction of S103.
  • the local camera HAL of the slave device 200 may generate a 1080P preview stream.
  • the preview stream above has the white balance mode set to incandescent mode.
  • the local camera HAL can then send the above-mentioned stream to the camera proxy service of the slave device 200 , and further, the camera proxy service can send the above-mentioned stream back to the master device 100 .
  • the flow transmitted back from the device 200 to the master device 100 is more than one flow, such as the multiple-flow situation exemplarily shown in FIG. 14 .
  • the master device 100 sends a photographing control command to the slave device 200
  • the multi-stream configuration and multiplexing control finally configure a preview stream (1080P) and a video stream (4K) for the above photographing control command.
  • a preview stream (1080P) and a video stream (4K) for the above photographing control command.
  • the slave device 200 can generate a 1080P preview stream (stream 1), a 4K video stream (stream 2) and a photographed image stream (stream 3).
  • the slave device 200 may acquire a set of image streams according to stream 2 .
  • the slave device 200 can copy the other two groups of streams (stream 1', stream 2').
  • the slave device 200 can process stream 1' and stream 2'. Thereby, stream 1 and stream 2 are obtained, for example, stream 1 (1080P) is obtained by compressing 4K stream 1', and photographed images are obtained from 4K stream 2', and so on.
  • the camera proxy service of the slave device 200 may send the above-mentioned 3 streams to the master device 100 .
  • S108 The master device 100 displays the image sent from the slave device 200.
  • the master device 100 can restore the streams transmitted back from the device 200 to the number and type of streams actually required by the master device 100 .
  • the analysis stream can be used for image processing, the preview stream can be used for display, etc.
  • the master device 100 can send each stream to a corresponding module for processing and utilization according to the type of the restored stream. By transmitting the preview stream from the slave device 200, the master device 100 can realize the function of displaying images.
  • the post-processing module of the master device 100 can perform intelligent distribution and multi-stream output to the slave device 200 .
  • the smart flow distribution module can record the number and types of flows configured by the multi-flow configuration module. After receiving the multiplexed stream returned from the device 200 , the smart stream distribution module can copy the received stream into multiple streams according to the aforementioned record.
  • the above-mentioned duplicated streams may be of the same quality as the received stream, or may be of slightly lower quality than the received stream, so as to obtain multiple streams of the same quantity and type as the streams configured by the multi-stream configuration module. Then, the multiple streams obtained after shunting can be sent to corresponding modules for processing and utilization.
  • the master device 100 may receive a 1080P preview stream transmitted back from the device 200 .
  • the smart stream distribution module can restore the preview stream to a preview stream (1080P) and an analysis stream (720P) configured by the multi-stream configuration module.
  • Figure 14 also shows another example of smart diversion. This figure shows the offloading process in which the master device 100 receives multiple streams returned from the slave device 200 .
  • the master device 100 can receive a 1080P stream (stream 1), a 4K stream (stream 2), and a picture stream (stream 3).
  • the smart stream splitting module can divide stream 1 into a 1080P preview stream and a 720P analysis stream.
  • Stream 2 can be divided into a 1080P stream (stream 1) and a 4K video stream, and the above-mentioned 1080P stream (stream 1) can continue to be split, refer to the split of stream 1.
  • Stream 3 can be divided into a 4K photo stream.
  • the stream splitting process can restore the multiplexed stream to the stream required by the original master device 100, thereby reducing network load and improving transmission efficiency brought about by multiplexing, and at the same time meeting the original requirements of the application program without affecting the normal use of the application.
  • the multiple streams obtained after the shunting can be respectively sent to the corresponding modules of the application layer for use by the application program.
  • the application can display a preview stream.
  • the master device 100 may only display the images returned from the device 200 .
  • the main device 100 can simultaneously display the image returned from the device 200 and the image captured and processed by its own camera, referring to the user interface 93 shown in FIG. 9C .
  • the main device 100 displays the image returned from the device 200 in the form of a floating window, referring to the window 931 .
  • the main device 100 may detect a user operation acting on the window 931 , and in response to the operation, the main device 100 may move the position of the floating window to any position of the user interface 93 .
  • the above operation is, for example, a long-press and drag operation.
  • the main device 100 may detect another user operation acting on the window 931, and in response to the operation, the main device 100 may exchange the content displayed by the air window and the preview window.
  • the above operation is, for example, a click operation.
  • the main device 100 can also adjust the size of the floating window and so on in response to other operations of the user.
  • the main device 100 when the main device 100 simultaneously displays the image returned from the device 200 and the image captured and processed by its own camera, it can also be divided into two tiled windows for display, such as the user interface 114 .
  • the master device 100 when the master device 100 is connected to multiple slave devices, the master device 100 can display the images returned by the aforementioned multiple slave devices. Similarly, the master device 100 may display through a floating window, or may display in a tile, which is not limited in this embodiment of the present application.
  • the frame synchronization module may synchronize the received stream returned from the device 200 before the post-processing offloads the stream returned from the device 200 .
  • Frame synchronization reduces delay errors due to network transmission.
  • the photographing result obtained through the frame-synchronized photographing stream can be closer to the user's requirement.
  • FIG. 15 exemplarily shows a schematic diagram of a frame synchronization method.
  • the frame synchronization may consist of three parts.
  • the first portion 151 and the third portion 153 may represent processes taking place on the master device 100 .
  • the second portion 152 may represent a process that occurs on the slave device 200 .
  • the following takes a stream composed of 7 frames and a photographing command as an example, the frame synchronization process of the photographing command by the master device 100 .
  • the master device 100 may create a photographing command 1512. Then, the master device 100 may transmit the above-mentioned command to the slave device 200 through a wired connection or a wireless connection. The slave device 200 receives the above command in the third frame 1521 . Therefore, the slave device 200 takes the image of the third frame 1521 as a result of executing the photographing command 1512 . After processing the photographing command, the slave device 200 may send the above processing result back to the master device 100 . The processing result received by the master device 100 is the third frame 1521 image. At this time, the master device 100 may perform synchronization processing on the above results.
  • frame synchronization can advance the processing results.
  • the processing result received by the master device 100 is the third frame 1521 image.
  • the master device 100 may move forward by one frame, and use the second frame 1511 as the processing result after synchronization.
  • a delay in receiving the command from device 200 by one frame time is an example network delay. That is, in some embodiments, the time when the command is received from the device 200 may also be the fourth frame, the fifth frame, and so on.
  • the delay in receiving the command from the device 200 varies according to the actual network communication quality.
  • the same master device 100 advances one frame as a synchronization result is also exemplified.
  • the master device 100 may not obtain the second frame 1511 as the processed result after synchronization when the master device 100 moves forward. For example, at the second frame, the master device 100 may create a photographing command. Then, the slave device 200 receives the above command in the fourth frame. Therefore, the slave device 200 takes the image of the fourth frame as a result of executing the photographing command. The master device 100 performs forward synchronization processing on the received fourth frame of image to obtain the third frame of image.
  • the host device 100 may not perform frame synchronization on the received image.
  • the main device 100 performs processing such as photographing, video recording, or forwarding on the displayed image.
  • the main device 100 After the application program of the main device 100 obtains the image sent from the device 200, the main device 100 can further utilize the above-mentioned image.
  • the live broadcast application program of the main device 100 can forward the obtained image.
  • the main device 100 may forward the above-mentioned image to the server of the live broadcast.
  • the server may distribute the above image to user devices watching the live broadcast.
  • the master device 100 can detect a user operation acting on the shooting control 1146 , and in response to the operation, the master device 100 can capture the obtained images from the camera of the slave device 200 . For photo storage or video storage.
  • the image acquired and processed by the slave device may be called the first image, such as the image shown in the window 912 in the user interface 91 .
  • the image collected and processed by the slave device according to the control command may be referred to as a second image.
  • the image shown in window 921 in user interface 93 For example the image shown in window 921 in user interface 93 .
  • the shooting parameter used to obtain the first image collected and processed from the device may be called the first shooting parameter, and the first shooting parameter may be the default parameter, or it may be carried in the control command sent by the master device previously received. shooting parameters.
  • Second shooting parameters The shooting parameters used by the slave device to obtain and process the second image may be called second shooting parameters, that is, the shooting parameters carried in the control command sent by the master device to the slave device to adjust the shooting effect.
  • the image acquired and processed by the main device according to the default shooting parameters of its own camera may be called a third image, for example, the image displayed in the window 911 of the user interface 91 .
  • the user can adjust the shooting effect of the main device, and in response to the above adjustment operation, the main device can control its own camera to adjust the shooting parameters.
  • the main device may acquire and process a new image according to the above-mentioned adjusted shooting parameters, and the above-mentioned new image may be referred to as a fourth image.
  • the image displayed in window 1031 in user interface 103 For example, the image displayed in window 1031 in user interface 103 .
  • the number of streams configured by the master device according to the requirements of the application layer is the first number
  • the number of streams sent to the slave device obtained by the master device after multiplexing the multiplexable streams is the second number
  • the type of the stream configured by the master device according to the requirements of the application layer is the first type
  • the type of the stream sent to the slave device obtained by the master device after multiplexing the multiplexable streams is the second type.
  • the software framework provided by this embodiment implements the management of the entire flow life cycle at the HAL layer. That is, in the embodiment of the present application, the stream processing module of the service layer in FIG. 3 is moved to the HAL layer. Other modules remain unchanged.
  • the introduction in FIG. 3 which is not repeated in the embodiment of the present application.
  • the master device can be connected to one or more slave devices, which can not only provide users with a multi-perspective shooting experience, but also control the shooting effects of the slave devices, such as controlling the focus, exposure, zoom, etc. of the slave devices. , so as to meet the needs of users to control the remote shooting effect.
  • the master device can obtain all the preview images, photographing results, and video recording results of the slave devices.
  • the master device can store the screen of the slave device by taking pictures or videos, or forward the screen of the slave device to a third-party server in a live broadcast application scenario.
  • Implementing the cross-device collaborative shooting method can also solve distributed control among devices with operating systems, extend the functions of electronic devices to other common hardware camera devices, and flexibly expand lenses.
  • the above method can support the acquisition of data streams of multiple devices through the control of the mobile phone, and realize the collaborative recording of the mobile phone with the large screen, watch, car machine, etc.
  • the multiplexing and splitting of streams in cross-device collaborative shooting can also load the network transmission, improve transmission efficiency, thereby reducing transmission delay and ensuring clear and smooth image quality.
  • the above-mentioned method, system, and device for collaborative shooting across devices can also be extended to distributed audio scenarios, for example, applying a distributed camera framework to a distributed audio framework.
  • the distributed audio scene can realize the unification of distributed audio and video, and improve the efficiency of the entire cross-device communication.
  • the term “when” may be interpreted to mean “if” or “after” or “in response to determining" or “in response to detecting" depending on the context.
  • the phrases “in determining" or “if detecting (the stated condition or event)” can be interpreted to mean “if determining" or “in response to determining" or “on detecting (the stated condition or event)” or “in response to the detection of (the stated condition or event)”.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer commands. When the computer program commands are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer commands may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer commands may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

本文提供了一种跨设备的协同拍摄方法、相关装置及系统。在该方法中,主设备可以和从设备进行协同拍摄,主设备可以接收到用户调整从设备拍摄效果的控制操作,并生成控制命令发送给从设备。从设备可以响应于该控制命令对拍摄效果进行调整,然后将调整后获取到的图像发送给主设备。实施该跨设备的协同拍摄方法,在协同拍摄的过程中,主设备不仅可以为用户提供多视角的拍摄体验,还可以控制从设备的拍摄效果,从而满足用户控制远端拍摄效果的需求。

Description

跨设备的协同拍摄方法、相关装置及系统
本申请要求于2021年02月04日提交中国专利局、申请号为202110154962.2、申请名称为“跨设备的协同拍摄方法、相关装置及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及拍摄技术领域,尤其涉及跨设备的协同拍摄方法、相关装置及系统。
背景技术
随着智能移动设备的发展,智能移动设备摄像头的拍摄功能越来越为强大。
由于单机拍摄的拍摄视角具有局限性,目前的智能移动设备可以和其他设备进行跨设备协同拍摄,从而获取更加多元的拍摄视角,达到更好的拍摄效果。在跨设备协同拍摄时,用户希望能够控制其他设备的拍摄效果。如何满足用户的拍摄需求,是当前亟需解决的问题。
发明内容
本申请提供了一种跨设备的协同拍摄方法、相关装置及系统,解决了在跨设备协同拍摄时,主设备无法控制从设备拍摄效果的问题。
第一方面,本申请实施例提供了一种跨设备的协同拍摄方法,该方法应用于主设备,主设备与m个从设备建立有通信连接,m为大于或等于1的整数,该方法包括:主设备显示一个应用的界面;主设备接收到从设备的发送的第一图像,第一图像是从设备根据第一拍摄参数得到的图像;主设备在界面上显示m个第一图像;主设备接收到至少一个操作;主设备响应于至少一个操作,向从设备发送携带第二拍摄参数的控制命令,第二拍摄参数用于调整从设备的拍摄效果;主设备接收到从设备发送的第二图像,第二图像由从设备根据拍摄参数得到的图像;主设备在界面上显示第二图像,并不再显示第一图像。
实施第一方面提供的方法,可以实现在跨设备协同拍摄的过程中,主设备首先可以获取从设备依据第一拍摄参数采集并处理得到的图像,用户可通过上述图像发现需要调整的拍摄参数,然后用户可通过主设备向从设备发送调整拍摄效果的控制命令。在从设备调整拍摄参数后,主设备可显示从设备传回的依据控制命令调整后的图像,从而用户能够通过主设备控制从设备拍摄效果。
结合第一方面,该方法还包括:主设备显示的一个应用的界面还包括从设备对应的多个拍摄选项,多个拍摄选项分别对应于从设备的拍摄能力;其中,操作包括:作用于多个拍摄选项中一个拍摄选项的操作;第二拍摄参数包括:包括:操作所作用的拍摄选项对应的拍摄参数。
这样,用户可以直接在主设备上查看从设备具备的拍摄能力,并通过作用于主设备上的用户操作,控制从设备进行相应地调整,从而实现主设备远程控制从设备拍摄效果。
结合上一种实施方式,主设备显示多个拍摄选项之前,主设备还可以获取从设备的拍摄能力;其中,第二拍摄参数在从设备的拍摄能力范围内。
这样,主设备可以获取从设备的拍摄能力,从而,主设备可以将从设备的能力显示给用户查看。
结合第一方面,在一些实施方式中,该方法还包括:主设备采集并处理图像,得到第三图像;主设备在界面上还显示第三图像。
这样,主设备在显示从设备采集并处理的图像的同时,还能显示自身采集并处理的图像。用户可以在主设备的显示屏上同时看到主设备和从设备拍摄的画面,从而提升用户的使用体验,满足多角度多画面拍摄的需求。
结合上一种实施方式,该界面还包括主设备对应的多个拍摄选项,主设备对应的多个拍摄选项对应于主设备的拍摄能力;主设备接收到作用于主设备对应的多个拍摄选项中一个拍摄选项的另一个操作,根据另一个操作所作用的拍摄选项对应的拍摄参数,采集并处理图像,得到第四图像;主设备在界面上显示第四图像,并不再显示第三图像。
这样,用户还可在主设备上控制自身的拍摄效果,为用户提供了更多的拍摄选择。
结合第一方面,在一些实施方式中,主设备响应于至少一个操作,向从设备发送携带第二拍摄参数的控制命令之前,方法还包括:主设备确定第一数量和第一类型,第一数量为显示第二图像所需的图像流的数量,第一类型包括显示第二图像所需的图像流的类型;主设备确定第二数量和第二类型,第二数量小于第一数量,第一类型包含第二类型;其中,控制命令还携带有:第二数量、第二类型;第二图像包括:从设备根据拍摄参数,采集并处理得到的第二数量和第二类型的图像流。
这样,主设备可减少向从设备请求的图像流的数量,相应地,从设备在向主设备发送图像流时,可以减少发送的流的数量,从而降低网络数据传输负载,提高传输效率。
结合上一种实施方式,主设备接收到从设备发送的第二图像之后,方法还包括:主设备将第二图像处理为第一数量和第一类型的图像流;主设备在界面上显示第二图像,包括:主设备根据第一数量和第一类型的图像流,在界面上显示第二图像。
这样,主设备可以通过收到的更少数量的图像流复制出更多的图像流。
结合第一方面,在一些实施方式中,方法还包括:主设备运行拍摄类应用程序,第一界面由拍摄类应用程序提供。
结合第一方面,在一些实施方式中,方法还包括:主设备运行直播类应用程序,第一界面由直播类应用程序提供;主设备接收到从设备发送的第一图像之后,方法还包括:主设备将第一图像发送给直播类应用程序对应的服务器,由服务器将第一图像发送给第一设备。
第二方面,本申请实施例提供了一种跨设备的协同拍摄方法,该方法应用于从设备,从设备与主设备建立有通信连接,该方法包括:从设备根据第一拍摄参数采集并处理得到第一图像;从设备将第一图像发送给主设备;从设备接收到主设备发送的携带第二拍摄参数的控制命令,第二拍摄参数用于调整从设备的拍摄效果;从设备根据第二拍摄参数,采集并处理得到第二图像;从设备将第二图像发送给主设备。
实施第二方面提供的方法,可以实现在跨设备协同拍摄的过程中,从设备可以首先将依据第一拍摄参数采集并处理得到的图像发送给主设备,供主设备使用。然后,从设备还可响应主设备发送的调整拍摄效果的控制命令。在从设备调整拍摄参数后,从设备可将调整后的图像发送给主设备,从而用户能够在主设备上使用从设备的图像,并能够通过主设备控制从设备拍摄效果。
结合第二方面,在一些实施方式中,该方法还包括:从设备显示一个应用的界面;从设备在界面上显示第一图像;从设备根据第二拍摄参数,采集并处理得到第二图像;从设备在应用界面显示第二图像,并不再显示第一图像。
这样,从设备既可以显示自身摄像头依据第一拍摄参数获取的图像,又可以在响应主设备发送的调整拍摄效果的控制命令后,显示调整后的图像。因此,从设备的用户可以在同意与主设备进行协同拍摄后,随时查看从设备采集的图像。
第三方面,本申请实施例提供了一种跨设备的协同拍摄方法,该方法应用于包括一个主设备和m个从设备的通信系统。该方法包括:m个从设备根据第一拍摄参数采集并处理得到第一图像;m个从设备将第一图像发送给主设备;主设备在界面上显示m个第一图像;主设备响应于至少一个操作,向n个从设备发送携带第二拍摄参数的控制命令,第二拍摄参数用于调整从设备的拍摄效果;n个从设备分别根据第二拍摄参数,分别采集并处理得到n个第二图像;该n个从设备分别将各自得到的第二图像发送给主设备;主设备在上述界面中,使用来自第i个从设备的第二图像,替换该第i个从设备的第一图像;n小于或等于m,1≤i≤n。
实施第三方面提供的方法,可以实现在跨设备协同拍摄的过程中,主设备可以与多个从设备建立连接,并向多个从设备发送调整拍摄效果的控制命令,接收到控制命令的多个从设备响应根据控制命令中携带的拍摄参数调整拍摄效果,并将调整后采集并处理的图像传回主设备。从而,用户能够通过主设备同时控制多个从设备拍摄效果,并且在一个主设备上查看多个从设备拍摄的图像。
结合第三方面,不同的从设备对应的第一拍摄参数不同。该第一拍摄参数可以是默认拍摄参数,也可以是从设备前一次接收的主设备发送的控制命令中携带的拍摄参数。
结合第三方面,不同的从设备对应的第二拍摄参数可以相同也可以不同,一个从设备对应的第二拍摄参数取决于用户针对该从设备的操作。
M个从设备中,每个从设备都对应有一个第一图像,该从设备对应的第一图像是根据该从设备对应的第一拍摄参数采集并处理得到的。
N个从设备中,每个从设备都对应有一个第二图像,该从设备对应的第二图像是根据该从设备对应的第二拍摄参数采集并处理得到的。
结合第三方面,该方法还包括:该界面还包括多个拍摄选项,该多个拍摄选项分别对应于m个从设备,拍摄选项对应从设备的拍摄能力;其中,第三方面中的操作包括:作用于拍摄选项的操作;第二拍摄参数包括:该操作所作用的拍摄选项对应的拍摄参数。
这样,用户可以直接在一个主设备上查看多个从设备具备的拍摄能力,然后通过主设备分别控制多个从设备进行相应地调整。
结合上一种实施方式,主设备显示多个拍摄选项之前,主设备还可以获取m个从设备的拍摄能力;其中,其中一个从设备对应的第二拍摄参数,在该从设备的拍摄能力范围内。
这样,主设备可以获取多个从设备的拍摄能力,从而,主设备可以将各个从设备的拍摄能力显示给用户查看。
结合第三方面,在一些实施例中该方法还包括:从设备显示一个应用的界面;从设备在界面上显示第一图像;从设备根据第二拍摄参数,采集并处理得到第二图像;从设备在应用界面显示第二图像,并不再显示第一图像。
这样,对每个从设备而言,从设备既可以显示自身摄像头依据第一拍摄参数获取的图像,又可以在响应主设备发送的调整拍摄效果的控制命令后,显示调整后的图像。因此,从设备的用户可以在同意与主设备进行协同拍摄后,随时查看从设备采集的图像。
结合第三方面,在一些实施例中,主设备在界面上显示m个第一图像时,该方法还包括:主设备采集并处理图像,得到第三图像;主设备在界面上还显示第三图像。
这样,主设备在显示多个从设备采集并处理的图像的同时,还能显示自身采集并处理的图像。用户可以在主设备的显示屏上同时看到主设备和多个从设备拍摄的画面,从而提升用户的使用体验,满足多角度多画面拍摄的需求。
结合上一种实施例,该界面还包括主设备对应的多个拍摄选项,主设备对应的多个拍摄 选项对应于主设备的拍摄能力;主设备接收到作用于主设备对应的多个拍摄选项中一个拍摄选项的另一个操作,根据第二操作所作用的拍摄选项对应的拍摄参数,采集并处理图像,得到第四图像;主设备在界面上显示第四图像,并不再显示第三图像。
这样,用户还可在主设备上控制自身的拍摄效果,为用户提供了更多的拍摄选择。
结合第三方面,主设备响应于至少一个操作,向从设备发送携带第二拍摄参数的控制命令之前,该方法还包括:主设备确定第一数量和第一类型,第一数量为显示第二图像所需的图像流的数量,第一类型包括显示第二图像所需的图像流的类型;主设备确定第二数量和第二类型,第二数量小于第一数量,第一类型包含第二类型;其中,控制命令还携带有:第二数量、第二类型;第二图像包括:从设备根据拍摄参数,采集并处理得到的第二数量和第二类型的图像流。
这样,主设备可减少向从设备请求的图像流的数量,相应地,从设备在向主设备发送图像流时,可以减少发送的流的数量,从而降低网络数据传输负载,提高传输效率。
结合上一种实施例,主设备在界面上显示第二图像之前,该方法还包括:主设备将第二图像处理为第一数量和第一类型的图像流;主设备在界面上显示第二图像,包括:主设备根据第一数量和第一类型的图像流,在界面上显示第二图像。
这样,主设备可以通过收到的更少数量的图像流复制出更多的图像流。
第四方面,本申请实施例提供了一种电子设备,该电子设备包括一个或多个处理器和一个或多个存储器;其中,一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行如第一方面或第一方面的任意一种实施方式描述的方法;或者,如第三方面或第三方面的任意一种实施方式描述的方法。
第五方面,本申请实施例提供了一种电子设备,该电子设备包括一个或多个处理器和一个或多个存储器;其中,一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行如第二方面或第二方面的任意一种实施方式描述的方法。
第六方面,本申请实施例提供了一种包含指令的计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行如第一方面或第一方面的任意一种实施方式描述的方法;或者,如第三方面或第三方面的任意一种实施方式描述的方法。
第七方面,本申请实施例提供了一种计算机可读存储介质,包括指令,当指令在电子设备上运行时,使得电子设备执行如第一方面或第一方面的任意一种实施方式描述的方法;或者,如第三方面或第三方面的任意一种实施方式描述的方法。
第八方面,本申请实施例提供了一种包含指令的计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行如第二方面或第二方面的任意一种实施方式描述的方法。
第九方面,本申请实施例提供了一种计算机可读存储介质,包括指令,当指令在电子设备上运行时,使得电子设备执行如第二方面或第二方面的任意一种实施方式描述的方法。
第十方面,本申请实施例提供了一种通信系统,该通信系统包括:主设备和从设备。其中,主设备用于执行如第一方面或第一方面的任意一种实施方式描述的方法;或者,如第三方面或第三方面的任意一种实施方式描述的方法,从设备用于执行如第二方面或第二方面的任意一种实施方式描述的方法。
实施本申请实施例提供的跨设备协同拍摄方法,用户可以通过主设备可连接一个或多个 从设备,不仅可以为用户提供多视角的拍摄体验,还可以控制从设备的拍摄效果,从而满足用户控制远端拍摄效果的需求。实施跨设备的协同拍摄方法还可以解决带操作系统的设备间的分布式控制,将电子设备的功能延伸到其他普通的硬件摄像设备,灵活拓展镜头。
附图说明
图1A-图1B是本申请实施例提供的两种跨设备协同拍摄示意图;
图2A是本申请实施例提供的系统结构图;
图2B是本申请实施例提供的电子设备400的硬件结构示意图;
图3是本申请实施例提供的主设备100的软件结构框架;
图4是本申请实施例提供的从设备200的软件结构框架;
图5是本申请实施例提供的业务场景示意图;
图6A-图6D、图7A-图7B、图8A-图8B、图9A-图9D、图10A-图10C、图11A-图11D是本申请实施例提供的一些用户界面示意图;
图12是本申请实施例提供的一种方法流程图;
图13是本申请实施例提供的一种动态流水线处理原理示意图;
图14是本申请实施例提供的一种复用与分流处理原理示意图;
图15是本申请实施例提供的一种帧同步处理原理示意图。
具体实施方式
本申请以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。还应当理解,本申请中使用的术语“和/或”是指并包含一个或多个所列出项目的任何或所有可能组合。
本申请的说明书和权利要求书及附图中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。应用程序的用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在终端设备上经过解析,渲染,最终呈现为用户可以识别的内容,比如图片、文字、按钮等控件。控件(control)也称为部件(widget),是用户界面的基本元素,典型的控件有工具栏(toolbar)、菜单栏(menu bar)、文本框(text box)、按钮(button)、滚动条(scrollbar)、图片和文本。界面中的控件的属性和内容是通过标签或者节点来定义的,比如XML通过<Textview>、<ImgView>、<VideoView>等节点来规定界面所包含的控件。一个节点对应界面中一个控件或属性,节点经过解析和渲染之后呈现为用户可视的内容。此外,很多应用程序,比如混合应用(hybrid application)的界面中通常还包含有网页。网页,也称为页面,可以理解为内嵌在应用程序界面中的一个特殊的控件,网页是通过特定计算机语言编写的源代码,例如超文本标记语言(hyper text markup language,GTML),层叠样式表(cascading style sheets,CSS),java脚本(JavaScript,JS)等,网页源代码可以由浏览器或与浏览器功能类似的网页显示组件加载和显示为用户可识别的内容。网页所包含的具体内容也是通过网页源代码中的标签或者节点来定义的,比如GTML通过<p>、<img>、<video>、<canvas>来定义网页的元素和属性。
用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、 窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
图1A示例性示出了一种跨设备协同拍摄的过程。
如图1A所示,跨设备协同拍摄涉及主设备和从设备。首先,主设备可发现从设备,然后从设备可在主设备处进行注册。注册后,主设备可通过无线网络向从设备发送命令,例如用于打开摄像头的命令,从设备可响应于该命令启动摄像头采集画面,并将采集的画面压缩后发送至主设备。最后,主设备可在预览界面显示从设备采集的画面,以及,主设备的摄像头采集的画面,从而实现跨设备的协同拍摄。
但是,图1A所示的方式中,主设备不能获取到从设备的能力信息,因此不能对从设备的拍摄画面进行控制和调整,例如不能向从设备下发调整拍摄效果的控制命令,如变焦、闪光灯等命令。
图1B示出了即时通讯过程中涉及的跨设备协同拍摄场景。
如图1B所示,设备A和设备B可以通过即时通讯服务器进行视频通话,例如微信
Figure PCTCN2022070618-appb-000001
提供的视频通话。具体的,设备A可以通过摄像头采集图像,然后将该图像通过即时通讯服务器发送给设备B。设备B可以在视频通话界面直接显示设备A发送的图像,还可以显示设备B自身采集的图像,从而实现跨设备的协同拍摄。
图1B所示的方法和图1A中类似,设备A不能获取到设备B的能力信息,因此不能对设备B的拍摄画面进行控制和调整,例如不能向设备B下发调整拍摄效果的控制命令,如变焦、闪光灯等命令。并且,由于设备A和设备B通过网络通信,在跨设备协同拍摄过程中会造成较大的时延,影响用户体验。
为了解决跨设备的协同拍摄中,无法对其他设备的拍摄效果进行控制的问题,本申请以下实施例提供了跨设备的协同拍摄方法、相关装置及系统。该跨设备的协同拍摄方法涉及主设备和从设备。在该方法中,在主设备和从设备进行协同拍摄的过程中,主设备可以接收到用户针对从设备的控制操作;从设备可以响应于该控制操作对拍摄效果进行调整,然后将调整后获取到的图像发送给主设备。之后,主设备可以呈现从设备返回的图像。在一些情况下,主设备还可以同时显示自身采集并处理后的图像。此外,主设备还可以响应于用户操作对显示的图像进行拍照、录像、转发等处理。
实施该跨设备的协同拍摄方法,在协同拍摄的过程中,主设备不仅可以为用户提供多视角的拍摄体验,还可以控制从设备的拍摄效果,从而满足用户控制远端拍摄效果的需求。此外,主设备还可以实现从设备和主设备各自画面的预览、拍照、录像、转发、剪辑等处理。
实施该跨设备的协同拍摄方法,可以实现多种跨设备协同拍摄场景的控制需求,例如手机控制电视的拍摄效果、手表控制手机的拍摄效果、手机控制平板电脑的拍摄效果等等。
在本申请以下实施例中,主设备的数量为一个。从设备的数量不限制,可以为一个也可以有多个。这样,主设备最终呈现的图像可能包括来自多个设备的图像,例如主设备最终显示的预览画面包括:主设备的图像,以及,多个从设备返回的图像。
本申请以下实施例提及的协同拍摄是指,主设备和从设备建立通信连接,并且主设备和从设备都使用摄像头进行拍摄并处理,从设备基于该通信连接将拍摄图像发送给主设备,并由主设备显示从设备的拍摄图像。在一些情况下,协同拍摄过程中,主设备还可以同时显示自身采集并处理后的图像。
主设备和从设备之间的通信连接可以是有线连接、无线连接。其中无线连接可以是高保真无线通信(wireless fidelity,Wi-Fi)连接、蓝牙连接、红外线连接、NFC连接、ZigBee连 接等近距离连接,也可以是远距离连接(远距离连接包括但不限于支持2G,3G,4G,5G以及后续标准协议的移动网络)。例如,主设备和从设备可以登录同一用户账号(例如华为账号),然后通过服务器(例如华为提供的多设备协同拍摄服务器)进行远距离连接。
本申请以下实施例涉及的拍摄效果的调整是指对电子设备的拍摄参数的调整。电子设备的拍摄参数又包括:采集图像时涉及的摄像头的硬件参数,和/或,处理图像时涉及的软件参数。拍摄参数还包括一些硬件参数和软件参数的组合参数。例如混合变焦范围、夜间模式、人像模式、延时拍摄、慢动作、全景模式、HDR等等。
硬件参数包括以下一项或多项:摄像头数量、摄像头的类型、光学变焦值、是否开启光学图像防抖、光圈大小、是否开启闪光灯、是否开启补光灯、快门时间、ISO感光值、像素以及视频帧率等等。其中,摄像头的类型可包括但不限于普通摄像头、广角摄像头、超广角摄像头;光学变焦值可以是1倍变焦、2倍变焦、5倍变焦;光圈大小可以是f/1。8、f/1。9、f/3。4;快门时间可以是1/40、1/60、1/200等等。
软件参数包括以下一项或多项:数码变焦值、图像裁剪大小、图像的色温校准模式、是否对图像进行降噪方式、美颜/美体类型、滤镜类型、贴纸选项、是否开启自拍镜像等等。其中,数码变焦值可以是10倍变焦、15倍变焦;图像剪裁大小可以是3:3、3:4、9:16;色温校准模式可以是日光、荧光、白炽灯、阴影、阴天校准模式;美颜/美体类型可以是瘦脸、瘦身、磨皮、美白、大眼、除痘等;滤镜类型可以是日系、质感、明亮、柔光、赛博朋克等;贴纸可以是表情、动物、风景、插画等贴纸。
当电子设备在响应于特定的拍摄模式或者启用特定的算法时,这时该电子设备将对拍摄参数进行调整。例如,当摄像头使用人脸模式功能时,电子设备可将拍摄参数中焦距的参数调小、增大光圈、打开补光灯、同时使用默认的美颜算法等操作。
电子设备在拍摄图像时,所使用的拍摄参数和处理参数还可参考后续实施例记录的摄像头或摄像头组具备的拍摄能力。根据拍摄能力可以确定拍摄参数和处理参数的参数范围。
默认拍摄参数指示主设备和从设备在启用摄像头时使用的参数。默认拍摄参数可以是摄像头出厂是预置的参数,还可以是用户前一次使用摄像头时使用的参数。同样的,上述参数包括摄像头采集图像时使用的多个硬件参数,以及图像处理模块处理图像时使用的多个软件参数。
下面首先介绍本申请实施例提供的系统。图2A示例性示出了系统10的结构。
如图所示,系统10包括:主设备100,从设备200。其中,从设备200的数量可以为一个或多个,图2A中以一个从设备200为例进行说明。
主设备100和从设备200均为配置有摄像头的电子设备。本申请实施例对主设备100和从设备200所具有的摄像头的数量不做限制。例如,从设备200可以配置有五个摄像头(2个前置摄像头和3个后置摄像头)。
电子设备包括但不限于智能手机、平板电脑、个人数字助理(personal digital assistant,PDA)、具备无线通讯功能的可穿戴电子设备(如智能手表、智能眼镜)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备等。电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2022070618-appb-000002
Linux或者其它操作系统的便携式电子设备。上述电子设备也可为其它便携式电子设备,诸如膝上型计算机(Laptop)等。还应当理解的是,在其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是台式计算机等等。
主设备100与从设备200之间建立有通信连接,该通信连接可以是有线连接、无线连接。
在一些实施例中,无线连接可以是高保真无线通信(Wi-Fi)连接、蓝牙连接、红外线连接、NFC连接、ZigBee连接等近距离连接。主设备100可通过该近距离连接直接向从设备200发送调节拍摄效果的控制命令。从设备200可响应电子设备100下发的上述控制命令,并将调整后的图像传回主设备100。之后,主设备100可显示从设备200传回的图像。此外,主设备100还可利用上述图像完成录像、拍照、转发任务。这里,主设备100向从设备200发送调节拍摄效果的控制命令、从设备200根据该控制命令调整图像等具体实现,可参考后续方法实施例的详细描述,在此暂不赘述。
在另一些实施例中,无线连接也可以是远距离连接,远距离连接包括但不限于支持2G,3G,4G,5G以及后续标准协议的移动网络。
可选的,图2A所示的系统10中还可以包括服务器300,主设备和从设备可以登录同一用户账号(例如华为账号),然后通过服务器300(例如华为提供的多设备协同拍摄服务器)进行远距离连接。服务器300可用于主设备100和从设备200的数据传输。即主设备100可通过服务器300向从设备200发送控制命令。同样的,从设备200可通过服务器300向主设备100发送图像。
图2B是本申请实施例提供的电子设备400的结构示意图。电子设备400可以为图2A所示系统10中的主设备100或从设备200。
电子设备400可以包括:处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据命令操作码和时序信号,产生操作控制信号,完成获取命令和执行命令的控制。
处理器110中还可以设置存储器,用于存储命令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的命令或数据。如果处理器110需要再次使用该命令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
电子设备400的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和检测到电磁波信号。电子设备400中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。移动通信模块150可以提供应用在电子设备400上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1检测到电磁波,并对检测到的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将检测到的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频输出设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备400上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2检测到电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110检测到待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。示例性地,无线通信模块160可以包括蓝牙模块、Wi-Fi模块等。
在一些实施例中,电子设备400的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备400可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),5G以及后续标准协议,BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备400通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲 染。处理器110可包括一个或多个GPU,其执行程序命令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,电子设备400可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备400可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。
在一些实施例中,电子设备400可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备400在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备400可以支持一种或多种视频编解码器。这样,电子设备400可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。
当图2B所示的电子设备400为图2A中的主设备100时,
移动通信模块150和无线通信模块160可用于为主设备100提供通信服务。具体的,在本申请实施例中,主设备100可通过移动通信模块150或无线通信模块160与具有摄像头193的其他电子设备(即从设备200)建立通信连接。此外,通过上述连接,主设备100可向从设备200发送控制命令,并接收从设备200传回的图像。
ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等为主设备100提供拍摄并显示图像的功能。当主设备100打开摄像头193时,主设备100可获得摄像头193采集的光学成像,并通过ISP将光信号转换为电信号。在本申请实施例对拍摄效果进行控制的实验中,ISP还可以对拍摄场景的曝光、色温等拍摄参数进行调整,对图像的噪点、亮度、肤色进行图像处理参数优化。
在跨设备协同拍摄中,视频编解码器可用于对数字视频压缩或解压缩。主设备100可通过视频编解码器将跨设备协同拍摄的拍摄文件编码成多种格式的视频文件。
通过GPU,显示屏194,以及应用处理器等,主设备100可实现显示功能。具体的,电子设备可将摄像头193采集的图像等通过显示屏194显示。此外,主设备100接收到的从设备200发送的图像,也可通过上述显示屏194等器件完成显示。在一些实施例中,显示屏194也可只显示从设备200发送的图像。同时通过触摸传感器180K,即“触控面板”,主设备100 可响应作用于用户界面控件的用户操作。
当图2B所示的电子设备400为图2A中的从设备200时,
移动通信模块150和无线通信模块160可用于为从设备200提供通信服务。具体的,从设备200可通过移动通信模块150或无线通信模块160与主设备100建立通信连接。通过上述连接,从设备200接收主设备100发送的用于控制拍摄效果的控制命令,并可响应于该控制命令拍摄图像,向主设备100发送摄像头193采集并处理后的图像。
与主设备100同样的,ISP,摄像头193,视频编解码器可为从设备200提供拍摄并发送图像的功能。在跨设备协同拍摄中,在从设备200向主设备100发送图像时,视频编解码器可用于对数字视频压缩或解压缩。从设备200可通过视频编解码器将跨设备协同拍摄的拍摄文件编码成图像流,然后再发送给主设备100。
在一些实施例中,从设备200也可显示自身摄像头193采集的图像。此时,通过GPU,显示屏194,以及应用处理器等可为从设备200实现显示功能。
同时通过触摸传感器180K,从设备200可响应作用于用户界面控件的用户操作。
主设备100和从设备200的软件系统均可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明主设备100和从设备200的软件结构。当然,在其他操作系统中(例如鸿蒙系统、Linux系统等),只要各个功能模块实现的功能和本申请的实施例类似也能实现本申请的方案。
图3是本发明实施例的主设备100的软件结构框图。
如图3所示,主设备100的软件结构框图可包括应用层、框架层、服务层和硬件抽象层(hardware abstraction layer,HAL)。其中,框架层还可包括设备虚拟化套件(device virtual kit,DVKit)和设备虚拟化平台(distributedmobile sensing development platform,DMSDP)。
DVkit是一个软件开发工具包(software development kit,SDK)。DVkit可向应用层提供能力接口。通过上述接口,应用层可调用DVkit中提供的服务与能力,例如发现从设备等等。DMSDP是一个框架层的服务。当DVkit发起连接从设备后,DVkit可将DMSDP服务拉起,然后DMSDP可实现连接从设备过程中的控制会话、数据会话的传输等。应用层可以包括一系列应用程序包。例如可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。在本申请实施例中,应用层包括各类使用摄像头的应用程序,例如相机应用程序、直播应用程序、视频通话应用程序等等。其中,视频通话应用程序是指同时具有语音通话和视频通话的应用程序,例如即时通讯应用程序微信
Figure PCTCN2022070618-appb-000003
(图3中未示出)等等。相机应用程序可以包括原生的相机应用程序,和,第三方的相机应用程序。应用层可向框架层请求使用摄像头的拍摄能力。
框架层为应用层的应用程序提供应用编程接口(application programming interface,API)和编程框架。框架层包括一些预先定义的函数。
如图3所示,框架层可包括相机套件(CameraKit)和相机接口(Camera API)。
其中,相机套件(CameraKit)可包括模式管理模块,该模式管理模块可用于调整主设备100运行各类使用摄像头的应用程序时,所使用的拍摄模式。拍摄模式可包括但不限于预览模式、拍照模式、录像模式、跨设备模式等等。其中,主设备100进入跨设备模式后,设备虚拟化套件(DVKit)可用于发现从设备200。这些拍摄模式可通过调用相机接口(Camera API)实现。
相机接口(Camera API)可包括相机管理(CameraManager)和相机设备(CameraDevice)两部分。
其中,相机管理(CameraManager)可用于管理主设备100的拍摄能力,以及,与主设备100连接的从设备200的拍摄能力。
设备的拍摄能力可包括摄像头的硬件能力和ISP/GPU等图像处理软件的软件能力。硬件能力是指摄像头具备的一些可供调整的能力。软件能力是指ISP、GPU等图像处理模块对电信号图像进行处理的能力。拍摄能力还包括一些同时组合硬件能力和软件能力的能力,例如混合变焦范围、夜间模式、人像模式、延时拍摄、慢动作、全景模式等等。以人脸模式为例:主设备100(或从设备200)可将调整摄像头焦距、同时添加美颜算法等。
硬件能力包括以下一项或多项:摄像头数量、摄像头的类型、光学变焦范围、光学图像防抖、光圈调节范围、闪光灯、补光灯、快门时间、ISO感光度、像素以及视频帧率等等。其中,摄像头的类型可包括但不限于普通摄像头、广角摄像头、超广角摄像头等等。光学变焦范围可以是1倍-5倍变焦;光圈大小可以是f/1.8-f/17;快门时间可以是1/40、1/60、1/200等等。
软件能力包括以下一项或多项:数码变焦范围、支持的图像裁剪规格、支持的图像的色温校准方式、支持图像降噪、支持的美颜/美体类型、支持的滤镜类型、支持的贴纸、支持自拍镜像。其中,数码变焦范围可以是10倍-15倍变焦;图像剪裁大小可以是3:3、3:4、9:16;色温校准模式可以是日光、荧光、白炽灯、阴影、阴天校准模式,以及瘦脸、瘦身、磨皮、美白、大眼、除痘等美颜算法、美体算法,日系、质感、明亮、柔光、赛博朋克等滤镜算法,表情、动物、风景、插画等贴纸等等。
表1示例性示出了主设备100与从设备200各自的拍摄能力。
Figure PCTCN2022070618-appb-000004
Figure PCTCN2022070618-appb-000005
表1
如表1所示,相机管理(CameraManager)可记录主设备100的摄像头和从设备200的摄像头的编号。例如主设备100的三个摄像头可分别编号为1、2、3。从设备200的三个摄像头可分别编号为1001、1002、1003。该编号用于唯一标识摄像头。特别的,从设备200的摄像头的编号,可通过主设备100的虚拟相机HAL完成。具体编号规则可参考后续对虚拟HAL的介绍,这里不再赘述。
可以理解的是,上述硬件能力和软件能力还可分别包括其他能力,不限于上述提及的内容,本申请实施例对此不作限制。
相机设备(CameraDevice)可用于响应应用层中各个应用程序,将流的控制命令转发给服务层进行进一步的处理。本申请实施例中的流是指一组具有顺序、大量、快速、连续到达的数据序列。一般情况下,数据流可被视为一个随时间延续而无限增长的动态数据集合。本申请的流是一帧一帧的图像所组成的图像流。流的控制命令由应用层的各个应用程序创建,并下发至应用层之下的其余模块。
相机设备(CameraDevice)还可以和设备虚拟化套件(DVKit)和设备虚拟化平台(DMSDP)协作建立主设备100和从设备200之间的通信连接,并将从设备200和主设备100进行绑定。
具体的,设备虚拟化套件(DVKit)可用于发现从设备200,设备虚拟化平台(DMSDP)可用于和发现的从设备200建立会话通道。DMSDP可包括控制会话(control session)和数据会话(data session)。其中控制会话用于传输主设备100和从设备200之间的控制命令(例如使用从设备200摄像头的请求、拍照命令、录像命令、调整拍摄效果的控制命令等)。数据会话用于传输从设备200返回的流,例如预览流、拍照流、录像流等等。
服务层可包括相机服务(CameraService)、CameraDeviceClient、Camera3Device、 CameraProviderManager、设备协同管理、动态流水线和流处理等模块。
其中,相机服务(CameraService)为相机接口(Camera API)提供各种实现接口功能的服务。
CameraDeviceClient是camera的实例化。一个cameraDeviceClient对应一个camera。CameraService可包括创建cameraDeviceClient的函数,例如connectDevice()。当响应于应用层创建一个camera实例的请求,CameraService调用上述函数,从而创建一个与上述camera对应地的cameraDeviceClient实例。
Camera3Device可用于管理各种类型流的生命周期,包括但不限于创建、停止、清除、销毁流信息等。
CameraProviderManager可用于获取虚拟相机HAL信息,该虚拟相机HAL信息包括从设备200的拍摄能力。从设备200的拍摄能力的详细描述可参考表1。
设备协同管理可用于控制主设备100和从设备200各自画面的时延。在主设备100和从设备200进行跨设备协同拍摄时,由于从设备200将拍摄的图像发送给主设备100时需要一定的时间,因此主设备100显示的双方的预览画面可能不是同一时间的,具有时延。设备协同管理模块可以通过增加缓存(buffer),例如将主设备100采集的画面复制部分帧,以使得双方的预览画面是在相差不大的时间点采集的,这样双方的预览画面产生的时延可以控制在用户视觉感受范围内,不会影响用户体验。
动态流水线可用于响应相机设备(CameraDevice)下发的流创建命令,生成待处理的命令队列。具体的,动态流水线可根据流创建命令的类型、所作用的设备生成一个或多个待处理的命令队列。流创建命令的类型例如可包括预览命令、录像命令、拍照命令等。流创建命令所作用的设备可包括主设备100、从设备200,甚至还可以进一步细化到设备的具体摄像头。后续方法实施例中将详细描述动态流水线的具体工作流程,这里暂不赘述。
动态流水线可在生成的待处理命令队列的各个命令中,增加该命令所作用于设备的标识(ID)或标签(tag)。动态流水线还可以在生成的待处理命令队列的各个命令中,添加单帧或连续帧的请求标签,用于指示该命令的类型。其中,拍照命令为单帧命令,预览命令或录像命令为连续帧命令。
动态流水线还可以用于将待处理命令队列中的作用于从设备200的命令分发至流处理模块,将作用于主设备200的命令分发至HAL层的本地相机HAL模块,可参考后续图13的介绍,这里不赘述。
当用户通过应用层的应用程序来控制拍摄效果时,动态流水线还可以在待处理命令队列的命令中,刷新或者添加用于控制拍摄效果的各项参数,例如变焦值、美颜算法(例如磨皮级别、美白级别等等)、滤镜、色温、曝光度等等。
流处理可包括:生命周期管理模块、前处理、后处理、帧同步等模块。
生命周期管理模块可用于监控流的整个生命周期。当相机设备(CameraDevice)向流处理发送创建流的命令时,生命周期管理可记录该流的信息,例如请求创建该流的时间戳、从设备200是否响应创建流等。当主设备100关闭摄像头或者结束运行应用程序时,对应的流的生命周期停止,生命周期管理模块可记录该流的结束时间。
前处理模块用于对动态流水线下发的各个命令进行处理,包括多流配置、复用控制等模块。
多流配置可用于根据相机设备(CameraDevice)下发的流创建命令的流的类型,来配置所需的流的类型及数量。不同类型的控制命令对应的所需的流的类型及数量不同。流的类型 可包括但不限于预览流、拍照流、录像流、分析流等等。例如:相机设备(CameraDevice)可下发一条“创建拍照流”的控制命令,多流配置可为上述控制命令配置四条预览流、一条分析流、一条拍照流和一条录像流。
多流配置为创建流的命令配置多条流的方法可参考表2:
Figure PCTCN2022070618-appb-000006
表2
可以理解的是,在一些实施例中,多流配置还可以有其他配置方法,本申请实施例对此不作限制。
复用控制可用于对相机设备(CameraDevice)请求的多条流进行复用,即将多流配置模块配置的多条流进行精简。例如一条请求画面质量为1080P的预览流和一条请求画面质量为720P的分析流可复用为一条1080P的预览流。
后续方法实施例将会对多流配置模块和复用控制模块的具体实现做详细的介绍,这里暂不赘述。
后处理模块用于对从设备200传回的图像流进行处理。后处理模块可包括智慧分流和多流输出模块。智慧分流可用于将从设备200传回的流扩充成与相机设备(CameraDevice)请求的流的类型和数量一致的流。例如,相机设备(CameraDevice)请求了一条1080P的预览流、一条720P的分析流。经过复用控制,上述请求两条流的控制命令被复用成请求一条1080P的预览流的控制命令。执行上述请求一条1080P的预览流的控制命令,从设备200可向主设备100传回一条1080P的预览流。当流处理收到上述1080P的预览流后,流处理可通过智慧分流模块将上述1080P的预览流恢复成一条1080P的预览流和一条720P的分析流。
多流输出可用于输出智慧分流实际所需要的流,并将输出的流发送给相机设备(CameraDevice)。
后处理模块还可包括对图像的镜像、旋转等处理,这里不做限制。
帧同步模块可用于对拍照时的图像帧进行帧同步处理。具体的,跨设备传输会使得主设备100发送的指令在到达从设备200时产生时延。即从设备200会晚于主设备100收到该同一指令。因此,从设备200在执行上述指令时得到执行结果也会与主设备100预期得到的结果产生差异。例如,主设备100在第一时刻下发的控制命令,从设备200可能在第二时刻才能响应该控制命令返回图像流,而主设备100希望收到从设备200在第一时刻的图像流。因此,帧同步可以将从设备200结果传回的第二时刻的结果向前推移,得到一个更接近第一时刻的结果(用户预期结果),从而降低网络时延的影响。
硬件抽象层(HAL)可包括本地相机HAL和虚拟相机HAL。
本地相机HAL可包括相机会话(camera session)和camera provider模块。相机会话(camera session)可用于主设备100下发控制命令给硬件。camera provider模块可用于管理主设备100摄像头的拍摄能力。主设备100的拍摄能力可参考上述表1。在这里camera provider可只管理主设备100的本地摄像头的拍摄能力。
虚拟相机HAL也包括相机会话(camera session)和camera provider模块。其中,相机会话(camera session)还可用于将与主设备100建立通信连接的从设备200注册到本地、将该 从设备200的连接状态反馈给DVkit,以及将主设备100下发的控制命令发送至从设备200。camera provider模块负责对从设备200的拍摄能力的管理。同样的,上述管理从设备200的拍摄能力可参考表1,这里不再赘述。
此外,相机虚拟HAL还提供对注册的从设备200的摄像头进行编号的功能。当主设备的虚拟相机HAL获取到从设备的拍摄能力时,虚拟相机HAL可获取从设备具有的摄像头数量,并为每一个摄像头建立一个ID。该ID可用于主设备100区分从设备200的多个摄像头。虚拟相机HAL在对上述摄像头编号时,可采取与主设备100自身摄像头不同的编号方法。例如当主设备100自身的摄像头从1开始编号时,从设备200的摄像头可从1000开始编号。
可以理解的是,图3示意的主设备到100的结构并不构成对从设备200的具体限定。在本申请另一些实施例中,从设备200可以包括比图示更多或更少模块,或者组合某些模块,或者拆分某些模块,或者不同的模块布置。
图4示例性示出了从设备200的系统框架图。
如图4所示,从设备200可包括应用层、框架层、服务层和硬件抽象层(HAL)。
应用层可以包括一系列应用程序包,例如可包括相机代理服务。相机代理服务可包括管道控制、多流适配、多操作系统(Operating System,OS)适配等模块。
其中,管道控制可用于与主设备100建立通信连接(包括建立控制会话和数据会话)、传输控制命令以及图像流。
多流适配可用于相机代理服务根据主设备100发送的流的配置信息进行流数据的返回。参考前述实施例的介绍,主设备100响应于创建流的请求,可为相机设备会话配置需要的流。进行上述配置过程可产生相应地流配置信息。相机代理服务可根据上述配置信息生成相应地流创建命令。因此,响应于上述创建流的控制命令,从设备200的底层服务可创建与上述命令相匹配的流。
多操作系统(Operating System,OS)适配可用于解决不同操作系统之的兼容性问题,例如android系统和鸿蒙系统等等。
框架层包括相机管理(CameraManager)、相机设备(CameraDevice);服务层包括相机服务(CameraService)、CameraDeviceClient、Camera3Device和CameraProviderManager模块。
其中相机管理(CameraManager)、相机设备(CameraDevice)、相机服务(CameraService)、CameraDeviceClient、Camera3Device、CameraProviderManager,和主设备100中对应的各个模块具有相同的作用,可参考前文图3的介绍。
从设备200的本地相机HAL可参考主设备100的本地相机HAL。本申请实施例在此不赘述。从设备200的相机HAL层也可包括相机会话(camera session)和camera provider模块。相机会话(camera session)可用于控制命令和硬件之间的通信。camera provider模块可用于管理从设备200的拍摄能力。从设备200的拍摄能力可参考上述表1,同样的,从设备200的camera provider可只管理从设备200的本地摄像头的拍摄能力。
可以理解的是,图4示意的从设备200的结构并不构成对从设备200的具体限定。在本申请另一些实施例中,从设备200可以包括比图示更多或更少模块,或者组合某些模块,或者拆分某些模块,或者不同的模块布置。
基于前文描述的系统10、主设备100和从设备200的软硬件结构,下面详细介绍本申请实施例提供的跨设备的协同拍摄方法。
本申请实施例提供的跨设备的协同拍摄方法可以应用于多种场景,包括但不限于:
(1)直播场景
在直播场景中,主设备100可以和从设备200连接,主设备100和从设备200可以处于不同的角度拍摄同一物体或不同的物体,从设备200可以将拍摄后的图像发送给主设备100,主设备100可以将双方的图像上传至直播服务器,并由直播服务器分发至更多的用户观看。在此过程中,主设备100可以控制从设备200的拍摄效果,例如可通过主设备100调整从设备200的摄像头焦距或使用的滤镜等参数。
在直播场景中使用本申请实施例提供的跨设备协同拍摄方法,发起直播的主播可以在主设备100上方便地控制从设备200的拍摄效果,并且观看直播的用户可以看到针对同一物体在不同角度展示的多个图像。
(2)相机应用场景
主设备100启动相机应用后,可以和从设备200连接,并各自处于不同的角度拍摄图像,从设备200可以将拍摄后的图像发送给主设备100。主设备100可以控制从设备200的拍摄效果,还可以同时显示双方的拍摄图像,并对该拍摄图像执行预览、拍照、录像等处理。上述相机应用可以是原生相机应用,也可以是第三方相机应用。
这样可以实现跨设备的双景录像、多景录像等功能,可以给用户提供更多更自由的视角,并且用户还可以在主设备100上方便地控制从设备200的拍摄效果,增加了拍摄的趣味性。
(3)视频通话场景
主设备100可以为手机,从设备200可以为大屏电视。手机可以和其他设备进行视频通话,在此过程中,手机可以连接大屏电视并控制大屏电视的摄像头进行拍摄,大屏电视可以将拍摄的图像发送给手机,然后手机将该图像发送给视频通话的另一端设备。这样,手机可以通过大屏电视来实现视频通话,无需用户手持手机在特定角度拍摄图像,可以给用户更加方便的视频通话体验。
(4)可穿戴设备控制智能电子设备拍照的场景
智能手表等可穿戴设备,可以和手机等智能电子设备连接,并控制手机的摄像头进行拍摄,手机可以将拍摄的图像发送给智能手表,使得用户在智能手表上直接观看到手机拍摄并处理的画面。在此过程中,用户还可以通过智能手表控制手机的拍摄效果。
这样,用户可以通过智能手表对手机的拍摄画面进行处理,在拍摄合照、拍摄景物的过程中无需其他人帮助,即可方便快捷地完成拍摄。
可理解的,以上场景仅为示例,本申请实施例提供的跨设备的协同拍摄方法还可以应用到其他场景中,这里不做限制。
下面以直播场景为例,结合该直播场景中的UI来描述跨设备的协同拍摄方法。
图5示例性示出了本申请实施例提供的直播场景。
如图5所示,该直播场景可包括主设备100、从设备200、对象A、对象B。
主设备100和从设备200建立通信连接,主设备100和从设备200可以处于不同的位置或角度。主设备100拍摄对象A,从设备200拍摄对象B。然后,从设备200可显示拍摄到的图像,并且将该图像处理后发送给主设备100。主设备100可以同时显示从设备200发送的图像,和自身拍摄对象A得到的图像。
在直播过程中,主设备100还可将显示的两个图像上传到直播服务器。进一步的,服务器可将上述两个图像分发到进入该直播间的其他用户的设备。
基于上述直播场景,下面介绍本申请实施例提供的主设备100和从设备200上的一些用户界面(user interface,UI)。图6A-图6D、图7A-图7B、图8A-图8B、图9A-图9D、图10A-图10C示例性示出了直播场景中,主设备100与从设备200上实现的一些用户界面。
图6A-图6C示出了一种主设备100和从设备200建立通信连接的方式。其中,图6A-图6C为主设备100上实现的用户界面,图6D为从设备200上实现的用户界面。
图6A示出了主设备100上的用于展示已安装应用程序的示例性用户界面60。该用户界面60显示有:状态栏、日历指示符、天气指示符、具有常用应用程序图标的托盘、导航栏、直播类应用的图标601、相机应用的图标602以及其他应用程序的图标等。其中,状态栏可包括:移动通信信号(又可称为蜂窝信号)的一个或多个信号强度指示符、运营商名称(例如“中国移动”)、Wi-Fi信号的一个或多个信号强度指示符,电池状态指示符、时间指示符等。导航栏可包括返回键、主屏幕键、多任务键等系统导航键。在一些实施例中,图6A示例性所示的用户界面60可以为主界面(Home screen)。
如图6A所示,主设备100可以检测到作用于直播类应用的图标601的用户操作,并响应于该用户操作,显示图6B所示的用户界面61。
用户界面61可以是直播类应用提供的主界面。用户界面61可包括:区域611、互动消息窗612、预览框613、添加控件614、设置控件615。
区域611可用于展示主播的一些信息,例如头像、直播时长、观看人数和直播账号等等。
互动消息窗612可用于显示主播或观众在直播过程中发出的消息,或“点赞”“喜欢”等互动操作产生的系统消息。
预览框613可用于显示主设备100的摄像头实时采集并处理的图像。主设备100可以实时刷新其中的显示内容,以便于用户预览主设备100的摄像头实时采集并处理的图像。该摄像头可以是主设备100的后置摄像头,或者是前置摄像头。
设置控件615可用于调整主设备100的拍摄效果。当检测到作用于设置控件615的用户操作(例如点击操作、触摸操作等),主设备100可显示:用于调节主设备100的拍摄参数,和/或,图像处理参数的选项。这些选项可参考后续用户界面的相关描述,这里不赘述。
添加控件614可用于查找从设备200。当检测到作用于添加控件614的用户操作时,主设备100可利用蓝牙、Wi-Fi、NFC等前述短距离通信技术来发现附近的其他电子设备,也可以利用前述远距离通信技术发现远程的其他电子设备,并查询发现的其他电子设备是否具备摄像头。
在接收到其他电子设备的响应后,主设备100可以在用户界面62上显示发现的具备摄像头的电子设备。例如,参考图6C,主设备100可显示窗口622,窗口622可包括两个电子设备的信息,包括各自的:电子设备的图标、名称、距离、位置等等。
图标623可用于展示该电子设备的类型。例如,上述主设备100显示的第一个从设备可以是平板电脑。用户可通过该图标623快速方便地初步辨认该从设备是否是自己想要连接的设备。
名称624可用于显示该从设备的名称。在一些实施例中,名称624可以是该从设备的型号。在另一些实施例中,该名称也可以是该从设备的用户自定义的名称。上述名称还可以是设备型号和用户自定义名称的组合。本申请对此不作限制。可以理解的是,用户界面62中列举的诸如“pad C1”、“Phone P40-LouS”等均为示例性名称。
如图6C所示,主设备100可以检测到作用于电子设备的图标623上的用户操作,响应该用户操作,主设备100向图标623对应的电子设备(从设备200)发送建立通信连接的请求。
参考图6D,图6D示出了从设备200接收到主设备100发送的建立通信连接的请求后,电子设备(从设备200)所显示的用户界面63。如图6D所示,用户界面63包括:主设备100 的设备信息631、确认控件632和取消控件633。
设备信息631可用于展示主设备100的身份信息。即用户可以通过设备信息631确定发出上述请求的主设备100的信息。当用户通过设备信息631能够确定主设备100的信息并且信任该主设备时,则上述用户可以通过确认控件632同意主设备100使用该电子设备的摄像头。该电子设备可以检测到作用于确认控件632的操作,响应于该用户操作,从设备200可同意主设备100使用自身的摄像头,即主设备100可与该电子设备建立通信连接。
上述通信连接可以是前述的有线连接、无线连接。例如,主设备和从设备可以登录同一用户账号(例如华为账号),然后通过服务器(例如华为提供的多设备协同拍摄服务器)进行远距离连接。本申请实施例对此不作限制。应理解,该电子设备相当于主设备100的从设备。
用户界面63还包括取消控件633。当通过设备信息631不能确定主设备100时,或者不信任主设备100时,用户可以通过取消控件633拒绝主设备100发送的使用该电子设备摄像头的请求。该电子设备可以检测到作用于取消控件633的操作,响应于该用户操作,该电子设备可拒绝主设备100使用自身的摄像头,即该电子设备不同意与主设备100建立通信连接。
图7A-图7B示出了另一种主设备100和从设备200建立通信连接的方式。其中,图7A为从设备200上实现的用户界面71,图7B为主设备100上实现的用户界面72。
当主设备100检测到作用于用户界面62中图标623上的操作时,响应该用户操作,主设备100向图标623对应的电子设备(从设备200)发送建立通信连接的请求。
如图7A所示,用户界面71可包括验证码712和取消控件714。
验证码712可用于主设备100与从设备200的进行连接确认。响应于主设备100发送的上述请求,从设备200可生成验证码712。在一些实施例中,验证码712也可由服务器300生成,然后通过无线网络发送给从设备200。然后,从设备200可在用户界面71上显示上述验证码。
取消控件714可用于拒绝主设备100发送的使用从设备200摄像头的请求。从设备200可检测到作用于取消控件714的用户操作,响应于该用户操作,主设备100可关闭对话框711。
图7B示例性示出了主设备100输入验证码的用户界面72。在从设备200显示用户界面71时,主设备100可显示用户界面72。用户界面72可显示对话框721。
对话框721可包括验证码7211、确认控件7212。验证码7211可表示用户输入主设备100的验证码。主设备100可检测到作用于确认控件7212的操作。响应于该用户操作,主设备100可将验证码7211发送给从设备200。上述用户操作例如是点击操作、长按操作等。
从设备200可检查收到的验证码7211是否与自身显示的验证码712一致。若两个验证码相同,则从设备200同意主设备100使用自身的摄像头。进一步的,从设备200可开启自身摄像头,并将该摄像头采集并通过ISP处理的图像传输给主设备100。反之,从设备200可拒绝主设备100使用从设备200摄像头的请求。
在一些实施例中,当主设备100输入验证码7211与从设备200显示的验证码712不同时,从设备200可维持显示验证码712,并等待主设备100输入新的验证码。当从设备200收到的新的验证码与验证码712一致时,从设备200也可同意主设备100使用自身的摄像头。
在另一些实施例中,当主设备100发送的验证码7211与从设备200显示的验证码712不同时,从设备200可重新生成另一个验证码M,主设备100可重新获取验证码M。当主设备100获取到的验证码N与验证码M一致时,从设备200也可同意主设备100使用自身的摄像头。
不限于图6D-图7B所示的2种描述的建立通信连接的方式,还可以通过其他方式建立通 信连接,例如,利用近场通信(near field communication,NFC)技术,主设备100和从设备200可通过碰一碰等用户操作,完成认证。本申请对于认证的方式不限于上述提及的2中认证方式。
在建立连接后,主设备100和从设备200可以分别显示提示信息。上述提示信息可提示用户主设备100和从设备200已经建立通信连接。如图8A所示,用户界面81示出了从设备200显示提示信息的用户界面。
当图6C所示的授权(或图7A-图7B所示的授权)完成后,从设备200可显示用户界面81。用户界面81可包括提示框811和预览框812。预览框812可用于显示从设备200的摄像头采集的图像。提示框811可用于显示提示信息。上述提示信息例如是“该从设备的摄像头正在被主设备100使用”。
当从设备200授予主设备100使用从设备200的摄像头后,从设备200可打开自身的摄像头。然后,从设备200可在预览框812显示自身摄像头采集并处理的画面。在预览框812的显示层之上,从设备200可显示提示框811。
在一些实施例中,从设备200也可通过浮窗显示自身摄像头采集并处理的图像。具体的,从设备200可在如图6A所示的用户界面60右上角显示一个浮窗。该浮窗可显示从设备200摄像头采集并处理的图像。
在从设备200显示用户界面81时,主设备100可显示如图8B所示的用户界面82。用户界面82可包括提示窗821、窗口822和窗口823。
提示窗821可用于显示提示信息。上述提示信息例如是“主设备100已经连接从设备200”。窗口822可用于显示从设备200摄像头采集并处理的图像。窗口823可显示主设备100摄像头采集并处理的图像。
当从设备200同意主设备100使用从设备200摄像头时,主设备100可从从设备200获得从设备200摄像头采集并处理的图像。然后,主设备100可在窗口822上显示上述图像。同时,主设备100还可显示提示窗821。用户可通过提示窗821显示的提示内容了解到主设备100已连接到从设备200。
此外,用户界面82还可新增一个设置控件824。该设置控件可用于显示从设备200的拍摄能力选项。具体介绍可参考后续实施例,这里赞不赘述。
在一些实施例中,主设备100还可交换窗口823和窗口822显示的内容。具体的,主设备100可检测到作用于窗口822的用户操作,响应于该用户操作,主设备100可在窗口823中显示从设备200摄像头采集并处理的图像。同时,主设备100可在窗口822显示主设备100摄像头采集并处理的图像。上述用户操作可以是点击、左滑等操作。
在一些实施例中,主设备还可将窗口823划分为两个独立的部分。一部分用于显示主设备100摄像头采集并处理的图像,另一部分用于显示从设备200摄像头采集并处理的图像。本申请对主设备100与从设备200的预览图像在主设备100的显示的排列不作限制。
图6A-图8B示例性示出了主设备100与从设备200建立通信连接,并显示从设备200的摄像头采集并处理的图像的一组用户界面。在此之后,主设备100可获取从设备200控制拍摄效果的能力,并且可以向从设备200发送控制拍摄效果的命令。
图9A-图9D示例性示出了主设备100控制从设备200的拍摄效果的一组用户界面。其中图9A-图9C为主设备100上的用户界面,图9D为从设备200上的用户界面。
当图8B所示的显示提示内容的提示窗821关闭后,主设备100可显示如图9A所示的用户界面91。用户界面91可包括窗口911、窗口912、删除控件913、设置控件915、设置控 件916。
窗口911可显示主设备100摄像头采集并处理的图像。窗口912可显示从设备200摄像头采集并处理的图像。删除控件913可用于关闭窗口912。删除控件913可用于关闭窗口912。主设备100可检测到作用于删除控件913的用户操作,响应于该操作,主设备100可关闭窗口912。
设置控件915可用于显示主设备100控制拍摄效果的能力选项。设置控件916可用于显示从设备200控制拍摄效果的能力选项。设置控件916旁还可显示提示信息,例如“点击调整远端画面”。主设备100可检测到作用于设置控件916的用户操作,响应于该用户操作,主设备100可显示从设备200控制拍摄效果的能力选项,参考图9B。
在一些实施例中,用户界面91还可包括删除控件913、添加控件914。删除控件913可用于关闭用户界面91中的一个或多个窗口,例如窗口911、窗口912。添加控件914可用于查找连接其他从设备。主设备100检测到作用于添加控件914的用户操作后,主设备100可显示图6C中窗口622所示的查询结果。
当主设备100检测到作用于窗口622中显示的其他从设备时,主设备100可向该从设备发送使用摄像头的请求。同样的,上述其他从设备可同意主设备100发送的请求,然后,该从设备可启用自身的摄像头采集并按照默认拍摄参数处理图像,进一步的,将处理后的图像发送给主设备。同时,主设备100可增加一个窗口显示上述图像。
实施上述方法,主设备100可显示多个从设备发送的图像,从而为用户提供更丰富的拍摄体验。
在一些实施例中,主设备100还可将设置控件915和设置控件916进行复用。具体的用户界面91可显示一个总的设置控件。当窗口911中显示主设备100采集并处理的图像时,该设置控件可显示主设备100的控制拍摄效果的能力选项。当窗口911中显示从设备200采集并处理的图像时,该设置控件可显示从设备200的控制拍摄效果的能力选项。
图9B示例性示出了主设备100显示从设备200的控制拍摄效果的能力选项的用户界面92。用户界面92可包括从设备200的拍摄效果窗口921。窗口921可显示从设备200具有的控制拍摄效果的各种能力选项,例如光圈、闪光灯、智能跟随、白平衡922、ISO感光度、变焦范围、美颜、滤镜等。
本申请以白平衡922调节为例,具体介绍主设备100向从设备200发送控制命令,和从设备200执行上述控制命令的用户界面。白平衡可用于校准相机的色温偏差。白平衡922可包括日光模式、白炽光模式923、荧光模式、阴天模式、阴影模式。主设备100可检测到作用于上述任意模式的用户操作。当主设备100检测到作用于白炽灯模式923的用户操作时,响应于该用户操作,主设备100可向从设备200下发将白平衡模式更换为白炽灯模式的控制命令。收到上述命令的从设备200可将白平衡922更换为白炽灯模式923。然后,主设备100可收到并显示从设备200发送的更换白炽灯模式923的图像。参考图9C。同时,从设备200的取景框中显示图像也可调整为更换白平衡模式之后的图像。参考图9D
在一些实施例中,主设备100也可设置一个专用的页面显示从设备200的控制拍摄效果的能力选项。即主设备100可使用一个独立的页面显示窗口921中的能力选项。本申请实施例对此不做限制。
如图9C所示,用户界面93可包括窗口931。
窗口931可用于显示从设备200摄像头采集并处理的图像。当主设备100可向从设备200下发将白平衡模式更换为白炽灯模式的控制命令后,主设备100可收到从设备200可将白平 衡模式更换为白炽灯模式的图像。窗口931可显示上述图像。
在用户界面93显示调整白平衡后的图像时,从设备200也可显示调整白平衡后的用户界面94。参考图9D。用户界面94为从设备200上显示的用户界面。用户界面94可包括预览窗941。
预览窗941可用于显示从设备200摄像头采集并处理的图像。当接收到主设备100向从设备200下发将白平衡模式更换为白炽灯模式的控制命令后,从设备200可将白平衡模式更换为白炽灯模式。然后,预览窗941可显示白炽灯模式下从设备200摄像头采集并处理的图像。同时,从设备200可将上述图像发送给主设备100。主设备100可显示上述图像,如图9C所示。
在一些实施例中,从设备200的控制拍摄效果的能力可能具备上述列表中部分或全部能力,或者还具备窗口921未提及的其他控制拍摄效果的能力。本申请对此不作限制。
主设备100还可获取主设备100控制拍摄效果的能力,并向自身下发控制指令。图10A-图10C主设备100向自身下发控制拍摄效果的一组用户界面。图10A示例性示出了主设备100获取自身控制拍摄效果的能力的用户界面101。用户界面101可包括设置控件1011。
设置控件1011可用于显示主设备100控制拍摄效果的能力选项。设置控件1011旁还可显示提示信息,例如“点击调整本端画面”。主设备100可检测到作用于设置控件1011的用户操作,响应于该用户操作,主设备100可显示主设备100的拍摄效果能力列表,如图10B所示用户界面102。
用户界面102可包括窗口1021。该窗口可显示主设备100控制拍摄效果的能力选项,例如光圈1022、闪光灯、智能跟随、美颜、滤镜等。本身实施例以光圈为例,说明主设备100向主设备100发送调整拍摄效果的控制命令。
光圈1022大小可通过刻度盘1024进行调节。主设备100可检测到作用于刻度盘1024的用户操作,响应于该用户操作,主设备100可向主设备100发下调节光圈的控制命令。
具体的,刻度盘1024初始刻度可以为“f/8”。用户可通过右滑操作将刻度盘1024上的浮标滑动到“f/17”的光圈刻度。主设备100可检测到这一用户操作,响应于该用户操作,主设备100的摄像头可将的光圈将“f/8”替换为“f/17”。光圈更换为“f/17”可获得更浅的景深,相应地,显示主设备100采集的图像的窗口可显示主设备100采集的景深更浅的图像。如图10C中窗口1031所示。
图10C示例性示出了主设备100预览窗景深变浅的用户界面103。用户界面103可包括预览窗1031。预览窗1031可用于显示主设备100摄像头采集并处理的图像。
当接收到主设备100向主设备100下发将将的光圈将“f/8”替换为“f/17”的控制命令后,主设备100可将光圈“f/8”替换为“f/17”。然后,预览窗1031可显示光圈大小为“f/17”的主设备100采集并处理的图像。
图6A-图6D、图7A-图7B、图8A-图8B、图9A-图9D、图10A-图10C、图11A-图11C介绍了在直播场景中,主设备100与从设备200建立通信连接并控制从设备200拍摄效果的一系列用户界面。上述控制从设备200拍摄效果的方法还可用于拍照场景中。下面介绍在拍照应用场景中,主设备100与从设备200建立通信连接并控制从设备200拍摄效果的一系列用户界面。
图11A示例性示出了主设备100显示添加从设备的用户界面111。用户界面111可包括添加控件1112、对话框1113。
主设备100可检测到作用于添加控件1112的用户操作,响应于该用户操作,主设备100 可查询具有摄像头的电子设备。当收到上述电子设备传回的具有摄像头的消息时,主设备100可显示对话框1113。对话框1113可用于展示具有摄像头的电子设备的信息。例如,对话框1113示例性示出了两个具有摄像头的电子设备(电子设备1114、电子设备1115)的信息。同样的,上述信息例如该从设备的名称、位置等信息。
以电子设备1115为例,主设备100可检测到作用于电子设备1115的用户操作,响应于该用户操作,主设备100可向从设备200发送使用电子设备1115摄像头的请求。电子设备1115可检测到用户同意主设备100使用自身摄像头的操作,响应于该操作,电子设备1115可同意主设备100使用电子设备1115的摄像头。上述电子设备1115授予主设备100使用权限的过程的用户界面可参考直播场景中的用户界面。如图6C-图6D,或图7A-图7B所示。本申请对此不再赘述。
当从设备200的用户同意主设备100使用从设备200的摄像头后,从设备200可显示用户界面112,参考图11B。同时主设备100可显示用户界面113,如图11C所示。
用户界面112为从设备200上的用户界面。用户界面112示例性示出了从设备200显示提示信息的用户界面。用户界面112可包括提示窗1121和预览窗1122。
当授予主设备100使用从设备200的摄像头后,从设备200可开启自身的摄像头,进一步的,预览窗1122可显示从设备200摄像头采集并处理的图像。用户界面112还可显示提示窗1121。提示窗1121提示用户该设备的相机已经被其他设备使用。例如“当前画面正在被LISA使用”。
如图11C所示,用户界面113为主设备100上的用户界面。用户界面113示例性示出了主设备100显示提示信息的用户界面。
当从设备200显示用户界面112时,主设备100可显示用户界面113。用户界面113可包括窗口1131、窗口1132、提示窗1134。
窗口1131可用于显示主设备100摄像头采集并处理的图像。窗口1132可显示从设备200的摄像头采集并处理的图像。当接收到从设备200发送的从设备200发送的图像时,窗口1132可显示上述图像。提示窗1134可用于显示提示信息,上述提示信息例如是“已连接相机:Phone P40-LouS”。上述提示信息可用于提示用户主设备100已经连接从设备200。
在一些实施例中,窗口1131和窗口1132还可交换显示内容。即窗口1131可显示从设备200摄像头采集并处理的图像。窗口1132可用于显示主设备100摄像头采集并处理的图像。特别的,当窗口1131可显示从设备200摄像头采集并处理的图像时,主设备100可在窗口1131中设有上述删除键1133。响应于作用在删除键1133上的用户操作时,主设备100可关闭窗口1131。
在一些实施例中,窗口1132可以用户界面113中的浮窗的形式显示在窗口1131之上。在另一实施例中,窗口1132还可与窗口1131平铺显示。参考图11D。
图11D示例性示出了另一种主设备100显示提示信息的用户界面114。用户界面114可包括窗口1141、窗口1142、提示窗1143、设置控件1147、设置控件1148。
窗口1141可用于显示主设备100摄像头采集并处理的图像。窗口1142可用于显示从设备200摄像头采集并处理的图像。同样的,窗口1141和窗口1142也可交换显示内容。即窗口1141可显示从设备200摄像头采集并处理的图像。窗口1142可用于显示主设备100摄像头采集并处理的图像。本申请对此不做限制。提示窗1143用于显示提示信息。
设置控件1147可用于显示主设备100的摄像头的拍摄能力选项。设置控件1148可用于显示从设备200的摄像头的拍摄能力选项。
主设备100可检测到作用于设置控件1148的用户操作,响应于该用户操作,主设备100可显示从设备200的拍摄能力选项。上述拍摄能力选项可参考图9B所示的用户界面92中对话框921显示的拍摄能力选项。本申请对此不再赘述。
进一步的,主设备100可检测到作用于某一拍摄能力选项的用户操作,响应于该用户操作,主设备100可向从设备200发送调整拍摄效果的控制命令。
同样以图9B所示的用户界面92为例,响应于作用在上述对话框921中白炽灯923的用户操作,主设备100可向从设备200发送将白平衡模式调整为白炽灯模式的控制命令。响应于上述控制命令,从设备200可对摄像头采集的图像进行白炽灯模式的色温校准。然后,从设备200可将采用白炽灯模式进行色温校准的图像发送至主设备100。
响应于从设备200发送的调整拍摄参数后的图像,主设备100可显示上述图像。
在一些实施例中,用户界面114还可包括删除控件1144、添加控件1145和拍摄控件1146。删除控件1144可用于关闭窗口1142。添加控件1145可用于主设备100发现其他带有摄像头的电子设备。
主设备100可检测到作用于拍摄控件1146的用户操作。响应于该用户操作,主设备100可将窗口1131、窗口1132显示的内容存储为图片或视频。在直播或者视频通话的场景中,直播应用/视频通话应用还可获取上述图片或视频,并将其发送至提供直播/视频通话的服务器。
在一些实施例中,从设备的数量还可以不限于一个。例如,主设备100可检测到添加控件的用户操作,上述添加控件例如是用户界面91的添加控件914、用户界面114的添加控件1145等。响应于该用户操作,主设备100可与其他带有摄像头的电子设备建立连接。进一步的,主设备100可与多个从设备建立连接,并且使用上述多个从设备发送的图像。
在一些实施例中,主设备100还可只显示从设备发送的图像。例如,当主设备100在与从设备200建立通信连接后,主设备100可关闭自身的摄像头,仅显示从设备200发送的图像。
下面介绍本申请实施例提供的跨设备的协同拍摄方法的详细流程。图12示出了该跨设备的协同拍摄方法的详细流程。如图12所示,该方法可包括如下步骤:
S101:主设备100与从设备200建立通信连接。
具体实现中,主设备100和从设备200之间建立的通信连接可以是前述的有线连接、无线连接。
在一些实施例中,主设备100可以先响应于接收到的用户操作(例如图6B所示的作用于添加控件614的用户操作),发现其他具备摄像头的电子设备,然后再向用户选择的电子设备发送连接请求。在该电子设备响应于同意该请求的用户操作后(例如图6D所示的作用于确认控件632的用户操作),主设备100和该电子设备成功建立通信连接。
在另一些实施例中,主设备100可以扫描上述电子设备200的二维码,和上述电子设备200建立连接。具体的,上述电子设备可显示使用该电子设备摄像头的二维码。主设备100可获取到上述二维码,响应于上述操作,主设备100可向上述电子设备发送使用请求并且获得该电子设备的同意。
不限于上述方法,主设备100还可以通过其他方式来建立通信连接。例如基于NFC技术的碰一碰操作。本申请实施例对此不作限制。
具体的,主设备100可通过设备虚拟化套件(DVKit)查找其他具有摄像头的电子设备,并向发现的电子设备发送建立通信连接的请求,在DVKit接收到其他电子设备反馈的同意建 立通信连接的消息后,可和该电子设备建立通信连接。进一步地,DVKit通过分布式设备虚拟化平台(DMSDP)来和上述电子设备建立通信连接,DMSDP具体用于与上述电子设备建立会话。上述会话包括控制会话和数据会话。
此时,与主设备100建立会话的上述电子设备可称之为主设备100的从设备200。
S102:主设备100通过通信连接,获取从设备200的拍摄能力信息。
基于主设备100与从设备200建立的会话通道,DMSDP可将从设备注册到虚拟相机HAL中。同时,DMSDP可向从设备200请求从设备200的拍摄能力。
具体的,从设备200可向通过自身的相机服务模块(CameraService)模块获取自身的拍摄能力。上述拍摄能力包括摄像头的硬件能力和ISP、GPU等图像处理模块的软件能力,以及一些组合硬件能力和软件能力的拍摄能力,具体可参考图3中表1的介绍。
然后,从设备200可将上述拍摄能力发送给主设备100。当接收到从设备200发送的上述拍摄能力后,DMSDP可将上述拍摄能力信息发送到HAL层虚拟HAL模块。进一步的,虚拟相机HAL还可将上述拍摄能力发送到相机管理(CameraManager)。
表1示出了从设备200可包括的硬件能力、结合硬件能力和软件能力的拍摄功能。例如从设备200具备的摄像头的数量、摄像头ID、像素、光圈大小、变焦范围、滤镜、美颜以及各种拍摄模式。特别的,夜景模式、人像模式等拍摄模式不仅包括从设备200的摄像头的硬件能力,还包括从设备200的图像处能力。
设备协同管理和动态流水线可以获知当前从设备200注册到了虚拟相机HAL中。
当从设备200注册到了虚拟相机HAL中后,设备协同管理需要对从设备200进行协同管理。具体的,在主设备100同时显示来自主设备100和从设备200的图像时,设备协同管理可根据摄像头的ID区分上述图像来自哪一个设备的哪一个摄像头,例如主设备的摄像头1、从设备的摄像头1001。然后,设备协同管理可通过对主设备100的图像进行重复或加缓存的方式,使显示来自主设备100和从设备200的图像的时延在人眼可接受的范围内。
同样的,动态流水线也可根据摄像头的ID等信息,对流向主设备100或从设备200的控制命令进行区分,进而将发送至从设备200的控制命令发送到虚拟相机HAL。
完成注册之后,虚拟HAL可向DVKit发送更改连接状态的通知。具体的,虚拟HAL可通知DVKit将未连接状态更改为已连接状态。上述未连接状态是指主设备100未与其他电子设备建立使用该电子设备摄像头的连接。相应地,上述已连接状态是指主设备100已经与其他电子设备建立使用该电子设备摄像头的连接。
S103:从设备200采集并处理图像,然后将采集并处理后的图像发送给主设备100。
在从设备200和主设备100成功建立通信连接后,从设备200可自动开始采集并处理图像。
参考图6D和图8A所示的用户界面。当从设备200同意与主设备100连接后,从设备200可打开自身的相机应用。相机应用的用户界面如图8A所示。同时,从设备200可显示自身摄像头采集并处理的图像,参考图8A预览框812。
然后,从设备200可将上述图像发送至主设备100。主设备100可显示上述图像。如图8B所示,主设备100可在用户界面82中增加一个窗口822。该预览窗可显示从设备200采集并处理的图像。从而,主设备100可实现跨设备的协同拍摄,并在显示屏上同时显示多个摄像头采集并处的图像。
在一些实施例中,主设备100还可只控制并使用从设备200的摄像头,即主设备100只显示从设备200的图像,而不显示主设备100的采集并处理的图像。
在另一些实施例中,在主设备100和从设备200建立通信连接后,主设备100可响应用户打开从设备200的摄像头的操作,向从设备200发送打开摄像头的控制命令。从设备200可响应上述命令并打开摄像头。
例如在图6D所示的用户界面63后,主设备100还可显示一个提示窗。该提示窗可询问用户是否打开从设备200的摄像头。当主设备100检测作用于打开从设备200的用户操作时,主设备100可向从设备200发送打开摄像头的控制命令。响应于该控制命令,主设备100可打开摄像头,并获得摄像头采集并处理后的图像。
进一步的,从设备200可将上述图像发送至主设备100。主设备100可显示上述图像,如图8B所示的窗口822。同时,从设备200也可在自身的显示屏上显示上述图像。如图8A所示的用户界面81。
在一些实施例中,从设备200也可不显示自身摄像头采集并处理的图像。
在上述两种方案中,从设备200的摄像头使用的拍摄参数可以是默认的,例如从设备200可以默认使用后置普通摄像头采集、摄像头使用一倍焦距、色温校准使用默认的日光模式、光圈大小为f/1。6、开启光学防抖、关闭闪光灯、快门时间1/60、ISO感光度400、像素8192×6144、裁剪框尺寸3:4、关闭美颜/美体算法、无滤镜、无贴纸等等。
然后,主设备100可以向从设备200发送一系列的协同拍摄的控制命令,例如拍照,录像或者例如更换滤镜等调整拍摄效果的控制命令。从设备200可根据上述命令对图像的采集和处理进行调整。
S104:主设备100响应于接收到的用户操作,向从设备200发送用于控制拍摄效果的命令。
控制命令包括以下信息:响应于用户特定操作调整的拍摄参数、流创建命令的类型(即预览命令、录像命令、拍照命令)。多流配置会根据不同的流创建命令配置不同数量和类型的流。
用户调整的拍摄参数取决于主设备100接收到的用户操作,该拍摄参数可包括但不限于:采集图像时涉及的摄像头的硬件参数,和/或,处理图像时涉及的软件参数。拍摄参数还包括一些硬件参数和软件参数的组合参数。例如混合变焦范围、夜间模式、人像模式、延时拍摄、慢动作、全景模式、HDR等等。
其中,硬件参数包括以下一项或多项:摄像头的ID、光学变焦范围、是否开启光学图像防抖、光圈大小调节范围、是否开启闪光灯、是否开启补光灯、快门时间、ISO感光度值、像素以及视频帧率等等。
软件参数包括以下一项或多项:数码变焦值、图像裁剪大小、图像的色温校准模式、是否对图像进行降噪方式、美颜/美体类型、滤镜类型、贴纸选项、是否开启自拍镜像等等。
在一些实施例中,上述控制命令还可包括其他拍摄参数的默认值。上述其他拍摄参数是指除去用户操作调整的参数之外的其他拍摄参数。
携带拍摄参数的控制命令可通过主设备100和从设备200建立的通信连接发送到从设备200。具体的,虚拟相机HAL可将创建的控制命令发送到设备虚拟化平台(DMSDP),DMSDP可包括有主设备100和从设备200的数据会话通道和控制会话通道。上述控制命令可通过控制会话通道,发送至从设备200。
例如,“将从设备200的白平衡模式设置为白炽灯模式”控制命令可包括:修改的拍摄参数(白平衡=白炽灯模式),流创建命令可以是预览命令。上述控制命令还可包括默认的拍摄参数,例如一倍焦距、无滤镜、关闭美颜等。
如用户界面92所示对话框921所示。根据S102中获取到的从设备200的拍摄能力信息,主设备100可在显示屏上显示对应上述拍摄能力的拍摄能力选项。主设备100可检测到作用于某一拍摄能力选项的用户操作。响应于该用户操作,主设备100可向从设备200发送控制拍摄效果的命令。
以“将从设备200的白平衡模式设置为白炽灯模式”的控制命令为例,主设备100可检测到作用于白炽灯模式的用户操作,例如作用于用户界面92中白炽灯923的用户操作,响应于该用户操作,主设备100可向从设备200发送上述控制命令,该控制命令用于控制从设备200将白平衡模式设置为白炽灯模式。
在创建并发送上述控制命令的过程中,主设备100还会对上述命令进行多种处理,例如动态流水线、多流配置以及复用控制。
动态流水线处理过程可参考图13。
当应用程序产生一条将“将从设备200的白平衡模式设置为白炽灯模式”的控制命令时,相机应用131可生成一条带有预览控制1311和调整拍摄效果1312的控制命令。该控制命令对应一条重复帧控制1314。重复帧控制可表示该控制命令是作用在多个帧上的。重复帧控制可包括字段Cmd和Surfaces。Cmd字段可用于表示控制命令。在一些实施例中,cmd中还可包括该控制命令的编号。Surfaces可用于接收渲染画面的视图,并将产生的效果送给surfaceflinger进行图像合成显示到屏幕。
动态流水线可为上述控制命令添加标签。上述标签可包括按需下发标签、流向标签、重复帧标签。相机服务132可包括待处理命令队列1321。当上述重复帧控制命令达到待处理命令队列1321时,该重复帧控制命令可替换原待处理命令队列1321中的基础命令1322(例如“cmd+streamIds+buffer”),上述基础命令还包括其他控制拍摄效果的默认参数,例如白平衡的默认模式:日光模式、默认滤镜:无滤镜等等。
替换后,待处理命令队列1321中增加一条按需下发的重复帧控制命令1324(例如“cmd+streamIds+buffer+白炽灯模式+IsNew+Repeating”)。重复帧控制命令1324可新增两个字段IsNew和Repeating。IsNew可用于表示该命令为应用下发的按需控制。Repeating可用于表示该命令为重复帧控制。同时,重复帧控制命令1324可将原来的默认的白平衡模式模式(例如上述日光模式)替换为白炽灯模式(即白平衡=白炽灯模式)。
此外,动态流水线还可标记将上述控制命令发送到从设备200的流向标签。例如,动态流水线可为上述控制命令添加流向标签Device。Device可通过摄像头编号(ID)来表示控制命令作用的对象。参考表1,当Device=1时,上述控制命令可表示流向主设备100前置镜头的控制命令。主设备100的本地相机HAL可收到上述控制命令。当Device=1002时,上述控制命令可表示流向从设备200后置普通镜头的控制命令。主设备100的虚拟相机HAL可收到上述控制命令。因此,将从设备200的白平衡模式设置为白炽灯模式的根据控制命令的Device可设置为1002。
然后,流处理模块中的多流配置可为上述控制命令添加流配置信息。上述控制命令包括预览控制1311和“将白平衡模式设置为白炽灯模式”的拍摄效果1312。即多流配置可为预览控制1311配置一条预览流(1080P)和一条分析流(720P)。多流配置具体的配置规则可参考图3中表2的介绍,这里不再赘述。
复用控制可将多流配置模块配置的多条流进行复用。经过复用控制,从设备200向主设备100传回的流的数量可以减少,从而降低网络负载,提升传输效率。具体的,复用控制可使用高质量的流覆盖低质量的流。例如,画面质量为720P的分流流可复用1080P的预览流。 因此,对于要求预览流(1080P)和分析流(720P)的控制命令,从设备200可只传回1080P的预览流。
进一步的,根据流向标签Device=1002,上述控制命令可被发送至HAL层的虚拟相机HAL模块。如图13所示,虚拟相机HAL可将按需下发的控制命令1324过滤出来。过滤出的按需控制命令可存入发送命令队列1331中。主设备100可将发送队列1331中的控制命令发送至从设备200。
在一些实施例中,当控制命令为开始拍照/录像的命令时,多流配置还可配置拍照流、录像流。根据多流配置的不同,复用控制也会相应地发生变化。
图14示例性示出了当控制命令为拍照命令时,一种不同的流的复用与分流示例。图14可包括前处理部分和后处理部分。前处理部分可通过流处理的前处理模块完成,后处理部分可通过流处理的后处理模块完成,上述两个模块可参考图3中的介绍。
当相机设备会话141发送一条拍照命令时,多流配置142模块可为上述命令配置一条预览流(1080P)、一条分析流(720P)、一条录像流(4K)、一条拍照流(4K)。然后,复用控制143模块可将上述配置四条流的信息复用为配置两条流:一条预览流(1080P)、一条录像流(4K)。最后,主设备100将经过分流与复用的拍摄控制命令发送至从设备200。
在主设备100响应于接收到的用户操作,向从设备200发送用于控制拍摄效果的命令的过程中,主设备100可对控制命令中要求的图像流(流)进行复用。响应于复用后的控制命令,从设备200可以只发送复用后的图像流,从而减少了网络中传输的图像流,降低了网络负载,进而提升传输效率。
S105:从设备200接收到用于控制拍摄效果的命令,响应于该命令,从设备200可采集并处理图像。
从设备200的相机代理服务可接收主设备100发送的调整拍摄效果的控制命令。具体的,DMSDP的可建立有主设备100与从设备200之间的数据会话通道和控制会话通道。从设备200的相机代理服务模块可接受到上述控制会话通道发送的控制命令。响应于上述控制命令,从设备200可根据该控制命令中携带的拍摄参数来采集并处理图像。
具体的,从设备200可以根据控制命令中携带的拍摄参数来执行相应的操作。当控制命令中携带有硬件参数时,从设备200可以根据该硬件参数来采集图像,可包括但不限于:从设备200使用控制命令中指示的前置或后置摄像头、光学变焦范围、开启光学图像防抖、开启闪光灯、开启补光灯、30fps的帧率来采集图像。当控制命令中携带有软件参数时,从设备200可以根据该软件参数来对采集到的图像进行处理,可包括但不限于:对采集到的图像进行裁剪、色温校准、降噪、添加滤镜效果等等。
同样以上述“将从设备200的白平衡模式设置为白炽灯模式”的控制命令为例,当从设备收到上述命令后,从设备可对上述命令进行适配和解析,然后就将上述命令发送的从设备200的本地相机HAL,最后得到处理结果。
图13从设备200部分示例性示出了上述控制指令在从设备的处理过程。
相机代理服务134可包括接收命令1341、surfaces映射1344表以及重复帧控制1342,接收命令队列1341可接受到主设备100发送的将从设备200的白平衡模式设置为白炽灯模式的控制命令。上述控制命令可参考命令可参考图中命令1343(即“cmd+streamIds+Repeating+白炽灯模式+1002,一条1080P预览流”)。其中白炽灯模式可表示将从设备200的白平衡模式设置为白炽灯模式,1002可标识该命令发送的对象为从设备200的摄像头1002。
首先,相机代理服务134可根据surface映射表1344,将接收命令队列中的控制命令的 StreamIds转换成对应从设备200的Surface。在相机服务中,streamId与surfaceId之间存在一一对应关系,即一个streamId对应一个surfaceId。surfaceId可用于标识surfaces。上述surfaces可作为显示从设备200生成的流的图像的载体。参考前述实施例的介绍,streamId可指示一条具体的流,surfaces可指示具体显示图像绘制的载体。从设备200需要根据主设备100发送的请求生成相应地流,因此,从设备200在接收到控制命令后,需要根据映射表1344将StreamIds生成与之对应的surfaces。
然后,待处理命令队列1351可使用上述的重复帧控制替换原队列中的基础命令,得到待处理命令1352。从设备200可根据Device=1002,将上述命令发送至摄像头1002的命令处理队列1361。响应上述命令,摄像头1002的ISP可将白平衡模式设置为白炽灯模式,从而摄像头1002的ISP可将摄像头1002采集的图像进行白炽灯色温校准。上述ISP也可以是图像视频处理器(imagevideoprocessor,IVP)/自然处理单元(natural processing unit,NPU)/数字信号处理(digital signal processing,DSP)等,本申请对此不作限制。
调整白平衡模式后,从设备200可得到新的图像。
参考图9A所示的用户界面91,在响应上述将从设备200的白平衡模式设置为白炽灯模式的控制命令之前,从设备200的摄像头1002采集并处理后的图像可参考窗口912显示内容。如窗口912所示,在默认的白平衡选项(例如日光模式)下,摄像头1002采集并处理后的图像可能亮度较低,图像偏灰。当响应于将从设备200的白平衡模式设置为白炽灯模式的控制命令之后,摄像头1002采集并处理后的图像可参考用户界面93中窗口931所示。此时,该图像亮度较高,画面整体色调更贴近人眼观察到的色彩。
可以理解的是,上述对从设备200响应于调整拍摄效果的控制命令之前和之后的图像效果的描述均为示例。上述描述不应构成对本申请实施例的限制。
S106:从设备200显示处理后的图像。
该步骤是可选的。同S103中所述,在一些实施例中,当从设备200同时主设备100使用自身摄像头后,从设备200可在本设备显示屏上显示自身摄像头采集并处理后的图像。参考图8A所示的用户界面81。因此,当收到响应调整拍摄效果的控制命令后,从设备200也可显示更新后的图像。如图9D所示的用户界面94。
在另一些实施例中,从设备200在打开摄像头时候也可不显示自身摄像头采集并处理的图像。因此,当从设备200响应主设备100发送的调整拍摄效果的控制命令后,从设备200也可不显示调整后的图像。即从设备仅将获得图像发送给主设备100使用,自身的显示屏并不现实自身摄像头采集并处理的图像。
S107:从设备200将处理后的图像发送给主设备100。
响应于主设备100发送的调整控制拍摄效果的控制命令,从设备200可得到新的图像。响应于主设备100要求的流的类型和数量,从设备200可相应地生成主设备100要求的流。在一些实施例中,从设备200可流采集一组图像流。该组图像流为主设备100要求的流的类型和数量中图像质量最高的一组。上述图像质量最高可以是分辨率最高等等。然后,从设备200可将上述采集的一组图像流复制出多条流,并根据主设备100要求的流的类型和数量对复制出的流进行压缩、调整等处理,例如从采集的一组1080P的录像流中复制出一组720P的分析流。
同样以“将从设备200的白平衡模式设置为白炽灯模式”的控制命令为例,在前述的介绍中,复用控制模块可将多流配置模块配置的两条流复用为一条流。因此,主设备100最终向从设备200发送了要求一条1080P的预览流的控制命令。具体参考S103的介绍。
响应于主设备100发送的上述控制命令,从设备200的本地相机HAL可以产生一条1080P预览流。上述预览流已经将白平衡模式设置为白炽灯模式。然后,本地相机HAL可将上述流发送至从设备200的相机代理服务,进一步的,相机代理服务可将上述流传回主设备100。
在一些实施例中,从设备200向主设备100传回的流大于1条流,例如图14示例性示出的多条流情形。当主设备100向从设备200发送拍照控制命令时,多流配置和复用控制最后为上述拍照控制命令配置了一条预览流(1080P)和一条录像流(4K),具体可参考S103的介绍。
响应于上述拍照控制命令,从设备200可产生一条1080P预览流(流1)、一条4K录像流(流2)和一条拍照图像流(流3)。在可选的实施例中,从设备200可根据流2采集一组图像流。根据流2,从设备200可复制出另外两组流(流1’、流2’),根据主设备100对流1和流2的要求,从设备200可对流1’和流2’进行处理,从而得到流1和流2,例如将4K的流1’进行压缩得到流1(1080P),从4K流2’获取拍照图像等等。然后,从设备200的相机代理服务可将上述3条流发送给主设备100。
S108:主设备100显示从设备200发送的图像。
主设备100可将从设备200传回的流恢复成主设备100实际需要的流的数量和类型。分析流可用于做图像处理,预览流可用于显示等。主设备100可根据恢复后的流的类型将各条流发送相应地模块进行处理和利用。利用从设备200发送预览流,主设备100可实现显示图像的功能。
主设备100的后处理模块可对从设备200进行智慧分流和多流输出。
首先,智慧分流模块可记录多流配置模块配置的流的数量和种类。当收到从设备200传回的经过复用的流后,智慧分流模块可根据前述的记录,将收到的流复制出多条流。上述复制的多条流可以与收到的流的质量相同,也可以比收到的流的质量略低,从而得到与前述多流配置模块配置的流的数量与类型相同的多条流。然后,分流后得到的多条流可被发送到相应地模块进行处理和利用。
例如,在将从设备200的白平衡模式设置为白炽灯模式的示例中,主设备100可收到从设备200传回的一条1080P预览流。此时,智慧分流模块可将上述预览流恢复成前述多流配置模块配置的一条预览流(1080P)、一条分析流(720P)。
图14还示出了另一个智慧分流的示例。该图示出了主设备100收到多条从设备200传回的流的分流过程。
如图所示,主设备100可收到一条1080P流(流1)、一条4K流(流2)和一条图片流(流3)。智慧分流模块可将流1分为一条1080P预览流和一条720P分析流。流2可分为一条1080P流(流1)和一条4K录像流,并且上述1080P流(流1)又可继续分流,参考流1的分流。流3可分为一条4K拍照流。
分流过程可将经过复用的流恢复成原来主设备100要求的流的,从而在实现复用带来的降低网络负载、提升传输效率的同时,也实现满足应用程序原本的需求,从而不影响应用程序的正常使用。
分流后得到的多条流可分别被发送到应用层相应地模块,供给应用程序使用。基础的,应用程序可显示预览流。在一些实施例中,主设备100可只显示从设备200传回的图像。在另一些实施例中,主设备100可同时显示从设备200传回的图像和自身摄像头采集并处理的图像,参考图9C所示的用户界面93。
如图9C所示,主设备100显示从设备200传回的图像可通过浮窗的形式,参考窗口931。 在一些实施例中,主设备100可检测到作用于窗口931的用户操作,响应于该操作,主设备100可移动浮窗的位置到用户界面93的任意位置。上述操作例如是长按拖动操作。在一些实施例中,主设备100可检测到作用于窗口931的另一种用户操作,响应于该操作,主设备100可交换浮窗和预览窗显示的内容。上述操作例如是点击操作。此外,主设备100还可响应用户的其他操作,调整浮窗的大小等等。
在另一些实施例中,主设备100同时显示从设备200传回的图像和自身摄像头采集并处理的图像时,还可划分成两个平铺的窗口显示,例如用户界面114。
可以理解的是,当主设备100连接多个从设备时,主设备100可显示上述多个从设备传回的图像。同样的,主设备100可以通过浮窗显示,可以平铺显示,本申请实施例对此不做限制。
在一些实施例中,当控制命令为调整参数后的拍照命令时,在后处理对从设备200传回的流进行分流之前,帧同步模块可对收到的从设备200传回的流进行同步。帧同步可减小由于网络传输产生的时延误差。特别是对于拍照流,经过帧同步的拍照流获得的拍照结果可更接近用户需求。
参考图15,图15示例性示出了帧同步方法示意图。
如图15所示,帧同步可包括三部分。第一部分151和第三部分153可表示主设备100上发生的过程。第二部分152可表示从设备200上发生的过程。下面以7个帧组成的流和拍照命令为例,主设备100对拍照命令的帧同步过程。
在第二帧1511时,主设备100可创建一个拍照命令1512。然后,主设备100可通过有线连接或无线连接将上述命令发送至从设备200。从设备200在第三帧1521时收到上述命令。因此,从设备200将第三帧1521的图像作为执行拍照命令1512的结果。处理完拍照命令的从设备200可将上述处理结果传回主设备100。主设备100收到的处理结果为第三帧1521图像。此时,主设备100可对上述结果做同步处理。
在拍照场景中,帧同步可对处理结果向前推移。例如主设备100收到的处理结果为第三帧1521图像。主设备100可向前推移一帧,将第二帧1511作为同步后的处理结果。可以理解的是,从设备200延后一个帧时间收到命令为示例性网络延迟。即在一些实施例中,从设备200收到命令的时间还可能是第四帧、第五帧等。从设备200收到命令的时延依据实际网络通信质量发生变化。同样的主设备100向前推移一个帧作为同步结果也是举例示范。
在一些实施例中,主设备100向前推移也可能并不会得到第二帧1511作为同步后的处理结果。例如在第二帧时,主设备100可创建一个拍照命令。然后,从设备200在第四帧时收到上述命令。因此,从设备200将第四帧的图像作为执行拍照命令的结果。主设备100对收到第四帧图像做向前推移同步处理得到第三帧图像。
当控制命令为仅仅为调整参数的命令,或者录像命令时,例如上述“将白平衡模式设置为白炽灯模式”的控制拍摄效果的控制命令,主设备100可不对收到的图像进行帧同步。
S109:主设备100对显示的图像进行拍照、录像或转发等处理。
当主设备100的应用程序获得从设备200发送的图像后,主设备100还可对上述图像进行进一步的利用。
如图9C所示,在直播应用场景中,主设备100的直播应用程序可对获得图像进行转发。具体的,主设备100可上述图像转发到直播的服务器。服务器可将上述图像分发给观看直播的用户设备。
如图11D所示,在拍照或录像的应用场景中,主设备100可检测到作用于拍摄控件1146 的用户操作,响应于该操作,主设备100可将对获得的从设备200摄像头采集的图像进行拍照存储或录像存储。
在本申请实施例中:
在上述方法S103中,在接收到主设备发送的调整拍摄效果的控制命令之前,从设备采集并处理得到的图像可称为第一图像,例如用户界面91中的窗口912所示的图像。
在上述方法S108中,主设备向从设备发送调整拍摄效果的控制命令之后,从设备根据该控制命令采集并处理的图像,可以被称为第二图像。例如用户界面93中的窗口921所示的图像。
第一拍摄参数:从设备采集并处理得到第一图像使用的拍摄参数可称为第一拍摄参数,第一拍摄参数可以是默认参数,也可以是前一次接收的主设备发送的控制命令中携带的拍摄参数。
第二拍摄参数:从设备采集并处理得到第二图像使用的拍摄参数可称为第二拍摄参数,即主设备向从设备发送调整拍摄效果的控制命令中携带的拍摄参数。
主设备根据自身摄像头的默认拍摄参数采集并处理得到的图像可被称为第三图像,例如用户界面91中窗口911显示的图像。
用户可调整主设备的拍摄效果,响应于上述调整操作,主设备可控制自身的摄像头调整拍摄参数。主设备根据上述调整的拍摄参数采集并处理可得到新的图像,上述新的图像可被称为第四图像。例如用户界面103中窗口1031显示的图像。
在上述方法S104中,主设备根据应用层的要求配置的流的数量为第一数量,主设备将可复用的流进行复用后得到的,发送给从设备的流的数量为第二数量。主设备根据应用层的要求配置的流的类型为第一类型,主设备将可复用的流进行复用后得到的,发送给从设备的流的类型为第二类型。
下面介绍另一种可能的主设备100的软件结构。
与图3所示软件结构的不同之处在于:本实施例提供的软件框架将整个流生命周期的管理放在了HAL层实现。即在本申请实施例中,图3中服务层的流处理模块被移到了HAL层。其他模块不变。本申请实施例提供的另一种软件结构的其他部分可参考图3的介绍,本申请实施例对此不再赘述。
在HAL层实现上述多流配置、复用控制等关于流的前处理、后处理操作,可以实现更高的处理效率。
在跨设备协同拍摄中,主设备可连接一个或多个从设备,不仅可以为用户提供多视角的拍摄体验,还可以控制从设备的拍摄效果,例如控制从设备的对焦、曝光、变焦等等,从而满足用户控制远端拍摄效果的需求。进一步的,实施该跨设备的协同拍摄方法,主设备可以获取从设备所有的预览画面、拍照结果、录像结果。例如,主设备可以通过拍照、录像,对从设备的画面进行存储,或者在直播的应用场景中,将从设备的画面转发第三方服务器等。
实施该跨设备的协同拍摄方法还可以解决带操作系统的设备间的分布式控制,将电子设备的功能延伸到其他普通的硬件摄像设备,灵活拓展镜头。例如上述方法可以支持通过手机控制获得多设备的数据流,实现手机与大屏、手表、车机等的协同录制。
此外,跨设备协同拍摄中流的复用与分流还可网络传输的负载,提升传输效率,进而降低传输时延,保证画质清晰流畅。
上述跨设备协同拍摄的方法、系统及装置还可拓展到分布式音频场景中,例如:将分布 式相机框架作用于分布式音频框架。分布式音频场景可以实现分布式音频与视频的统一,提升整个跨设备通信的效率。
上述实施例中所用,根据上下文,术语“当…时”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机命令。在计算机上加载和执行所述计算机程序命令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机命令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机命令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来命令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (15)

  1. 一种跨设备的协同拍摄方法,其特征在于,所述方法应用于主设备,所述主设备与m个从设备建立有通信连接,m为大于或等于1的整数,所述方法包括:
    所述主设备显示一个应用的界面;
    所述主设备接收到从设备的发送的第一图像,所述第一图像是从设备根据第一拍摄参数得到的图像;
    所述主设备在所述界面上显示所述m个第一图像;
    所述主设备接收到至少一个操作;
    所述主设备响应于所述至少一个操作,向所述从设备发送携带第二拍摄参数的控制命令,所述第二拍摄参数用于调整所述从设备的拍摄效果;
    所述主设备接收到所述从设备发送的第二图像,所述第二图像由所述从设备根据所述拍摄参数得到的图像;
    所述主设备在所述界面上显示所述第二图像,并不再显示所述第一图像。
  2. 根据权利要求1所述的方法,其特征在于,
    所述界面还包括所述从设备对应的多个拍摄选项,所述多个拍摄选项分别对应于所述从设备的拍摄能力;
    其中,所述操作包括:作用于所述多个拍摄选项中一个拍摄选项的操作;所述第二拍摄参数包括:所述操作所作用的拍摄选项对应的拍摄参数。
  3. 根据权利要求2所述的方法,其特征在于,所述主设备在所述界面上显示多个拍摄选项之前,所述方法还包括:
    所述主设备获取所述从设备的拍摄能力;其中,所述第二拍摄参数在所述从设备的拍摄能力范围内。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    所述主设备采集并处理图像,得到第三图像;
    所述主设备在所述界面上还显示所述第三图像。
  5. 根据权利要求4所述的方法,其特征在于,
    所述界面还包括所述主设备对应的多个拍摄选项,所述主设备对应的多个拍摄选项对应于所述主设备的拍摄能力;
    所述主设备接收到作用于所述主设备对应的多个拍摄选项中一个拍摄选项的另一个操作,根据所述另一个操作所作用的拍摄选项对应的拍摄参数,采集并处理图像,得到第四图像;
    所述主设备在界面上显示所述第四图像,并不再显示所述第三图像。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述主设备响应于所述至少一个操作,向所述从设备发送携带第二拍摄参数的控制命令之前,所述方法还包括:
    所述主设备确定第一数量和第一类型,所述第一数量为显示所述第二图像所需的图像流的数量,所述第一类型包括显示所述第二图像所需的图像流的类型;
    所述主设备确定第二数量和第二类型,所述第二数量小于第一数量,所述第一类型包含所述第二类型;
    其中,所述控制命令还携带有:所述第二数量、所述第二类型;
    所述第二图像包括:所述从设备根据所述拍摄参数,采集并处理得到的所述第二数量和 所述第二类型的图像流。
  7. 根据权利要求6所述的方法,其特征在于,所述主设备接收到所述从设备发送的第二图像之后,所述方法还包括:
    所述主设备将所述第二图像处理为所述第一数量和所述第一类型的图像流;
    所述主设备在界面上显示所述第二图像,包括:所述主设备根据所述第一数量和所述第一类型的图像流,在所述界面上显示所述第二图像。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,所述应用包括:拍摄类应用程序,所述界面包括所述拍摄类应用程序的一个使用界面。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述应用包括:直播类应用程序,所述界面包括所述直播类应用的一个使用界面;
    所述主设备接收到所述从设备发送的第二图像之后,所述方法还包括:
    所述主设备将所述第二图像发送给所述直播类应用程序对应的服务器,由所述服务器将所述第二图像发送给其他设备。
  10. 一种跨设备的协同拍摄方法,其特征在于,所述方法应用于从设备,所述从设备与主设备建立有通信连接,所述方法包括:
    所述从设备根据第一拍摄参数采集并处理得到第一图像;
    所述从设备将所述第一图像发送给所述主设备;
    所述从设备接收到所述主设备发送的携带第二拍摄参数的控制命令,所述第二拍摄参数用于调整所述从设备的拍摄效果;
    所述从设备根据所述第二拍摄参数,采集并处理得到第二图像;
    所述从设备将所述第二图像发送给所述主设备。
  11. 根据权利要求10所述的方法,其特征在于,所述方法还包括:
    所述从设备显示一个应用的界面;
    所述从设备在所述界面上显示所述第一图像;
    所述从设备根据所述第二拍摄参数,采集并处理得到第二图像;
    所述从设备在所述应用界面显示所述第二图像,并不再显示所述第一图像。
  12. 一种电子设备,其特征在于,所述电子设备包括一个或多个处理器和一个或多个存储器;其中,所述一个或多个存储器与所述一个或多个处理器耦合,所述一个或多个存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,使得所述电子设备执行如权利要求1-10任一项所述的方法。
  13. 一种电子设备,其特征在于,所述电子设备包括一个或多个处理器和一个或多个存储器;其中,所述一个或多个存储器与所述一个或多个处理器耦合,所述一个或多个存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,使得所述电子设备执行如权利要求10-11任一项所述的方法。
  14. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-10任一项所述的方法。
  15. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求10-11任一项所述的方法。
PCT/CN2022/070618 2021-02-04 2022-01-07 跨设备的协同拍摄方法、相关装置及系统 WO2022166521A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280009678.9A CN116724560A (zh) 2021-02-04 2022-01-07 跨设备的协同拍摄方法、相关装置及系统

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110154962.2 2021-02-04
CN202110154962.2A CN114866681B (zh) 2021-02-04 2021-02-04 跨设备的协同拍摄方法、相关装置及系统

Publications (1)

Publication Number Publication Date
WO2022166521A1 true WO2022166521A1 (zh) 2022-08-11

Family

ID=82623054

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/070618 WO2022166521A1 (zh) 2021-02-04 2022-01-07 跨设备的协同拍摄方法、相关装置及系统

Country Status (2)

Country Link
CN (3) CN114866681B (zh)
WO (1) WO2022166521A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115379126A (zh) * 2022-10-27 2022-11-22 荣耀终端有限公司 一种摄像头切换方法及相关电子设备
CN116471429A (zh) * 2023-06-20 2023-07-21 上海云梯信息科技有限公司 基于行为反馈的图像信息推送方法及实时视频传输系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104113697A (zh) * 2014-08-01 2014-10-22 广东欧珀移动通信有限公司 协同拍照处理方法和装置、拍照处理方法和装置
CN104427228A (zh) * 2013-08-22 2015-03-18 展讯通信(上海)有限公司 协作拍摄系统及其拍摄方法
US20150126121A1 (en) * 2013-11-05 2015-05-07 Samsung Electronics Co., Ltd Display apparatus and method of controlling display apparatus
CN108668071A (zh) * 2017-03-29 2018-10-16 至美世界(北京)网络科技有限公司 一种拍摄方法、装置、系统及一种移动终端
CN108900764A (zh) * 2018-06-06 2018-11-27 三星电子(中国)研发中心 拍摄方法和电子装置以及拍摄控制方法和服务器
CN110291774A (zh) * 2018-03-16 2019-09-27 深圳市大疆创新科技有限公司 一种图像处理方法、设备、系统及存储介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5977498B2 (ja) * 2011-10-14 2016-08-24 キヤノン株式会社 撮像装置、撮像装置の制御方法
CN103634524A (zh) * 2013-11-15 2014-03-12 北京智谷睿拓技术服务有限公司 相机系统的控制方法、控制设备及相机系统
KR102145542B1 (ko) * 2014-08-14 2020-08-18 삼성전자주식회사 촬영 장치, 복수의 촬영 장치를 이용하여 촬영하는 촬영 시스템 및 그 촬영 방법
CN104601960B (zh) * 2015-01-30 2018-01-12 深圳市视晶无线技术有限公司 影视拍摄控制管理方法及系统
CN106657791A (zh) * 2017-01-03 2017-05-10 广东欧珀移动通信有限公司 一种合成图像的生成方法及装置
CN106803879A (zh) * 2017-02-07 2017-06-06 努比亚技术有限公司 协同取景拍摄装置及方法
CN107707862A (zh) * 2017-05-25 2018-02-16 北京小米移动软件有限公司 视频远程协助的处理方法和装置、第一终端、第二终端
CN109120504B (zh) * 2017-06-26 2023-05-09 深圳脸网科技有限公司 一种影像设备共享方法及其社交方法
CN111327865B (zh) * 2019-11-05 2021-12-28 杭州海康威视系统技术有限公司 视频传输方法、装置及设备
CN111050072B (zh) * 2019-12-24 2022-02-01 Oppo广东移动通信有限公司 一种异地合拍方法、设备以及存储介质
CN111083379A (zh) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 拍摄方法及电子设备
CN111988528B (zh) * 2020-08-31 2022-06-24 北京字节跳动网络技术有限公司 拍摄方法、装置、电子设备及计算机可读存储介质
CN112261430A (zh) * 2020-10-21 2021-01-22 深圳市炫刷刷网络科技有限公司 一种带有一个以上移动摄像装置的直播系统及其直播方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104427228A (zh) * 2013-08-22 2015-03-18 展讯通信(上海)有限公司 协作拍摄系统及其拍摄方法
US20150126121A1 (en) * 2013-11-05 2015-05-07 Samsung Electronics Co., Ltd Display apparatus and method of controlling display apparatus
CN104113697A (zh) * 2014-08-01 2014-10-22 广东欧珀移动通信有限公司 协同拍照处理方法和装置、拍照处理方法和装置
CN108668071A (zh) * 2017-03-29 2018-10-16 至美世界(北京)网络科技有限公司 一种拍摄方法、装置、系统及一种移动终端
CN110291774A (zh) * 2018-03-16 2019-09-27 深圳市大疆创新科技有限公司 一种图像处理方法、设备、系统及存储介质
CN108900764A (zh) * 2018-06-06 2018-11-27 三星电子(中国)研发中心 拍摄方法和电子装置以及拍摄控制方法和服务器

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115379126A (zh) * 2022-10-27 2022-11-22 荣耀终端有限公司 一种摄像头切换方法及相关电子设备
CN116471429A (zh) * 2023-06-20 2023-07-21 上海云梯信息科技有限公司 基于行为反馈的图像信息推送方法及实时视频传输系统
CN116471429B (zh) * 2023-06-20 2023-08-25 上海云梯信息科技有限公司 基于行为反馈的图像信息推送方法及实时视频传输系统

Also Published As

Publication number Publication date
CN114866681A (zh) 2022-08-05
CN115514883A (zh) 2022-12-23
CN114866681B (zh) 2023-12-01
CN116724560A (zh) 2023-09-08
CN115514883B (zh) 2023-05-12

Similar Documents

Publication Publication Date Title
WO2020238871A1 (zh) 一种投屏方法、系统及相关装置
US20230055623A1 (en) Video shooting method and electronic device
WO2022166521A1 (zh) 跨设备的协同拍摄方法、相关装置及系统
WO2022105803A1 (zh) 摄像头调用方法、系统及电子设备
WO2022160985A1 (zh) 一种分布式拍摄方法,电子设备及介质
WO2023226612A1 (zh) 一种曝光参数确定方法和装置
WO2023005900A1 (zh) 一种投屏方法、电子设备及系统
WO2022262416A1 (zh) 音频的处理方法及电子设备
CN115359105A (zh) 景深扩展图像生成方法、设备及存储介质
WO2022156721A1 (zh) 一种拍摄方法及电子设备
CN116170629A (zh) 一种传输码流的方法、电子设备及计算机可读存储介质
US20230350629A1 (en) Double-Channel Screen Mirroring Method and Electronic Device
WO2022222773A1 (zh) 拍摄方法、相关装置及系统
WO2023160295A1 (zh) 视频处理方法和装置
CN114466131B (zh) 一种跨设备的拍摄方法及相关设备
US12028300B2 (en) Method, apparatus, and system for sending pictures after thumbnail selections
US20230208790A1 (en) Content sharing method, apparatus, and system
WO2023143171A1 (zh) 一种采集音频的方法及电子设备
WO2023142731A1 (zh) 一种分享多媒体文件的方法、发送端设备和接收端设备
WO2023169237A1 (zh) 一种截屏方法、电子设备及系统
WO2023160224A9 (zh) 一种拍摄方法及相关设备
CN117082295B (zh) 图像流处理方法、设备及存储介质
WO2022206600A1 (zh) 一种投屏方法、系统及相关装置
WO2023231585A1 (zh) 视频拍摄方法、装置、设备和存储介质
WO2022228214A1 (zh) 设备发现方法、系统及其电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22748800

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280009678.9

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22748800

Country of ref document: EP

Kind code of ref document: A1