WO2022160985A1 - Procédé de photographie distribuée, dispositif électronique et support - Google Patents

Procédé de photographie distribuée, dispositif électronique et support Download PDF

Info

Publication number
WO2022160985A1
WO2022160985A1 PCT/CN2021/137917 CN2021137917W WO2022160985A1 WO 2022160985 A1 WO2022160985 A1 WO 2022160985A1 CN 2021137917 W CN2021137917 W CN 2021137917W WO 2022160985 A1 WO2022160985 A1 WO 2022160985A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
mobile phone
camera
image data
image
Prior art date
Application number
PCT/CN2021/137917
Other languages
English (en)
Chinese (zh)
Inventor
冯可荣
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022160985A1 publication Critical patent/WO2022160985A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the embodiments of the present application relate to the field of photographing technologies, and in particular, to a distributed photographing method, an electronic device, and a medium.
  • the electronic device may be a mobile phone, a tablet computer, a smart watch, or a smart TV.
  • an electronic device (called the local device) can not only present the picture captured by the camera of the electronic device to the user, but also acquire and display the camera captured by other electronic devices (called the remote device). screen.
  • the local device can obtain the screen of the remote device in the following ways: (1) the local device sends a control command to the remote device, instructing the remote device to collect images and return the preview result to the local device; (2) the remote device After the end device receives the control command, it collects images and performs image signal processing (ISP) on the collected images to obtain a preview result; (3) the remote device sends the preview result to the local device.
  • ISP image signal processing
  • the present application provides a distributed shooting method, an electronic device and a medium, which can reduce the network bandwidth occupied by the remote device and the local device for data transmission during the process of realizing the distributed shooting.
  • the present application provides a distributed shooting method, which is applied to a first device.
  • the first device may receive a first operation of the user selecting to perform synchronous shooting with the second device.
  • the first device may start acquiring the first image data and instruct the second device to start acquiring the image data.
  • the first device may receive second image data from the second device, the second image data including the original image.
  • the first device may display the first shooting picture corresponding to the first image data in the first window, and display the second shooting picture corresponding to the second image data in the second window.
  • the first window and the second window are located on the same display interface.
  • the above-described original image may also be referred to as a RAW image.
  • RAW can be translated as "unprocessed". That is, the above-mentioned original image is an image that has not been processed by the second device. Compared with the image processed by the second device, the data amount of the original image is smaller. In this way, the network bandwidth occupied by the distributed shooting service across the devices can be reduced.
  • the above method may further include: the first device sends first image data to the second device, where the first image data includes an original image.
  • the original image also referred to as a RAW image
  • the original image is an image that has not been processed by the first device.
  • the first device can simultaneously display the shooting screen of the first device and the shooting screen of the second device
  • the second device can simultaneously display the shooting screen of the first device and the shooting screen of the second device.
  • the method of the present application may further include: the first device acquires the second device's A shooting capability parameter, where the shooting capability parameter is used to indicate an image processing algorithm supported by the second device.
  • the first device may perform image processing on the first image data according to the shooting capability parameters of the first device to obtain a first shooting picture; and according to the shooting capability parameters of the second device, perform image processing on the second image data to obtain the first image Two pictures.
  • the shooting capability parameter of the first device is used to indicate an image processing algorithm supported by the first device.
  • the first device performs image processing on the RAW image collected by the second device.
  • the second device performs the same or similar effect of image processing on the RAW image. In this way, the picture effect of the second device can be restored on the first device.
  • the first device may perform image processing on the first image data according to the shooting capability parameter of the first device to obtain the first shooting picture;
  • the shooting capability parameter of the first device is used, and image processing is performed on the second image data to obtain the second shooting picture.
  • the shooting capability parameter of the first device is used to indicate an image processing algorithm supported by the first device.
  • the first device performs image processing on the RAW images collected by the first device and the second device, so that the image effects of the shooting screen of the second device and the shooting screen of the first device can be consistent.
  • the first device does not need to acquire and save the shooting capability parameters of the second device.
  • the network bandwidth occupied by the distributed shooting service across the devices can be further reduced, and the storage space of the first device can also be saved.
  • the above-mentioned first device instructing the second device to start capturing image data includes: the first device sends a shooting instruction to the second device.
  • the shooting instruction includes a preset mark
  • the shooting instruction is used to instruct the second device to collect image data
  • the preset mark is used to instruct the second device to transmit the original image collected by the second device to the first device.
  • the first device may receive the second operation before the first device receives the first operation.
  • the second operation is the user's click operation on the function button for realizing distributed shooting, the preview interface of the camera application of the first device, the chat interface of the video communication application of the first device, the control center of the first device, the drop-down menu Or at least one item in the negative screen includes function buttons.
  • the first device displays a list of candidate devices in the preview interface.
  • the second device is included in the candidate device list.
  • the first operation is an operation in which the user selects the second device in the candidate device list.
  • the above-mentioned synchronous shooting includes at least one of synchronous video recording, synchronous photographing, synchronous live broadcast, or synchronous video call. That is, the method of the present application can be applied to the process of video recording, photographing, live broadcasting, or video calling with other devices by the first device.
  • the method of the present application further includes: the first device creates the A control session and a data session between the first device and the second device.
  • control session is used to transmit a control command between the first device and the second device
  • control command includes a shooting instruction used to instruct the second device to capture an image
  • data session is used for The second image data from the second device is transmitted.
  • the first device and the second device can distinguish the transmission paths of the control flow and the data flow during the process of executing the distributed photographing service across the devices. In this way, the problem that the control flow and the data flow mutually occupy the bandwidth can be avoided.
  • the present application provides an electronic device, the electronic device is a first device, and the electronic device includes: one or more cameras, one or more processors, a display screen, a memory, and a communication module.
  • the electronic device includes: one or more cameras, one or more processors, a display screen, a memory, and a communication module.
  • the above camera, display screen, memory, communication module and processor are coupled.
  • the memory is used to store computer program codes, and the computer program codes include computer instructions.
  • the electronic device is made to execute the method described in the first aspect and any possible design manner thereof.
  • the present application provides a chip system, which is applied to an electronic device including a display screen, a memory and a communication module; the chip system includes one or more interface circuits and one or more processors; the An interface circuit and the processor are interconnected by a wire; the interface circuit is configured to receive signals from a memory of the electronic device and send the signals to the processor, the signals comprising computer instructions stored in the memory ; When the processor executes the computer instructions, the electronic device executes the method described in the first aspect and any possible design manner thereof.
  • the electronic device is the first device.
  • the present application provides a computer storage medium, the computer storage medium comprising computer instructions, when the computer instructions are executed on an electronic device, the electronic device is made to perform the first aspect and any possible design methods thereof the method described.
  • the present application provides a computer program product, which, when the computer program product runs on a computer, causes the computer to execute the method described in the first aspect and any possible design manners thereof.
  • the present application provides a distributed shooting system, which includes the first device described in the second aspect and any possible design manner, and the second aspect involved in the first aspect and any possible design manner. equipment.
  • FIG. 1 is a schematic diagram of the architecture of a distributed shooting system provided by an embodiment of the present application.
  • 2A is a schematic diagram of functions implemented by a distributed shooting method provided by an embodiment of the present application.
  • 2B is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the application.
  • 2C is a schematic diagram of a software architecture of the local device 101 or the remote device 102;
  • 2D is a schematic diagram of the software architecture of the local device 101 and the remote device 102;
  • FIG. 2E is a schematic diagram of a software architecture of a local device 101 and a remote device 102 according to an embodiment of the application;
  • Figure 2F is a simplified diagram of the software architecture shown in Figure 2E;
  • 3A is a schematic diagram of a distributed shooting interface provided by an embodiment of the present application.
  • 3B is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • 6A is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • 6B is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a data structure of a shooting capability parameter provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a data transmission flow between a local device and a remote device according to an embodiment of the present application
  • FIG. 14 is a schematic flowchart of data processing and transmission in a remote device provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of another distributed shooting interface provided by an embodiment of the present application.
  • 16 is a schematic diagram of a data transmission flow between a local device and a remote device according to an embodiment of the present application
  • FIG. 17 is a schematic structural diagram of a chip system provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • the distributed shooting system 100 may include a local device 101 and N remote devices 102 , where N is an integer greater than 0.
  • the local device 101 and any remote device 102 may communicate in a wired manner or wirelessly.
  • the local device 101 is the first device, and the remote device 102 is the second device.
  • a wired connection may be established between the local device 101 and the remote device 102 using a universal serial bus (universal serial bus, USB).
  • a universal serial bus universal serial bus, USB
  • the global system for mobile communications GSM
  • the general packet radio service GPRS
  • the code division multiple access code division multiple access
  • WCDMA wideband code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • 5G and subsequent standard protocols Bluetooth, wireless fidelity (Wi-Fi), NFC, voice over Internet protocol (VoIP), and communication protocols that support network slicing architecture to establish wireless connections.
  • the local device 101 may directly communicate with the remote device 102 by using the above-mentioned wireless connection. In other embodiments, the local device 101 may use the above wireless connection to communicate with the remote device 102 through a cloud server, which is not limited in this embodiment of the present application.
  • both the local device 101 and the remote device 102 can be provided with one or more cameras.
  • the local device 101 can use its own camera to collect image data
  • the remote device 102 can also use its own camera to collect image data.
  • the local device 101 and the remote device 102 can use their own cameras to collect image data at the same time, and the remote device 102 can send the image data to the local device 101, and the local device 101 can simultaneously display images from The image data of the local device 101 and the remote device 102, so as to realize the distributed shooting function across the devices.
  • the local device 101 may include a preset application (application, APP).
  • the local device 101 can implement the above-mentioned distributed shooting function across devices through the application.
  • the preset application may be a system camera application or a third-party camera application.
  • the system camera application or the third-party camera application may be collectively referred to as a camera application.
  • the system application may also be referred to as an embedded application, and an embedded application is an application program provided as a part of the realization of an electronic device (eg, the local device 101 or the remote device 102 ).
  • Third-party applications may also be referred to as downloadable applications.
  • a downloadable application is an application that can provide its own internet protocol multimedia subsystem (IMS) connection.
  • the downloadable application may be an application pre-installed in the electronic device or may be an application downloaded and installed on the electronic device by a user.
  • IMS internet protocol multimedia subsystem
  • the local device 101 can be connected to the remote device 102 , and can also control the remote device 102 to open the camera of the remote device 102 through the connection between the local device 101 and the remote device 120 .
  • an image preview of the remote device 102 can be implemented on the local device 101 to control the preview effect.
  • the local device 101 can control the remote device 102 to take pictures, and control the photographing effect of the remote device 102 .
  • the local device 101 can control the video recording of the remote device 102 and control the video recording effect of the remote device 102 .
  • the above-mentioned distributed shooting function across devices is implemented. As shown in FIG.
  • the local device 101 can display the shooting images of the local device 101 and the remote device 102 . In this way, on the local device 101 , the user can not only see the shooting picture of the local device 101 , but also the shooting picture of the remote device 102 .
  • the local device 101 can also control the remote device 102 to turn off the camera of the remote device 102 through the connection between the local device 101 and the remote device 120 .
  • the local device 101 may specifically be a mobile phone, a tablet computer, a TV (also referred to as a smart TV, a smart screen or a large-screen device), a notebook computer, an ultra-mobile personal computer (Ultra- mobile Personal Computer, UMPC), handheld computers, cameras (such as digital cameras), video cameras (such as digital video cameras), netbooks, personal digital assistants (Personal Digital Assistant, PDA), wearable electronic devices (such as: smart watches, smart bracelets , smart glasses), vehicle-mounted devices, virtual reality devices, and other electronic devices with a shooting function, which are not limited in this embodiment of the present application.
  • a TV also referred to as a smart TV, a smart screen or a large-screen device
  • UMPC ultra-mobile personal computer
  • handheld computers cameras (such as digital cameras), video cameras (such as digital video cameras), netbooks, personal digital assistants (Personal Digital Assistant, PDA), wearable electronic devices (such as: smart watches, smart bracelets , smart glasses), vehicle-mounted devices, virtual reality devices,
  • FIG. 2B shows a schematic structural diagram of the mobile phone.
  • the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone jack 170D, a sensor module 180, and the like.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the mobile phone.
  • the mobile phone may include more or less components than shown, or some components may be combined, or some components may be separated, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in the processor 110 is a cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the mobile phone.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the mobile phone including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (BT), global navigation satellite system ( global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • BT Bluetooth
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the antenna 1 of the mobile phone is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like. Display screen 194 includes a display panel. In some embodiments, the handset may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the mobile phone may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the mobile phone can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the sensor module 180 may include a pressure sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the mobile phone may also include a charging management module, a power management module, a battery, a button, an indicator, and one or more SIM card interfaces, which are not limited in this embodiment of the present application.
  • FIG. 2C is a block diagram of a software structure of a mobile phone according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into five layers, from top to bottom, the application layer, the application framework layer, the Android runtime (Android runtime) and the system library, and the HAL (hardware abstraction layer, hardware abstraction layer) layer and kernel layer.
  • the Android system is used as an example to illustrate, in other operating systems (such as Hongmeng system, IOS system, etc.), as long as the functions implemented by each functional module are similar to the embodiments of the present application, the solution of the present application can also be implemented.
  • the application layer can include a series of application packages.
  • applications such as calls, memos, browsers, contacts, gallery, calendar, maps, Bluetooth, music, video, and short messages can be installed in the application layer.
  • an application with a shooting function for example, a camera application
  • a camera application may be installed in the application layer.
  • the camera application can also be called to realize the shooting function.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, etc., which is not limited in this embodiment of the present application.
  • the above-mentioned window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • the above content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system described above can be used to build the display interface of an application.
  • Each display interface can consist of one or more controls.
  • controls may include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • the above resource managers provide various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager described above enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, prompt text information in the status bar, send out a sound, vibrate, and flash the indicator light, etc.
  • the Android runtime includes core libraries and virtual machines. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is located under the HAL and is the layer between hardware and software.
  • the kernel layer at least includes a display driver, a camera driver, an audio driver, a sensor driver, and the like, which are not limited in this embodiment of the present application.
  • a camera service (Camera Service) is set in the application framework layer.
  • the camera application can start the Camera Service by calling the preset API.
  • Camera Service can interact with Camera HAL in HAL (hardware abstraction layer, hardware abstraction layer) during the running process.
  • the Camera HAL is responsible for interacting with the hardware devices (such as cameras) that realize the shooting function in the mobile phone.
  • the Camera HAL hides the implementation details of the relevant hardware devices (such as specific image processing algorithms), and on the other hand can provide the Android system. Call the interface of the relevant hardware device.
  • the relevant control commands (such as preview, zoom in, photographing, or video recording instructions) issued by the user may be sent to the Camera Service.
  • the Camera Service can send the received control commands to the Camera HAL, so that the Camera HAL can call the camera driver in the kernel layer according to the received control commands, and the camera driver drives the camera and other hardware devices to collect images in response to the control commands. data.
  • the camera can transmit each frame of image data collected to the Camera HAL through the camera driver at a certain frame rate.
  • the transfer process of the control command inside the operating system reference may be made to the specific transfer process of the control flow in FIG. 2C .
  • the Camera Service can determine the shooting strategy at this time according to the received control command, and the specific image processing task that needs to be performed on the collected image data is set in the shooting strategy. For example, in preview mode, Camera Service can set image processing task 1 in the shooting strategy to implement the face detection function. For another example, if the user enables the beauty function in the preview mode, the Camera Service can also set image processing task 2 in the shooting strategy to implement the beauty function. Furthermore, the Camera Service can send the determined shooting strategy to the Camera HAL.
  • the Camera HAL After the Camera HAL receives each frame of image data collected by the camera, it can perform corresponding image processing tasks on the above image data according to the shooting strategy issued by the Camera Service, and obtain each frame of image processing after image processing. For example, Camera HAL can perform image processing task 1 for each frame of image data received according to shooting strategy 1, and obtain each corresponding frame of shooting picture. After the shooting strategy 1 is updated to the shooting strategy 2, the Camera HAL can perform image processing task 2 and image processing task 3 for each frame of image data received according to the shooting strategy 2, and obtain the corresponding shooting picture of each frame.
  • Camera HAL can report each frame of captured image after image processing to the camera application through Camera Service, and the camera application can display each frame of captured image on the display interface, or the camera application can display each frame of captured image in the form of photos or videos.
  • Each frame shot is saved in the phone.
  • FIG. 2D shows a schematic diagram of a software architecture of the local device 101 and the remote device 102 .
  • the solution for realizing cross-device distributed shooting by the local device 101 and the remote device 102 is described with reference to the software architecture shown in FIG. 2D .
  • the local device 101 has the functions of device discovery and remote registration.
  • the local device 101 can discover the remote device 102, such as the TV 1, the watch 2, and the mobile phone 3 as shown in FIG. 4 .
  • the remote device 102 such as the TV 1, the watch 2, and the mobile phone 3 as shown in FIG. 4 .
  • the local device 101 can use the remote device 102 (such as the TV 1) ) is registered on the local device 101.
  • the local device 101 may create a Distributed Mobile Sensing Development Platform (DMSDP) HAL for the remote device 102 at the hardware abstraction layer (HAL) of the local device 101, and also It can be called a virtual camera HAL (ie, virtual Camera HAL).
  • DMSDP HAL does not correspond to the actual hardware device of the local device 101, but corresponds to the remote device 102 currently connected to the local device 101.
  • the DMSDP HAL shown in FIG. 2D is a HAL created by the local device 101 according to the shooting capability parameters of the remote device 102 .
  • the local device 101 can send and receive data with the remote device 102 through the DMSDP HAL, and the remote device 102 is used as a virtual device of the local device 101 to cooperate with the remote device 102 to complete the cross-device distributed shooting service.
  • the process may include steps 1-step 10.
  • Step 1 The application layer of the local device 101 issues a control command for controlling the camera of the remote device 102 to the camera service of the service layer.
  • Step 2 The service layer of the local device 101 sends the control command to the DMSDP HAL of the HAL.
  • Step 3 The DMSDP HAL of the local device 101 transmits a control command to the remote device 102.
  • Step 4 After receiving the control command, the remote device 102 transmits the control command to the service layer.
  • Step 5 The service layer of the remote device 102 transmits a control command to the Camera HAL of the HAL.
  • the Camera HAL of the remote device 102 corresponds to the actual hardware device of the remote device 102.
  • the HAL of the remote device 102 can call its bottom layer (eg, the kernel layer Kernel) to perform corresponding shooting tasks (eg, start the camera, switch the camera, or adjust the relevant parameters of the camera, etc.) according to the control command.
  • the data transmitted in the above steps 1-step 5 may be referred to as a control flow.
  • Step 6 The HAL of the remote device 101 uploads the preview stream to the camera service of the service layer.
  • the preview stream includes one or more frames of preview images collected by the camera device at the bottom layer of the remote device 102 and processed by an image signal processor (image signal processor, ISP) at the bottom layer.
  • Step 7 The service layer of the remote device 101 transmits the preview stream to the application layer of the remote device 102 .
  • Step 8 The application layer of the remote device 101 transmits the preview stream to the DMSDP HAL of the local device 101.
  • Step 9 The DMSDP HAL of the local device 101 transmits the preview stream to the camera service of the service layer.
  • Step 10 The service layer of the local device 101 reports the preview stream to the application layer.
  • the camera application of the local device 101 can present the preview stream from the remote device 102 to the user (such as the above-mentioned shooting picture 2, shooting picture 3 or shooting picture c).
  • the camera application of the local device 101 can also present the preview stream from the local device 101 to the user, and the specific process can refer to the related introduction of the conventional technology, which will not be repeated here.
  • the data transmitted in the above steps 6-step 10 may be referred to as a preview stream or a data stream.
  • the service layer of the local device 101 or the remote device 102 may be included in the framework layer, and the service layer may be implemented in the framework layer.
  • the service layer of the service layer of the local device 101 or the remote device 102 may also be independent of the framework layer, which is not limited in this embodiment of the present application.
  • the preview stream transmitted from the remote device 102 to the local device 101 is obtained through image processing. If the data volume is increased, a large amount of network bandwidth needs to be occupied, and network resources are wasted.
  • the performance of the layer processing performed by the remote device 102 may be lower. In this way, the drawing efficiency of the remote device 102 is lower.
  • a relatively long time delay is required.
  • the transmission paths of the control flow and the data flow (or preview flow) are not distinguished, and the control flow and the data flow may occupy the bandwidth of each other.
  • An embodiment of the present application provides a distributed shooting method.
  • the local device 101 can directly obtain the remote device 102 from the remote device 102.
  • Raw image captured by device 102 The original image may also be referred to as a RAW image.
  • RAW can be translated as "unprocessed". That is to say, the original image collected by the remote device 102 is an image that has not been processed by the remote device 102 .
  • a RAW image is used to represent the original image, and the method of the embodiment of the present application is introduced.
  • the data amount of the RAW image is smaller. In this way, the network bandwidth occupied by the distributed shooting service across the devices can be reduced.
  • the local device 101 processes the RAW image from the remote device 102, which can avoid the problem of increasing the time delay due to the large performance difference between the two devices.
  • the time delay of the distributed shooting service can be reduced.
  • the transmission paths of the control flow and the data flow can be distinguished. In this way, it is possible to avoid the problem of the control flow and the data flow occupying the bandwidth with each other.
  • FIG. 2E shows a schematic diagram of a software architecture of a local device 101 and a remote device 102 provided by an embodiment of the present application.
  • the local device 101 may include: an application layer 200, a framework layer 201, a service layer 201a, and a HAL 202;
  • the remote device 102 may include: an application layer 210, a framework layer 211, a service layer 211a, and a HAL 212.
  • the service layer can be implemented in the framework layer.
  • the service layer 201 a may be implemented in the framework layer 201
  • the service layer 211 a may be implemented in the framework layer 211 .
  • the application layer may include a series of application packages.
  • the application layer 200 may include multiple applications, such as a camera application and a Dv application 200a.
  • the camera application can also be called to realize the shooting function.
  • the application layer 210 includes a camera proxy service 210a for supporting the remote device 102 and the local device 101 to cooperate to complete a distributed shooting service.
  • the camera proxy service 210a may also be called a camera proxy application, which is not limited in this embodiment of the present application.
  • a device virtualization (device virtualization, Dv) application 200a for realizing a distributed shooting function may also be installed in the application layer 200 .
  • the Dv application 200a may run in the local device 101 as a system application.
  • the functions implemented by the Dv application 200a may also be resident in the local device 101 and run in the form of system services.
  • the framework layer provides APIs and programming frameworks for applications in the application layer.
  • both the framework layer 201 and the framework layer 211 may provide Camera API 2.0.
  • the frame layer 201 is provided with a Camera kit 201c.
  • Camera kit 201c can encapsulate a variety of camera modes. For example, the photographing mode, video recording mode and dual scene mode shown in FIG. 2E.
  • a camera service (Camera Service) 201a1 may be provided in the service layer 201a.
  • Camera applications can start Camera Service 201a1 by calling a preset API (such as Camera API 2.0).
  • the HAL 202 is provided with a Camera HAL 202a and a DMSDP HAL 202b.
  • Camera Service 201a1 can interact with Camera HAL 202a and/or DMSDP HAL 202b in HAL 202 during operation.
  • the Camera HAL 202a is responsible for interacting with the hardware device (such as a camera) that implements the shooting function in the local device 101.
  • the Camera HAL 202a hides the implementation details of the relevant hardware devices (such as specific image processing algorithms), and on the other hand It can provide the Android system with an interface for calling related hardware devices.
  • the interaction process between the application layer 200 and the framework layer 201, the service layer 201a, the Camera HAL 202a and the DMSDP HAL 202b, etc. can refer to the above-mentioned embodiment for the corresponding software modules in FIG. 2C. , which is not repeated in this embodiment of the present application.
  • the above-mentioned framework layer 201 may also be provided with a Dv kit (kit) 201b for realizing a distributed shooting function.
  • the Dv kit 201b can be called by the Dv application 200a provided in the application layer 200 to realize the function of discovering and connecting to the remote device 102.
  • the functions implemented by the Dv application 200a can also be resident and run in the mobile phone in the form of system services.
  • the local device 101 can discover and connect to the remote device 102 through the Dv kit 201b. After the local device 101 establishes a connection with the remote device 102, the local device 101 can create a DMSDP HAL 202b for the remote device 102 in the HAL 202 of the local device 101. Then the remote device 102 can be registered with the local device 101 .
  • the DMSDP HAL 202b shown in FIG. 2E is a HAL created by the local device 101 according to the shooting capability parameters of the remote device 102.
  • the remote device 102 may also include a Dv application and a Dv kit.
  • a Dv application and a Dv kit for the functions of the Dv application and the Dv kit in the remote device 102, reference may be made to the description of the functions of the Dv application 200a and the Dv kit 201b in the embodiments of the present application, which will not be repeated here.
  • the shooting capability parameters of the remote device 102 can also be sent to the Camera Service 201a1 for saving, that is, in the Camera Service In 201a1, the current shooting capability parameters of the remote device 102 (including device capability information and algorithm capability information) are registered.
  • the device capability information is used to indicate the hardware capability of the remote device 102 to capture images
  • the algorithm capability information is used to indicate the algorithm capability of the remote device 102 to perform image processing on the captured images.
  • the capability information of the remote device 102 can be registered or mapped to the Camera Service 201a1 through the DMSDP HAL 202b.
  • the "peer Meta synchronization mapping" function can also be set.
  • the peer Meta refers to a tag in the remote device 102 that is used to indicate the shooting capability parameter of the remote device 102 .
  • the DMSDP HAL 202b may also be called a virtual Camera HAL or a DMSDP Camera HAL. Different from the traditional Camera HAL, the DMSDP HAL 202b does not correspond to the actual hardware device of the local device 101, but corresponds to the remote device 102. As shown in FIG. 2F , the remote device 102 (eg, a device including a camera such as a TV, a tablet computer, a mobile phone, etc.) can be used as a virtual camera of the local device 101 .
  • the local device 101 (the virtual frame shown in FIG.
  • the remote device 102 can send and receive data with the remote device 102 through the DMSDP HAL 202b, and the remote device 102 is used as a virtual camera of the local device 101 to communicate with the remote device 102. Collaborate to complete various businesses in distributed shooting scenarios.
  • the Dv application 200a of the local device 101 can also obtain the audio capability parameters (such as audio playback delay, audio sampling rate or number of sound channels, etc.), display capability parameters (such as screen resolution, codec algorithm for display data, etc.).
  • the remote device 102 can also send the relevant capability parameters to the Dv application 200a of the local device 101 .
  • the DMSDP HAL 202b not only has the image processing capability of the remote device 102, but also has the audio and display capabilities of the remote device 102, so that the remote device 102 can act as the virtual device of the local device 101 to cooperate with the local device 101 Complete various businesses in distributed scenarios.
  • the local device 101 may create a control session (Control Session) for transmitting a control stream and a data session (Data Session) for transmitting a data stream (that is, a preview stream) through the DMSDP HAL 202b.
  • Control Session Control Session
  • Data Session data session
  • the shooting capability parameters of the remote device 102 can also be sent to the Camera Service 201a1 for saving, that is, register the current remote in the Camera Service 201a1.
  • the capture capability of the device 102 in addition to creating a corresponding DMSDP HAL 202b for the remote device 102 in the HAL, the shooting capability parameters of the remote device 102 can also be sent to the Camera Service 201a1 for saving, that is, register the current remote in the Camera Service 201a1.
  • the capture capability of the device 102 in addition to creating a corresponding DMSDP HAL 202b for the remote device 102 in the HAL, the shooting capability parameters of the remote device 102 can also be sent to the Camera Service 201a1 for saving, that is, register the current remote in the Camera Service 201a1.
  • the capture capability of the device 102 in addition to creating a corresponding DMSDP HAL 202b for the remote device 102 in the HAL, the shooting capability parameters of the remote device 102 can also be sent to the Camera Service 201a1
  • the Camera Service 201a1 can determine the shooting strategy in the shooting process in real time according to the control commands (such as preview, zoom, video and other instructions) issued by the camera application, combined with the shooting capability of the remote device 102.
  • the Camera Service can set the image processing tasks that the mobile phone needs to perform and the image processing tasks that the remote device 102 needs to perform in the shooting strategy according to the shooting capability parameters of the slave device.
  • the Camera Service can use the DMSDP HAL 202b to send a shooting instruction corresponding to the shooting strategy to the remote device 102 through the Control Session, and trigger the remote device 102 to perform the corresponding image processing task.
  • the Camera HAL 212a of the remote device 102 can call the camera driver in the kernel layer according to the received shooting instruction, and drive hardware devices such as the camera to respond to the shooting instruction to collect image data.
  • the camera can transmit each frame of image data collected to the Camera HAL through the camera driver at a certain frame rate.
  • the Camera HAL 212a of the remote device 102 After receiving each frame of image data collected by the camera, the Camera HAL 212a of the remote device 102 transmits a data stream including the RAW image collected by the camera to the upper layer.
  • the camera proxy service 210a of the application layer 210 (or may also be referred to as the camera proxy application) sends the data stream including the RAW image to the local device 101 through the DMSDP HAL 202b.
  • Data stream of RAW images For example, the camera proxy service 210a may send a data stream including RAW images to the local device 101 through the Data Session provided by the DMSDP HAL 202b.
  • the DMSDP HAL 202b of the local device 101 After the DMSDP HAL 202b of the local device 101 receives the data stream including the RAW image, it can call the relevant algorithm to perform post-processing (algorithm) according to the shooting capability parameters saved in the Camera Service 201a1, perform image processing on the RAW image, and obtain the corresponding Take a picture every frame. Subsequently, DMSDP HAL 202b can report each frame of captured image after image processing to the camera application through Camera Service 201a1, and the camera application can display each frame of captured image on the display interface.
  • algorithm post-processing
  • the local device 101 and the remote device 102 implement the distributed shooting function, the local device 101 can perform corresponding processing on the image data according to its own shooting capabilities and the shooting capabilities of the remote device 102 based on the above shooting strategy. Image processing to achieve better shooting results in distributed shooting scenes.
  • the local device 101 may also send the RAW image collected by the local device 101 to the remote device 102 .
  • the camera service 211a1 of the remote device 102 can also save the shooting capability parameters of the local device 101.
  • the remote device 102 may also call a related algorithm according to the shooting capability parameter of the local device 101 to perform image processing on the RAW image from the local device 101 to obtain each corresponding frame of shooting images.
  • the remote device 102 can call a relevant algorithm according to the shooting capability parameter of the remote device 102 to perform image processing on the RAW images collected by the remote device 102, and obtain each corresponding frame of shooting images.
  • the far-end device 102 can also display the photographing picture of the local device 101 and the photographing picture of the far-end device 102 at the same time.
  • the above-mentioned distributed shooting function across devices can be implemented on the local device 101 and the remote device 102 at the same time.
  • the following examples illustrate application scenarios of the methods provided by the embodiments of the present application.
  • the methods of the embodiments of the present application may be applicable to at least the following application scenarios (1)-application scenarios (2).
  • Application Scenario (1) Cross-device multi-shot shooting scenario.
  • the cross-device multi-scene shooting scene may be a cross-device dual-scene shooting scene.
  • the cross-device dual-view shooting scene may include a cross-device dual-view photography scene and a cross-device dual-view video recording scene.
  • the cross-device dual-view shooting scene is taken as an example to introduce the cross-device dual-view shooting scene.
  • the above-mentioned preset application may be a camera application for implementing a shooting function.
  • a camera application may be installed in mobile phone 0 .
  • the mobile phone 0 ie, the local device 101
  • the mobile phone 0 can perform cross-device multi-view shooting with the remote device 102 through the camera application.
  • FIG. 3A after detecting that the user has opened the camera application, the mobile phone 0 can open its own camera to start collecting image data, and display the corresponding shooting image in the preview frame 302 of the preview interface 301 in real time according to the collected image data.
  • the shooting picture displayed by the mobile phone 0 in the preview frame 302 may be different from the image data collected by the mobile phone 0 using the camera.
  • the mobile phone 0 after the mobile phone 0 obtains the image data collected by the camera, it can perform image processing tasks such as anti-shake, focus, soft focus, blur, filter, beauty, face detection, or AR recognition on the collected image data, and obtain The captured image after image processing.
  • the mobile phone 0 may display the image-processed shooting screen in the above-mentioned preview frame 302 .
  • the remote device 102 can also turn on its own camera to start collecting image data, and perform anti-shake, focus, soft focus, blur, filter, beauty, face detection or AR recognition on the collected image data, etc.
  • the image processing task is to obtain a photographed picture after image processing, which is not limited in this embodiment of the present application.
  • the mobile phone 0 may set a function button 303 for cross-device multi-view shooting in the preview interface 301 of the camera application.
  • the user wants to see the multi-scene pictures shot by the mobile phone 0 and the remote device 102 on the mobile phone 0, the user can click the function button 303 to enable the cross-device multi-scene shooting function.
  • the mobile phone 0 may also set the function button 304 of the synchronous shooting function in the control center, pull-down menu, negative screen, or other applications (eg, video calling application) of the mobile phone, which is not limited in this embodiment of the present application.
  • the mobile phone can display the control center 305 in response to the user's operation of opening the control center, and the control center 305 is provided with the above-mentioned function buttons 304 . If the user wishes to use the mobile phone and other electronic devices to shoot synchronously, the user can click the function button 304 .
  • the mobile phone 0 can display one or more candidate devices that can collect image data currently searched by the mobile phone 0 in the dialog box 401. .
  • the mobile phone 0 can query the cloud server for an electronic device with a photographing function that is logged into the same account (for example, a Huawei account) as the mobile phone 0 . Furthermore, the mobile phone 0 may display the queried electronic device as a candidate device in the dialog 401 .
  • mobile phone 0 can search for electronic devices located in the same Wi-Fi network as mobile phone 0 . Furthermore, the mobile phone 0 can send a query request to each electronic device in the same Wi-Fi network, and the electronic device triggered to receive the query request can send a response message to the mobile phone 0, and the response message can indicate whether it has a shooting function. Then, the mobile phone 0 can determine, according to the received response message, an electronic device with a photographing function in the current Wi-Fi network. Furthermore, the mobile phone 0 can display the electronic device with the photographing function as a candidate device in the dialog 401 .
  • an application for managing smart home devices in the home may be installed in the mobile phone 0 .
  • the user can add one or more smart home devices in the smart home application, so that the smart home device added by the user is associated with the mobile phone 0 .
  • a two-dimensional code containing device information such as device identification can be set on the smart home device. After the user scans the two-dimensional code with the smart home application of the mobile phone 0, the corresponding smart home device can be added to the smart home application, thereby establishing The association between smart home devices and mobile phone 0.
  • the mobile phone 0 when one or more smart home devices added in the smart home application go online, for example, when the mobile phone 0 detects the Wi-Fi signal sent by the added smart home device, the mobile phone 0 can use the smart home device.
  • the home device is displayed in the dialog 401 as a candidate device, and the user is prompted to choose to use the corresponding smart home device to shoot synchronously with the mobile phone 0 .
  • the candidate devices searched by mobile phone 0 include TV 1, watch 2 and mobile phone 3 as an example, the user can select from TV 1, watch 2 and mobile phone 3 to perform cross-device multi-view shooting with mobile phone 0 this time.
  • One or more remote devices 102 are used as an example. For example, if it is detected that the user selects the TV 1, the mobile phone 0 can use the TV 1 as the remote device 102 to establish a network connection with the TV 1.
  • mobile phone 0 can establish a Wi-Fi connection with TV 1 through a router, or mobile phone 0 can directly establish a Wi-Fi P2P connection with TV 1, or mobile phone 0 can directly establish a Bluetooth connection with TV 1; Establish a Bluetooth connection with the TV 1; alternatively, the mobile phone 0 can directly establish a short-range wireless connection with the TV 1, and the short-range wireless connection includes but is not limited to a near field communication (Near Field Communication, NFC) connection, an infrared connection, an ultra-wideband (Ultra Wideband, UWB) connection, ZigBee protocol (ZigBee) connection; or, mobile phone 0 can directly establish a mobile network connection with TV 1, the mobile network including but not limited to supporting 2G, 3G, 4G, 5G and subsequent standard protocols mobile networks.
  • NFC Near Field Communication
  • UWB ultra-wideband
  • ZigBee protocol ZigBee protocol
  • the mobile phone 0 after the mobile phone 0 detects that the user clicks the function button 303, the mobile phone can search for one or more electronic devices with a photographing function according to the above method. Furthermore, the mobile phone 0 can automatically establish a network connection with the searched electronic device. At this time, the user does not need to manually select a specific device for establishing a network connection with the mobile phone 0 .
  • the mobile phone may have established a network connection with one or more electronic devices having a photographing function. For example, before the user opens the camera application in phone 0, a Bluetooth connection has been established with the tablet. Subsequently, after the mobile phone 0 opens the camera application to display the preview interface 301 of the camera application, if it is detected that the user clicks the function button 303, the mobile phone 0 can no longer search for an electronic device with a shooting function, but execute the following method.
  • the mobile phone 0 After the mobile phone 0 establishes a network connection with the TV 1, on the one hand, as shown in FIG. On the other hand, as shown in FIG. 5 , the mobile phone 0 can instruct the TV 1 to turn on its own camera to start collecting image data, and perform image processing on the collected image data to obtain the shooting picture 2 . Subsequently, the TV 1 can send the captured image 2 to the mobile phone 0 . In this way, the mobile phone 0 can simultaneously display the shooting picture 1 from the mobile phone 0 and the shooting picture 2 from the TV 1 on the display interface of the camera application.
  • multiple cameras can be used to achieve multi-view shooting of the same shooting object, which can improve the fun of shooting sex.
  • multiple cameras can also be used to achieve co-shooting of multiple subjects (such as parent-child co-shooting or friends co-shooting, etc.).
  • the local device 101 is the mobile phone 0 shown in FIG. 5
  • the remote device 102 is the mobile phone 3 shown in FIG. 6A .
  • the mobile phone 0 or the mobile phone 3 can use the above-mentioned preset application (such as a camera application or other applications that can be used to realize photo or video shooting) to realize the combined shooting of multiple shooting objects.
  • the preset application is a camera application.
  • the user wants to see the co-photographing screen of the mobile phone 0 and the mobile phone 3 on the mobile phone 0, the user can click the function button 303 or the function button 304 to enable the cross-device multi-view co-photography function.
  • the mobile phone 0 can display one or more candidate devices that can collect image data currently searched by the mobile phone 0 in the dialog box 401.
  • the candidate devices searched by mobile phone 0 include TV 1, watch 2 and mobile phone 3 as an example, the user can select from TV 1, watch 2 and mobile phone 3 to perform cross-device multi-scene co-photography with mobile phone 0 this time.
  • Remote device 102 For example, if it is detected that the user selects the mobile phone 3, the mobile phone 0 can use the mobile phone 3 as the remote device 102 to establish a network connection with the mobile phone 3.
  • the mobile phone 0 may instruct the mobile phone 3 to turn on its own camera to start collecting image data, and perform image processing on the collected image data to obtain a shooting picture 3 .
  • the mobile phone 3 may send the captured image 3 to the mobile phone 0 .
  • the mobile phone 0 can simultaneously display the shooting screen 1 from the mobile phone 0 and the shooting screen 3 from the mobile phone 3 on the display interface of the camera application.
  • the mobile phone 0 can send the photographed image 1 to the mobile phone 3 .
  • the mobile phone 3 can also display the shooting screen 1 from the mobile phone 0 and the shooting screen 3 from the mobile phone 3 at the same time.
  • the mobile phone 0 can be used as the local device and the remote device at the same time.
  • the mobile phone 0 is used as the local device, and the mobile phone 3 is used as the remote device.
  • the mobile phone 3 is used as the local device, and the mobile phone 0 is used as the remote device.
  • the above-mentioned cross-device multi-view shooting scenario may be a scenario in which at least two devices perform cross-device multi-view shooting.
  • three electronic devices may also apply the method of the embodiment of the present application to realize the cross-device multi-scene shooting function.
  • the mobile phone 0 shown in FIG. 6B can be used as the local device, and the mobile phone 3 and the TV 1 can be used as the remote devices; As shown in FIG. 6B , the mobile phone 0 can simultaneously display the shooting picture 1 from the mobile phone 0 , the shooting picture 3 from the mobile phone 3 , and the shooting picture 2 from the TV 1 .
  • Application scenario (2) a scenario in which the local device 101 calls the remote device 102 to perform a video call, live broadcast, photograph or record.
  • the local device 102 can call the camera of the remote device 102, or the camera and display screen of the remote device 102, to assist the local device 101 to make a video call, live broadcast, take pictures or record videos.
  • the application scenario (2) is introduced by taking the local device 101 calling the remote device 102 for a video call as an example.
  • the above-mentioned preset application may be a video communication application (such as a WeChat TM application) for implementing a video call.
  • a video communication application such as a WeChat TM application
  • the mobile phone 0 may have the above video communication application installed.
  • the mobile phone 0 can call the camera and display screen of the remote device 102 (such as the TV 1 ) during the video call with the mobile phone 4 to assist the mobile phone 0 and the mobile phone 4 in the video call.
  • the mobile phone 0 displays a video call interface 701
  • the video call interface 701 is provided with a function button 702 for calling other devices to assist the mobile phone 0 to make a video call.
  • the user wants the mobile phone 0 to call other devices to assist the mobile phone 0 to make a video call, the user can click the function button 702 .
  • the mobile phone 0 can call the camera and display screen of the remote device 102 (such as the TV 1) before requesting to conduct a video call with the mobile phone 4 to assist the mobile phone 0 to conduct a video call with the mobile phone 4.
  • the mobile phone 0 may display a confirmation window 802, which is used to request the user to confirm whether to use the large screen to assist the mobile phone to make voice calls call.
  • the user wants the mobile phone 0 to call other devices to assist the mobile phone 0 to make a video call, the user can click the “Yes” button in the confirmation window 802 .
  • a dialog box 401 as shown in FIG. 4 may be displayed.
  • the specific method for the mobile phone to search for candidate devices and display the dialog box 401 may refer to the detailed description in the above-mentioned embodiment, which will not be repeated here.
  • the mobile phone 0 When the mobile phone 0 detects that the user selects the TV 1, the mobile phone 0 can use the TV 1 as the remote device 102 to establish a network connection with the TV 1. After the mobile phone 0 establishes a network connection with the TV 1, on the one hand, as shown in FIG. On the other hand, as shown in FIG. 9 , the mobile phone 0 can instruct the TV 1 to turn on its own camera to start collecting image data, and perform image processing on the collected image data to obtain a shooting picture c. Subsequently, the TV 1 can send the shooting picture c to the mobile phone 0 , and the mobile phone 0 transmits the shooting picture c to the mobile phone 4 . In this way, the TV 1 can simultaneously display the shooting screen b from the mobile phone 4 and the shooting screen c from the TV 1 ; the mobile phone 4 can also simultaneously display the shooting screen c from the TV 1 and the shooting screen b from the mobile phone 4 .
  • the mobile phone 0 calls the camera and display screen of the TV 1 to assist the mobile phone 0 to conduct a video call with the mobile phone 4, and the mobile phone 0 can also display the shooting screen b from the mobile phone 4 and/or the shooting from the TV 1. screen c.
  • the mobile phone 0 may call the camera and the display screen of the TV 1 to assist the mobile phone 0 to conduct a video call with the mobile phone 4, and the mobile phone 0 may display a preset interface.
  • the preset interface may be the main interface 901 shown in FIG. 9 .
  • the preset interface may also be a preset picture or a preset animation or the like.
  • mobile phone 0 when mobile phone 0 invokes the camera and display screen of TV 1 to assist mobile phone 0 to conduct a video call with mobile phone 4, mobile phone 0 may have a black screen. In this way, the power consumption of the mobile phone 0 can be reduced.
  • the shooting picture of the mobile phone 0 may be different from that of the TV 1 .
  • the shooting screen a shown in FIG. 7 is shot by the mobile phone 0
  • the shooting screen c shown in FIG. 9 is shot by the TV 1
  • the shooting screen a is different from the shooting screen c.
  • the local device 101 can use and control the camera of the remote device 102, and use the camera of the remote device 102 to assist the local device 101 to complete shooting tasks, video tasks or other related tasks.
  • the remote device 102 can also use and control the camera of the local device 101, and use the camera of the local device 101 to assist the remote device 102 to complete the above tasks.
  • the mobile phone 3 can also simultaneously display the shooting screen 1 from the mobile phone 0 and the shooting screen 3 from the mobile phone 3 .
  • the local device 101 and the remote device 102 may perform shooting in a distributed shooting scenario (eg, multi-scene shooting).
  • a distributed shooting scenario eg, multi-scene shooting
  • the methods of the embodiments of the present application are described in detail by taking the local device 101 as a mobile phone as an example.
  • the local device 101 eg, a mobile phone
  • the local device 101 can provide a dual-view mode (including a dual-view camera mode and a dual-view video mode).
  • a function button 1001 and a function button 1002 for cross-device multi-view shooting can be set in the preview interface of the camera application on the mobile phone.
  • the function button 1001 is used to trigger the mobile phone to enter the multi-view photographing mode.
  • the function button 1002 is used to trigger the mobile phone to enter the multi-view recording mode.
  • the user can click the function button 1001 or the function button 1002 to enable the cross-device multi-scene shooting function.
  • the user's click operation on the function button 1001 , the function button 1002 , or the function button 303 is the second operation.
  • a preset application of the mobile phone may trigger the mobile phone to search for one or more nearby candidate devices with a shooting function.
  • the preset application of the mobile phone can discover (ie search) one or more candidate devices with shooting capabilities through the Dv kit.
  • the mobile phone can display one or more candidate devices found in the dialog box 1101 .
  • the mobile phone may query the server for an electronic device that is logged in to the same account as the mobile phone and has a photographing function, and displays the electronic device as a candidate device in the dialog 1101 .
  • the user's click operation on a candidate device in the dialog box 1101 and the dialog box 401 (also referred to as a candidate device list) is the first operation.
  • the candidate devices in the dialog 1101 include the TV 1102 , the TV 1103 and the watch 1104 , the user can select the remote device in the dialog 1101 that cooperates with the mobile phone to realize the synchronous shooting function this time.
  • the mobile phone detects that the user selects the TV 1102 in the dialog 1101, it means that the user wishes to use the mobile phone and the TV 1102 to shoot at the same time.
  • the Dv kit of the mobile phone can use the TV 1102 as a remote device of the mobile phone to establish a network connection with the mobile phone.
  • the TV 1102 can be registered in the HAL of the mobile phone. Specifically, the mobile phone can obtain the shooting capability parameters of the TV 1102 from the TV 1102 based on the network connection; and create a corresponding DMSDP HAL in the HAL according to the shooting capability parameters of the TV 1102 .
  • the shooting capability parameter of the TV 1002 is used to reflect the specific shooting capability of the TV 1102 .
  • the shooting capability parameter may include device capability information.
  • the device capability information is used to indicate the hardware capability of the TV 1102 to capture images.
  • the device capability information may indicate parameters such as the number of cameras in the TV 1102, the resolution of the cameras, or the model of the image processor.
  • the device capability information can be used by the mobile phone to determine the shooting strategy of the TV 1102 .
  • the above-mentioned shooting capability parameters of the TV 1102 can be stored in the Camera Service of the mobile phone.
  • the shooting capability parameters of the TV 1102 may be stored in the Camera Service of the mobile phone in the form of a tag.
  • the device capability information is used as an example to introduce the storage format of the shooting capability parameter of the TV 1102 in the Camera Service of the mobile phone.
  • FIG. 12 shows a schematic diagram of a format of the device capability information of the TV 1102 stored in the Camera Service of the mobile phone in the embodiment of the present application.
  • the three-layer data structure shown in Figure 12 can be used in the Camera Service of the mobile phone to store the device capability information of the TV 1102.
  • the mobile phone can store each device capability information of the TV 1102 in segments in the Camera Service.
  • the Section_name shown in FIG. 12 is the storage address of the TV 1102; the Tag_name is the multiple Tag names under one address of the Section_name; and the Tag_index is the index of the capability information of the Tag name. The index is used to indicate the storage address of specific capability information.
  • the device capability information of the TV 1102 may include five Section_names shown in FIG. 12 : com.huawei.capture.metadata; com.huawei.device.capabilities; android.huawei.device.parameters; android.huawei.stream.info ; android.huawei.stream.parameters.
  • the device capability information of the TV 1102 may further include: Tag names of multiple tags stored in the storage address corresponding to the Section_name, such as device_sensor_position, hidden_camera_id, colerBarCheckUnsupport, amoothZoomSupport, and tofType.
  • device_sensor_position represents the sensor of the camera of TV 1102 .
  • the device capability information of the TV 1102 may further include: an index of the capability information of the tag name, such as CAMERA_HUAWEI_DEVICE_CAPABILITIES_START and CAMERA_HUAWEI_DEVICE_CAPABILITIES_END.
  • an index of the capability information of the tag name such as CAMERA_HUAWEI_DEVICE_CAPABILITIES_START and CAMERA_HUAWEI_DEVICE_CAPABILITIES_END.
  • CAMERA_HUAWEI_DEVICE_CAPABILITIES_START may indicate the starting address of the capability information of a Tag name stored in the TV 1102 .
  • CAMERA_HUAWEI_DEVICE_CAPABILITIES_END may indicate the end address of the capability information in which a Tag name is stored in the TV 1102 . According to the starting address and ending address, the mobile phone can query the TV 1102 for various device capabilities of the TV 1102 .
  • the TAG that is, the above-mentioned shooting capability parameter
  • the post-processing (algorithm) module in the mobile phone to process the RAW image returned by the TV 1102 .
  • the specific method for the mobile phone to process the RAW image returned by the TV 1102 according to the above TAG combined with the post-processing (algorithm) module is described in detail in the following embodiments, and will not be repeated here.
  • the mobile phone when the mobile phone is running the camera application, on the one hand, it can call its own camera to obtain each frame of shooting images, on the other hand, the mobile phone can instruct the TV 1102 to obtain RAW images, and send the obtained RAW images to the mobile phone through DMSDP HAL 202b camera application.
  • the mobile phone can call its own camera to acquire each frame of shooting images according to a certain frame rate, and instruct the TV 1102 to acquire RAW images according to the same frame rate.
  • the mobile phone can send a shooting instruction to the TV 1102 through the DMSDP HAL to instruct the TV 1102 to obtain RAW images.
  • the application layer 200 of the mobile phone can transmit the shooting instruction to the DMSDP HAL 202b of the HAL 202 through the framework layer 201 and the service layer 201a; the DMSDP HAL 202b of the HAL 202 can perform step a to transmit the shooting instruction to the TV 1102.
  • the camera proxy service 210a of the application layer 210 can add a preset mark to the shooting instruction.
  • the preset flag is used to instruct to directly acquire the RAW data collected by the underlying Camera device, and transparently transmit the RAW data to the camera proxy service 210a of the application layer 211 .
  • the camera proxy service 210a of the application layer 210 may add a preset mark to the shooting instruction, and transmit the preset mark to the Camera HAL 212a in the HAL 212 through the framework layer 211 and the service layer 211a shooting order.
  • the Camera HAL 212a in the HAL 212 receives the shooting instruction, it can call the Camera device of the kernel layer 213 to execute the shooting instruction.
  • the Camera HAL 212a in the HAL 212 can call the Camera device in the kernel layer 213 to collect RAW images, and perform step b to transparently transmit the RAW data to the camera proxy service 210a of the application layer 211 . Then, step c is performed by the camera proxy service 210a of the application layer 211 to transmit the RAW image to the DMSDP HAL 202b of the mobile phone.
  • the Camera device of the core layer 213 may include a photosensitive device, a digital signal processor (DSP) and an image processor (ISP) as shown in FIG. 14 .
  • the photosensitive device may include a lens, a sensor, and the like. This sensor is used to capture RAW images.
  • a digital signal processor (DSP) is used to perform sampling processing on the multi-frame RAW images collected by the photosensitive device; then, the sampled RAW images are transmitted to an image processor (ISP).
  • an image processor (ISP) can perform image processing on RAW images from the DSP.
  • the ISP does not perform image processing on the RAW image from the DSP, but transparently transmits the RAW image from the DSP to the Camera HAL 212a in the HAL 212.
  • the step b in FIG. 13 points directly from the kernel layer 213 to the application layer 210, just to illustrate that the mobile phone will not perform image processing on the RAW image, and does not mean that the RAW image does not pass through the Camera HAL 212a
  • the service layer 211a and the framework layer 211 can be directly transferred from the kernel layer 213 to the application layer 210 .
  • the RAW images collected by the Camera device in the kernel layer 213 need to be transmitted to the application layer 210 through the Camera HAL 212a, the service layer 211a, and the framework layer 211 in the HAL 212.
  • the Camera HAL 212a, the service layer 211a, and the framework layer 211 in the HAL 212 can only transparently transmit the RAW image without performing any processing on the RAW image.
  • what the TV 1102 transmits to the mobile phone is not a processed image that can be directly displayed, but a RAW image without image processing.
  • the TV 1102 uses the camera to capture the RAW image, it can directly transmit the RAW image to the mobile phone without performing image processing on the RAW image.
  • This RAW image is an image that has not been processed by the television 1102 .
  • the data volume of the RAW image is smaller than that of the image processed by the TV 1102 . In this way, the network bandwidth occupied by the distributed shooting service across the devices can be reduced.
  • the mobile phone can perform image processing on the RAW image from the TV 1102 to obtain a corresponding shooting picture.
  • the camera application of the mobile phone can not only acquire each frame of the photographed image from the mobile phone, but also can acquire each frame of the photographed image of the TV 1102 .
  • the camera application can synchronously display the photographed image of the mobile phone and the photographed image of the TV 1102 on the display interface of the camera application, so as to realize a distributed photographing function across devices.
  • the preview interface of the camera application of the mobile phone includes the shooting screen 1501 of the mobile phone and the shooting screen 1502 of the TV 1102 .
  • the mobile phone processes the RAW image from the TV 1102, which can avoid the problem of increasing the time delay due to the large difference in the performance of the devices at both ends.
  • the time delay of the distributed shooting service can be reduced.
  • Control Session and Data Session can correspond to different transmission paths (or called transmission pipes).
  • the DMSDP HAL 202b of the mobile phone performs step a shown in FIG. 13, and can transmit the shooting instruction to the camera proxy service 210a of the TV 1102 through the transmission pipeline of the above-mentioned Control Session.
  • the camera proxy service 210a of the TV 1102 executes step c shown in FIG. 13, and can transmit a data stream (such as the above-mentioned RAW data) to the DMSDP HAL 202b of the mobile phone through the transmission pipeline of the above-mentioned Data Session.
  • the transmission paths of the control flow and the data flow can be distinguished. In this way, the problem that the control flow and the data flow mutually occupy the bandwidth can be avoided.
  • the above-mentioned shooting capability parameters of the television 1102 may further include algorithm capability information.
  • the algorithm capability information is used to indicate the algorithm capability of the television 1102 to perform image processing on the captured image.
  • the algorithm capability information may indicate one or more image processing algorithms such as a face recognition algorithm, an auto-focus algorithm, etc. supported by the TV 1102 . That is to say, the mobile phone can also synchronize the post-processing algorithm of the TV 1002 to the mobile phone.
  • the post-processing algorithm of the TV set 1102 is reserved in the service layer 201a of the mobile phone.
  • the service layer 201a of the mobile phone can call the reserved post-processing algorithm of the TV 1102 based on the algorithm capability information of the TV 1102 to perform image processing on the RAW image from the DMSDP HAL 202b.
  • the mobile phone can synchronously render the image collected by the mobile phone and the post-processed image from the TV 1102 based on the time stamp to obtain and display the shooting screen 1501 of the mobile phone and the shooting screen 1502 of the TV 1102 shown in FIG. 15 .
  • the mobile phone calls the reserved post-processing algorithm of the TV 1102 to perform image processing on the RAW image collected by the TV 1102. similar effect.
  • the image processed by the image can also achieve the same effect as the image processing on the TV 1102 side. That is to say, by adopting the method of the embodiment of the present application, not only the network bandwidth occupied by the distributed photographing service across devices can be reduced, but also the image effect of the remote device can be restored.
  • the above-mentioned shooting capability parameters of the television 1102 may not include algorithm capability information. That is, the mobile phone does not need to synchronize the post-processing algorithm of the TV 1002 to the mobile phone. For example, as shown in FIG. 16, the post-processing algorithm of the television 1102 is not reserved in the service layer 201a of the mobile phone.
  • the service layer 201a of the mobile phone can call the post-processing algorithm of the mobile phone based on the algorithm capability information of the mobile phone to perform image processing on the RAW image from the DMSDP HAL 202b. Afterwards, the mobile phone can synchronously render the image collected by the mobile phone and the post-processed image from the TV 1102 based on the time stamp to obtain and display the shooting screen 1501 of the mobile phone and the shooting screen 1502 of the TV 1102 shown in FIG. 15 .
  • the mobile phone calls the mobile phone's post-processing algorithm to process the RAW images collected by the TV 1102, although the effect of image processing on the TV 1102 side cannot be restored; however, for the images collected by the mobile phone and the RAW images collected by the TV 1102, Both call the post-processing algorithm of the mobile phone to perform image processing, so that the image effects of the photographed image of the mobile phone and the photographed image of the TV 1102 can be consistent.
  • the algorithm processing capability of the mobile phone is better than the algorithm processing capability of the TV 1102, compared with calling the post-processing algorithm of the TV 1102 to perform image processing on the RAW image collected by the TV 1102, calling the post-processing algorithm of the mobile phone for the RAW image.
  • the image effect of the captured image of the TV 1102 displayed by the mobile phone can also be improved.
  • the post-processing algorithm of the mobile phone is used to perform image processing on the RAW image collected by the TV 1102, and the post-processing algorithm of the TV 1102 does not need to be retained in the mobile phone. In this way, the storage space of the phone can be saved.
  • the mobile phone is used as an example for the local device in the distributed shooting scene.
  • the local device in the distributed shooting scene can also be a tablet computer, a TV, etc. with the above shooting functions.
  • An electronic device, this embodiment of the present application does not impose any limitation on this.
  • the Android system is used as an example to illustrate the specific method for realizing the distributed shooting function among the various functional modules. It can be understood that corresponding functional modules can also be set in other operating systems to realize the above-mentioned method. As long as the functions implemented by various devices and functional modules are similar to the embodiments of the present application, they fall within the scope of the claims of the present application and their technical equivalents.
  • the image data collected by the local device 101 is the first image data
  • the shooting picture of the local device 101 is the first shooting picture
  • the first shooting picture is displayed in the first window.
  • the image data collected by the remote device 102 is the second image data
  • the shooting picture of the remote device 102 is the second shooting picture
  • the second shooting picture is displayed in the second window.
  • the above-mentioned second image data includes a RAW image.
  • the first photographing picture may be the photographing picture 1 shown in FIG. 5
  • the second photographing picture may be the photographing picture 2 shown in FIG. 5
  • the shooting screen 1 and the shooting screen 2 shown in FIG. 5 are displayed on the same display interface.
  • the first photographing picture may be the photographing picture 1 shown in FIG. 6A
  • the second photographing picture may be the photographing picture 3 shown in FIG. 6A
  • the shooting screen 1 and the shooting screen 2 shown in FIG. 6A are displayed on the same display interface on both the mobile phone 0 and the mobile phone 1 .
  • the electronic device may include: the above-mentioned touch screen, a memory, and one or more processors.
  • the touch screen, memory and processor are coupled.
  • the memory is used to store computer program code comprising computer instructions.
  • the processor executes the computer instructions, the electronic device can execute various functions or steps executed by the mobile phone in the foregoing method embodiments.
  • the structure of the electronic device reference may be made to the structure of the mobile phone shown in FIG. 2B .
  • the chip system 1700 includes at least one processor 1701 and at least one interface circuit 1702.
  • the processor 1701 and the interface circuit 1702 may be interconnected by wires.
  • the interface circuit 1702 may be used to receive signals from other devices, such as the memory of an electronic device.
  • the interface circuit 1702 may be used to send signals to other devices (eg, the processor 1701).
  • the interface circuit 1702 can read the instructions stored in the memory and send the instructions to the processor 1701 .
  • the electronic device can be caused to perform the various steps in the above-mentioned embodiments.
  • the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
  • Embodiments of the present application further provide a computer storage medium, where the computer storage medium includes computer instructions, when the computer instructions are executed on the above-mentioned electronic device, the electronic device is made to perform various functions or steps performed by the mobile phone in the above-mentioned method embodiments .
  • Embodiments of the present application further provide a computer program product, which, when the computer program product runs on a computer, enables the computer to perform various functions or steps performed by the mobile phone in the above method embodiments.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or may be distributed to multiple different places . Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Telephonic Communication Services (AREA)

Abstract

La présente invention concerne un procédé de photographie distribuée, un dispositif électronique et un support, étant capables de réduire la largeur de bande de réseau occupée par une transmission de données entre un dispositif distant et un dispositif local dans le processus de mise en œuvre d'une photographie distribuée. Dans ledit procédé, un premier dispositif peut recevoir une première opération d'un utilisateur qui sélectionne la réalisation d'une photographie synchrone avec un second dispositif. En réponse à la première opération, le premier dispositif peut commencer l'acquisition de premières données d'image, et le premier dispositif peut également donner l'ordre au second dispositif de commencer l'acquisition de données d'image. Ensuite, le premier dispositif peut recevoir des secondes données d'image du second dispositif, les secondes données d'image incluant une image d'origine acquise par une caméra du second dispositif. Finalement, le premier dispositif peut afficher une première image photographiée correspondant aux premières données d'image dans une première fenêtre et afficher une seconde image photographiée correspondant aux secondes données d'image dans une seconde fenêtre. La première fenêtre et la seconde fenêtre sont disposées dans la même interface d'affichage.
PCT/CN2021/137917 2021-01-30 2021-12-14 Procédé de photographie distribuée, dispositif électronique et support WO2022160985A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110131870.2A CN114845035B (zh) 2021-01-30 2021-01-30 一种分布式拍摄方法,电子设备及介质
CN202110131870.2 2021-01-30

Publications (1)

Publication Number Publication Date
WO2022160985A1 true WO2022160985A1 (fr) 2022-08-04

Family

ID=82561398

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137917 WO2022160985A1 (fr) 2021-01-30 2021-12-14 Procédé de photographie distribuée, dispositif électronique et support

Country Status (2)

Country Link
CN (2) CN115514882B (fr)
WO (1) WO2022160985A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320783A (zh) * 2022-09-14 2023-06-23 荣耀终端有限公司 一种录像中抓拍图像的方法及电子设备
WO2024145063A1 (fr) * 2022-12-29 2024-07-04 Meta Platforms, Inc. Procédés, appareils et produits programmes d'ordinateur pour fournir des caméras virtuelles pour des entrées matérielles

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115514898A (zh) * 2022-11-17 2022-12-23 深圳开鸿数字产业发展有限公司 拍摄方法、终端设备和存储介质
CN117707242A (zh) * 2023-07-11 2024-03-15 荣耀终端有限公司 温度控制方法及相关装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130329124A1 (en) * 2012-06-08 2013-12-12 Canon Kabushiki Kaisha Image processing apparatus and image processing method
CN108718383A (zh) * 2018-04-24 2018-10-30 天津字节跳动科技有限公司 协同拍摄方法、装置、存储介质及终端设备
CN109769087A (zh) * 2017-11-09 2019-05-17 中兴通讯股份有限公司 远程合照的拍摄方法、装置及移动终端
CN110944109A (zh) * 2018-09-21 2020-03-31 华为技术有限公司 一种拍照方法、装置与设备
CN111083379A (zh) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 拍摄方法及电子设备

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796964A (en) * 1996-01-16 1998-08-18 International Business Machines Method for modifying an existing computer bus to enhance system performance
JP2009134391A (ja) * 2007-11-29 2009-06-18 Renesas Technology Corp ストリーム処理装置、ストリーム処理方法及びデータ処理システム
CN101404726B (zh) * 2008-10-20 2012-05-02 华为终端有限公司 一种远端摄像机的控制方法、系统和装置
CN103336677B (zh) * 2013-06-25 2016-03-30 小米科技有限责任公司 一种向显示设备输出图像的方法、装置和系统
WO2016019495A1 (fr) * 2014-08-04 2016-02-11 华为技术有限公司 Terminal, serveur, et procédé de commande de terminal
CN104284234B (zh) * 2014-10-17 2017-12-12 惠州Tcl移动通信有限公司 多个终端间共享同步图像的方法及系统
CN105516423A (zh) * 2015-12-24 2016-04-20 努比亚技术有限公司 移动终端、数据传输系统及移动终端拍摄方法
KR102482860B1 (ko) * 2018-01-02 2022-12-30 삼성전자 주식회사 상황 정보 기반 이미지 처리 방법 및 이를 사용하는 전자 장치
CN110139109B (zh) * 2018-02-08 2023-01-10 北京三星通信技术研究有限公司 图像的编码方法及相应终端
KR102495763B1 (ko) * 2018-02-23 2023-02-06 삼성전자주식회사 제1 이미지 처리 방식으로 보정된 이미지를 제2 이미지 처리 방식을 이용하여 외부 전자 장치에서 보정하기 위한 전자 장치 및 방법
CN110224804A (zh) * 2018-03-01 2019-09-10 国民技术股份有限公司 数据传输控制方法、终端、基站及计算机存储介质
US11917335B2 (en) * 2018-06-25 2024-02-27 Sony Corporation Image processing device, movable device, method, and program
CN109361869B (zh) * 2018-11-28 2021-04-06 维沃移动通信(杭州)有限公司 一种拍摄方法及终端
CN110602805B (zh) * 2019-09-30 2021-06-15 联想(北京)有限公司 信息处理方法、第一电子设备和计算机系统
CN111860530B (zh) * 2020-07-31 2024-05-28 Oppo广东移动通信有限公司 电子设备、数据处理方法及相关装置
CN112004076B (zh) * 2020-08-18 2022-08-16 Oppo广东移动通信有限公司 数据处理方法、控制终端、ar终端、ar系统及存储介质
CN111988528B (zh) * 2020-08-31 2022-06-24 北京字节跳动网络技术有限公司 拍摄方法、装置、电子设备及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130329124A1 (en) * 2012-06-08 2013-12-12 Canon Kabushiki Kaisha Image processing apparatus and image processing method
CN109769087A (zh) * 2017-11-09 2019-05-17 中兴通讯股份有限公司 远程合照的拍摄方法、装置及移动终端
CN108718383A (zh) * 2018-04-24 2018-10-30 天津字节跳动科技有限公司 协同拍摄方法、装置、存储介质及终端设备
CN110944109A (zh) * 2018-09-21 2020-03-31 华为技术有限公司 一种拍照方法、装置与设备
CN111083379A (zh) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 拍摄方法及电子设备

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320783A (zh) * 2022-09-14 2023-06-23 荣耀终端有限公司 一种录像中抓拍图像的方法及电子设备
CN116320783B (zh) * 2022-09-14 2023-11-14 荣耀终端有限公司 一种录像中抓拍图像的方法及电子设备
WO2024145063A1 (fr) * 2022-12-29 2024-07-04 Meta Platforms, Inc. Procédés, appareils et produits programmes d'ordinateur pour fournir des caméras virtuelles pour des entrées matérielles

Also Published As

Publication number Publication date
CN114845035B (zh) 2024-04-26
CN115514882A (zh) 2022-12-23
CN115514882B (zh) 2024-06-14
CN114845035A (zh) 2022-08-02

Similar Documents

Publication Publication Date Title
WO2020233553A1 (fr) Procédé de photographie et terminal
WO2021052147A1 (fr) Procédé de transmission de données et dispositifs associés
WO2022160985A1 (fr) Procédé de photographie distribuée, dispositif électronique et support
WO2022257977A1 (fr) Procédé de projection d'écran pour dispositif électronique, et dispositif électronique
CN112398855B (zh) 应用内容跨设备流转方法与装置、电子设备
WO2022121775A1 (fr) Procédé de projection sur écran, et dispositif
WO2022143077A1 (fr) Procédé, système et dispositif électronique de photographie
WO2022143883A1 (fr) Procédé et système de photographie, et dispositif électronique
WO2022105803A1 (fr) Procédé et système d'appel de caméra, et dispositif électronique
CN112130788A (zh) 一种内容分享方法及其装置
WO2022222773A1 (fr) Procédé de capture d'images, et appareil et système associés
CN115514883B (zh) 跨设备的协同拍摄方法、相关装置及系统
US11895713B2 (en) Data sharing and instruction operation control method and system
WO2022156721A1 (fr) Procédé de photographie et dispositif électronique
CN114201130A (zh) 一种投屏的方法、装置及存储介质
WO2023231697A1 (fr) Procédé pour photographier et dispositif associé
CN114466131B (zh) 一种跨设备的拍摄方法及相关设备
WO2021204103A1 (fr) Procédé de prévisualisation d'images, dispositif électronique et support de stockage
WO2022161058A1 (fr) Procédé de photographie pour image panoramique, et dispositif électronique
WO2022111701A1 (fr) Procédé et système de projection d'écran
WO2024152905A1 (fr) Procédé d'interaction avec le dispositif, et dispositif électronique
CN117082295B (zh) 图像流处理方法、设备及存储介质
WO2022206769A1 (fr) Procédé de combinaison de contenu, dispositif électronique et système
WO2023143171A1 (fr) Procédé d'acquisition audio et dispositif électronique
WO2024159925A1 (fr) Procédé et système de duplication d'écran, et dispositif électronique associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21922575

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21922575

Country of ref document: EP

Kind code of ref document: A1