WO2022121751A1 - 相机控制方法、装置和存储介质 - Google Patents

相机控制方法、装置和存储介质 Download PDF

Info

Publication number
WO2022121751A1
WO2022121751A1 PCT/CN2021/134826 CN2021134826W WO2022121751A1 WO 2022121751 A1 WO2022121751 A1 WO 2022121751A1 CN 2021134826 W CN2021134826 W CN 2021134826W WO 2022121751 A1 WO2022121751 A1 WO 2022121751A1
Authority
WO
WIPO (PCT)
Prior art keywords
operation command
terminal device
camera
status information
image data
Prior art date
Application number
PCT/CN2021/134826
Other languages
English (en)
French (fr)
Inventor
史豪君
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US18/256,504 priority Critical patent/US20240056673A1/en
Priority to EP21902458.5A priority patent/EP4246940A4/en
Publication of WO2022121751A1 publication Critical patent/WO2022121751A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present application relates to the field of electronic equipment, and in particular, to a camera control method, device and storage medium.
  • an embodiment of the present application provides a camera control method, wherein the method is used in a first terminal device, and the method includes: receiving image data from a second terminal device, the image data It is collected by the second terminal device during the shooting process; the operation command and status information are determined, the operation command is an operation command for the shooting process of the second terminal device, and the status information indicates that the second terminal the execution status of the operation command by the device; display a screen according to the image data; display the execution status of the operation command on the screen according to the operation command and status information.
  • the image data from the second terminal device is received by the first terminal device, the operation command and status information are determined, the screen is displayed according to the image data, and the screen is displayed on the screen according to the operation command and status information.
  • Displays the execution status of the operation command so that the first terminal device can achieve image synchronization with the second terminal device, and the operation command and status information can be synchronized and communicated, so that the two ends can coordinately control the camera, and also allow the user to more clearly. Understand the current status information of distributed cameras, improve control accuracy, and improve user experience.
  • the operation command includes a first operation command generated in response to an operation on the first terminal device, and determining the operation command and status information includes : send the first operation command to the second terminal device; receive status information that is sent by the second terminal device and indicates the execution state of the first operation command.
  • the operation command is sent by the first terminal
  • the synchronization of camera operation commands and status information between the first terminal device and the second terminal device enables the first terminal device to control the camera, and the results and status are shared, so that in the distributed camera scenario
  • the camera control is more flexible and fast, and the control is more accurate.
  • the operation command includes a second operation command generated in response to an operation on the second terminal device, and determining the operation command and status information includes : Receive the second operation command and the status information sent by the second terminal device.
  • the operation command and the state information sent by the second terminal device by receiving the second operation command and the state information sent by the second terminal device, it is possible to make the operation command and the state information be compatible with the operation command in the case that the operation command is triggered by the second terminal device.
  • the synchronization of the first terminal device enables the user of the first terminal device to know the current state of the camera in real time, and on this basis, the operation of the camera can be realized, and the intercommunication of multi-side information is realized.
  • determining an operation command and status information includes: receiving status information sent by the second terminal device, where the status information indicates that the second terminal
  • the execution state of the device on the target operation command determined from the multiple operation commands, the multiple operation commands are operations of the same operation type generated in response to operations on the second terminal device or one or more first terminal devices command, the target operation command is the operation command with the largest corresponding frame number among the plurality of operation commands.
  • the latest operation command can be responded to, and the other party is allowed to cancel the command being executed, so that the user's intention can be correctly selected, and the corresponding operation command can be selected. , which makes the response more flexible and quicker when multiple parties collaborate, and multi-side commands can be cancelled and updated with each other to improve the user experience.
  • the operation type of the operation command includes focusing , zoom, turn flash on or off, adjust exposure, use filters, beautify skin, and smooth skin.
  • operations that can be performed on images on the first terminal device can be enriched, and through interface display between the first terminal device and the second terminal device, a scenario of distributed camera control can be realized, so that the user experience is better.
  • an embodiment of the present application provides a camera control method, the method is used for a second terminal device, the method includes: sending image data to the first terminal device, the image data is the first terminal device Collected by the second terminal device during the shooting process; determine an operation command and status information, the operation command is an operation command for the shooting process of the second terminal device, and the status information indicates that the second terminal device has the execution state of the operation command; sending the state information to the first terminal device.
  • the second terminal device sends the image data to the first terminal device, determines the operation command and status information, and sends the status information to the first terminal device, so that the images and operation commands of the multi-side device can be sent to the first terminal device. Synchronization and intercommunication with status information, so that the opposite end can also control the camera collaboratively, which can realize distributed camera control.
  • the operation command includes a first operation command generated in response to an operation on the first terminal device, and determining the operation command and status information includes : Receive the first operation command sent by the first terminal device, execute the first operation command, and obtain status information indicating the execution state of the first operation command.
  • the operation command is executed by When the first terminal device is triggered, the operation command is synchronized and the command is executed to realize the control of the camera by the multi-sided device.
  • the current state is convenient for users to perform subsequent operations, making the control more accurate and improving the user experience.
  • determining the operation command and the status information includes: in response to an operation on the second terminal device, generating a second operation command, executing the first The second operation command is obtained, and the state information indicating the execution state of the second operation command is obtained.
  • the operation command is executed by When the second terminal device is triggered, the operation command is executed and the corresponding state information is obtained, so as to be synchronized to the multi-side device, so that the multi-side device can coordinately control the camera.
  • determining an operation command and status information includes: determining a target operation command from a plurality of operation commands, the plurality of operation commands being in response to An operation command with the same operation type is generated for the operation of the second terminal device or one or more first terminal devices, and the target operation command is the operation command with the largest corresponding frame number among the plurality of operation commands; execute For the target operation command, state information representing the execution state of the target operation command is obtained.
  • the second terminal device can execute the latest operation command through a concurrent response strategy, and at the same time allow the opposite side to cancel the command being executed, so that the user can be correctly selected and select the corresponding operation command, so that when multiple parties cooperate, the response is more flexible and rapid.
  • the operation type of the operation command includes focusing , zoom, turn flash on or off, adjust exposure, use filters, beautify skin, and smooth skin.
  • the operations that can be performed on the image on the second terminal device can be enriched, and the interface display between the first terminal device and the second terminal device can realize a distributed camera control scene, so that the user experience is better.
  • an embodiment of the present application provides a camera control apparatus, the apparatus is used in a first terminal device, the apparatus includes: an image data receiving module, configured to receive image data from a second terminal device, the The image data is collected by the second terminal device during the shooting process; the first information determination module is used to determine an operation command and status information, where the operation command is an operation command for the second terminal device during the shooting process , the status information indicates the execution status of the operation command by the second terminal device; the screen display module is used to display the screen according to the image data; the execution status display module is used to display the screen according to the operation command and status information , and the execution status of the operation command is displayed on the screen.
  • the operation command includes a first operation command generated in response to an operation on the first terminal device, and the first information determination module, It includes: a first operation command sending sub-module for sending the first operation command to the second terminal device; a first information receiving sub-module for receiving a message sent by the second terminal device, indicating that the State information describing the execution state of the first operation command.
  • the operation command includes a second operation command generated in response to an operation on the second terminal device, and the first information determination module, It includes: a second information receiving submodule, configured to receive the second operation command and the status information sent by the second terminal device.
  • the first information determination module includes: a status information receiving sub-module, configured to receive status information sent by the second terminal device , the state information indicates the execution state of the target operation command determined by the second terminal device on the plurality of operation commands in response to the second terminal device or one or more first terminal devices.
  • the target operation command is the operation command with the largest corresponding frame number among the plurality of operation commands.
  • the operation type of the operation command includes focusing , zoom, turn flash on or off, adjust exposure, use filters, beautify skin, and smooth skin.
  • an embodiment of the present application provides a camera control apparatus, the apparatus is used for a second terminal device, the apparatus includes: an image data sending module, configured to send image data to the first terminal device, the The image data is collected by the second terminal device during the shooting process; a second information determination module is used to determine an operation command and status information, and the operation command is for the shooting process of the second terminal device.
  • an operation command the state information indicates the execution state of the operation command by the second terminal device; a state information sending module is configured to send the state information to the first terminal device.
  • the operation command includes a first operation command generated in response to an operation on the first terminal device, and the second information determination module, It includes: an operation command receiving sub-module for receiving the first operation command sent by the first terminal device, a first operation command executing sub-module for executing the first operation command, and obtaining an indication that the first operation command is Status information about the execution status of an operation command.
  • the second information determination module includes: an operation command generation sub-module, configured to, in response to an operation on the second terminal device, generate The second operation command, the second operation command execution sub-module is configured to execute the second operation command to obtain state information representing the execution state of the second operation command.
  • the second information determination module includes: a target operation command determination submodule, configured to determine a target operation command from a plurality of operation commands , the multiple operation commands are operation commands of the same operation type that are generated in response to operations on the second terminal device or one or more first terminal devices, and the target operation command is one of the multiple operation commands , corresponding to the operation command with the largest frame number; the target operation command execution sub-module is used to execute the target operation command to obtain status information representing the execution state of the target operation command.
  • the operation type of the operation command includes focusing , zoom, turn flash on or off, adjust exposure, use filters, beautify skin, and smooth skin.
  • embodiments of the present application provide a camera control apparatus, the apparatus comprising: a processor; a memory for storing instructions executable by the processor; wherein, when the processor is configured to execute the instructions
  • a camera control method that implements the first aspect or one or more of the possible implementations of the first aspect, or implements the second aspect or one or more of the possible implementations of the second aspect.
  • embodiments of the present application provide a non-volatile computer-readable storage medium on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the above-mentioned first aspect or the first aspect is implemented
  • the camera control method
  • an embodiment of the present application provides a terminal device, which can execute the first aspect or one or more of the camera control methods in the multiple possible implementation manners of the first aspect, or execute The second aspect or one or more camera control methods in multiple possible implementation manners of the second aspect.
  • embodiments of the present application provide a computer program product, comprising computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are stored in an electronic
  • the processor in the electronic device executes the first aspect or one or more of the camera control methods in multiple possible implementations of the first aspect, or executes the second aspect or the second aspect.
  • One or more camera control methods in various possible implementation manners of the aspect.
  • FIG. 1 shows a schematic diagram of an application scenario according to an embodiment of the present application.
  • FIG. 2 shows a flowchart of a camera control method according to an embodiment of the present application.
  • FIG. 3 shows a schematic interface diagram of a camera control method according to an embodiment of the present application.
  • FIG. 4 shows a schematic interface diagram of a camera control method according to an embodiment of the present application.
  • FIG. 5 shows a schematic interface diagram of a camera control method according to an embodiment of the present application.
  • FIG. 6 shows a schematic interface diagram of a camera application according to an embodiment of the present application.
  • FIG. 7 shows a schematic interface diagram of a camera application according to an embodiment of the present application.
  • FIG. 8 shows a flowchart of a camera control method according to an embodiment of the present application.
  • FIG. 9 shows a flowchart of a camera control method according to an embodiment of the present application.
  • FIG. 10 shows a flowchart of a camera control method according to an embodiment of the present application.
  • FIG. 11 shows a flowchart of a camera control method according to an embodiment of the present application.
  • FIG. 12 shows a flowchart of a camera control method according to an embodiment of the present application.
  • FIG. 13 shows a flowchart of a camera control method according to an embodiment of the present application.
  • FIG. 14 is a structural diagram of a camera control apparatus according to an embodiment of the present application.
  • FIG. 15 is a structural diagram of a camera control apparatus according to an embodiment of the present application.
  • FIG. 16 shows a schematic structural diagram of a terminal device according to an embodiment of the present application.
  • FIG. 17 shows a block diagram of a software structure of a terminal device according to an embodiment of the present application.
  • the control method of cameras in the prior art is often unilateral.
  • the local device invokes the camera application of the remote device, but the camera cannot be remotely controlled, and the operation commands can only Triggered on a remote device, the problem is that the operation command can only be triggered in a single-sided device, which cannot support the coordinated control of distributed cameras.
  • Another method in the prior art is that the local device shares the preview data of the remote device after evoking the camera application of the remote device. After the proxy, it is sent to the camera system.
  • This approach is actually exclusive control, not real control of the camera of the remote device.
  • the problem is that the state information after the remote device executes the command cannot be synchronized to the local device, and the information between the two is not interoperable, making the collaborative control unable to achieve precise synchronization. , causing inconvenience to users.
  • an embodiment of the present application provides a camera control method.
  • the camera control method of the embodiment of the present application can realize the intercommunication between a first terminal device and a second terminal device, and receive data from the second terminal through the first terminal device.
  • the image data of the device determine the operation command and status information, can display the screen and the execution status of the operation command, realize the synchronization and intercommunication of information, and at the same time, the first terminal device can also send the operation command to the second terminal device. control.
  • the embodiments of the present application it is possible to support the camera control of multiple first terminal devices to the same second terminal device, and truly achieve shared control of the camera.
  • the devices can perceive each other and execute the result of the command. It can be shared between devices, and supports concurrent trigger commands of each device. Through the coordination mechanism, the user's intention can be correctly understood.
  • the concurrent commands between devices can be canceled and updated with each other. Convenient user experience.
  • FIG. 1 shows a schematic diagram of an application scenario according to an embodiment of the present application.
  • the camera control method provided by the embodiment of the present application may be applied in a live broadcast scenario including a first terminal device and a second terminal device.
  • the live broadcast scenarios may include e-commerce live broadcasts, education live broadcasts, etc.
  • the first terminal device and the second terminal device may be devices with wireless connection functions.
  • Bluetooth and other short-range wireless connection methods are connected to each other, and can also be connected through General Packet Radio Service (GPRS, General Packet Radio Service) / Code Division Multiple Access (CDMA, Code Division Multiple Access), digital radio, spread spectrum microwave, satellite communication , mobile communication and other long-distance wireless connection methods are connected to each other, and the above-mentioned terminal equipment may also have the function of wired connection for communication.
  • the first terminal device and the second terminal device in this embodiment of the present application may be the same type of terminal equipment, or may be different types of terminal equipment, and may be touch screen, non-touch screen, or no screen, The touch screen can control the terminal device by clicking, sliding, etc.
  • a device without a screen can be a Bluetooth speaker without a screen, for example.
  • the terminal device of the present application may be a smart phone, a netbook, a tablet computer, a notebook computer, a television (TV, television), or a virtual reality device.
  • the present application does not limit the type of live broadcast and the type of terminal equipment, and the embodiments of the present application can also be applied to other application scenarios other than live broadcast.
  • the camera control method provided by the embodiment of the present application can be applied in a live broadcast scene.
  • the host can use the camera of a remote control device to shoot the live broadcast scene.
  • Panoramic pictures you can also use another remote auxiliary device to take close-up pictures of items in the live broadcast scene, such as close-up pictures of goods for sale.
  • the user can see both the panorama picture of the host during the live broadcast and the close-up picture of the goods for sale.
  • the remote main control device can be connected to one or more local devices.
  • the remote auxiliary device can also be connected with one or more local devices, through the camera control method of the embodiment of the present application, the command and status information of the cameras among the multiple devices are synchronized, and the staff is on one or more local devices.
  • the panorama picture captured by the camera of the camera that can help the host to control the remote control device.
  • the host and the staff can jointly operate the panorama picture; the staff can also help the host to control the camera of the remote auxiliary device on one or more local devices.
  • the close-up picture of the item captured by the camera, the anchor and the staff can also jointly operate the close-up picture.
  • the staff can click on a certain part of the shared panorama image on the local device connected to the remote master control device to trigger the focus operation, and the operation command can be synchronized to the remote master control device.
  • the execution status can be displayed on the remote main control device and each local device connected to the remote main control device to realize the coordinated control between the devices. The same is shown for the remote auxiliary device and the local device connected to it.
  • the camera control method provided by the embodiment of the present application can make it possible to correctly understand the operation intention of the user of each device performing the coordinated operation, so that the control of the connected cameras in the application scenario is more flexible and convenient.
  • FIG. 2 shows a flowchart of a camera control method according to an embodiment of the present application.
  • the camera control system may include a first terminal device and a second terminal device.
  • the first terminal device may include a camera application and a camera system that are connected to each other
  • the second terminal device may also include a camera application and a camera system that are connected to each other.
  • the camera application can be used to display images corresponding to image data on the first terminal device and the second terminal device, and can also be used to generate corresponding operation commands according to user operations
  • the camera system can be used to store relevant information of image frames As well as data such as attributes and parameters of the camera, it can also be used to process image data in the first terminal device and the second terminal device according to operation commands.
  • the number of the first terminal device and the number of the second terminal device is not limited.
  • the division, function, and communication mode of each module of the first terminal device and the second terminal device shown in FIG. 2 are only examples, which are not limited in this embodiment of the present application.
  • the embodiments of the present application do not limit the operation types of the operation commands.
  • the following uses application scenarios of focus operations and zoom operations as examples for description. Those skilled in the art should understand that the embodiments of the present application are not limited to such application scenarios.
  • user A can use the second terminal device to take pictures (for example, take pictures or take videos), and user B can synchronously see the preview screen of user A during the shooting process through the first terminal device, and display The screen performs a shooting operation, the operation command of the shooting operation is sent to the second terminal device, the second terminal device executes the operation command, and returns the state information of the execution state of the operation command to the first terminal device, so that the first terminal device can The execution state is also displayed, so that the first terminal device can synchronously reflect the picture and state during the shooting process of the second terminal device, and user B can also precisely control the second terminal device.
  • the operation command of the shooting operation is sent to the second terminal device
  • the second terminal device executes the operation command
  • returns the state information of the execution state of the operation command to the first terminal device so that the first terminal device can
  • the execution state is also displayed, so that the first terminal device can synchronously reflect the picture and state during the shooting process of the second terminal device, and user B can also precisely control the second terminal device.
  • the flow of the camera control method according to an embodiment of the present application includes:
  • Step S101 the camera application of the first terminal device sends an operation command to the camera system of the first terminal device.
  • the camera system of the first terminal device can be connected with the camera system of the second terminal device to send and receive image data, operation commands and execution status information to and from each other.
  • the operation type of the operation command may include one or more of focusing, zooming, turning on or off the flash, adjusting the exposure, using a filter, beautifying the skin, and smoothing the skin.
  • the operation type of the operation command is not limited here.
  • the corresponding focus operation command may include the operation type (focusing operation), the operation area (clicking). area) and operation mode (manual focus), and may also include the frame number of the image corresponding to the operation command, and the camera application of the first terminal device can issue the focus operation command to the camera system of the first terminal device.
  • Step S102 the camera system of the first terminal device sends the operation command to the camera system of the second terminal device, and the operation command is executed in the camera system of the second terminal device.
  • the camera system of the second terminal device executes the operation command, which may include controlling the camera according to the operation command, changing image data, and sending the image data to the camera application of the second terminal device, and may also generate an execution state according to the execution of the operation command. information.
  • the camera system of the second terminal device can compare the frames of the images corresponding to the two operation commands number, execute the operation command corresponding to the larger frame number, cancel the operation command corresponding to the smaller frame number, so as to correctly understand the user's operation intention, and give priority to responding to the user's new operation command.
  • the predetermined time interval may be determined according to the current number of frames per second of the camera system of the second terminal device. For example, in the case of 30 frames per second, the predetermined time interval may be 100/3 milliseconds.
  • the frame number represents a count of image frames.
  • the camera system of the second terminal device can execute the focus command, Modify the relevant parameters, generate image data, and display the corresponding picture in the camera application of the second terminal device.
  • Step S103 the camera system of the second terminal device sends the operation command to the camera application of the second terminal device.
  • the camera application of the second terminal device may display a screen corresponding to the image data on the second terminal device, and after the camera application of the second terminal device receives the operation command, it may display the operation being performed on the second terminal device Information about the command.
  • the camera application of the second terminal device can display the relevant information of the current focus command in the corresponding position (for example, highlight the target operation icon, display manual focus target location, etc.), so that the user using the second terminal device can simultaneously know what kind of operation command is currently being executed.
  • Step S301 the camera system of the second terminal device sends execution status information to the camera system of the first terminal device.
  • the execution state information may indicate the execution state corresponding to the operation command.
  • the camera system of the second terminal device may generate a focus start state, a focus in progress state, and a focus completed state, respectively.
  • the camera system of the second terminal device may respectively send the state information corresponding to the three states to the camera system of the first terminal device.
  • the camera system of the second terminal device receives multiple operation commands of the same operation type within a predetermined time interval, and determines to execute the operation command corresponding to the larger frame number, for example, determines to execute the second operation command.
  • the operation command triggered by the camera application of the terminal device cancels the operation command triggered by the camera application of the first terminal device.
  • the camera system of the second terminal device sends the execution status information to the camera system of the first terminal device, what is sent is the execution The state information of the execution state corresponding to the operation command triggered by the camera application of the second terminal device.
  • Step S302 the camera system of the second terminal device sends the execution state information to the camera application of the second terminal device.
  • the execution status information corresponds to the operation command actually executed by the camera system of the second terminal device. After the camera system of the second terminal device sends the execution status information to the camera application of the second terminal The execution state information corresponding to the actually executed operation command is synchronized on the camera application.
  • the camera application of the second terminal device receives the state information of the double-zoomed state, it can The status information is displayed on the camera application of the terminal device.
  • Step S303 the camera system of the first terminal device sends the execution state information to the camera application of the first terminal device.
  • the execution status information corresponds to the operation command actually executed by the camera system of the second terminal device. After the camera system of the first terminal device sends the execution status information to the camera application of the first terminal The execution state information corresponding to the actually executed operation command is synchronized on the camera application.
  • the camera application of the first terminal device After the camera system of the second terminal device generates the double-zoomed state in the process of executing the zoom command, after the camera application of the first terminal device receives the state information of the doubled-zoomed state, it can The status information is displayed on the camera application of the terminal device.
  • user A can use the second terminal device to take pictures (for example, take pictures or take videos), and user B can synchronously see the preview screen of user A during the shooting process through the first terminal device.
  • the operation command of the shooting operation will be synchronized to the first terminal device.
  • the state information of the execution state of the operation command can also be returned.
  • the execution state is also displayed on the first terminal device, so that the first terminal device can synchronously reflect the image and state during the shooting process of the second terminal device, and user A and user B can simultaneously accurately capture the image. control.
  • the flow of the camera control method according to an embodiment of the present application includes:
  • Step S201 the camera application of the second terminal device sends an operation command to the camera system of the second terminal device.
  • the user can click on the zoom icon of the camera application interface of the second terminal device to zoom in by two times, which can trigger the generation of a zoom operation command.
  • the corresponding operation command may include the operation type (zoom operation) and zoom factor (zoom in two times), and may also include the frame number of the image corresponding to the operation command.
  • the relevant information of the currently executed operation command (such as the zoom icon and The identification of the zoom factor)
  • the camera application of the second terminal device can deliver the zoom operation command to the camera system of the second terminal device.
  • Step S202 Execute the operation command in the camera system of the second terminal device, and send the operation command to the camera system of the first terminal device.
  • the camera system of the second terminal device can compare the frames of the images corresponding to the two operation commands number, execute the operation command corresponding to the larger frame number, cancel the operation command corresponding to the smaller frame number, so as to correctly understand the user's operation intention, and give priority to responding to the user's new operation command.
  • the camera system of the second terminal device can execute the zoom command, modify the relevant parameters, generate image data, and display the corresponding screen in the camera application of the second terminal device.
  • the camera system of the second terminal device can send the operation command. to the camera system of the first terminal device.
  • Step S203 the camera system of the first terminal device sends the operation command to the camera application of the first terminal device.
  • the camera application of the first terminal device can display the relevant information of the current zoom command at the corresponding position, so that the user who uses the first terminal device can synchronously know The currently executing action command.
  • Step S301 the camera system of the second terminal device sends execution status information to the camera system of the first terminal device.
  • Step S302 the camera system of the second terminal device sends the execution state information to the camera application of the second terminal device.
  • Step S303 the camera system of the first terminal device sends the execution state information to the camera application of the first terminal device.
  • FIG. 3 shows a schematic interface diagram of a camera control method according to an embodiment of the present application. As shown in FIG. 3 , during the shooting process of the second terminal device, after the camera captures the image data, it can send the image data to the first terminal device in real time, so that the first terminal device and the second terminal device can display the second terminal synchronously. The device's shooting preview screen. The user can also perform operations on the first terminal device and/or the second terminal device to trigger corresponding operation commands.
  • FIG. 4 shows a schematic interface diagram of a camera control method according to an embodiment of the present application.
  • the user can click on the image of the first terminal device to trigger a focus operation command, the first terminal device can send the focus operation command to the second terminal device, and the second terminal device executes the command, and the first terminal device can send the focus operation command to the second terminal device.
  • a corresponding execution state is generated.
  • the corresponding execution state may include starting to focus, focusing in progress, and focusing completed.
  • the second terminal device can respectively send the three execution states to the first terminal. device, and displayed on the first terminal device and the second terminal device.
  • a square frame as shown in the figure may be displayed in the corresponding areas of the images of the first terminal device and the second terminal device, indicating the focus area, and the square frame is displayed in white to indicate the start of focusing;
  • the square frame can be zoomed and displayed to indicate that the focusing operation is currently in progress;
  • the square frame can be displayed in yellow, and the square frame can be displayed in red when the focus fails.
  • the second terminal device may send image data to the first terminal device in real time during the process of performing the focusing operation.
  • the first terminal device and the second terminal device can synchronously display a clearer picture change in a local area of the shooting preview picture after the focusing operation.
  • FIG. 5 shows a schematic interface diagram of a camera control method according to an embodiment of the present application.
  • the user can click the zoom icon on the second terminal device, and select the desired zoom factor, such as zooming in three times, to trigger a zoom operation command, and the second terminal device can execute the zoom operation command and put the zoom operation command.
  • Synchronized to the first terminal device, and the first terminal device can display on the interface, for example, change the zoom icon to "3x" as shown in the figure.
  • the second terminal device executes the three times magnification command, it can generate state information whose execution state is the three times magnification completed, and can send the state information to the first terminal device, and the first terminal device can display the state information on the interface according to the state information. to display.
  • the second terminal device only receives a zoom command within a predetermined time interval, and the first terminal device zooms the image when it receives the zoom command and the status information. three times; if the second terminal device receives two commands of the zoom operation type within a predetermined time interval, for example, one command is triggered by the second terminal device to zoom in on the image with frame number 30 by three times, and the other One is a command triggered by the first terminal device to double-enlarge the image whose frame number is 25, then the command with a larger corresponding frame number, that is, a three-fold enlargement command, will be executed.
  • the command of twice magnification such as "2x”
  • the state information of three times magnification such as "3x” can be displayed.
  • the second terminal device may send the image data to the first terminal device in real time in the process of performing the zoom operation.
  • the first terminal device and the second terminal device can synchronously display the zoomed-in or zoomed-out shooting preview screen after the zoom operation.
  • FIG. 6 and FIG. 7 show schematic interface diagrams of a camera application according to an embodiment of the present application.
  • the following interface can be applied to the first terminal device or the second terminal device.
  • the top of the interface can include icons for metering, turning on or off the flash, selecting color mode, setting, etc.
  • the bottom of the interface can choose the camera mode, which can include aperture, night scene, portrait, photo, video, professional and more Multi-camera mode.
  • the operation of triggering the zoom can be performed on the right side of the interface of the terminal device.
  • the zoom factor can be selected in the semicircular box, for example Any value from 1 to 10 times.
  • the current zoom factor can be displayed in the operation icon.
  • FIG. 7 shows a schematic diagram of an interface of a camera application according to an embodiment of the present application.
  • icons for operations such as metering, turning on or off the flash, selecting a color mode, and setting may be included at the top of the interface.
  • Camera modes which can include aperture, night scene, portrait, photo, video, professional and many more camera modes.
  • the operation of triggering zooming can also be performed above the interface of the terminal device, and the user can select the zooming factor by sliding the progress bar, for example, any value from 1 to 10 times. After the user selects the zoom factor, the current zoom factor can be displayed in the operation icon.
  • FIG. 6 and FIG. 7 only enumerate two modes of interface display. Those skilled in the art should understand that the embodiments of the present application are not limited to such interface display modes. This is achieved by displaying text on the interface.
  • FIG. 8 shows a flowchart of a camera control method according to an embodiment of the present application.
  • the method can be used in a first terminal device. As shown in FIG. 8 , the method may include:
  • the operation command is an operation command for the shooting process of the second terminal device
  • the state information represents the execution state of the operation command by the second terminal device
  • the image data from the second terminal device is received by the first terminal device, the operation command and status information are determined, the screen is displayed according to the image data, and the screen is displayed on the screen according to the operation command and status information.
  • Displays the execution status of the operation command so that the first terminal device can achieve image synchronization with the second terminal device, and the operation command and status information can be synchronized and communicated, so that the two ends can coordinately control the camera, and also allow the user to more clearly. Understand the current status information of distributed cameras, improve control accuracy, and improve user experience.
  • in process of shooting may indicate that the second terminal device is shooting and has completed the stage before shooting.
  • the second terminal device turns on the camera and enters the process of taking pictures or videos.
  • the second terminal device Before clicking the shooting button or the button to end the video recording, the second terminal device is in the shooting process, and the second terminal device can display the shooting preview screen during the shooting process. .
  • the image data may include image frame data.
  • the image frame data may be each frame of image data collected by the camera during the shooting process, Wherein, each frame of image data may be in RGB type, YUV type, or JPG format, which is not limited in the embodiments of the present application.
  • the image frame data may also be each frame of image data processed by zooming, focusing, etc.
  • the screen displayed by the first terminal device after receiving the image frame data may be a shooting preview screen synchronized with the second terminal device.
  • the image data may also include frame information (eg, the frame number of each frame of image) and basic image attribute information (eg, image size, resolution, etc.).
  • the user can trigger the operation command by clicking, dragging, etc.
  • the icon corresponding to the operation command can be displayed in the first terminal device and the second terminal device, and the user can trigger the operation command by clicking on the icon, etc. operate.
  • the operation command triggered by the user and the execution state after the second terminal device executes the command can be synchronized between the devices.
  • the operation command may include parameters such as type, area, mode, etc.
  • the operation command of the focus operation may correspond to type as focus, area as a certain area clicked by the user, and mode as manual focus.
  • the operation command of the zoom operation may correspond to a type of zoom and a mode of zoom X times.
  • the operation type of the operation command may include, but is not limited to, focusing, zooming, turning on or off the flash, adjusting exposure, using filters, beautifying skin, microdermabrasion, and the like.
  • the operation command may be triggered by the first terminal device, or may be triggered by the second terminal device.
  • Different operation commands can have their own execution status.
  • the execution status of the focus operation can include the start focus status, the focus in progress status, and the focus completed status.
  • the execution status of the zoom operation can include the completion of zooming to a specified multiple, and turning on or off the flash.
  • the execution state of the operation may include that the flashlight is turned on, or the flashlight is turned off, and the execution state may be set according to the characteristics of the operation and the need for display, which is not limited in this embodiment of the present application.
  • the execution state of the operation command may be displayed on the interface of the first terminal device to indicate the start focus state, the focus in progress state, and the completed focus state, or the zooming to a specified multiple is completed, or The flashlight has been turned on/off, so that the user can know the execution of the operation command on the second terminal device.
  • the manner of display may be to display an icon, pop up a prompt box, display text on an interface, etc., for example, refer to FIG. 3 to FIG. 7 , which is not limited by the embodiments of the present application.
  • the position of the manual focusing operation on the screen can be obtained according to the operation command, and the icons corresponding to the starting focusing state, the focusing in progress state, and the focusing completed state can be displayed at the position in the preview screen according to the status information.
  • FIG. 9 shows a flowchart of a camera control method according to an embodiment of the present application.
  • the operation command may include a first operation command generated in response to an operation of the first terminal device, and determining the operation command and state.
  • Information can include:
  • S22 Receive state information that is sent by the second terminal device and indicates the execution state of the first operation command.
  • the operation command is sent by the first terminal
  • the synchronization of camera operation commands and status information between the first terminal device and the second terminal device enables the first terminal device to control the camera, and the results and status are shared, so that in the distributed camera scenario
  • the camera control is more flexible and fast, and the control is more accurate.
  • the first terminal device triggers the first operation command to execute the focusing operation, and during the process of executing the first operation command by the second terminal device, the first terminal device may receive the start focusing, focusing in progress and complete focusing (focusing success/ Focusing failure) status information, the first terminal device can synchronously display the shooting preview screen of the second terminal device, and display the execution status of the focusing operation by the second terminal device.
  • the operation command may include a second operation command generated in response to an operation on the second terminal device, and determining the operation command and the status information may include: receiving the first operation command sent by the second terminal device Two operation commands, and the status information.
  • the operation command and the state information sent by the second terminal device by receiving the second operation command and the state information sent by the second terminal device, it is possible to make the operation command and the state information be compatible with the operation command in the case that the operation command is triggered by the second terminal device.
  • the synchronization of the first terminal device enables the user of the first terminal device to know the current state of the camera in real time, and on this basis, the operation of the camera can be realized, and the intercommunication of multi-side information is realized.
  • determining the operation command and status information may include:
  • Receive state information sent by the second terminal device the state information representing the execution state of the target operation command determined by the second terminal device on the plurality of operation commands, the plurality of operation commands being in response to the second terminal device , or an operation command of the same operation type generated by the operation of one or more first terminal devices, and the target operation command is the operation command with the largest corresponding frame number among the plurality of operation commands.
  • the latest operation command can be responded to, and the other party is allowed to cancel the command being executed, so that the user's intention can be correctly selected, and the corresponding operation command can be selected. , which makes the response more flexible and quicker when multiple parties collaborate, and multi-side commands can be cancelled and updated with each other to improve the user experience.
  • the frame number of the image data can be used to determine the sequence in which the two images are generated.
  • the frame number of the image frame corresponding to the operation that generates the operation command may be used as the frame number corresponding to the operation command.
  • the frame number of the displayed image frame It can be used as the frame number corresponding to the operation command generated by the click operation.
  • the larger the frame number corresponding to the operation command, the later the time when the operation of the operation command occurs, and the operation command with the largest frame number can correspond to the latest operation.
  • the predetermined time interval may be selected as required, for example, determined according to the current number of frames per second of the camera system of the second terminal device. For example, in the case of 30 frames per second, the predetermined time interval may be 100/3 milliseconds.
  • first terminal devices there may be multiple first terminal devices that have all generated operation commands of the same type (for example, multiple users have simultaneously issued zoom operation commands on different first terminal devices), and these operation commands are all sent to the first terminal device.
  • the second terminal device can determine the frame number corresponding to the operation command sent by each first terminal device, and select the target operation command with the largest frame number to execute.
  • the second terminal device may send status information indicating the execution state of the target operation command to all first terminal devices communicating with the second terminal device, which may include the first terminal device that triggered the operation command or did not trigger the operation command. If the second terminal device or a certain first terminal device triggers an operation command, and the operation command is not the largest frame number in the same type, the operation command is not executed, which is equivalent to being cancelled.
  • the second terminal device detects two zoom operation commands within 30 milliseconds, one is a command triggered by the second terminal device to zoom in on an image with frame number 30 by three times, and the other is a command triggered by the first terminal device Triggered, for the image whose frame number is 25 to zoom in twice, then the target operation command is the command corresponding to the larger frame number, that is, the command to zoom in three times.
  • the operation command to complete the twice magnification can be displayed, and after receiving the three times complete state information, the three times complete state information can be displayed.
  • Fig. 10 shows a flowchart of a camera control method according to an embodiment of the present application.
  • the method can be used for a second terminal device.
  • the method can include:
  • S32 determine an operation command and state information, where the operation command is an operation command for the shooting process of the second terminal device, and the state information indicates the execution state of the operation command by the second terminal device;
  • S33 Send the status information to the first terminal device.
  • the second terminal device sends the image data to the first terminal device, determines the operation command and status information, and sends the status information to the first terminal device, so that the images and operation commands of the multi-side device can be sent to the first terminal device. Synchronization and intercommunication with status information, so that the opposite end can also control the camera collaboratively, which can realize distributed camera control.
  • FIG. 11 shows a flowchart of a camera control method according to an embodiment of the present application.
  • the operation command may include a first operation command generated in response to an operation of the first terminal device, determining the operation command and the state Information can include:
  • S42 Execute the first operation command to obtain status information representing the execution state of the first operation command.
  • the operation command is executed by When the first terminal device is triggered, the operation command is synchronized and the command is executed to realize the control of the camera by the multi-sided device.
  • the current state is convenient for users to perform subsequent operations, making the control more accurate and improving the user experience.
  • FIG. 12 shows a flowchart of a camera control method according to an embodiment of the present application. As shown in FIG. 12 , determining the operation command and status information may include:
  • S52 Execute the second operation command to obtain status information representing the execution state of the second operation command.
  • the operation command is executed by When the second terminal device is triggered, the operation command is executed and the corresponding state information is obtained, so as to be synchronized to the multi-side device, so that the multi-side device can coordinately control the camera.
  • FIG. 13 shows a flowchart of a camera control method according to an embodiment of the present application. As shown in FIG. 13 , determining an operation command and status information may include:
  • S61 Determine a target operation command from a plurality of operation commands, where the plurality of operation commands are operation commands of the same operation type and generated in response to operations on the second terminal device or one or more first terminal devices.
  • the target operation command is the operation command with the largest corresponding frame number among the plurality of operation commands;
  • S62 Execute the target operation command to obtain status information representing the execution state of the target operation command.
  • the second terminal device can execute the latest operation command through a concurrent response strategy, and at the same time allow the opposite side to cancel the command being executed, so that the user can be correctly selected and select the corresponding operation command, so that when multiple parties cooperate, the response is more flexible and rapid.
  • FIG. 14 is a structural diagram of a camera control apparatus according to an embodiment of the present application.
  • the apparatus can be used in a first terminal device. As shown in FIG. 14 , the apparatus may include:
  • an image data receiving module 1401, configured to receive image data from the second terminal device, where the image data is collected by the second terminal device during the shooting process;
  • the first information determination module 1402 is configured to determine an operation command and state information, where the operation command is an operation command for the shooting process of the second terminal device, and the state information indicates that the second terminal device has the execution status of the operation command;
  • an image display module 1403, configured to display an image according to the image data
  • the execution state display module 1404 is configured to display the execution state of the operation command on the screen according to the operation command and the state information.
  • the image data from the second terminal device is received by the first terminal device, the operation command and status information are determined, the screen is displayed according to the image data, and the screen is displayed on the screen according to the operation command and status information.
  • Displays the execution status of the operation command so that the first terminal device can achieve image synchronization with the second terminal device, and the operation command and status information can be synchronized and communicated, so that the two ends can coordinately control the camera, and also allow the user to more clearly. Understand the current status information of distributed cameras, improve control accuracy, and improve user experience.
  • the operation command includes a first operation command generated in response to an operation on the first terminal device
  • the first information determination module includes: a first operation command A sending sub-module, configured to send the first operation command to the second terminal device; a first information receiving sub-module, configured to receive an information sent by the second terminal device, indicating a response to the first operation command Status information for the execution state.
  • the operation command is sent by the first terminal
  • the synchronization of camera operation commands and status information between the first terminal device and the second terminal device enables the first terminal device to control the camera, and the results and status are shared, so that in the distributed camera scenario
  • the camera control is more flexible and fast, and the control is more accurate.
  • the operation command includes a second operation command generated in response to an operation on the second terminal device
  • the first information determination module includes: receiving a second information A submodule, configured to receive the second operation command and the status information sent by the second terminal device.
  • the operation command and the state information sent by the second terminal device by receiving the second operation command and the state information sent by the second terminal device, it is possible to make the operation command and the state information be compatible with the operation command in the case that the operation command is triggered by the second terminal device.
  • the synchronization of the first terminal device enables the user of the first terminal device to know the current state of the camera in real time, and on this basis, the operation of the camera can be realized, and the intercommunication of multi-side information is realized.
  • the first information determination module includes: a status information receiving sub-module, configured to receive status information sent by the second terminal device, where the status information indicates The execution state of the target operation command determined by the second terminal device on the plurality of operation commands, the multiple operation commands are generated in response to the operation on the second terminal device or one or more first terminal devices, the operation type is For the same operation command, the target operation command is the operation command with the largest corresponding frame number among the plurality of operation commands.
  • the latest operation command can be responded to, and the other party is allowed to cancel the command being executed, so that the user's intention can be correctly selected, and the corresponding operation command can be selected. , which makes the response more flexible and quicker when multiple parties collaborate, and multi-side commands can be cancelled and updated with each other to improve the user experience.
  • the operation type of the operation command includes one or more of focusing, zooming, turning on or off the flash, adjusting exposure, using filters, beautifying skin, and microdermabrasion. kind.
  • operations that can be performed on images on the first terminal device can be enriched, and through interface display between the first terminal device and the second terminal device, a scenario of distributed camera control can be realized, so that the user experience is better.
  • FIG. 15 is a structural diagram of a camera control apparatus according to an embodiment of the present application, the apparatus can be used in a second terminal device, as shown in FIG. 15 , the apparatus may include:
  • an image data sending module 1501 configured to send image data to a first terminal device, where the image data is collected by the second terminal device during a shooting process;
  • the second information determining module 1502 is configured to determine an operation command and state information, where the operation command is an operation command for the shooting process of the second terminal device, and the state information indicates that the second terminal device has the execution status of the operation command;
  • the status information sending module 1503 is configured to send the status information to the first terminal device.
  • the second terminal device sends the image data to the first terminal device, determines the operation command and status information, and sends the status information to the first terminal device, so that the images and operation commands of the multi-side device can be sent to the first terminal device. Synchronization and intercommunication with status information, so that the opposite end can also control the camera collaboratively, which can realize distributed camera control.
  • the operation command includes a first operation command generated in response to an operation on the first terminal device
  • the second information determination module includes: an operation command receiving a sub-module for receiving the first operation command sent by the first terminal device, a first-operation command execution sub-module for executing the first operation command, and obtaining a representation indicating the execution of the first operation command Status information for the state.
  • the operation command is executed by When the first terminal device is triggered, the operation command is synchronized and the command is executed to realize the control of the camera by the multi-sided device.
  • the current state is convenient for users to perform subsequent operations, making the control more accurate and improving the user experience.
  • the second information determination module includes: an operation command generation sub-module, configured to generate a second operation command in response to an operation on the second terminal device,
  • the second operation command execution sub-module is configured to execute the second operation command to obtain state information representing the execution state of the second operation command.
  • the operation command is executed by When the second terminal device is triggered, the operation command is executed and the corresponding state information is obtained, so as to be synchronized to the multi-side device, so that the multi-side device can coordinately control the camera.
  • the second information determination module includes: a target operation command determination sub-module, configured to determine a target operation command from a plurality of operation commands, the plurality of The operation command is an operation command of the same operation type generated in response to an operation on the second terminal device or one or more first terminal devices, and the target operation command is the one with the largest corresponding frame number among the plurality of operation commands.
  • the operation command of the target operation command; the target operation command execution sub-module is used for executing the target operation command to obtain status information representing the execution state of the target operation command.
  • the second terminal device can execute the latest operation command through a concurrent response strategy, and at the same time allow the opposite side to cancel the command being executed, so that the user can be correctly selected and select the corresponding operation command, so that when multiple parties cooperate, the response is more flexible and rapid.
  • the operation type of the operation command includes focusing, zooming, turning on or off the flash, adjusting One or more of exposure, filter use, skin beautification, and skin resurfacing.
  • the operations that can be performed on the image on the second terminal device can be enriched, and the interface display between the first terminal device and the second terminal device can realize a distributed camera control scene, so that the user experience is better.
  • FIG. 16 shows a schematic structural diagram of a terminal device according to an embodiment of the present application. Taking the terminal device as a mobile phone as an example, FIG. 16 shows a schematic structural diagram of the mobile phone 200 .
  • the mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 251, a wireless communication module 252, Audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone jack 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293, display screen 294, SIM card interface 295, etc.
  • a processor 210 an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 251, a wireless communication module 252, Audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone jack 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293, display screen 294, SIM card interface 295, etc.
  • the sensor module 280 may include a gyroscope sensor 280A, an acceleration sensor 280B, a proximity light sensor 280G, a fingerprint sensor 280H, and a touch sensor 280K (of course, the mobile phone 200 may also include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, and a magnetic sensor. , ambient light sensor, air pressure sensor, bone conduction sensor, etc., not shown in the figure).
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile phone 200 .
  • the mobile phone 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units, for example, the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or Neural-network Processing Unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the mobile phone 200 . The controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in processor 210 is cache memory.
  • the memory may hold instructions or data that have just been used or recycled by the processor 210 . If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 210 is reduced, thereby improving the efficiency of the system.
  • the processor 210 can run the camera control method provided by the embodiment of the present application, so as to support the camera control of multiple first terminal devices on the same second terminal device, and truly realize the shared control of the camera. For commands on a certain device, Devices can perceive each other, and the results of executing commands can be shared among devices. At the same time, each device can trigger commands concurrently. Through the coordination mechanism, the user's intention can be correctly understood. The concurrent commands between devices can be canceled and updated with each other, and the response speed is fast. Flexible and fast, it brings a more convenient user experience to users.
  • the processor 210 may include different devices. For example, when a CPU and a GPU are integrated, the CPU and the GPU may cooperate to execute the camera control method provided by the embodiments of the present application. For example, some algorithms in the camera control method are executed by the CPU, and another part of the algorithms are executed by the GPU. for faster processing efficiency.
  • Display screen 294 is used to display images, videos, and the like.
  • Display screen 294 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • cell phone 200 may include 1 or N display screens 294, where N is a positive integer greater than 1.
  • the display screen 294 may be used to display information entered by or provided to the user as well as various graphical user interfaces (GUIs).
  • GUIs graphical user interfaces
  • display 294 may display photos, videos, web pages, or documents, and the like.
  • display 294 may display a graphical user interface.
  • the GUI includes a status bar, a hideable navigation bar, a time and weather widget, and an application icon, such as a browser icon.
  • the status bar includes operator name (eg China Mobile), mobile network (eg 4G), time and remaining battery.
  • the navigation bar includes a back button icon, a home button icon, and a forward button icon.
  • the status bar may further include a Bluetooth icon, a Wi-Fi icon, an external device icon, and the like.
  • the graphical user interface may further include a Dock bar, and the Dock bar may include commonly used application icons and the like.
  • the display screen 294 may be an integrated flexible display screen, or a spliced display screen composed of two rigid screens and a flexible screen located between the two rigid screens.
  • the terminal device can establish a connection with other terminal devices through the antenna 1, the antenna 2 or the USB interface, and transmit data and data according to the camera control method provided by the embodiment of the present application.
  • Control display 294 displays a corresponding graphical user interface.
  • Camera 293 front camera or rear camera, or a camera that can be both a front camera and a rear camera is used to capture still images or video.
  • the camera 293 may include a photosensitive element such as a lens group and an image sensor, wherein the lens group includes a plurality of lenses (convex or concave) for collecting the light signal reflected by the object to be photographed, and transmitting the collected light signal to the image sensor .
  • the image sensor generates an original image of the object to be photographed according to the light signal.
  • Internal memory 221 may be used to store computer executable program code, which includes instructions.
  • the processor 210 executes various functional applications and data processing of the mobile phone 200 by executing the instructions stored in the internal memory 221 .
  • the internal memory 221 may include a storage program area and a storage data area.
  • the storage program area may store operating system, code of application programs (such as camera application, WeChat application, etc.), and the like.
  • the storage data area may store data created during the use of the mobile phone 200 (such as images and videos collected by the camera application) and the like.
  • the internal memory 221 may also store one or more computer programs 1310 corresponding to the camera control methods provided in the embodiments of the present application.
  • the one or more computer programs 1304 are stored in the aforementioned memory 221 and configured to be executed by the one or more processors 210, and the one or more computer programs 1310 include instructions that may be used to perform the execution of FIG. 2 8-13, the computer program 1310 may include an image data receiving module 1401 for receiving image data from the second terminal device, the image data is the second terminal device in the shooting process.
  • the first information determination module 1402 is configured to determine an operation command and state information, the operation command is an operation command for the shooting process of the second terminal device, and the state information indicates that the second terminal
  • the execution status of the operation command by the device is used to display the screen according to the image data
  • the execution status display module 1404 is used to display the operation command on the screen according to the operation command and status information execution status.
  • the computer program 1310 may further include an image data sending module 1501 for sending image data to the first terminal device, the image data being collected by the second terminal device during the shooting process; the second information determining module 1502, is used to determine an operation command and state information, the operation command is an operation command for the shooting process of the second terminal device, and the state information indicates the execution state of the operation command by the second terminal device;
  • the status information sending module 1503 is configured to send the status information to the first terminal device (not shown).
  • the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • non-volatile memory such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the code of the camera control method provided by the embodiment of the present application may also be stored in an external memory.
  • the processor 210 may execute the codes of the camera control method stored in the external memory through the external memory interface 220.
  • the function of the sensor module 280 is described below.
  • the gyro sensor 280A can be used to determine the movement posture of the mobile phone 200 .
  • the angular velocity of cell phone 200 about three axes ie, x, y, and z axes
  • the gyro sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or still.
  • the gyro sensor 280A can be used to detect a folding or unfolding operation acting on the display screen 294 .
  • the gyroscope sensor 280A may report the detected folding operation or unfolding operation to the processor 210 as an event to determine the folding state or unfolding state of the display screen 294 .
  • the acceleration sensor 280B can detect the magnitude of the acceleration of the mobile phone 200 in various directions (generally three axes). That is, the gyro sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or still. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 280B can be used to detect a folding or unfolding operation acting on the display screen 294 . The acceleration sensor 280B may report the detected folding operation or unfolding operation to the processor 210 as an event to determine the folding state or unfolding state of the display screen 294 .
  • Proximity light sensor 280G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the mobile phone emits infrared light outward through light-emitting diodes.
  • Phones use photodiodes to detect reflected infrared light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the phone. When insufficient reflected light is detected, the phone can determine that there are no objects near the phone.
  • the proximity light sensor 280G can be arranged on the first screen of the foldable display screen 294, and the proximity light sensor 280G can detect the first screen according to the optical path difference of the infrared signal.
  • the gyroscope sensor 280A (or the acceleration sensor 280B) may send the detected motion state information (such as angular velocity) to the processor 210 .
  • the processor 210 determines, based on the motion state information, whether the current state is the hand-held state or the tripod state (for example, when the angular velocity is not 0, it means that the mobile phone 200 is in the hand-held state).
  • the fingerprint sensor 280H is used to collect fingerprints.
  • the mobile phone 200 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • Touch sensor 280K also called “touch panel”.
  • the touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, also called a "touch screen”.
  • the touch sensor 280K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 294 .
  • the touch sensor 280K may also be disposed on the surface of the mobile phone 200 , which is different from the location where the display screen 294 is located.
  • the display screen 294 of the mobile phone 200 displays a main interface, and the main interface includes icons of multiple applications (such as a camera application, a WeChat application, etc.).
  • Display screen 294 displays an interface of a camera application, such as a viewfinder interface.
  • the wireless communication function of the mobile phone 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 251, the wireless communication module 252, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in handset 200 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 251 can provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the mobile phone 200 .
  • the mobile communication module 251 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 251 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 251 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 251 may be provided in the processor 210 .
  • the mobile communication module 251 may be provided in the same device as at least part of the modules of the processor 210 .
  • the mobile communication module 251 may also be used for information interaction with other terminal devices, such as sending operation commands, sending status information, and receiving image data.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 270A, the receiver 270B, etc.), or displays images or videos through the display screen 294 .
  • the modem processor may be a stand-alone device.
  • the modulation and demodulation processor may be independent of the processor 210, and may be provided in the same device as the mobile communication module 251 or other functional modules.
  • the wireless communication module 252 can provide applications on the mobile phone 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 252 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 252 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 .
  • the wireless communication module 252 can also receive the signal to be sent from the processor 210 , perform frequency modulation on the signal, amplify the signal, and then convert it into an electromagnetic wave for radiation through the antenna 2 .
  • the wireless communication module 252 is used to transmit data between other terminal devices under the control of the processor 210.
  • the processor 210 executes the camera control method provided by the embodiment of the present application
  • the processor can control the
  • the wireless communication module 252 sends operation commands and status information to other terminal devices, and can also receive image data to control the camera, provide users with intuitive visual feedback, avoid user errors and repeated operations, and improve operation efficiency.
  • the mobile phone 200 can implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, and an application processor. Such as music playback, recording, etc.
  • the cell phone 200 can receive key 290 input and generate key signal input related to user settings and function control of the cell phone 200 .
  • the mobile phone 200 can use the motor 291 to generate vibration alerts (eg, vibration alerts for incoming calls).
  • the indicator 292 in the mobile phone 200 may be an indicator light, which may be used to indicate a charging state, a change in power, and may also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 295 in the mobile phone 200 is used to connect the SIM card. The SIM card can be inserted into the SIM card interface 295 or pulled out from the SIM card interface 295 to achieve contact and separation with the mobile phone 200 .
  • the mobile phone 200 may include more or less components than those shown in FIG. 16 , which are not limited in this embodiment of the present application.
  • the illustrated handset 200 is merely an example, and the handset 200 may have more or fewer components than those shown, two or more components may be combined, or may have different component configurations.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the software system of the terminal device can adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of a terminal device.
  • FIG. 17 is a block diagram of a software structure of a terminal device according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as phone, camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • the window manager may also be used to detect whether there is a camera control operation according to the embodiment of the present application, such as a focus operation, a zoom operation, and an operation of turning on or off the flash.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the telephony manager is used to provide the communication function of the terminal device. For example, the management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the terminal device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • An embodiment of the present application provides a camera control device, comprising: a processor and a memory for storing instructions executable by the processor; wherein the processor is configured to implement the above method when executing the instructions.
  • Embodiments of the present application provide a non-volatile computer-readable storage medium on which computer program instructions are stored, and when the computer program instructions are executed by a processor, implement the above method.
  • Embodiments of the present application provide a computer program product, including computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are stored in a processor of an electronic device When running in the electronic device, the processor in the electronic device executes the above method.
  • a computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Computer-readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (Electrically Programmable Read-Only-Memory, EPROM or flash memory), static random access memory (Static Random-Access Memory, SRAM), portable compact disk read-only memory (Compact Disc Read-Only Memory, CD - ROM), Digital Video Disc (DVD), memory sticks, floppy disks, mechanically encoded devices, such as punch cards or raised structures in grooves on which instructions are stored, and any suitable combination of the foregoing .
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read-only memory
  • EPROM Errically Programmable Read-Only-Memory
  • SRAM static random access memory
  • portable compact disk read-only memory Compact Disc Read-Only Memory
  • CD - ROM Compact Disc Read-Only Memory
  • DVD Digital Video Disc
  • memory sticks floppy disks
  • Computer readable program instructions or code described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • the computer program instructions used to perform the operations of the present application may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or in one or more source or object code written in any combination of programming languages, including object-oriented programming languages such as Smalltalk, C++, etc., and conventional procedural programming languages such as the "C" language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement.
  • the remote computer may be connected to the user's computer through any kind of network—including a Local Area Network (LAN) or a Wide Area Network (WAN)—or, may be connected to an external computer (eg, use an internet service provider to connect via the internet).
  • electronic circuits such as programmable logic circuits, Field-Programmable Gate Arrays (FPGA), or Programmable Logic Arrays (Programmable Logic Arrays), are personalized by utilizing state information of computer-readable program instructions.
  • Logic Array, PLA the electronic circuit can execute computer readable program instructions to implement various aspects of the present application.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium on which the instructions are stored includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more functions for implementing the specified logical function(s) executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in hardware (eg, circuits or ASICs (Application) that perform the corresponding functions or actions. Specific Integrated Circuit, application-specific integrated circuit)), or can be implemented by a combination of hardware and software, such as firmware.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

本申请涉及一种相机控制方法、装置和存储介质,所述方法用于第一终端设备,所述方法包括:接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;根据所述图像数据显示画面;根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。根据本申请实施例,真正做到了相机的共享控制,通过协同机制正确地理解用户的意图,设备间并发的命令可以相互取消、相互更新,响应速度快,更加灵活与迅速,给用户带来了更便捷的用户体验。

Description

相机控制方法、装置和存储介质
本申请要求于2020年12月09日提交中国专利局、申请号为202011451138.5、发明名称为“相机控制方法、装置和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备领域,尤其涉及一种相机控制方法、装置和存储介质。
背景技术
随着智能终端设备的普及,个人拥有的智能终端设备的数量和种类也越来越多,目前的大部分智能终端设备上都带有摄像头,用户可以使用摄像头进行拍照或录像,随之而来的是用户间对于相机信息的交互需求。
近年来网络技术的高速发展,也给分布式应用场景提供了更好硬件支持。当前的分布式相机控制的应用场景中一般仅支持单方面的相机控制,或者是相机的控制信息无法互通,在分布式相机的应用场景下,亟需更灵活的控制方式、和更便捷的用户体验。
发明内容
有鉴于此,提出了一种相机控制方法、装置和存储介质。
第一方面,本申请的实施例提供了一种相机控制方法,其特征在于,所述方法用于第一终端设备,所述方法包括:接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;根据所述图像数据显示画面;根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。
根据本申请实施例,通过第一终端设备接收来自第二终端设备的图像数据,确定操作命令和状态信息,根据所述图像数据显示画面,根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态,使得第一终端设备可以实现与第二终端设备的图像同步、操作命令和状态信息同步与互通,使得两端可以对相机进行协同控制,也可以使得用户能更清楚的了解分布式相机的当前状态信息,提高控制的准确度,提升用户体验。
根据第一方面,在所述相机控制方法的第一种可能的实现方式中,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,确定操作命令和状态信息,包括:将所述第一操作命令发送至所述第二终端设备;接收所述第二终端设备发送的、表示对所述第一操作命令的执行状态的状态信息。
根据本申请实施例,通过将第一操作命令发送至第二终端设备,接收第二终端设备发送的、表示对第一操作命令的执行状态的状态信息,可以实现在操作命令是由第一终端设备触发的情况下,第一终端设备与第二终端设备间相机的操作命令与状态信息的同步,使得第一终端设备可以对相机进行控制,且结果和状态共享,使得分布式相机场景下的相机控制更加灵活与迅速,控制更加准确。
根据第一方面,在所述相机控制方法的第二种可能的实现方式中,所述操作命令包括响应于针对第二终端设备的操作生成的第二操作命令,确定操作命令和状态信息,包括:接收所述第二终端设备发送的所述第二操作命令,和所述状态信息。
根据本申请实施例,通过接收第二终端设备发送的所述第二操作命令,和所述状态信息,可以使得在操作命令是由第二终端设备触发的情况下,操作命令和状态信息可以与第一终端设备同步,使得第一终端设备的用户可以实时的了解当前相机的状态,在此基础上可以实现对相机的操作,实现了多侧信息的互通。
根据第一方面,在所述相机控制方法的第三种可能的实现方式中,确定操作命令和状态信息,包括:接收所述第二终端设备发送的状态信息,所述状态信息表示第二终端设备对多个操作命令中确定的目标操作命令的执行状态,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令。
根据本申请实施例,可以实现在设备间触发的多个操作命令产生冲突时,响应最新的操作命令,同时允许另外一方取消正在执行的命令,可以正确选择用户的意图,并选取相应的操作命令,使得多方协同时,响应的更加灵活与迅速,多侧命令可以相互取消,相互更新,提高用户的体验。
根据第一方面或者第一方面的第一种、第二种或第三种可能的实现方式,在所述相机控制方法的第四种可能的实现方式中,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
根据本申请实施例,可以丰富第一终端设备上可对图像进行的操作,并通过第一终端设备与第二终端设备的界面显示,实现分布式相机控制的场景,使得用户的体验更好。
第二方面,本申请的实施例提供了一种相机控制方法,所述方法用于第二终端设备,所述方法包括:将图像数据发送给第一终端设备,所述图像数据是所述第二终端设备在拍摄过程中采集的;确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;将所述状态信息发送给第一终端设备。
根据本申请实施例,通过第二终端设备将图像数据发送给第一终端设备,确定操作命令和状态信息,将所述状态信息发送给第一终端设备,可以使得多侧设备的图像、操作命令和状态信息同步与互通,使得对端也可以对相机进行协同控制,可以实现分布式的相机控制。
根据第二方面,在所述相机控制方法的第一种可能的实现方式中,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,确定操作命令和状态信息,包括:接收所述第一终端设备发送的所述第一操作命令,执行所述第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息。
根据本申请实施例,通过接收第一终端设备发送的所述第一操作命令,执行第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息,可实现在操作命令是由第一终端设备触发的情况下,同步操作命令并执行命令,实现多侧设备对相机的控制,通过将执行后的执行状态进行多侧同步,可以使得多侧设备的用户可以直观的了解相机的当前状态,方便用户进行后续的操作,使控制更加准确,提升了用户体验。
根据第二方面,在所述相机控制方法的第二种可能的实现方式中,确定操作命令和状态 信息,包括:响应于针对第二终端设备的操作,生成第二操作命令,执行所述第二操作命令,得到表示所述第二操作命令的执行状态的状态信息。
根据本申请的实施例,通过响应于针对第二终端设备的操作,生成第二操作命令,执行第二操作命令,得到表示第二操作命令的执行状态的状态信息,可以实现在操作命令是由第二终端设备触发的情况下,执行操作命令并得到相应的状态信息,以同步给多侧设备,使得多侧设备可以对相机进行协同控制。
根据第二方面,在所述相机控制方法的第三种可能的实现方式中,确定操作命令和状态信息,包括:从多个操作命令中确定目标操作命令,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令;执行所述目标操作命令,得到表示所述目标操作命令的执行状态的状态信息。
根据本申请实施例,可以实现在触发的多个操作命令产生冲突时,第二终端设备可以通过并发响应的策略,执行最新的操作命令,同时允许对侧取消正在执行的命令,可以正确选择用户的意图,并选取相应的操作命令,使得多方协同时,响应的更加灵活与迅速,多侧命令可以相互取消,相互更新,提高用户的体验。
根据第二方面或者第二方面的第一种、第二种或第三种可能的实现方式,在所述相机控制方法的第四种可能的实现方式中,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
根据本申请实施例,可以丰富第二终端设备上可对图像进行的操作,并通过第一终端设备与第二终端设备的界面显示,实现分布式相机控制的场景,使得用户的体验更好。
第三方面,本申请的实施例提供了一种相机控制装置,所述装置用于第一终端设备,所述装置包括:图像数据接收模块,用于接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;第一信息确定模块,用于确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;画面显示模块,用于根据所述图像数据显示画面;执行状态显示模块,用于根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。
根据第三方面,在所述相机控制装置的第一种可能的实现方式中,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,所述第一信息确定模块,包括:第一操作命令发送子模块,用于将所述第一操作命令发送至所述第二终端设备;第一信息接收子模块,用于接收所述第二终端设备发送的、表示对所述第一操作命令的执行状态的状态信息。
根据第三方面,在所述相机控制装置的第二种可能的实现方式中,所述操作命令包括响应于针对第二终端设备的操作生成的第二操作命令,所述第一信息确定模块,包括:第二信息接收子模块,用于接收所述第二终端设备发送的所述第二操作命令,和所述状态信息。
根据第三方面,在所述相机控制装置的第三种可能的实现方式中,所述第一信息确定模块,包括:状态信息接收子模块,用于接收所述第二终端设备发送的状态信息,所述状态信息表示第二终端设备对多个操作命令中确定的目标操作命令的执行状态,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令。
根据第三方面或者第三方面的第一种、第二种或第三种可能的实现方式,在所述相机控制装置的第四种可能的实现方式中,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
第四方面,本申请的实施例提供了一种相机控制装置,所述装置用于第二终端设备,所述装置包括:图像数据发送模块,用于将图像数据发送给第一终端设备,所述图像数据是所述第二终端设备在拍摄过程中采集的;第二信息确定模块,用于确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;状态信息发送模块,用于将所述状态信息发送给第一终端设备。
根据第四方面,在所述相机控制装置的第一种可能的实现方式中,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,所述第二信息确定模块,包括:操作命令接收子模块,用于接收所述第一终端设备发送的所述第一操作命令,第一操作命令执行子模块,用于执行所述第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息。
根据第四方面,在所述相机控制装置的第二种可能的实现方式中,所述第二信息确定模块,包括:操作命令生成子模块,用于响应于针对第二终端设备的操作,生成第二操作命令,第二操作命令执行子模块,用于执行所述第二操作命令,得到表示所述第二操作命令的执行状态的状态信息。
根据第四方面,在所述相机控制装置的第三种可能的实现方式中,所述第二信息确定模块,包括:目标操作命令确定子模块,用于从多个操作命令中确定目标操作命令,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令;目标操作命令执行子模块,用于执行所述目标操作命令,得到表示所述目标操作命令的执行状态的状态信息。
根据第四方面或者第四方面的第一种、第二种或第三种可能的实现方式,在所述相机控制方法的第四种可能的实现方式中,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
第五方面,本申请的实施例提供了一种相机控制装置,所述装置包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为执行所述指令时实现上述第一方面或者第一方面的多种可能的实现方式中的一种或几种的相机控制方法,或者实现上述第二方面或者第二方面的多种可能的实现方式中的一种或几种的相机控制方法。
第六方面,本申请的实施例提供了一种非易失性计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现上述第一方面或者第一方面的多种可能的实现方式中的一种或几种的相机控制方法,或者,实现上述第二方面或者第二方面的多种可能的实现方式中的一种或几种的相机控制方法。
第七方面,本申请的实施例提供了一种终端设备,该终端设备可以执行上述第一方面或者第一方面的多种可能的实现方式中的一种或几种的相机控制方法,或者执行述第二方面或者第二方面的多种可能的实现方式中的一种或几种的相机控制方法。
第八方面,本申请的实施例提供了一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中 运行时,所述电子设备中的处理器执行上述第一方面或者第一方面的多种可能的实现方式中的一种或几种的相机控制方法,或者执行述第二方面或者第二方面的多种可能的实现方式中的一种或几种的相机控制方法。
本申请的这些和其他方面在以下(多个)实施例的描述中会更加简明易懂。
附图说明
包含在说明书中并且构成说明书的一部分的附图与说明书一起示出了本申请的示例性实施例、特征和方面,并且用于解释本申请的原理。
图1示出了根据本申请一实施例的应用场景示意图。
图2示出了根据本申请一实施例的相机控制方法的流程图。
图3示出了根据本申请一实施例的相机控制方法的界面示意图。
图4示出了根据本申请一实施例的相机控制方法的界面示意图。
图5示出了根据本申请一实施例的相机控制方法的界面示意图。
图6示出了根据本申请一实施例的相机应用的界面示意图。
图7示出了根据本申请一实施例的相机应用的界面示意图。
图8示出了根据本申请一实施例的相机控制方法的流程图。
图9示出了根据本申请一实施例的相机控制方法的流程图。
图10示出了根据本申请一实施例的相机控制方法的流程图。
图11示出了根据本申请一实施例的相机控制方法的流程图。
图12示出了根据本申请一实施例的相机控制方法的流程图。
图13示出了根据本申请一实施例的相机控制方法的流程图。
图14使出了根据本申请一实施例的相机控制装置的结构图。
图15使出了根据本申请一实施例的相机控制装置的结构图。
图16示出根据本申请一实施例的终端设备的结构示意图。
图17示出根据本申请一实施例的终端设备的软件结构框图。
具体实施方式
以下将参考附图详细说明本申请的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。
另外,为了更好的说明本申请,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本申请同样可以实施。在一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本申请的主旨。
对于终端设备的分布式相机的应用,现有技术中相机的控制方式往往是单方面的,在设备互通的基础上,本地设备唤起远程设备的相机应用,但无法远程控制相机,操作命令只能在远程设备上触发,这样带来的问题是操作命令只能在单侧设备中被触发,无法支持分布式 相机的协同控制。现有技术中的另一种做法,是本地设备在唤起远程设备的相机应用后,共享远程设备的预览数据,本地设备以遥控的方式支持少量的相机操作命令,操作命令由远程设备的相机应用代理后下发到相机系统。这样的做法实际上属于独占控制,并非对远程设备相机的真正控制,其问题在于远程设备执行命令后的状态信息无法同步到本地设备,二者信息不互通,使得协同控制无法做到精确的同步,给用户的使用带来不便。
为了解决上述技术问题,本申请实施例提供了一种相机控制方法,本申请实施例的相机控制方法能够实现第一终端设备与第二终端设备的互通,通过第一终端设备接收来自第二终端设备的图像数据、确定操作命令和状态信息,可以显示画面和操作命令的执行状态,实现信息的同步与互通,同时第一终端设备也可以向第二终端设备发送操作命令,实现设备间的协同控制。通过本申请实施例,可以支持多各第一终端设备对同一第二终端设备的相机控制,真正做到了相机的共享控制,对于某一设备上的命令,设备间可以互相感知,执行命令的结果设备间可以共享,同时支持各个设备并发触发命令,通过协同机制正确地理解用户的意图,设备间并发的命令可以相互取消、相互更新,响应速度快,更加灵活与迅速,给用户带来了更便捷的用户体验。
图1示出了根据本申请一实施例的应用场景示意图。在一种可能的实现方式中,本申请实施例提供的相机控制方法可以应用在包括第一终端设备和第二终端设备的直播的场景中。直播的场景可以包括电商直播、教育直播等场景,第一终端设备和第二终端设备可以是具有无线连接功能的设备,无线连接的功能是指可以通过无线保真(WIFI,Wireless Fidelity)、蓝牙等近距离无线连接方式进行相互连接,也可以通过通用分组无线服务(GPRS,General Packet Radio Service)/码分多址(CDMA,Code Division Multiple Access)、数传电台、扩频微波、卫星通信、移动通信等远距离无线连接方式进行相互连接,上述终端设备也可以具有有线连接进行通信的功能。本申请实施例的第一终端设备和第二终端设备可以是同种类的终端设备,也可以不同种类的终端设备,可以是触屏的、也可以是非触屏的、也可以是没有屏幕的,触屏的可以通过手指、触控笔等在显示屏幕上点击、滑动等方式对终端设备进行控制,非触屏的设备可以连接鼠标、键盘、触控面板等输入设备,通过输入设备对终端设备进行控制,没有屏幕的设备比如说可以是没有屏幕的蓝牙音箱等。举例来说,本申请的终端设备可以是智能手机、上网本、平板电脑、笔记本电脑、电视机(TV,television)、虚拟现实设备。本申请对于这里的直播的类型以及终端设备的类型不作限制,且本申请实施例也可应用在直播之外的其他应用场景。
以下以通过作为第一终端设备的本地设备,以及作为第二终端设备的远程辅助设备和远程主控设备进行直播的应用场景为例进行说明,本领域技术人员应理解,本申请实施例不限于这样的应用场景。
如图1所示,本申请实施例提供的相机控制方法可以应用在直播的场景中,例如,在主播进行电商直播的过程中,主播可以通过一个远程主控设备的相机拍摄直播场景中的全景画面,还可以使用另一个远程辅助设备拍摄直播场景中的物品特写画面,例如待售商品的特写画面,在观看直播的用户的用户端,直播场景可以以“画中画”的方式呈现,在同一界面中,用户既可以看到主播直播时的全景画面,也可以看到待售商品的特写画面,在一种可能的实现方式中,远程主控设备可以与一个或多个本地设备相连接,远程辅助设备也可以与一个或多个本地设备相连接,通过本申请实施例的相机控制方法,同步多个设备间的相机的命令和 状态信息,工作人员在一个或多个本地设备上可以帮助主播控制远程主控设备的相机的摄像头拍摄的全景画面,主播和工作人员可以共同对全景画面进行操作;工作人员在另外一个或多个本地设备上也可以帮助主播控制远程辅助设备的相机的摄像头拍摄的物品特写画面,主播和工作人员也可以共同对特写画面进行操作。例如,工作人员可以在与远程主控设备连接的本地设备上对共享的全景画面的某一处进行点击,触发对焦操作,该操作命令可以同步给远程主控设备,远程主控设备在对焦的过程中,可以将执行的状态显示在远程主控设备以及和远程主控设备连接的各个本地设备上,实现设备间的协同控制,对于远程辅助设备和与其连接的本地设备同理所示。通过本申请实施例提供的相机控制方法可以使得正确的理解进行协同操作的各设备的用户的操作意图,使得应用场景下对相连接的相机的控制更加灵活与便捷。
图2示出了根据本申请一实施例的相机控制方法的流程图。如图2所示,相机控制系统可以包括第一终端设备和第二终端设备。第一终端设备中可以包括相互连接的相机应用与相机系统,第二终端设备中也可以包括相互连接的相机应用与相机系统。其中,相机应用可以用于在第一终端设备和第二终端设备上显示图像数据对应的画面,还可以用于根据用户的操作生成对应的操作命令,相机系统可以用于存储图像帧的相关信息以及相机的属性和参数等数据,还可以用于在第一终端设备和第二终端设备中根据操作命令对图像数据进行处理。需要说明的是,在本申请的实施例中,对于第一终端设备和第二终端设备的数量不作限制。图2所示的第一终端设备以及第二终端设备的各模块的划分、功能以及通信方式仅为实例性的,本申请实施例对此不作限制。
本申请的实施例中对于操作命令的操作类型并不作限制,以下以对焦操作和缩放操作的应用场景为例进行说明,本领域技术人员应理解,本申请实施例不限于这样的应用场景。
在一个示例性的应用场景中,用户A可以使用第二终端设备进行拍摄(例如拍照或拍摄视频),用户B可以通过第一终端设备同步看到用户A拍摄过程中的预览画面,并针对显示画面进行拍摄操作,拍摄操作的操作命令发送给第二终端设备,第二终端设备执行该操作命令,并将该操作命令的执行状态的状态信息返回给第一终端设备,使得第一终端设备上也显示该执行状态,使得第一终端设备能够同步反映第二终端设备拍摄过程中的画面和状态,用户B也可以对第二终端设备进行精准控制。
在该应用场景中,如图2所示,根据本申请一实施例的相机控制方法的流程包括:
步骤S101,第一终端设备的相机应用把操作命令发送至第一终端设备的相机系统。
其中,第一终端设备的相机系统可以与第二终端设备的相机系统进行连接,相互发送和接收图像数据、操作命令和执行状态信息。操作命令的操作类型可以包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。这里对操作命令的操作类型不作限定。
例如,用户在第一终端设备的相机应用的界面上某一处进行点击,可以触发生成该处的对焦操作命令,这时对应的对焦操作命令可以包括操作类型(对焦操作)、操作区域(点击的该处区域)以及操作模式(手动对焦),还可以包括操作命令对应的图像的帧号,第一终端设备的相机应用可以将对焦操作命令下发至第一终端设备的相机系统。
步骤S102,第一终端设备的相机系统把操作命令发送至第二终端设备的相机系统,并在第二终端设备的相机系统中执行操作命令。
其中,第二终端设备的相机系统执行操作命令,可以包括根据操作命令控制摄像头,改变图像数据,并将图像数据发送给第二终端设备的相机应用,还可以根据操作命令的执行情况生成执行状态信息。
其中,在第二终端设备的相机系统中执行操作命令前,如果在一预定时间间隔内存在两个相同类型的操作命令,第二终端设备的相机系统可以比较两个操作命令对应的图像的帧号,执行对应帧号较大的操作命令,取消对应帧号较小的操作命令,以正确理解用户的操作意图,优先响应用户的新的操作命令。
其中,预定时间间隔可以根据第二终端设备的相机系统当前的每秒出帧数决定,例如,在每秒30帧的情况下,预定时间间隔可以是100/3毫秒。帧号表示对图像帧的计数。
例如,在接收到第一终端设备的相机系统发送的对焦操作命令后,在预定时间间隔内不存在同是对焦类型的操作命令的情况下,第二终端设备的相机系统可以执行该对焦命令,修改相关参数,生成图像数据,并在第二终端设备的相机应用中显示对应的画面。
步骤S103,第二终端设备的相机系统把操作命令发送至第二终端设备的相机应用。
其中,第二终端设备的相机应用可以在第二终端设备上显示图像数据对应的画面,在第二终端设备的相机应用接收到操作命令后,可以在第二终端设备上进行显示正在执行的操作命令的相关信息。
例如,在接收到第二终端设备的相机系统发送的对焦命令后,第二终端设备的相机应用可以在相应的位置显示当前的对焦命令的相关信息(比如高亮对标操作图标,显示手动对标的位置等),使得使用第二终端设备的用户可以同步知道当前执行的是何种操作命令。
步骤S301,第二终端设备的相机系统把执行状态信息发送给第一终端设备的相机系统。
其中,执行状态信息可以表示操作命令对应的执行状态。
例如,在第二终端设备的相机系统在执行对焦命令的过程中分别可以产生开始对焦状态,对焦进行中状态,以及完成对焦状态。第二终端设备的相机系统可以将这三种状态对应的状态信息分别发送给第一终端设备的相机系统。
在一种可能的实现方式中,第二终端设备的相机系统在预定时间间隔内接收到多个相同操作类型的操作命令,并确定执行对应帧号较大的操作命令时,例如确定执行第二终端设备的相机应用触发的操作命令,取消第一终端设备的相机应用触发的操作命令,在第二终端设备的相机系统把执行状态信息发送给第一终端设备的相机系统时,发送的是执行第二终端设备的相机应用触发的操作命令对应的执行状态的状态信息。
步骤S302,第二终端设备的相机系统把执行状态信息发送给第二终端设备的相机应用。
其中,执行状态信息对应着第二终端设备的相机系统实际执行的操作命令,在第二终端设备的相机系统把执行状态信息发送给第二终端设备的相机应用后,可以在第二终端设备的相机应用上同步该实际执行的操作命令对应的执行状态信息。
例如,在第二终端设备的相机系统在执行缩放命令的过程中产生完成缩放两倍的状态后,第二终端设备的相机应用接收到完成缩放两倍的状态的状态信息后,可以在第二终端设备的相机应用上显示该状态信息。
步骤S303,第一终端设备的相机系统把执行状态信息发送给第一终端设备的相机应用。
其中,执行状态信息对应着第二终端设备的相机系统实际执行的操作命令,在第一终端设备的相机系统把执行状态信息发送给第一终端设备的相机应用后,可以在第一终端设备的 相机应用上同步该实际执行的操作命令对应的执行状态信息。
例如,在第二终端设备的相机系统在执行缩放命令的过程中产生完成缩放两倍的状态后,第一终端设备的相机应用接收到完成缩放两倍的状态的状态信息后,可以在第一终端设备的相机应用上显示该状态信息。
在一个示例性的应用场景中,用户A可以使用第二终端设备进行拍摄(例如拍照或拍摄视频),用户B可以通过第一终端设备同步看到用户A拍摄过程中的预览画面,当用户A在第二终端设备上针对显示画面进行拍摄操作时,拍摄操作的操作命令将同步给第一终端设备,第二终端设备执行该操作命令后,还可以将该操作命令的执行状态的状态信息返回给第一终端设备,使得第一终端设备上也显示该执行状态,使得第一终端设备能够同步反映第二终端设备拍摄过程中的画面和状态,用户A和用户B可以同时对拍摄画面进行精准控制。
在该应用场景中,如图2所示,根据本申请一实施例的相机控制方法的流程包括:
步骤S201,第二终端设备的相机应用把操作命令发送至第二终端设备的相机系统。
例如,用户可以在第二终端设备的相机应用的界面的缩放图标上点击放大两倍,可以触发生成缩放操作命令,这时对应的操作命令可以包括操作类型(缩放操作)以及缩放倍数(放大两倍),还可以包括操作命令对应的图像的帧号,在一种可能的实现方式中,可以在第二终端设备的相机应用的界面上显示当前执行的操作命令的相关信息(例如缩放图标以及缩放倍数的标识),第二终端设备的相机应用可以将缩放操作命令下发至第二终端设备的相机系统。
步骤S202,在第二终端设备的相机系统中执行操作命令,并把操作命令发送至第一终端设备的相机系统。
其中,在第二终端设备的相机系统中执行操作命令前,如果在一预定时间间隔内存在两个相同类型的操作命令,第二终端设备的相机系统可以比较两个操作命令对应的图像的帧号,执行对应帧号较大的操作命令,取消对应帧号较小的操作命令,以正确理解用户的操作意图,优先响应用户的新的操作命令。
例如,在接收到第二终端设备的相机系统发送的缩放命令后,在预定时间间隔内不存在同时缩放类型的操作命令的情况下,第二终端设备的相机系统可以执行该缩放命令,修改相关参数,生成图像数据,并在第二终端设备的相机应用中显示对应的画面,为了实现第一终端设备与第二终端设备的命令信息的同步,第二终端设备的相机系统可以把操作命令发送给第一终端设备的相机系统。
步骤S203,第一终端设备的相机系统把操作命令发送至第一终端设备的相机应用。
例如,在接收到第一终端设备的相机系统发送的缩放命令后,第一终端设备的相机应用可以在相应的位置显示当前的缩放命令的相关信息,使得使用第一终端设备的用户可以同步知道当前执行的操作命令。
步骤S301,第二终端设备的相机系统把执行状态信息发送给第一终端设备的相机系统。
步骤S302,第二终端设备的相机系统把执行状态信息发送给第二终端设备的相机应用。
步骤S303,第一终端设备的相机系统把执行状态信息发送给第一终端设备的相机应用。图3示出了根据本申请一实施例的相机控制方法的界面示意图。如图3所示,第二终端设备在拍摄过程中,摄像头采集到图像数据后,可以实时地将图像数据发送给第一终端设备,使得第一终端设备和第二终端设备同步显示第二终端设备的拍摄预览画面。用户也可以在第一终端设备和/或第二终端设备上进行操作,触发相应的操作命令。
图4示出了根据本申请一实施例的相机控制方法的界面示意图。如图4所示,用户可以在第一终端设备的图像上进行点击,触发对焦操作命令,第一终端设备可以将对焦操作的命令发送给第二终端设备,由第二终端设备执行命令,第二终端设备执行命令后,产生对应的执行状态,对于对焦操作,对应的执行状态可以包括开始对焦,对焦进行中以及完成对焦,第二终端设备可以分别将这三个执行状态发送给第一终端设备,并在第一终端设备和第二终端设备上显示。
例如,对应开始对焦的执行状态,在第一终端设备和第二终端设备的图像的相应区域可以显示如图所示的正方形框,表示对焦区域,并用白色进行正方形框的显示以表示对焦开始;对应对焦进行中的执行状态,可以对正方形框进行缩放的显示,以表明当前正在进行对焦操作;对应对焦完成的执行状态,可以有对焦成功和对焦失败这两种情况,在对焦成功的情况下,可以用黄色进行正方形框的显示,在对焦失败的情况下,可以用红色进行正方形框的显示。
其中,第二终端设备在执行对焦操作的过程中,可以实时地将图像数据发送给第一终端设备。第一终端设备与第二终端设备可同步显示拍摄预览画面中,局部区域经对焦操作后更加清晰的画面变化。
图5示出了根据本申请一实施例的相机控制方法的界面示意图。如图5所示,用户可以在第二终端设备上点击缩放的图标,并选择需要放大的倍数,例如放大三倍,触发缩放操作命令,第二终端设备可以执行缩放操作的命令,并将其同步给第一终端设备,第一终端设备可以在界面上进行显示,例如,将缩放图标改成如图所示的“3x”。在第二终端设备执行完放大三倍的命令后,可以产生执行状态为完成放大三倍的状态信息,并可以将状态信息发送给第一终端设备,第一终端设备可以根据状态信息在界面上进行显示。
在一种可能的实现方式中,第二终端设备在预定时间间隔内仅接收到一个操作类型为缩放的命令,第一终端设备在接收到缩放操作的命令以及接收到状态信息都是将图像放大三倍;如果第二终端设备在预定时间间隔内接收到两个操作类型为缩放的命令,例如一个是由第二终端设备触发的、对于帧号为30的图像进行放大三倍的命令,另一个是由第一终端设备触发的、对于帧号为25的图像进行放大两倍的命令,那么将执行对应帧号更大的也就是放大三倍的命令,对于第一终端设备,在接收到完成放大三倍的状态信息前,可以显示放大两倍的命令,例如“2x”,在接收到完成放大三倍的状态信息后,可以显示放大三倍的状态信息,例如“3x”。
其中,第二终端设备在执行缩放操作的过程中,可以实时地将图像数据发送给第一终端设备。第一终端设备与第二终端设备可同步显示经缩放操作后,放大或缩小的拍摄预览画面。
图6和图7示出了根据本申请一实施例的相机应用的界面示意图,下述界面可以应用在第一终端设备,也可以应用在第二终端设备,本领域技术人员应理解,本申请实施例不限于这样的应用场景。如图6所示,界面上方可以包括测光、打开或关闭闪光灯、选择色彩模式、设置等操作的图标,界面下方可以选择相机模式,可以包括光圈、夜景、人像、拍照、录像、专业以及更多相机模式。
在一种可能的实现方式中,如图所示,触发缩放的操作可以在终端设备的界面右侧进行,通过点按缩放的操作图标,可以通过在半圆形框中选择缩放的倍数,例如从1倍至10倍中的任意一值。用户在选定缩放倍数后,可以在操作图标中显示当前的缩放倍数。
图7示出了根据本申请一实施例的相机应用的界面示意图,如图7所示,界面上方可以包括测光、打开或关闭闪光灯、选择色彩模式、设置等操作的图标,界面下方可以选择相机模式,可以包括光圈、夜景、人像、拍照、录像、专业以及更多相机模式。
在一种可能的实现方式中,触发缩放的操作也可以在终端设备的界面上方进行,用户可以通过滑动进度条的方式选择缩放的倍数,例如从1倍至10倍中的任意一值。用户在选定缩放倍数后,可以在操作图标中显示当前的缩放倍数。
需要说明的是,图6和图7仅列举了界面显示的两种方式,本领域技术人员应理解,本申请实施例不限于这样的界面显示的方式,界面显示还可以通过弹出提示框、在界面上显示文字等方式实现。
图8示出了根据本申请一实施例的相机控制方法的流程图,该方法可用于第一终端设备,如图8所示,方法可包括:
S11,接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;
S12,确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
S13,根据所述图像数据显示画面;
S14,根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。
根据本申请实施例,通过第一终端设备接收来自第二终端设备的图像数据,确定操作命令和状态信息,根据所述图像数据显示画面,根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态,使得第一终端设备可以实现与第二终端设备的图像同步、操作命令和状态信息同步与互通,使得两端可以对相机进行协同控制,也可以使得用户能更清楚的了解分布式相机的当前状态信息,提高控制的准确度,提升用户体验。
其中,“拍摄过程中”可表示第二终端设备正在进行拍摄,且完成拍摄之前的阶段。例如,第二终端设备开启了摄像头进入拍照或者拍摄视频的过程,在点击拍摄按钮或者结束录像的按钮之前,第二终端设备处于拍摄过程中,第二终端设备在拍摄过程中可显示拍摄预览画面。
图像数据可以包括图像帧数据,例如,在第二终端设备的拍摄过程中,没有执行对焦、缩放等图像处理的情况下,图像帧数据可以是摄像头在拍摄过程中采集到的各帧图像数据,其中,各帧图像数据可以是RGB类型、YUV类型、JPG格式,本申请的实施例对此不作限制。或者,图像帧数据也可以是经缩放、对焦等处理后的各帧图像数据,第一终端设备在接收到图像帧数据后显示的画面,可以是与第二终端设备同步的拍摄预览画面。图像数据还可包括帧信息(例如每帧图像的帧号)和图像基本属性信息(例如图像尺寸、分辨率等)。
用户可以通过点击、拖动等操作触发操作命令,例如,可以在第一终端设备和第二终端设备中显示操作命令对应的图标,用户可以通过点击图标等方式触发操作命令,以实现对图像的操作。设备间可以同步用户触发的操作命令和第二终端设备执行命令后的执行状态。
操作命令可以包括类型、区域、模式等参数,例如,对焦操作的操作命令可以对应类型为对焦、区域为用户点击的某一区域、模式为手动对焦。缩放操作的操作命令可以对应类型为缩放、以及模式为放大X倍。所述操作命令的操作类型可包括但不限于对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮等等。操作命令可以是由第一终端设备触发的、也可以是由第二终端设备触发的。
不同的操作命令可有各自的执行状态,例如对焦操作的执行状态可以包括开始对焦状态,对焦进行中状态,以及完成对焦状态,缩放操作的执行状态可以包括完成缩放至指定倍数,开启或关闭闪光灯操作的执行状态可以包括闪光灯已开启,或闪光灯已关闭,执行状态可以根据操作的特点和显示的需要进行设置,本申请实施例对此不作限制。
在一种可能的实现方式中,操作命令的执行状态可以在第一终端设备的界面上进行显示,以指示开始对焦状态,对焦进行中状态,以及完成对焦状态,或者完成缩放至指定倍数,或者闪光灯已开启/已关闭等状态,使用户了解操作命令在第二终端设备上的执行情况。显示的方式可以是显示图标,弹出提示框、在界面上显示文字等,例如参见图3-图7,本申请的实施例对此不作限制。以对焦操作为例,可以根据操作命令获得手动对焦操作在画面中的位置,并根据状态信息在预览画面中的该位置显示与开始对焦状态,对焦进行中状态,以及完成对焦状态相对应的图标。以缩放操作为例,可以根据操作命令获得操作类型为放大,放大倍数为3倍,并根据状态信息为“完成放大3倍”,在预览画面中显示例如图6或7所示的当前缩放倍数的指示。
图9示出了根据本申请一实施例的相机控制方法的流程图,如图9所示,操作命令可包括响应于针对第一终端设备的操作生成的第一操作命令,确定操作命令和状态信息可包括:
S21,将所述第一操作命令发送至所述第二终端设备;
S22,接收所述第二终端设备发送的、表示对所述第一操作命令的执行状态的状态信息。
根据本申请实施例,通过将第一操作命令发送至第二终端设备,接收第二终端设备发送的、表示对第一操作命令的执行状态的状态信息,可以实现在操作命令是由第一终端设备触发的情况下,第一终端设备与第二终端设备间相机的操作命令与状态信息的同步,使得第一终端设备可以对相机进行控制,且结果和状态共享,使得分布式相机场景下的相机控制更加灵活与迅速,控制更加准确。
例如,第一终端设备触发了执行对焦操作的第一操作命令,第二终端设备执行第一操作命令的过程中,第一终端设备可以接收到开始对焦、对焦进行中以及完成对焦(对焦成功/对焦失败)的状态信息,第一终端设备可同步显示第二终端设备的拍摄预览画面,并显示第二终端设备对对焦操作的执行状态。
在一种可能的实现方式中,操作命令可包括响应于针对第二终端设备的操作生成的第二操作命令,确定操作命令和状态信息可包括:接收所述第二终端设备发送的所述第二操作命令,和所述状态信息。
根据本申请实施例,通过接收第二终端设备发送的所述第二操作命令,和所述状态信息,可以使得在操作命令是由第二终端设备触发的情况下,操作命令和状态信息可以与第一终端设备同步,使得第一终端设备的用户可以实时的了解当前相机的状态,在此基础上可以实现对相机的操作,实现了多侧信息的互通。
在一种可能的实现方式中,确定操作命令和状态信息可包括:
接收所述第二终端设备发送的状态信息,所述状态信息表示第二终端设备对多个操作命令中确定的目标操作命令的执行状态,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令。
根据本申请实施例,可以实现在设备间触发的多个操作命令产生冲突时,响应最新的操 作命令,同时允许另外一方取消正在执行的命令,可以正确选择用户的意图,并选取相应的操作命令,使得多方协同时,响应的更加灵活与迅速,多侧命令可以相互取消,相互更新,提高用户的体验。
其中,图像数据的帧号可以用于确定两张图像产生的先后顺序。可将生成操作命令的操作对应的图像帧的帧号,作为操作命令对应的帧号。例如,在第一终端设备或第二终端设备显示拍摄预览画面时,用户点击预览画面进行对焦,第一终端设备或第二终端设备检测到该点击操作时,所显示的图像帧的帧号,可作为该点击操作生成的操作命令对应的帧号。操作命令对应的帧号越大,表示该操作命令的操作发生的时间越靠后,帧号最大的操作命令,可对应于最新的操作。可以每隔预设时间间隔,对类型相同的(例如同为对焦)的操作命令对应的帧号进行比较,选出该类型、当前时间间隔内帧号最大的操作命令,作为目标操作命令,例如选出当前30毫秒内,所有对焦操作的操作命令中,帧号最大的操作命令,由第二终端设备来执行该帧号最大的操作命令。其中,预定时间间隔可根据需要选取,例如根据第二终端设备的相机系统当前的每秒出帧数决定,例如,在每秒30帧的情况下,预定时间间隔可以是100/3毫秒。
在一示例中,可能有多个第一终端设备都生成了同一类型的操作命令(例如多个用户在不同的第一种端设备上同时出发了缩放操作命令),这些操作命令都发送给第二终端设备,第二终端设备可判断各第一终端设备发来的操作命令所对应的帧号,并选出帧号最大的目标操作命令来执行。
在又一示例中,可能有一个或多个第一终端设备、以及第二终端设备自身都生成了同一类型的操作命令,第二终端设备可判断各第一终端设备发来的操作命令以及自身生成的操作命令对应的帧号,并选出帧号最大的目标操作命令来执行。
第二终端设备可以将表示目标操作命令的执行状态的状态信息,发送给与第二终端设备通信的所有第一终端设备,可包括触发了操作命令或者没有触发操作命令的第一终端设备。如果第二终端设备或者某第一终端设备触发了操作命令,而该操作命令不是同类型中帧号最大的,则该操作命令未被执行,相当于被取消。
例如,第二终端设备在30毫秒内检测到两个缩放操作命令,一个是由第二终端设备触发的、对于帧号为30的图像进行放大三倍的命令,另一个是由第一终端设备触发的、对于帧号为25的图像进行放大两倍的命令,那么目标操作命令是对应帧号更大的也就是放大三倍的命令,对于第一终端设备,在接收到完成放大三倍的状态信息前,可以显示完成放大两倍的操作命令,在接收到完成放大三倍的状态信息后,可以显示完成放大三倍的状态信息。
图10示出了根据本申请一实施例的相机控制方法的流程图,方法可用于第二终端设备,如图10所示,方法可包括:
S31,将图像数据发送给第一终端设备,所述图像数据是所述第二终端设备在拍摄过程中采集的;
S32,确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
S33,将所述状态信息发送给第一终端设备。
根据本申请实施例,通过第二终端设备将图像数据发送给第一终端设备,确定操作命令和状态信息,将所述状态信息发送给第一终端设备,可以使得多侧设备的图像、操作命令和 状态信息同步与互通,使得对端也可以对相机进行协同控制,可以实现分布式的相机控制。
图11示出了根据本申请一实施例的相机控制方法的流程图,如图11所示,操作命令可包括响应于针对第一终端设备的操作生成的第一操作命令,确定操作命令和状态信息可包括:
S41,接收所述第一终端设备发送的所述第一操作命令,
S42,执行所述第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息。
根据本申请实施例,通过接收第一终端设备发送的所述第一操作命令,执行第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息,可实现在操作命令是由第一终端设备触发的情况下,同步操作命令并执行命令,实现多侧设备对相机的控制,通过将执行后的执行状态进行多侧同步,可以使得多侧设备的用户可以直观的了解相机的当前状态,方便用户进行后续的操作,使控制更加准确,提升了用户体验。
图12示出了根据本申请一实施例的相机控制方法的流程图,如图12所示,确定操作命令和状态信息可包括:
S51,响应于针对第二终端设备的操作,生成第二操作命令,
S52,执行所述第二操作命令,得到表示所述第二操作命令的执行状态的状态信息。
根据本申请的实施例,通过响应于针对第二终端设备的操作,生成第二操作命令,执行第二操作命令,得到表示第二操作命令的执行状态的状态信息,可以实现在操作命令是由第二终端设备触发的情况下,执行操作命令并得到相应的状态信息,以同步给多侧设备,使得多侧设备可以对相机进行协同控制。
图13示出了根据本申请一实施例的相机控制方法的流程图,如图13所示,确定操作命令和状态信息可包括:
S61,从多个操作命令中确定目标操作命令,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令;
S62,执行所述目标操作命令,得到表示所述目标操作命令的执行状态的状态信息。
根据本申请实施例,可以实现在触发的多个操作命令产生冲突时,第二终端设备可以通过并发响应的策略,执行最新的操作命令,同时允许对侧取消正在执行的命令,可以正确选择用户的意图,并选取相应的操作命令,使得多方协同时,响应的更加灵活与迅速,多侧命令可以相互取消,相互更新,提高用户的体验。
关于第二终端设备的示例性描述可参见上文,此处不再重复赘述。
图14使出了根据本申请一实施例的相机控制装置的结构图,该装置可用于第一终端设备,如图14所示,所述装置可包括:
图像数据接收模块1401,用于接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;
第一信息确定模块1402,用于确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
画面显示模块1403,用于根据所述图像数据显示画面;
执行状态显示模块1404,用于根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。
根据本申请实施例,通过第一终端设备接收来自第二终端设备的图像数据,确定操作命令和状态信息,根据所述图像数据显示画面,根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态,使得第一终端设备可以实现与第二终端设备的图像同步、操作命令和状态信息同步与互通,使得两端可以对相机进行协同控制,也可以使得用户能更清楚的了解分布式相机的当前状态信息,提高控制的准确度,提升用户体验。
在所述相机控制装置的一种可能的实现方式中,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,所述第一信息确定模块,包括:第一操作命令发送子模块,用于将所述第一操作命令发送至所述第二终端设备;第一信息接收子模块,用于接收所述第二终端设备发送的、表示对所述第一操作命令的执行状态的状态信息。
根据本申请实施例,通过将第一操作命令发送至第二终端设备,接收第二终端设备发送的、表示对第一操作命令的执行状态的状态信息,可以实现在操作命令是由第一终端设备触发的情况下,第一终端设备与第二终端设备间相机的操作命令与状态信息的同步,使得第一终端设备可以对相机进行控制,且结果和状态共享,使得分布式相机场景下的相机控制更加灵活与迅速,控制更加准确。
在所述相机控制装置的一种可能的实现方式中,所述操作命令包括响应于针对第二终端设备的操作生成的第二操作命令,所述第一信息确定模块,包括:第二信息接收子模块,用于接收所述第二终端设备发送的所述第二操作命令,和所述状态信息。
根据本申请实施例,通过接收第二终端设备发送的所述第二操作命令,和所述状态信息,可以使得在操作命令是由第二终端设备触发的情况下,操作命令和状态信息可以与第一终端设备同步,使得第一终端设备的用户可以实时的了解当前相机的状态,在此基础上可以实现对相机的操作,实现了多侧信息的互通。
在所述相机控制装置的一种可能的实现方式中,所述第一信息确定模块,包括:状态信息接收子模块,用于接收所述第二终端设备发送的状态信息,所述状态信息表示第二终端设备对多个操作命令中确定的目标操作命令的执行状态,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令。
根据本申请实施例,可以实现在设备间触发的多个操作命令产生冲突时,响应最新的操作命令,同时允许另外一方取消正在执行的命令,可以正确选择用户的意图,并选取相应的操作命令,使得多方协同时,响应的更加灵活与迅速,多侧命令可以相互取消,相互更新,提高用户的体验。
在所述相机控制装置的一种可能的实现方式中,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
根据本申请实施例,可以丰富第一终端设备上可对图像进行的操作,并通过第一终端设备与第二终端设备的界面显示,实现分布式相机控制的场景,使得用户的体验更好。
图15使出了根据本申请一实施例的相机控制装置的结构图,该装置可用于第二终端设备,如图15所示,所述装置可包括:
图像数据发送模块1501,用于将图像数据发送给第一终端设备,所述图像数据是所述第二终端设备在拍摄过程中采集的;
第二信息确定模块1502,用于确定操作命令和状态信息,所述操作命令为针对所述第二 终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
状态信息发送模块1503,用于将所述状态信息发送给第一终端设备。
根据本申请实施例,通过第二终端设备将图像数据发送给第一终端设备,确定操作命令和状态信息,将所述状态信息发送给第一终端设备,可以使得多侧设备的图像、操作命令和状态信息同步与互通,使得对端也可以对相机进行协同控制,可以实现分布式的相机控制。
在所述的相机控制装置的一种可能的实现方式中,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,所述第二信息确定模块,包括:操作命令接收子模块,用于接收所述第一终端设备发送的所述第一操作命令,第一操作命令执行子模块,用于执行所述第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息。
根据本申请实施例,通过接收第一终端设备发送的所述第一操作命令,执行第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息,可实现在操作命令是由第一终端设备触发的情况下,同步操作命令并执行命令,实现多侧设备对相机的控制,通过将执行后的执行状态进行多侧同步,可以使得多侧设备的用户可以直观的了解相机的当前状态,方便用户进行后续的操作,使控制更加准确,提升了用户体验。
在所述的相机控制装置的一种可能的实现方式中,所述第二信息确定模块,包括:操作命令生成子模块,用于响应于针对第二终端设备的操作,生成第二操作命令,第二操作命令执行子模块,用于执行所述第二操作命令,得到表示所述第二操作命令的执行状态的状态信息。
根据本申请的实施例,通过响应于针对第二终端设备的操作,生成第二操作命令,执行第二操作命令,得到表示第二操作命令的执行状态的状态信息,可以实现在操作命令是由第二终端设备触发的情况下,执行操作命令并得到相应的状态信息,以同步给多侧设备,使得多侧设备可以对相机进行协同控制。
在所述的相机控制装置的一种可能的实现方式中,所述第二信息确定模块,包括:目标操作命令确定子模块,用于从多个操作命令中确定目标操作命令,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令;目标操作命令执行子模块,用于执行所述目标操作命令,得到表示所述目标操作命令的执行状态的状态信息。
根据本申请实施例,可以实现在触发的多个操作命令产生冲突时,第二终端设备可以通过并发响应的策略,执行最新的操作命令,同时允许对侧取消正在执行的命令,可以正确选择用户的意图,并选取相应的操作命令,使得多方协同时,响应的更加灵活与迅速,多侧命令可以相互取消,相互更新,提高用户的体验。
在所述的相机控制装置的一种可能的实现方式中,在所述相机控制方法的第四种可能的实现方式中,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
根据本申请实施例,可以丰富第二终端设备上可对图像进行的操作,并通过第一终端设备与第二终端设备的界面显示,实现分布式相机控制的场景,使得用户的体验更好。
图16示出根据本申请一实施例的终端设备的结构示意图。以终端设备是手机为例,图16示出了手机200的结构示意图。
手机200可以包括处理器210,外部存储器接口220,内部存储器221,USB接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块251,无线通信模块252,音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及SIM卡接口295等。其中传感器模块280可以包括陀螺仪传感器280A,加速度传感器280B,接近光传感器280G、指纹传感器280H,触摸传感器280K(当然,手机200还可以包括其它传感器,比如温度传感器,压力传感器、距离传感器、磁传感器、环境光传感器、气压传感器、骨传导传感器等,图中未示出)。
可以理解的是,本申请实施例示意的结构并不构成对手机200的具体限定。在本申请另一些实施例中,手机200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(Neural-network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是手机200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。
处理器210可以运行本申请实施例提供的相机控制方法,以便于支持多各第一终端设备对同一第二终端设备的相机控制,真正做到了相机的共享控制,对于某一设备上的命令,设备间可以互相感知,执行命令的结果设备间可以共享,同时支持各个设备并发触发命令,通过协同机制正确地理解用户的意图,设备间并发的命令可以相互取消、相互更新,响应速度快,更加灵活与迅速,给用户带来了更便捷的用户体验。处理器210可以包括不同的器件,比如集成CPU和GPU时,CPU和GPU可以配合执行本申请实施例提供的相机控制方法,比如相机控制方法中部分算法由CPU执行,另一部分算法由GPU执行,以得到较快的处理效率。
显示屏294用于显示图像,视频等。显示屏294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机200可以包括1个或N个显示屏294,N为大于1的正整数。显示屏294可用于显示由用户输入的信息或提供给用户的信息以及各种图形用户界面(graphical user interface,GUI)。例如,显示器294可以显示照片、视频、网页、或者文件等。再例如,显示器294可以显示图形用户界面。其中,图形用户界面上包括状态栏、可隐藏的导航栏、时间和天气小组件(widget)、以及应用的图标,例如浏览器图标等。状态栏中包括运营 商名称(例如中国移动)、移动网络(例如4G)、时间和剩余电量。导航栏中包括后退(back)键图标、主屏幕(home)键图标和前进键图标。此外,可以理解的是,在一些实施例中,状态栏中还可以包括蓝牙图标、Wi-Fi图标、外接设备图标等。还可以理解的是,在另一些实施例中,图形用户界面中还可以包括Dock栏,Dock栏中可以包括常用的应用图标等。当处理器210检测到用户的手指(或触控笔等)针对某一应用图标的触摸事件后,响应于该触摸事件,打开与该应用图标对应的应用的用户界面,并在显示器294上显示该应用的用户界面。
在本申请实施例中,显示屏294可以是一个一体的柔性显示屏,也可以采用两个刚性屏以及位于两个刚性屏之间的一个柔性屏组成的拼接显示屏。
当处理器210运行本申请实施例提供的相机控制方法后,终端设备可以通过天线1、天线2或者USB接口与其他的终端设备建立连接,并根据本申请实施例提供的相机控制方法传输数据以及控制显示屏294显示相应的图形用户界面。
摄像头293(前置摄像头或者后置摄像头,或者一个摄像头既可作为前置摄像头,也可作为后置摄像头)用于捕获静态图像或视频。通常,摄像头293可以包括感光元件比如镜头组和图像传感器,其中,镜头组包括多个透镜(凸透镜或凹透镜),用于采集待拍摄物体反射的光信号,并将采集的光信号传递给图像传感器。图像传感器根据所述光信号生成待拍摄物体的原始图像。
内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器210通过运行存储在内部存储器221的指令,从而执行手机200的各种功能应用以及数据处理。内部存储器221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,应用程序(比如相机应用,微信应用等)的代码等。存储数据区可存储手机200使用过程中所创建的数据(比如相机应用采集的图像、视频等)等。
内部存储器221还可以存储本申请实施例提供的相机控制方法对应的一个或多个计算机程序1310。该一个或多个计算机程序1304被存储在上述存储器221中并被配置为被该一个或多个处理器210执行,该一个或多个计算机程序1310包括指令,上述指令可以用于执行如图2、图8-13相应实施例中的各个步骤,该计算机程序1310可以包括图像数据接收模块1401,用于接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;第一信息确定模块1402,用于确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;画面显示模块1403,用于根据所述图像数据显示画面;执行状态显示模块1404,用于根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。该计算机程序1310还可以包括图像数据发送模块1501,用于将图像数据发送给第一终端设备,所述图像数据是所述第二终端设备在拍摄过程中采集的;第二信息确定模块1502,用于确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;状态信息发送模块1503,用于将所述状态信息发送给第一终端设备(未示出)。
此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
当然,本申请实施例提供的相机控制方法的代码还可以存储在外部存储器中。这种情况下,处理器210可以通过外部存储器接口220运行存储在外部存储器中的相机控制方法的代 码。
下面介绍传感器模块280的功能。
陀螺仪传感器280A,可以用于确定手机200的运动姿态。在一些实施例中,可以通过陀螺仪传感器280A确定手机200围绕三个轴(即,x,y和z轴)的角速度。即陀螺仪传感器280A可以用于检测手机200当前的运动状态,比如抖动还是静止。
当本申请实施例中的显示屏为可折叠屏时,陀螺仪传感器280A可用于检测作用于显示屏294上的折叠或者展开操作。陀螺仪传感器280A可以将检测到的折叠操作或者展开操作作为事件上报给处理器210,以确定显示屏294的折叠状态或展开状态。
加速度传感器280B可检测手机200在各个方向上(一般为三轴)加速度的大小。即陀螺仪传感器280A可以用于检测手机200当前的运动状态,比如抖动还是静止。当本申请实施例中的显示屏为可折叠屏时,加速度传感器280B可用于检测作用于显示屏294上的折叠或者展开操作。加速度传感器280B可以将检测到的折叠操作或者展开操作作为事件上报给处理器210,以确定显示屏294的折叠状态或展开状态。
接近光传感器280G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。手机通过发光二极管向外发射红外光。手机使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定手机附近有物体。当检测到不充分的反射光时,手机可以确定手机附近没有物体。当本申请实施例中的显示屏为可折叠屏时,接近光传感器280G可以设置在可折叠的显示屏294的第一屏上,接近光传感器280G可根据红外信号的光程差来检测第一屏与第二屏的折叠角度或者展开角度的大小。
陀螺仪传感器280A(或加速度传感器280B)可以将检测到的运动状态信息(比如角速度)发送给处理器210。处理器210基于运动状态信息确定当前是手持状态还是脚架状态(比如,角速度不为0时,说明手机200处于手持状态)。
指纹传感器280H用于采集指纹。手机200可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
触摸传感器280K,也称“触控面板”。触摸传感器280K可以设置于显示屏294,由触摸传感器280K与显示屏294组成触摸屏,也称“触控屏”。触摸传感器280K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器280K也可以设置于手机200的表面,与显示屏294所处的位置不同。
示例性的,手机200的显示屏294显示主界面,主界面中包括多个应用(比如相机应用、微信应用等)的图标。用户通过触摸传感器280K点击主界面中相机应用的图标,触发处理器210启动相机应用,打开摄像头293。显示屏294显示相机应用的界面,例如取景界面。
手机200的无线通信功能可以通过天线1,天线2,移动通信模块251,无线通信模块252,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块251可以提供应用在手机200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块251可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise  amplifier,LNA)等。移动通信模块251可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块251还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块251的至少部分功能模块可以被设置于处理器210中。在一些实施例中,移动通信模块251的至少部分功能模块可以与处理器210的至少部分模块被设置在同一个器件中。在本申请实施例中,移动通信模块251还可以用于与其它终端设备进行信息交互,比如发送操作命令、发送状态信息和接收图像数据。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器270A,受话器270B等)输出声音信号,或通过显示屏294显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器210,与移动通信模块251或其他功能模块设置在同一个器件中。
无线通信模块252可以提供应用在手机200上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块252可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块252经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器210。无线通信模块252还可以从处理器210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。本申请实施例中,无线通信模块252,用于在处理器210的控制下与其他终端设备之间传输数据,比如,处理器210运行本申请实施例提供的相机控制方法时,处理器可以控制无线通信模块252向其他终端设备发送操作命令和状态信息,还可以接收图像数据,以实现相机的控制,为用户提供直观的视觉反馈,避免用户错误操作和反复操作,提高操作效率。
另外,手机200可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。手机200可以接收按键290输入,产生与手机200的用户设置以及功能控制有关的键信号输入。手机200可以利用马达291产生振动提示(比如来电振动提示)。手机200中的指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。手机200中的SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和手机200的接触和分离。
应理解,在实际应用中,手机200可以包括比图16所示的更多或更少的部件,本申请实施例不作限定。图示手机200仅是一个范例,并且手机200可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
终端设备的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云 架构。本申请实施例以分层架构的Android系统为例,示例性说明终端设备的软件结构。
图17是本申请实施例的终端设备的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图17所示,应用程序包可以包括电话、相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图17所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。窗口管理器还可以用于检测是否存在本申请实施例的相机控制操作,例如对焦操作、缩放操作、开启或关闭闪光灯操作。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供终端设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,终端设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
本申请的实施例提供了一种相机控制装置,包括:处理器以及用于存储处理器可执行指令的存储器;其中,所述处理器被配置为执行所述指令时实现上述方法。
本申请的实施例提供了一种非易失性计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现上述方法。
本申请的实施例提供了一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备的处理器中运行时,所述电子设备中的处理器执行上述方法。
以上实施例的示例性说明可参见方法实施例部分,此处不再重复描述。
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是――但不限于――电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(Random Access Memory,RAM)、只读存储器(Read Only Memory,ROM)、可擦式可编程只读存储器(Electrically Programmable Read-Only-Memory,EPROM或闪存)、静态随机存取存储器(Static Random-Access Memory,SRAM)、便携式压缩盘只读存储器(Compact Disc Read-Only Memory,CD-ROM)、数字多功能盘(Digital Video Disc,DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。
这里所描述的计算机可读程序指令或代码可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。
用于执行本申请操作的计算机程序指令可以是汇编指令、指令集架构(Instruction Set Architecture,ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(Local Area Network,LAN)或广域网(Wide Area Network,WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或可编程逻辑阵列(Programmable Logic Array,PLA),该电子电路可以执行计算机可读程序 指令,从而实现本申请的各个方面。
这里参照根据本申请实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本申请的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本申请的多个实施例的装置、系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。
也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行相应的功能或动作的硬件(例如电路或ASIC(Application Specific Integrated Circuit,专用集成电路))来实现,或者可以用硬件和软件的组合,如固件等来实现。
尽管在此结合各实施例对本发明进行了描述,然而,在实施所要求保护的本发明过程中,本领域技术人员通过查看所述附图、公开内容、以及所附权利要求书,可理解并实现所述公开实施例的其它变化。在权利要求中,“包括”(comprising)一词不排除其他组成部分或步骤,“一”或“一个”不排除多个的情况。单个处理器或其它单元可以实现权利要求中列举的若干项功能。相互不同的从属权利要求中记载了某些措施,但这并不表示这些措施不能组合起来产生良好的效果。
以上已经描述了本申请的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术的改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。

Claims (14)

  1. 一种相机控制方法,其特征在于,所述方法用于第一终端设备,所述方法包括:
    接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;
    确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
    根据所述图像数据显示画面;
    根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。
  2. 根据权利要求1所述的方法,其特征在于,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,
    确定操作命令和状态信息,包括:
    将所述第一操作命令发送至所述第二终端设备;
    接收所述第二终端设备发送的、表示对所述第一操作命令的执行状态的状态信息。
  3. 根据权利要求1所述的方法,其特征在于,所述操作命令包括响应于针对第二终端设备的操作生成的第二操作命令,
    确定操作命令和状态信息,包括:
    接收所述第二终端设备发送的所述第二操作命令,和所述状态信息。
  4. 根据权利要求1所述的方法,其特征在于,确定操作命令和状态信息,包括:
    接收所述第二终端设备发送的状态信息,所述状态信息表示第二终端设备对多个操作命令中确定的目标操作命令的执行状态,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令。
  5. 根据权利要求1-4任意一项所述的方法,其特征在于,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
  6. 一种相机控制方法,其特征在于,所述方法用于第二终端设备,所述方法包括:
    将图像数据发送给第一终端设备,所述图像数据是所述第二终端设备在拍摄过程中采集的;
    确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操 作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
    将所述状态信息发送给第一终端设备。
  7. 根据权利要求6所述的方法,其特征在于,所述操作命令包括响应于针对第一终端设备的操作生成的第一操作命令,
    确定操作命令和状态信息,包括:
    接收所述第一终端设备发送的所述第一操作命令,
    执行所述第一操作命令,得到表示对所述第一操作命令的执行状态的状态信息。
  8. 根据权利要求6所述的方法,其特征在于,确定操作命令和状态信息,包括:
    响应于针对第二终端设备的操作,生成第二操作命令,
    执行所述第二操作命令,得到表示所述第二操作命令的执行状态的状态信息。
  9. 根据权利要求6所述的方法,其特征在于,确定操作命令和状态信息,包括:
    从多个操作命令中确定目标操作命令,所述多个操作命令是响应于针对第二终端设备、或者一个或多个第一终端设备的操作生成的、操作类型相同的操作命令,所述目标操作命令是所述多个操作命令中、对应帧号最大的操作命令;
    执行所述目标操作命令,得到表示所述目标操作命令的执行状态的状态信息。
  10. 根据权利要求6-9任意一项所述的方法,其特征在于,所述操作命令的操作类型包括对焦、缩放、开启或关闭闪光灯、调整曝光、使用滤镜、美肤、磨皮中的一种或多种。
  11. 一种相机控制装置,其特征在于,所述装置用于第一终端设备,所述装置包括:
    图像数据接收模块,用于接收来自第二终端设备的图像数据,所述图像数据是第二终端设备在拍摄过程中采集的;
    第一信息确定模块,用于确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
    画面显示模块,用于根据所述图像数据显示画面;
    执行状态显示模块,用于根据所述操作命令和状态信息,在所述画面上显示操作命令的执行状态。
  12. 一种相机控制装置,其特征在于,所述装置用于第二终端设备,所述装置包括:
    图像数据发送模块,用于将图像数据发送给第一终端设备,所述图像数据是所述第二终端设备在拍摄过程中采集的;
    第二信息确定模块,用于确定操作命令和状态信息,所述操作命令为针对所述第二终端设备的所述拍摄过程的操作命令,所述状态信息表示所述第二终端设备对所述操作命令的执行状态;
    状态信息发送模块,用于将所述状态信息发送给第一终端设备。
  13. 一种相机控制装置,其特征在于,所述装置包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令时实现权利要求1-5任意一项所述的方法,或者实现权利要求6-10任意一项所述的方法。
  14. 一种非易失性计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计算机程序指令被处理器执行时实现权利要求1-5中任意一项所述的方法,或者,实现权利要求6-10任意一项所述的方法。
PCT/CN2021/134826 2020-12-09 2021-12-01 相机控制方法、装置和存储介质 WO2022121751A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/256,504 US20240056673A1 (en) 2020-12-09 2021-12-01 Camera Control Method and Apparatus, and Storage Medium
EP21902458.5A EP4246940A4 (en) 2020-12-09 2021-12-01 CAMERA CONTROL METHOD AND APPARATUS, AND STORAGE MEDIUM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011451138.5 2020-12-09
CN202011451138.5A CN114615362B (zh) 2020-12-09 2020-12-09 相机控制方法、装置和存储介质

Publications (1)

Publication Number Publication Date
WO2022121751A1 true WO2022121751A1 (zh) 2022-06-16

Family

ID=81857121

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/134826 WO2022121751A1 (zh) 2020-12-09 2021-12-01 相机控制方法、装置和存储介质

Country Status (4)

Country Link
US (1) US20240056673A1 (zh)
EP (1) EP4246940A4 (zh)
CN (1) CN114615362B (zh)
WO (1) WO2022121751A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007028254A (ja) * 2005-07-19 2007-02-01 Nippon Micro Systems Kk 遠隔カメラ制御システム
US20110273570A1 (en) * 2010-05-10 2011-11-10 Sony Corporation Control device, camera, method and computer program storage device
CN102710549A (zh) * 2012-06-12 2012-10-03 上海量明科技发展有限公司 通过摄像建立通信连接关系的方法、终端及系统
CN107992255A (zh) * 2017-12-01 2018-05-04 珠海格力电器股份有限公司 一种生成图像的方法及服务器
CN108076379A (zh) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 多屏互动实现方法及装置
CN110162255A (zh) * 2019-05-30 2019-08-23 腾讯科技(深圳)有限公司 单机程序的运行方法、装置、设备及存储介质
CN110945863A (zh) * 2018-05-23 2020-03-31 华为技术有限公司 一种拍照方法和终端设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104349032A (zh) * 2013-07-23 2015-02-11 中兴通讯股份有限公司 一种照相的方法及移动终端
US9516220B2 (en) * 2014-10-02 2016-12-06 Intel Corporation Interactive video conferencing
CN104320586A (zh) * 2014-11-07 2015-01-28 广东欧珀移动通信有限公司 拍照方法、系统及终端
JP6602080B2 (ja) * 2015-07-24 2019-11-06 キヤノン株式会社 撮影システム及びその制御方法、コンピュータプログラム
US11539785B2 (en) * 2019-02-22 2022-12-27 Microsoft Technology Licensing, Llc Simultaneous cross-device application platform
CN110971823B (zh) * 2019-11-29 2021-06-29 维沃移动通信(杭州)有限公司 一种参数调整方法及终端设备
CN111988528B (zh) * 2020-08-31 2022-06-24 北京字节跳动网络技术有限公司 拍摄方法、装置、电子设备及计算机可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007028254A (ja) * 2005-07-19 2007-02-01 Nippon Micro Systems Kk 遠隔カメラ制御システム
US20110273570A1 (en) * 2010-05-10 2011-11-10 Sony Corporation Control device, camera, method and computer program storage device
CN102710549A (zh) * 2012-06-12 2012-10-03 上海量明科技发展有限公司 通过摄像建立通信连接关系的方法、终端及系统
CN108076379A (zh) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 多屏互动实现方法及装置
CN107992255A (zh) * 2017-12-01 2018-05-04 珠海格力电器股份有限公司 一种生成图像的方法及服务器
CN110945863A (zh) * 2018-05-23 2020-03-31 华为技术有限公司 一种拍照方法和终端设备
CN110162255A (zh) * 2019-05-30 2019-08-23 腾讯科技(深圳)有限公司 单机程序的运行方法、装置、设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4246940A4

Also Published As

Publication number Publication date
CN114615362A (zh) 2022-06-10
EP4246940A4 (en) 2024-05-08
EP4246940A1 (en) 2023-09-20
CN114615362B (zh) 2023-07-11
US20240056673A1 (en) 2024-02-15

Similar Documents

Publication Publication Date Title
WO2022100237A1 (zh) 投屏显示方法及相关产品
CN114205522B (zh) 一种长焦拍摄的方法及电子设备
WO2022105759A1 (zh) 视频处理方法、装置及存储介质
WO2022100239A1 (zh) 设备协作方法、装置、系统、电子设备和存储介质
JP2023514631A (ja) インタフェースレイアウト方法、装置、及び、システム
CN111666055B (zh) 数据的传输方法及装置
KR20140112914A (ko) 휴대단말기의 어플리케이션 정보 처리 장치 및 방법
WO2022088974A1 (zh) 一种遥控方法、电子设备及系统
WO2022063159A1 (zh) 一种文件传输的方法及相关设备
US20240143262A1 (en) Splicing Display Method, Electronic Device, and System
WO2022105758A1 (zh) 道路识别方法以及装置
WO2022134691A1 (zh) 一种终端设备中啸叫处理方法及装置、终端
CN115442509A (zh) 拍摄方法、用户界面及电子设备
WO2023231697A1 (zh) 一种拍摄方法及相关设备
WO2023202407A1 (zh) 应用的显示方法、装置及存储介质
WO2022194005A1 (zh) 一种跨设备同步显示的控制方法及系统
WO2022105716A1 (zh) 基于分布式控制的相机控制方法及终端设备
WO2022105793A1 (zh) 图像处理方法及其设备
WO2022121751A1 (zh) 相机控制方法、装置和存储介质
WO2022166614A1 (zh) 针对控件操作的执行方法、装置、存储介质和控件
WO2023280077A1 (zh) 图像校正方法、装置及存储介质
WO2023169237A1 (zh) 一种截屏方法、电子设备及系统
WO2023083052A1 (zh) 一种交互方法及装置
WO2023231696A1 (zh) 一种拍摄方法及相关设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21902458

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18256504

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2021902458

Country of ref document: EP

Effective date: 20230613

NENP Non-entry into the national phase

Ref country code: DE