WO2023065077A1 - 远程控制方法、拍摄设备、控制设备、系统及存储介质 - Google Patents

远程控制方法、拍摄设备、控制设备、系统及存储介质 Download PDF

Info

Publication number
WO2023065077A1
WO2023065077A1 PCT/CN2021/124479 CN2021124479W WO2023065077A1 WO 2023065077 A1 WO2023065077 A1 WO 2023065077A1 CN 2021124479 W CN2021124479 W CN 2021124479W WO 2023065077 A1 WO2023065077 A1 WO 2023065077A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
image
control device
display screen
shooting
Prior art date
Application number
PCT/CN2021/124479
Other languages
English (en)
French (fr)
Inventor
冯鲁桥
丁晓飞
耶方明
赵波
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2021/124479 priority Critical patent/WO2023065077A1/zh
Priority to CN202180100510.4A priority patent/CN117730541A/zh
Publication of WO2023065077A1 publication Critical patent/WO2023065077A1/zh

Links

Images

Definitions

  • the present application relates to the technical field of data transmission, and in particular to a remote control method, a photographing device, a control device, a system and a storage medium.
  • the shooting device can be equipped with a camera through the pan/tilt, and use the image transmission module to wirelessly transmit real-time image data to the remote device, so that the real-time image captured by the camera can be displayed on the display screen of the remote device.
  • the remote device also has a built-in control logic to transmit control commands to the shooting device to control the shooting parameters of the camera or the control parameters of the pan/tilt.
  • the present application provides a remote control method, a photographing device, a control device, a system, and a storage medium.
  • the present application provides a remote control method, the method is applicable to a shooting device, the shooting device and the control device have established a wireless communication connection, the shooting device includes a pan-tilt, and the pan-tilt is used to carry a camera , the method includes:
  • a control instruction is generated according to the touch data to control the photographing device.
  • the present application provides a remote control method, the method is applicable to control equipment, the control equipment establishes a wireless communication connection with the shooting equipment, the shooting equipment includes a pan-tilt, and the pan-tilt is used to carry a camera , the method includes:
  • the touch data including the position of the touch point
  • the present application provides a photographing device, the photographing device and the control device establish a wireless communication connection, the photographing device includes a pan-tilt, the pan-tilt is used to carry a camera, and the photographing device also includes: a memory and processors;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and when executing the computer program, implement the following steps:
  • a control instruction is generated according to the touch data to control the photographing device.
  • the present application provides a control device, the control device establishes a wireless communication connection with the shooting device, the shooting device includes a pan-tilt, and the pan-tilt is used to carry a camera, and the control device includes: a display screen , memory and processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and when executing the computer program, implement the following steps:
  • the touch data including the position of the touch point
  • the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor realizes the remote Control Method.
  • the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor realizes the remote Control Method.
  • the present application provides a shooting control system, which includes the shooting device described in the third aspect above and the control device described in the fourth aspect above.
  • the image captured by the camera is synthesized with the image corresponding to the user operation interface, and the synthesized image is sent to the control device for display.
  • the control device detects the user's touch operation, and obtains information including the position of the touch point
  • the touch data is sent to the shooting device, and the shooting device generates control commands based on the touch data sent by the control device to control the shooting device, so that there is no need to build too much control logic in the control device, which can reduce the local control of the control device.
  • the interpretation of control data is simple and efficient.
  • FIG. 1 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
  • Fig. 2 is a schematic flow diagram of the photographing device end provided by an embodiment of the present application.
  • FIG. 3 is a schematic flow diagram of a control device provided by an embodiment of the present application.
  • Fig. 4 is a schematic flow diagram of a system provided by an embodiment of the present application.
  • Fig. 5 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
  • Fig. 6 is a schematic structural diagram of a control device provided by an embodiment of the present application.
  • the shooting device is equipped with a camera through the pan/tilt, and the real-time image data is wirelessly transmitted to the remote device by using the image transmission module, and the real-time image captured by the camera is displayed on the display screen of the remote device.
  • the remote device has a built-in control logic to transmit control commands to the shooting device to control the shooting parameters of the camera or the control parameters of the pan/tilt.
  • FIG. 1 is a schematic structural diagram of a shooting device provided by an embodiment of the present application.
  • the shooting device includes a pan-tilt, and the pan-tilt is used to carry a camera.
  • the gimbal can adjust the camera in pitch, yaw, roll and vertical directions, so that the position of the camera remains stable or changes in a desired direction.
  • the camera can shoot the surrounding scenery, and the user needs to adjust the shooting frame rate, resolution, exposure parameters, focus parameters and other parameters based on the specific shooting requirements.
  • the control logic is built in the remote equipment, many control channels need to be established, the implementation is too complicated, the workload is too large, and it is easy to cause control failure.
  • the embodiment of the present application provides a remote control method, a shooting device, a control device, a system, and a storage medium.
  • the shooting device and the control device have established a wireless communication connection, and the shooting device includes a cloud platform.
  • the above gimbal is used to carry the camera.
  • the remote control method in the embodiment of the present application includes a remote control method at the shooting device end (referred to as the method at the shooting device end) and a remote control method at the control device end (referred to as the method at the control device end).
  • the method at the shooting device side includes: synthesizing the image captured by the camera and the image corresponding to the user operation interface to obtain a synthesized image; sending the synthesized image to the control device, so that the display screen of the control device displays the synthesized image. image; acquiring touch data sent by the control device, where the touch data includes the position of a touch point; generating a control instruction according to the touch data to control the shooting device.
  • the method for controlling the device includes: acquiring a composite image sent by the shooting device; displaying the composite image on a display screen of the control device; and obtaining The corresponding touch data includes the position of the touch point; sending the touch data to the shooting device.
  • the control device Since the image captured by the camera is synthesized with the image corresponding to the user operation interface, and the synthesized image is sent to the control device for display, the control device detects the user's touch operation and obtains touch data including the position of the touch point And send it to the shooting device, the shooting device generates control commands based on the touch data sent by the control device to control the shooting device, so that there is no need to build too much control logic in the control device, which can reduce the local interpretation of the touch data by the control device , which is simple to implement and more efficient.
  • the shooting device-side method and the control device-side method are combined for description, the shooting device-side method and the control device-side method are independent of each other, and the shooting device-side method does not require too many controls built into the control device Logic, to reduce the local interpretation of touch data by the control device and provide partial technical support; the control device side method provides partial technology for reducing the local interpretation of touch data by the control device without building too much control logic in the control device Support; when the shooting device-side method and the control device-side method are combined, there is no need to build too much control logic in the control device, reducing the local interpretation of touch data by the control device, and the implementation is simple and efficient.
  • Fig. 2 is a schematic flow diagram of the shooting device provided by an embodiment of the present application
  • Fig. 3 is a schematic flow diagram of the control device provided by an embodiment of the present application
  • Fig. 4 is a system provided by an embodiment of the present application Schematic diagram of the process, showing the method of combining the shooting device side and the control device side.
  • a wireless communication connection is established between the shooting device and the control device, and the shooting device includes a pan-tilt for carrying a camera.
  • the photographing device may be a handheld photographing device, a drone, an unmanned vehicle, an unmanned ship, and the like.
  • the shooting device method includes: step S101, step S102, step S103 and step S104; the control device method includes: step S201, step S202, step S203 and step S204.
  • Step S101 Synthesize the image captured by the camera and the image corresponding to the user operation interface to obtain a composite image.
  • Step S102 Send the synthesized image to a control device, so that a display screen of the control device displays the synthesized image.
  • the user interface is the medium for interaction and information exchange between equipment and users. It realizes the conversion between the internal form of information and the form acceptable to humans, so that users can conveniently and effectively operate the hardware to achieve two-way interaction and complete all tasks. Hope to work.
  • the composite image includes both the image captured by the camera and the image corresponding to the user operation interface. After the synthesized image is obtained, it is sent to the control device to be displayed on the display screen of the control device.
  • Step S201 Obtain a synthesized image sent by the photographing device, where the synthesized image is synthesized based on an image captured by the camera and an image corresponding to a user operation interface.
  • Step S202 displaying the synthesized image on a display screen of the control device.
  • Step S203 Obtain corresponding touch data according to the detected touch operation of the user on the display screen of the control device, where the touch data includes the position of the touch point.
  • Step S204 Send the touch data to the photographing device, so that the photographing device generates a control instruction according to the touch data to control the photographing device.
  • the control device After the control device acquires the synthesized image sent by the shooting device, it displays the synthesized image on the display screen of the control device, and the user can see the image captured by the camera of the shooting device and the user interface corresponding to the shooting device on the display screen of the control device. image.
  • the user wants to remotely control the shooting device on the control device, he can perform a touch operation on the display screen of the control device, and the control device obtains the corresponding touch data including the position of the touch point, and then sends the touch data to Shooting equipment.
  • Touch operations may include click operations and touch operations.
  • the click operation is that the user presses down ⁇ stays for a certain period of time (can be long or short) ⁇ the user releases the finger. This complete process is regarded as a click operation.
  • the position of the touch point can be the position of a single touch point.
  • the touch data may also include a touch type, such as press and lift.
  • the touch operation involves multi-touch, that is, the user operates multiple points on the display screen. At this time, the position of the touch point may be the position of multiple touch points.
  • the touch data may also include the touch type, Such as pressing, lifting, moving.
  • Step S103 Obtain touch data sent by the control device, where the touch data includes the positions of touch points.
  • Step S104 Generate a control instruction according to the touch data to control the photographing device.
  • the photographing device itself stores the corresponding relationship between touch data and control instructions. For example: touch data 1 corresponds to control command 1, touch data 2 corresponds to control command 2, touch data 3 corresponds to control command 3, and so on. After the photographing device acquires the touch data sent by the control device, it generates a corresponding control command in combination with the corresponding relationship, and then controls the photographing device in response to the control command.
  • the image captured by the camera is synthesized with the image corresponding to the user operation interface, and the synthesized image is sent to the control device for display.
  • the control device detects the user's touch operation, and obtains information including the position of the touch point
  • the touch data is sent to the shooting device, and the shooting device generates control commands based on the touch data sent by the control device to control the shooting device, so that there is no need to build too much control logic in the control device, which can reduce the local control of the control device.
  • Interpretation of data is simple and efficient.
  • the photographing device is provided with a body display screen for displaying the synthesized image.
  • the shooting device is provided with a body display screen, since the body display screen of the shooting device and the display screen of the remote control device can display composite images basically synchronously, which can provide technical support for convenient multi-person operation.
  • the method at the shooting device side further includes: controlling the shooting device according to the detected operation of the user on the display screen of the body.
  • the user can operate on the display screen of the body, and when the user's operation is detected, the shooting device can be controlled. Because the photographing device itself stores the corresponding relationship between touch data and control instructions. In this way, after the user operates on the display screen of the body, the touch data can be directly interpreted as a control command to control the shooting device, which can facilitate the user of the shooting device to perform corresponding operations.
  • the user of the shooting device and the user of the control device can perform divisional operations through the display screen of the body and the display screen of the control device. For example, one user performs touch operations on the display screen of the control device to remotely control the shooting device to adjust the control parameters of the gimbal, and another user performs touch operations on the display screen of the shooting device to control the shooting at close range device to adjust the shooting parameters of the camera; or, a user performs touch operation on the display screen of the control device, remotely controls the shooting device to adjust the shooting parameters of the camera, and another user performs touch operation on the display screen of the shooting device Operation, control the shooting equipment at close range to adjust the control parameters of the gimbal.
  • step S104 generating a control instruction according to the touch data may include: generating a pan/tilt control instruction and/or a shooting control instruction according to the touch data.
  • the generating a control instruction according to the touch data may include: generating a pan/tilt control instruction and/or a shooting control instruction according to the touch data.
  • the PTZ control command is used to control the PTZ
  • the shooting control command is used to control the shooting.
  • the control of the gimbal can be the control of the pitch axis (pitch), the translation axis (yaw), the roll axis (roll), and the Z axis
  • the control of the shooting can be the frame rate, resolution, exposure parameters, focus Parameters and other parameters of the control.
  • the user interface of the shooting device includes multiple different resolutions and/or multiple different frame rates, multiple different resolutions and/or multiple different frame rates are displayed on the composite image, and the user can control the device Click on the position of the resolution and/or frame rate on the display screen of the user to obtain the position of the corresponding touch point. After the shooting device obtains the position of the touch point, it can generate the resolution and/or frame rate selected by the user. Or frame rate control commands.
  • a second user operation interface of the control device itself may also be displayed, and the second user operation interface includes second operation controls, so that the user The control device can be controlled by operating a second operation control on the second user operation interface. That is, the image corresponding to the above user operation interface is the image corresponding to the first user operation interface, the first user operation interface includes a first operation control for controlling the shooting device, and the display screen of the control device It is also used for displaying a second user operation interface, where the second user operation interface includes a second operation control for controlling the control device.
  • controlling the control device may be controlling a mode of the control device.
  • the mode is the viewing mode
  • the display screen of the control device only displays images captured by the camera, so that the user of the control device can see a relatively simple picture.
  • the display screen of the control device displays a composite image, which includes images captured by the camera and images corresponding to the user interface, so that the user of the control device can control the shooting device in time.
  • the second user operation interface further includes a download operation control or a link operation control
  • the method further includes: generating an image download control instruction according to the detected user operation on the download operation control; Or generate a frequency binding control instruction according to the detected operation of the user on the frequency binding operation control; and send the image download control instruction or the frequency binding control instruction to the shooting device.
  • control device retains the commonly used download operation control or frequency binding operation control, and remotely controls the shooting device by operating the download operation control or frequency binding operation control, so as to download images from the shooting device or make the shooting device enter the frequency binding mode To establish a wireless communication connection with the control device.
  • the second user operation interface is displayed on the periphery of the synthesized image.
  • the size of the display screen of the control device may be larger than the size of the original synthesized image. In this way, neither the display of the synthesized image nor the display of the second user operation interface is affected. And this display mode can facilitate the user to distinguish, so as to quickly find the required operation control.
  • the shooting device-side method may further include: acquiring a control instruction sent by the control device, where the control instruction includes an image download control instruction or a frequency binding control instruction.
  • control device can retain commonly used image download control instructions or linking control instructions to remotely control the shooting device, so as to download images from the shooting device or make the shooting device enter a frequency binding mode to establish a wireless communication connection with the control device.
  • step S101 the synthesizing the image captured by the camera and the image corresponding to the user operation interface to obtain the composite image may also include:
  • the image captured by the camera and the image corresponding to the user operation interface are synthesized.
  • the synthesized image is processed to obtain a synthesized image suitable for the size of the display screen of the control device.
  • the display screen of the control device can display the original composited image in a 1:1 ratio. If the size of the display screen of the control device is greater than or smaller than the size of the original synthesized image, the synthesized image may be processed according to the size of the display screen of the control device to obtain an image suitable for the size of the display screen of the control device. composite image.
  • the number of the control devices is one or more. In this way, one or more users can remotely control the shooting device through one or more control devices.
  • step S202 the displaying the composite image on the display screen of the control device may include: One area is hidden and not displayed, and the remaining second area of the synthesized image is displayed on the display screen of the control device. In this way, a specific region in the composite image can be hidden when the composite image is displayed according to the needs of the user.
  • FIG. 5 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
  • the photographing device and the control device establish a wireless communication connection.
  • the photographing device 100 includes a pan-tilt 30, which is used for Equipped with camera 40.
  • the photographing device also includes: a memory 10 and a processor 20; the processor 20 is connected to the memory 10, the pan/tilt 30 and the camera 40 through a bus.
  • the processor 20 may be a microcontroller unit, a central processing unit, or a digital signal processor, among others.
  • the memory 10 may be a Flash chip, a read-only memory, a magnetic disk, an optical disk, a U disk or a mobile hard disk, and the like.
  • the photographing device of this embodiment can execute the steps in the above-mentioned method at the photographing device end.
  • relevant content of the above-mentioned remote control method at the photographing device end please refer to the relevant content of the above-mentioned remote control method at the photographing device end, which will not be repeated here.
  • the memory is used to store a computer program; the processor is used to execute the computer program and when executing the computer program, implement the following steps:
  • the photographing device is provided with a body display screen for displaying the synthesized image.
  • the processor executes the computer program
  • the following step is implemented: controlling the photographing device according to the detected operation of the user on the display screen of the fuselage.
  • the touch data further includes a touch type
  • the touch type includes at least one of pressing, lifting, and moving.
  • the processor executes the computer program, the following step is implemented: generating a pan/tilt control instruction and/or a shooting control instruction according to the touch data.
  • the processor executes the computer program, the following step is implemented: acquiring a control instruction sent by the control device, the control instruction including an image download control instruction or a frequency link control instruction.
  • the processor executes the computer program, the following steps are implemented: synthesizing the image captured by the camera and the image corresponding to the user operation interface; processing to obtain a composite image suitable for the size of the display screen of the control device.
  • the number of the control device is one or more.
  • FIG. 6 is a schematic structural diagram of an embodiment of the control device of the present application, the control device establishes a wireless communication connection with the shooting device, the shooting device includes a pan-tilt, and the pan-tilt is used to carry a camera.
  • the control device 200 includes: a display screen 33 , a memory 11 and a processor 22 ; the processor 22 is connected to the memory 11 and the display screen 33 through a bus.
  • the processor 22 may be a microcontroller unit, a central processing unit, or a digital signal processor, among others.
  • the memory 11 can be a Flash chip, a read-only memory, a magnetic disk, an optical disk, a U disk or a mobile hard disk, and the like.
  • the control device in this embodiment can execute the steps in the above-mentioned method at the control device end.
  • relevant content please refer to the relevant content of the above-mentioned remote control method at the control device end, which will not be repeated here.
  • the memory is used to store a computer program; the processor is used to execute the computer program and when executing the computer program, implement the following steps:
  • the composite image is synthesized based on the image captured by the camera and the image corresponding to the user operation interface; displaying the composite image on the display screen of the control device; according to the detection
  • the received user obtains the corresponding touch data for the touch operation of the display screen of the control device, and the touch data includes the position of the touch point; and sends the touch data to the shooting device, so that The photographing device generates a control instruction according to the touch data to control the photographing device.
  • the image corresponding to the user operation interface is an image corresponding to the first user operation interface
  • the display screen of the control device is also used to display the second user operation interface
  • the first user operation interface includes a A first operation control for controlling the shooting device
  • the second user operation interface includes a second operation control for controlling the control device.
  • the second user operation interface further includes a download operation control or a link operation control, and when the processor executes the computer program, the following steps are implemented: according to the detected operation of the user on the download operation control, Generate an image download control instruction; or generate a frequency binding control instruction according to the detected operation of the user on the frequency binding operation control; and send the image download control instruction or the frequency binding control instruction to the shooting device.
  • the second user operation interface is displayed on the periphery of the synthesized image.
  • the processor executes the computer program, the following steps are implemented: according to the preset hiding requirements, the corresponding first region on the composite image is hidden from display, and the corresponding first region is displayed on the display screen of the control device. The remaining second region of the composite image.
  • the touch data further includes a touch type
  • the touch type includes at least one of pressing, lifting, and moving.
  • the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor realizes the remote control described in any one of the above shooting device terminals method.
  • the relevant content please refer to the relevant content above, and will not repeat it here.
  • the computer-readable storage medium may be an internal storage unit of the above-mentioned photographing device, such as a hard disk or a memory.
  • the computer-readable storage medium can also be an external storage device, such as a plug-in hard disk provided, a smart memory card, a secure digital card, a flash memory card, and the like.
  • the present application also provides another computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor realizes the remote control described in any one of the above control device terminals.
  • Control Method For a detailed description of the relevant content, please refer to the relevant content above, and will not repeat it here.
  • the computer-readable storage medium may be an internal storage unit of the above-mentioned control device, such as a hard disk or a memory.
  • the computer-readable storage medium can also be an external storage device, such as a plug-in hard disk provided, a smart memory card, a secure digital card, a flash memory card, and the like.
  • the present application also provides a shooting control system, which includes the shooting device and the control device as described above.

Landscapes

  • Studio Devices (AREA)

Abstract

一种远程控制方法、拍摄设备、控制设备、系统及存储介质,该方法包括:将摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像(S101);将合成图像发送给控制设备(S102);获取拍摄设备发送的合成图像(S201);在控制设备的显示屏上显示合成图像(S202);根据检测到的用户针对控制设备的显示屏的触控操作,获取对应的包括触控点的位置的触控数据(S203);将触控数据发送给拍摄设备(S204);获取控制设备发送的触控数据(S103);根据触控数据生成控制指令,以对拍摄设备进行控制(S104)。

Description

远程控制方法、拍摄设备、控制设备、系统及存储介质 技术领域
本申请涉及数据传输技术领域,尤其涉及一种远程控制方法、拍摄设备、控制设备、系统及存储介质。
背景技术
相关技术中,拍摄设备可以通过云台搭载摄像头,利用图传模块将实时图像数据无线传输至远端设备,以使远端设备的显示屏上显示摄像头拍摄的实时图像。此外,远端设备还内置有一套控制逻辑以将控制指令传输给拍摄设备,以控制摄像头的拍摄参数或云台的控制参数。
然而,由于摄像头的拍摄参数和云台的控制参数太多,采用在远端设备内置控制逻辑的方法,需要建立很多的控制通道,实现太复杂,工作量太大,且很容易导致控制失灵。
发明内容
基于此,本申请提供一种远程控制方法、拍摄设备、控制设备、系统及存储介质。
第一方面,本申请提供一种远程控制方法,所述方法适用于拍摄设备,所述拍摄设备和控制设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述方法包括:
将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像;
将所述合成图像发送给控制设备,以使得所述控制设备的显示屏显示所述合成图像;
获取所述控制设备发送的触控数据,所述触控数据包括触控点的位置;
根据所述触控数据生成控制指令,以对所述拍摄设备进行控制。
第二方面,本申请提供一种远程控制方法,所述方法适用于控制设备,所述控制设备与拍摄设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述方法包括:
获取所述拍摄设备发送的合成图像,所述合成图像是基于所述摄像头拍摄的图像和用户操作界面对应的图像合成得到的;
在所述控制设备的显示屏上显示所述合成图像;
根据检测到的用户针对所述控制设备的显示屏的触控操作,获取对应的触控数据,所述触控数据包括触控点的位置;
将所述触控数据发送给所述拍摄设备,以使所述拍摄设备根据所述触控数据生成控制指令,对所述拍摄设备进行控制。
第三方面,本申请提供一种拍摄设备,所述拍摄设备和控制设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述拍摄设备还包括:存储器和处理器;
所述存储器用于存储计算机程序;
所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像;
将所述合成图像发送给控制设备,以使得所述控制设备的显示屏显示所述合成图像;
获取所述控制设备发送的触控数据,所述触控数据包括触控点的位置;
根据所述触控数据生成控制指令,以对所述拍摄设备进行控制。
第四方面,本申请提供一种控制设备,所述控制设备与拍摄设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述控制设备包括:显示屏、存储器和处理器;
所述存储器用于存储计算机程序;
所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
获取所述拍摄设备发送的合成图像,所述合成图像是基于所述摄像头拍摄的图像和用户操作界面对应的图像合成得到的;
在所述控制设备的显示屏上显示所述合成图像;
根据检测到的用户针对所述控制设备的显示屏的触控操作,获取对应的触控数据,所述触控数据包括触控点的位置;
将所述触控数据发送给所述拍摄设备,以使所述拍摄设备根据所述触控数据生成控制指令,对所述拍摄设备进行控制。
第五方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上第一方面所述的远程控制方法。
第六方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上第二方面所述的远程控制方法。
第七方面,本申请提供一种拍摄控制系统,所述拍摄控制系统包括如上第三方面所述的拍摄设备和如上第四方面所述的控制设备。
本申请实施例通过将摄像头拍摄的图像和用户操作界面对应的图像进行合成,并将合成的图像发送给控制设备进行显示,控制设备检测到用户的触控操作,获取包括触控点的位置的触控数据,并发送给拍摄设备,拍摄设备根据控制设备发送的触控数据生成控制指令以对拍摄设备进行控制,如此无需在控制设备内置过多的控制逻辑,能够减少控制设备在本地对触控数据的解读,实现简单,效率更高。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以 根据这些附图获得其他的附图。
图1是本申请一实施例提供的拍摄设备的结构示意图;
图2是本申请一实施例提供的拍摄设备端的流程示意图;
图3是本申请一实施例提供的控制设备端的流程示意图;
图4是本申请一实施例提供的系统流程示意图;
图5是本申请一实施例提供的拍摄设备的结构示意图;
图6是本申请一实施例提供的控制设备的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
相关技术中,拍摄设备通过云台搭载摄像头,利用图传模块将实时图像数据无线传输至远端设备,远端设备的显示屏上显示摄像头拍摄的实时图像。远端设备内置一套控制逻辑以将控制指令传输给拍摄设备,控制摄像头的拍摄参数或云台的控制参数。然而,对于一些专业拍摄设备,摄像头的拍摄参数和云台的控制参数太多。示例的,图1是本申请一实施例提供的拍摄设备的结构示意图,拍摄设备包括云台,云台用于搭载摄像头。云台可以在俯仰(pitch)、平移(yaw)、翻滚(roll)以及垂直方向上调整摄像头,以使得摄像头的位置保持稳定或者按照期望的方向变化。摄像头可以对周围的景物进行拍摄,用户需要基于具体的拍摄需求对拍摄的帧率、分辨率、曝光参数、对焦参数等参数进行调整。对于此类拍摄设备,如果在远端设备内置控制逻辑,需要建立很多的控制通道,实现太复杂,工作量太大,且很容易导致控制失灵。
为解决上述问题,本申请实施例提供了一种远程控制方法、拍摄设备、控制设备、系统及存储介质,所述拍摄设备和控制设备建立有无线通信连接,所 述拍摄设备包括云台,所述云台用于搭载摄像头。
本申请实施例的远程控制方法包括拍摄设备端的远程控制方法(简称拍摄设备端方法)和控制设备端的远程控制方法(简称控制设备端方法)。
拍摄设备端的方法包括:将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像;将所述合成图像发送给控制设备,以使得所述控制设备的显示屏显示所述合成图像;获取所述控制设备发送的触控数据,所述触控数据包括触控点的位置;根据所述触控数据生成控制指令,以对所述拍摄设备进行控制。
控制设备端的方法包括:获取所述拍摄设备发送的合成图像;在所述控制设备的显示屏上显示所述合成图像;根据检测到的用户针对所述控制设备的显示屏的触控操作,获取对应的包括触控点的位置的触控数据;将所述触控数据发送给所述拍摄设备。
由于通过将摄像头拍摄的图像和用户操作界面对应的图像进行合成,并将合成的图像发送给控制设备进行显示,控制设备检测到用户的触控操作,获取包括触控点的位置的触控数据并发送给拍摄设备,拍摄设备根据控制设备发送的触控数据生成控制指令以对拍摄设备进行控制,如此无需在控制设备内置过多的控制逻辑,能够减少控制设备在本地对触控数据的解读,实现简单,效率更高。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
为了描述的方便,也为了便于更好理解本申请实施例远程控制方法,下面将拍摄设备端方法和控制设备端方法结合起来一起进行详细说明。
需要说明的是,虽然将拍摄设备端方法和控制设备端方法结合起来一起说明,但是拍摄设备端方法和控制设备端方法是彼此独立的,拍摄设备端方法为无需在控制设备内置过多的控制逻辑,减少控制设备在本地对触控数据的解读提供部分的技术支持;控制设备端方法为无需在控制设备内置过多的控制逻辑,减少控制设备在本地对触控数据的解读提供部分的技术支持;当拍摄设备端方法和控制设备端方法结合在一起时能够实现无需在控制设备内置过多的控制逻辑,减少控制设备在本地对触控数据的解读,且实现简单,效率更高。
参见图2至图4,图2是本申请一实施例提供的拍摄设备端的流程示意图,图3是本申请一实施例提供的控制设备端的流程示意图,图4是本申请一实施例提供的系统流程示意图,展示了拍摄设备端和控制设备端结合在一起的方法。
拍摄设备和控制设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头。其中,所述拍摄设备可以是手持拍摄设备、无人机、无人车、无人船等。
拍摄设备端方法包括:步骤S101、步骤S102、步骤S103以及步骤S104;控制设备端方法包括:步骤S201、步骤S202、步骤S203以及步骤S204。
拍摄设备端方法:
步骤S101:将摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像。
步骤S102:将所述合成图像发送给控制设备,以使得所述控制设备的显示屏显示所述合成图像。
用户操作界面,是设备和用户之间进行交互和信息交换的媒介,它实现信息的内部形式与人类可以接受形式之间的转换,使得用户能够方便有效地去操作硬件以达成双向交互,完成所希望的工作。
合成图像既包括摄像头拍摄的图像也包括用户操作界面对应的图像。得到合成图像后发送给控制设备,以在控制设备的显示屏上显示。
控制设备端方法:
步骤S201:获取所述拍摄设备发送的合成图像,所述合成图像是基于所述摄像头拍摄的图像和用户操作界面对应的图像合成得到的。
步骤S202:在所述控制设备的显示屏上显示所述合成图像。
步骤S203:根据检测到的用户针对所述控制设备的显示屏的触控操作,获取对应的触控数据,所述触控数据包括触控点的位置。
步骤S204:将所述触控数据发送给所述拍摄设备,以使所述拍摄设备根据所述触控数据生成控制指令,对所述拍摄设备进行控制。
控制设备获取到拍摄设备发送的合成图像后,在控制设备的显示屏上显示合成图像,用户可以在控制设备的显示屏上看到拍摄设备的摄像头拍摄的图像以及拍摄设备的用户操作界面对应的图像。当用户希望在控制设备上远程控制 拍摄设备时,可以在控制设备的显示屏上进行触控操作,控制设备获得到对应的包括触控点的位置的触控数据,然后将触控数据发送给拍摄设备。
触控操作,可以包括点击操作和触摸操作。点击操作是用户手指按下→停留若干时间(可长可短)→用户手指松开,这一完整的过程视为一次点击操作,此时触控点的位置可以是单个触控点的位置,所述触控数据还可以包括触控类型,例如按下、抬起。触摸操作涉及多点触控,即用户在显示屏上进行多个点的操作,此时触控点的位置可以是多个触控点的位置,所述触控数据还可以包括触控类型,例如按下、抬起、移动。
拍摄设备端方法:
步骤S103:获取所述控制设备发送的触控数据,所述触控数据包括触控点的位置。
步骤S104:根据所述触控数据生成控制指令,以对所述拍摄设备进行控制。
拍摄设备自身存储有触控数据和控制指令的对应关系。例如:触控数据1对应控制指令1,触控数据2对应控制指令2,触控数据3对应控制指令3,等等。拍摄设备获取到控制设备发送的触控数据后,结合对应关系,生成对应的控制指令,即可响应控制指令对拍摄设备进行控制。
本申请实施例通过将摄像头拍摄的图像和用户操作界面对应的图像进行合成,并将合成的图像发送给控制设备进行显示,控制设备检测到用户的触控操作,获取包括触控点的位置的触控数据并发送给拍摄设备,拍摄设备根据控制设备发送的触控数据生成控制指令以对拍摄设备进行控制,如此无需在控制设备内置过多的控制逻辑,能够减少控制设备在本地对触控数据的解读,实现简单,效率更高。
在一实施例中,所述拍摄设备设置有机身显示屏,用于显示所述合成图像。
如图1所示,拍摄设备设置有机身显示屏,由于拍摄设备的机身显示屏和远端的控制设备的显示屏可以基本同步地显示合成图像,可以为方便多人操作提供技术支持。
在一实施例中,所述拍摄设备端方法还包括:根据检测到的用户对所述机身显示屏的操作,对所述拍摄设备进行控制。
本实施例中,用户可以在机身显示屏上操作,当检测到用户的操作后,可以对拍摄设备进行控制。由于拍摄设备自身存储有触控数据和控制指令的对应关系。如此,用户在机身显示屏上操作后,触控数据能够直接解读为控制指令对所述拍摄设备进行控制,能够方便拍摄设备的用户进行相应的操作。
在一实施例中,拍摄设备的用户和控制设备的用户可以通过机身显示屏和控制设备的显示屏进行分工操作。示例的,一个用户在控制设备的显示屏上进行触控操作,远程控制拍摄设备以调整云台的控制参数,另一个用户在拍摄设备的机身显示屏上进行触控操作,近距离控制拍摄设备以调整摄像头的拍摄参数;或者,一个用户在控制设备的显示屏上进行触控操作,远程控制拍摄设备以调整摄像头的拍摄参数,另一个用户在拍摄设备的机身显示屏上进行触控操作,近距离控制拍摄设备以调整云台的控制参数。
在一实施例中,拍摄设备端方法中,步骤S104,所述根据所述触控数据生成控制指令,可以包括:根据所述触控数据生成云台控制指令和/或拍摄控制指令。控制终端方法中,步骤S204中,所述根据所述触控数据生成控制指令,可以包括:根据所述触控数据生成云台控制指令和/或拍摄控制指令。
云台控制指令用于对云台进行控制,拍摄控制指令用于对拍摄进行控制。其中,对云台的控制可以是对俯仰轴(pitch)、平移轴(yaw)、翻滚轴(roll)、Z轴的控制,对拍摄的控制可以是对帧率、分辨率、曝光参数、对焦参数等参数的控制。
例如:拍摄设备的用户操作界面包括多个不同的分辨率和/或多个不同的帧率,合成图像上显示多个不同的分辨率和/或多个不同的帧率,用户可以在控制设备的显示屏上点击分辨率和/或帧率所在的位置,即可获取到对应的触控点的位置,拍摄设备获取到触控点的位置后,即可生成用户所选择的分辨率和/或帧率的控制指令。
在一实施例中,在控制设备的显示屏上除了显示拍摄设备发送过来的合成图像外,还可以显示控制设备自身的第二用户操作界面,第二用户操作界面包括第二操作控件,如此用户可以通过操作第二用户操作界面上的第二操作控件对控制设备进行控制。也即,上述用户操作界面对应的图像为第一用户操作界面对应的图像,所述第一用户操作界面包括用于对所述拍摄设备进行控制的第 一操作控件,所述控制设备的显示屏还用于显示第二用户操作界面,所述第二用户操作界面包括用于对所述控制设备进行控制的第二操作控件。
在一实施例中,对所述控制设备进行控制可以是对所述控制设备的模式进行控制。例如,当模式为观看模式时,控制设备的显示屏仅显示摄像头拍摄的图像,如此可以使得控制设备的用户看到相对简洁的画面。当模式为控制模式时,控制设备的显示屏显示合成图像,也即包括摄像头拍摄的图像和用户操作界面对应的图像,如此可以使得控制设备的用户可以及时对拍摄设备进行控制。
在一实施例中,所述第二用户操作界面还包括下载操作控件或对频操作控件,所述方法还包括:根据检测到的用户对所述下载操作控件的操作,生成图像下载控制指令;或根据检测到的用户对所述对频操作控件的操作,生成对频控制指令;将所述图像下载控制指令或所述对频控制指令发送给所述拍摄设备。
本实施例中,控制设备保留常用的下载操作控件或对频操作控件,通过操作下载操作控件或对频操作控件对拍摄设备进行远程控制,以从拍摄设备下载图像或者使拍摄设备进入对频模式以与控制设备建立无线通信连接。
在一实施例中,所述第二用户操作界面显示在所述合成图像的外围。其中,控制设备的显示屏的尺寸可以大于原始的合成后的图像的尺寸。如此既不影响合成图像的显示,也不影响第二用户操作界面的显示。并且这种显示方式可以方便用户进行区分,从而快速查找到需要的操作控件。
在一实施例中,所述拍摄设备端方法还可以包括:获取所述控制设备发送的控制指令,所述控制指令包括图像下载控制指令或对频控制指令。
本实施例中,控制设备可以保留常用的图像下载控制指令或对频控制指令对拍摄设备进行远程控制,以从拍摄设备下载图像或者使拍摄设备进入对频模式以与控制设备建立无线通信连接。
在一实施例中,拍摄设备端方法中,步骤S101,所述将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像,还可以包括:
将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成。
根据控制设备的显示屏的尺寸,对合成后的图像进行处理,得到适合所述控制设备的显示屏尺寸的合成图像。
通常情况下,控制设备的显示屏可以按照1:1的比例显示原始的合成后的 图像。如果控制设备的显示屏的尺寸大于或者小于原始的合成后的图像的尺寸,则可以根据控制设备的显示屏的尺寸,对合成后的图像进行处理,得到适合所述控制设备的显示屏尺寸的合成图像。
在一实施例中,所述控制设备的数量为一个或多个。如此,能够实现一个或多个用户通过一个或者多个控制设备对拍摄设备进行远程控制。
在一实施例中,控制设备端方法中,步骤S202,所述在所述控制设备的显示屏上显示所述合成图像,可以包括:根据预设掩藏要求,将所述合成图像上对应的第一区域掩藏不显示,在所述控制设备的显示屏上显示所述合成图像剩余的第二区域。如此能够根据用户的需求在显示合成图像时掩藏合成图像中的特定区域。
参见图5,图5是本申请一实施例提供的拍摄设备的结构示意图,所述拍摄设备和控制设备建立有无线通信连接,所述拍摄设备100包括云台30,所述云台30用于搭载摄像头40。
所述拍摄设备还包括:存储器10和处理器20;处理器20与存储器10、云台30和摄像头40通过总线连接。处理器20可以是微控制单元、中央处理单元或数字信号处理器,等等。存储器10可以是Flash芯片、只读存储器、磁盘、光盘、U盘或者移动硬盘等等。本实施例的拍摄设备能够执行上述拍摄设备端方法中的步骤,相关内容的详细说明,请参见上述拍摄设备端的远程控制方法的相关内容,在此不再赘叙。
所述存储器用于存储计算机程序;所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像;将所述合成图像发送给控制设备,以使得所述控制设备的显示屏显示所述合成图像;获取所述控制设备发送的触控数据,所述触控数据包括触控点的位置;根据所述触控数据生成控制指令,以对所述拍摄设备进行控制。
其中,所述拍摄设备设置有机身显示屏,用于显示所述合成图像。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据检测到的用户对所述机身显示屏的操作,对所述拍摄设备进行控制。
其中,所述触控数据还包括触控类型,所述触控类型包括按下、抬起、移 动中的至少一个。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据所述触控数据生成云台控制指令和/或拍摄控制指令。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:获取所述控制设备发送的控制指令,所述控制指令包括图像下载控制指令或对频控制指令。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成;根据控制设备的显示屏的尺寸,对合成后的图像进行处理,得到适合所述控制设备的显示屏尺寸的合成图像。
其中,所述控制设备的数量为一个或多个。
参见图6,图6是本申请控制设备一实施例的结构示意图,所述控制设备与拍摄设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述控制设备200包括:显示屏33、存储器11和处理器22;处理器22与存储器11、显示屏33通过总线连接。处理器22可以是微控制单元、中央处理单元或数字信号处理器,等等。存储器11可以是Flash芯片、只读存储器、磁盘、光盘、U盘或者移动硬盘等等。本实施例的控制设备能够执行上述控制设备端方法中的步骤,相关内容的详细说明,请参见上述控制设备端的远程控制方法的相关内容,在此不再赘叙。
所述存储器用于存储计算机程序;所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
获取所述拍摄设备发送的合成图像,所述合成图像是基于所述摄像头拍摄的图像和用户操作界面对应的图像合成得到的;在所述控制设备的显示屏上显示所述合成图像;根据检测到的用户针对所述控制设备的显示屏的触控操作,获取对应的触控数据,所述触控数据包括触控点的位置;将所述触控数据发送给所述拍摄设备,以使所述拍摄设备根据所述触控数据生成控制指令,对所述拍摄设备进行控制。
其中,所述用户操作界面对应的图像为第一用户操作界面对应的图像,所述控制设备的显示屏还用于显示第二用户操作界面,所述第一用户操作界面包括用于对所述拍摄设备进行控制的第一操作控件,所述第二用户操作界面包括 用于对所述控制设备进行控制的第二操作控件。
其中,所述第二用户操作界面还包括下载操作控件或对频操作控件,所述处理器在执行所述计算机程序时,实现如下步骤:根据检测到的用户对所述下载操作控件的操作,生成图像下载控制指令;或根据检测到的用户对所述对频操作控件的操作,生成对频控制指令;将所述图像下载控制指令或所述对频控制指令发送给所述拍摄设备。
其中,所述第二用户操作界面显示在所述合成图像的外围。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据预设掩藏要求,将所述合成图像上对应的第一区域掩藏不显示,在所述控制设备的显示屏上显示所述合成图像剩余的第二区域。
其中,所述触控数据还包括触控类型,所述触控类型包括按下、抬起、移动中的至少一个。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上拍摄设备端的任一所述的远程控制方法。相关内容的详细说明请参见上述相关内容部分,在此不再赘叙。
其中,该计算机可读存储介质可以是上述拍摄设备的内部存储单元,例如硬盘或内存。该计算机可读存储介质也可以是外部存储设备,例如配备的插接式硬盘、智能存储卡、安全数字卡、闪存卡,等等。
本申请还提供另一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上控制设备端任一所述的远程控制方法。相关内容的详细说明请参见上述相关内容部分,在此不再赘叙。
其中,该计算机可读存储介质可以是上述控制设备的内部存储单元,例如硬盘或内存。该计算机可读存储介质也可以是外部存储设备,例如配备的插接式硬盘、智能存储卡、安全数字卡、闪存卡,等等。
本申请还提供一种拍摄控制系统,所述拍摄控制系统包括如上任一所述的拍摄设备和控制设备。
应当理解,在本申请说明书中所使用的术语仅仅是出于描述特定实施例的 目的而并不意在限制本申请。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
以上所述,仅为本申请的具体实施例,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (32)

  1. 一种远程控制方法,其特征在于,所述方法适用于拍摄设备,所述拍摄设备和控制设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述方法包括:
    将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像;
    将所述合成图像发送给控制设备,以使得所述控制设备的显示屏显示所述合成图像;
    获取所述控制设备发送的触控数据,所述触控数据包括触控点的位置;
    根据所述触控数据生成控制指令,以对所述拍摄设备进行控制。
  2. 根据权利要求1所述的方法,其特征在于,所述拍摄设备设置有机身显示屏,用于显示所述合成图像。
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    根据检测到的用户对所述机身显示屏的操作,对所述拍摄设备进行控制。
  4. 根据权利要求1所述的方法,其特征在于,所述触控数据还包括触控类型,所述触控类型包括按下、抬起、移动中的至少一个。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述触控数据生成控制指令,包括:
    根据所述触控数据生成云台控制指令和/或拍摄控制指令。
  6. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    获取所述控制设备发送的控制指令,所述控制指令包括图像下载控制指令或对频控制指令。
  7. 根据权利要求1所述的方法,其特征在于,所述将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像,包括:
    将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成;
    根据控制设备的显示屏的尺寸,对合成后的图像进行处理,得到适合所述控制设备的显示屏尺寸的合成图像。
  8. 根据权利要求1所述的方法,其特征在于,所述控制设备的数量为一 个或多个。
  9. 一种远程控制方法,其特征在于,所述方法适用于控制设备,所述控制设备与拍摄设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述方法包括:
    获取所述拍摄设备发送的合成图像,所述合成图像是基于所述摄像头拍摄的图像和用户操作界面对应的图像合成得到的;
    在所述控制设备的显示屏上显示所述合成图像;
    根据检测到的用户针对所述控制设备的显示屏的触控操作,获取对应的触控数据,所述触控数据包括触控点的位置;
    将所述触控数据发送给所述拍摄设备,以使所述拍摄设备根据所述触控数据生成控制指令,对所述拍摄设备进行控制。
  10. 根据权利要求9所述的方法,其特征在于,所述根据所述触控数据生成控制指令包括:
    根据所述触控数据生成云台控制指令和/或拍摄控制指令。
  11. 根据权利要求9所述的方法,其特征在于,所述用户操作界面对应的图像为第一用户操作界面对应的图像,所述控制设备的显示屏还用于显示第二用户操作界面,所述第一用户操作界面包括用于对所述拍摄设备进行控制的第一操作控件,所述第二用户操作界面包括用于对所述控制设备进行控制的第二操作控件。
  12. 根据权利要求11所述的方法,其特征在于,所述第二用户操作界面还包括下载操作控件或对频操作控件,所述方法还包括:
    根据检测到的用户对所述下载操作控件的操作,生成图像下载控制指令;或
    根据检测到的用户对所述对频操作控件的操作,生成对频控制指令;
    将所述图像下载控制指令或所述对频控制指令发送给所述拍摄设备。
  13. 根据权利要求11所述的方法,其特征在于,所述第二用户操作界面显示在所述合成图像的外围。
  14. 根据权利要求9所述的方法,其特征在于,所述在所述控制设备的显示屏上显示所述合成图像,包括:
    根据预设掩藏要求,将所述合成图像上对应的第一区域掩藏不显示,在所述控制设备的显示屏上显示所述合成图像剩余的第二区域。
  15. 根据权利要求9所述的方法,其特征在于,所述触控数据还包括触控类型,所述触控类型包括按下、抬起、移动中的至少一个。
  16. 一种拍摄设备,其特征在于,所述拍摄设备和控制设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述拍摄设备还包括:存储器和处理器;
    所述存储器用于存储计算机程序;
    所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
    将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成,得到合成图像;
    将所述合成图像发送给控制设备,以使得所述控制设备的显示屏显示所述合成图像;
    获取所述控制设备发送的触控数据,所述触控数据包括触控点的位置;
    根据所述触控数据生成控制指令,以对所述拍摄设备进行控制。
  17. 根据权利要求16所述的拍摄设备,其特征在于,所述拍摄设备设置有机身显示屏,用于显示所述合成图像。
  18. 根据权利要求17所述的拍摄设备,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据检测到的用户对所述机身显示屏的操作,对所述拍摄设备进行控制。
  19. 根据权利要求16所述的拍摄设备,其特征在于,所述触控数据还包括触控类型,所述触控类型包括按下、抬起、移动中的至少一个。
  20. 根据权利要求16所述的拍摄设备,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据所述触控数据生成云台控制指令和/或拍摄控制指令。
  21. 根据权利要求16所述的拍摄设备,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    获取所述控制设备发送的控制指令,所述控制指令包括图像下载控制指令 或对频控制指令。
  22. 根据权利要求16所述的拍摄设备,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    将所述摄像头拍摄的图像和用户操作界面对应的图像进行合成;
    根据控制设备的显示屏的尺寸,对合成后的图像进行处理,得到适合所述控制设备的显示屏尺寸的合成图像。
  23. 根据权利要求16所述的拍摄设备,其特征在于,所述控制设备的数量为一个或多个。
  24. 一种控制设备,其特征在于,所述控制设备与拍摄设备建立有无线通信连接,所述拍摄设备包括云台,所述云台用于搭载摄像头,所述控制设备包括:显示屏、存储器和处理器;
    所述存储器用于存储计算机程序;
    所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
    获取所述拍摄设备发送的合成图像,所述合成图像是基于所述摄像头拍摄的图像和用户操作界面对应的图像合成得到的;
    在所述控制设备的显示屏上显示所述合成图像;
    根据检测到的用户针对所述控制设备的显示屏的触控操作,获取对应的触控数据,所述触控数据包括触控点的位置;
    将所述触控数据发送给所述拍摄设备,以使所述拍摄设备根据所述触控数据生成控制指令,对所述拍摄设备进行控制。
  25. 根据权利要求24所述的控制设备,其特征在于,所述用户操作界面对应的图像为第一用户操作界面对应的图像,所述控制设备的显示屏还用于显示第二用户操作界面,所述第一用户操作界面包括用于对所述拍摄设备进行控制的第一操作控件,所述第二用户操作界面包括用于对所述控制设备进行控制的第二操作控件。
  26. 根据权利要求25所述的控制设备,其特征在于,所述第二用户操作界面还包括下载操作控件或对频操作控件,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据检测到的用户对所述下载操作控件的操作,生成图像下载控制指令;或
    根据检测到的用户对所述对频操作控件的操作,生成对频控制指令;
    将所述图像下载控制指令或所述对频控制指令发送给所述拍摄设备。
  27. 根据权利要求25所述的控制设备,其特征在于,所述第二用户操作界面显示在所述合成图像的外围。
  28. 根据权利要求24所述的控制设备,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据预设掩藏要求,将所述合成图像上对应的第一区域掩藏不显示,在所述控制设备的显示屏上显示所述合成图像剩余的第二区域。
  29. 根据权利要求24所述的控制设备,其特征在于,所述触控数据还包括触控类型,所述触控类型包括按下、抬起、移动中的至少一个。
  30. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求1-8任一项所述的远程控制方法。
  31. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求9-15任一项所述的远程控制方法。
  32. 一种拍摄控制系统,其特征在于,所述拍摄控制系统包括如权利要求16-23任一项所述的拍摄设备以及如权利要求24-29任一项所述的控制设备。
PCT/CN2021/124479 2021-10-18 2021-10-18 远程控制方法、拍摄设备、控制设备、系统及存储介质 WO2023065077A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/124479 WO2023065077A1 (zh) 2021-10-18 2021-10-18 远程控制方法、拍摄设备、控制设备、系统及存储介质
CN202180100510.4A CN117730541A (zh) 2021-10-18 2021-10-18 远程控制方法、拍摄设备、控制设备、系统及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/124479 WO2023065077A1 (zh) 2021-10-18 2021-10-18 远程控制方法、拍摄设备、控制设备、系统及存储介质

Publications (1)

Publication Number Publication Date
WO2023065077A1 true WO2023065077A1 (zh) 2023-04-27

Family

ID=86057841

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/124479 WO2023065077A1 (zh) 2021-10-18 2021-10-18 远程控制方法、拍摄设备、控制设备、系统及存储介质

Country Status (2)

Country Link
CN (1) CN117730541A (zh)
WO (1) WO2023065077A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060150109A1 (en) * 2004-12-30 2006-07-06 Motorola, Inc. Shared user interface
CN105338245A (zh) * 2015-10-30 2016-02-17 努比亚技术有限公司 拍照共享方法、终端及系统
CN106488043A (zh) * 2016-12-20 2017-03-08 努比亚技术有限公司 一种移动终端及拍照方法
CN111212230A (zh) * 2020-01-15 2020-05-29 源德盛塑胶电子(深圳)有限公司 一种拍摄控制系统及其控制方法和自拍杆

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060150109A1 (en) * 2004-12-30 2006-07-06 Motorola, Inc. Shared user interface
CN105338245A (zh) * 2015-10-30 2016-02-17 努比亚技术有限公司 拍照共享方法、终端及系统
CN106488043A (zh) * 2016-12-20 2017-03-08 努比亚技术有限公司 一种移动终端及拍照方法
CN111212230A (zh) * 2020-01-15 2020-05-29 源德盛塑胶电子(深圳)有限公司 一种拍摄控制系统及其控制方法和自拍杆

Also Published As

Publication number Publication date
CN117730541A (zh) 2024-03-19

Similar Documents

Publication Publication Date Title
US10437545B2 (en) Apparatus, system, and method for controlling display, and recording medium
CN109218606B (zh) 摄像控制设备、其控制方法及计算机可读介质
CN111669508A (zh) 一种摄像头的控制方法及显示设备
CN105554372B (zh) 一种拍摄方法以及装置
US20090059094A1 (en) Apparatus and method for overlaying image in video presentation system having embedded operating system
WO2020103020A1 (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
WO2015142971A1 (en) Receiver-controlled panoramic view video share
JP2005100084A (ja) 画像処理装置及び方法
CN112672062B (zh) 一种显示设备及人像定位方法
JP2010016826A (ja) 画像処理オペレーションを効率的に実行するためのシステム及び方法
US20230262321A1 (en) Electronic device and operating method thereof
CN114630053A (zh) 一种hdr图像显示方法及显示设备
US10291835B2 (en) Information processing apparatus, imaging apparatus, information processing method, and imaging system
CN113747078B (zh) 显示设备及焦距控制方法
WO2022088974A1 (zh) 一种遥控方法、电子设备及系统
CN112929750B (zh) 一种摄像头调节方法及显示设备
CN113473024A (zh) 显示设备、云台摄像头和摄像头控制方法
WO2023065077A1 (zh) 远程控制方法、拍摄设备、控制设备、系统及存储介质
WO2021007792A1 (zh) 拍摄方法、设备、系统及计算机可读存储介质
CN108496351B (zh) 无人机及其控制方法、控制终端及其控制方法
JP7190594B1 (ja) 撮像装置およびその制御方法、画像処理装置および画像処理システム
CN112905008B (zh) 一种手势调节显示图像方法及显示设备
CN112887653B (zh) 一种信息处理方法和信息处理装置
CN112437284A (zh) 一种投影画面校正方法、终端设备及显示设备
EP3033874B1 (en) Electronic apparatus and method of controlling the same

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202180100510.4

Country of ref document: CN