WO2023178670A1 - 可移动平台的图像传输方法、装置及设备 - Google Patents

可移动平台的图像传输方法、装置及设备 Download PDF

Info

Publication number
WO2023178670A1
WO2023178670A1 PCT/CN2022/083116 CN2022083116W WO2023178670A1 WO 2023178670 A1 WO2023178670 A1 WO 2023178670A1 CN 2022083116 W CN2022083116 W CN 2022083116W WO 2023178670 A1 WO2023178670 A1 WO 2023178670A1
Authority
WO
WIPO (PCT)
Prior art keywords
control terminal
rotation information
image
shooting device
target
Prior art date
Application number
PCT/CN2022/083116
Other languages
English (en)
French (fr)
Inventor
饶雄斌
肖巍
赵巍
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2022/083116 priority Critical patent/WO2023178670A1/zh
Priority to CN202280047437.3A priority patent/CN117730543A/zh
Publication of WO2023178670A1 publication Critical patent/WO2023178670A1/zh

Links

Images

Definitions

  • the present application relates to the field of image transmission technology, and in particular, to an image transmission method, device and equipment for a movable platform.
  • the movable platform (such as a drone) can be equipped with a shooting device, and the shooting device can shoot various environmental images.
  • the movable platform can use its configured pan/tilt to adjust the posture of the shooting device to change the shooting direction of the shooting device.
  • the movable platform can establish a wireless communication connection with the control terminal (such as a head-mounted display device), and the movable platform can send the image collected by the shooting device to the control terminal so that the control terminal displays the image.
  • the control terminal can send the sensed rotation information to the movable platform through a wireless communication connection.
  • the movable platform can control the movement of the pan/tilt based on the rotation information of the control terminal.
  • the control terminal when the control terminal is a head-mounted display device, the head-mounted display device worn on the user's head will rotate as the user's head rotates, so that the shooting direction of the shooting device will change as the user's head rotates. Change, the image displayed by the user on the head-mounted display device can change as the head rotates.
  • Embodiments of the present application provide an image transmission method, device and equipment for a movable platform to solve the problem in the prior art that from the time the body part holding or wearing the control terminal drives the control terminal to start rotating, to the time the user starts to see the body parts There is a problem with longer delays in images that change while rotating.
  • an embodiment of the present application provides a movable platform that includes a shooting device that collects and outputs images and a pan/tilt for carrying the shooting device and adjusting the posture of the shooting device.
  • the method includes:
  • the movement of the pan/tilt is controlled according to the rotation information to adjust the posture of the shooting device, wherein the cropped target image is output by the photography device before the pan/tilt moves in response to the rotation information. image;
  • inventions of the present application provide an image transmission device with a movable platform.
  • the movable platform includes a shooting device that collects and outputs images, and a cloud for carrying the shooting device and adjusting the posture of the shooting device.
  • the device includes: a memory and a processor;
  • the memory is used to store program code
  • the processor calls the program code, and when the program code is executed, is used to perform the following operations:
  • the movement of the pan/tilt is controlled according to the rotation information to adjust the posture of the shooting device, wherein the cropped target image is output by the photography device before the pan/tilt moves in response to the rotation information. image;
  • embodiments of the present application provide a movable platform that includes a shooting device that collects and outputs images, a pan/tilt for carrying the shooting device and adjusting the posture of the shooting device, and image transmission.
  • a device the device including: a memory and a processor;
  • the memory is used to store program code
  • the processor calls the program code, and when the program code is executed, is used to perform the following operations:
  • the movement of the pan/tilt is controlled according to the rotation information to adjust the posture of the shooting device, wherein the cropped target image is output by the photography device before the pan/tilt moves in response to the rotation information. image;
  • embodiments of the present application provide a computer-readable storage medium on which a computer program is stored. When the computer program is executed, the method described in the first aspect is implemented.
  • embodiments of the present application provide a computer program, which when the computer program is executed by a computer, is used to implement the method described in the first aspect.
  • the target image output by the shooting device can be cropped according to the rotation information to obtain the target area image, and the target area image is sent to the control terminal, where the target image is the response of the pan/tilt to the rotation information.
  • Image output from the shooting device before exercise even if the pan/tilt has not yet had time to rotate in response to the rotation information to drive the shooting direction change of the shooting device, the control terminal can obtain the image change based on the rotation information, without waiting for the pan/tilt to move in response to the rotation information to drive the shooting direction of the shooting device.
  • the shooting direction changes the user can see the changed image, which can reduce the delay time from when the body part holding or wearing the control terminal drives the control terminal to rotate to when the user starts to see the image that changes with the rotation of the body part, which is beneficial Improve user experience.
  • Figure 1 is a schematic diagram of the application scenario of the image transmission method provided by the embodiment of the present application.
  • Figure 2 is a schematic diagram of the processing process in the prior art from when the body part of the control terminal is worn or held to drive the control terminal to start rotating, to when the user starts to see images that change with the rotation of the body part;
  • Figure 3 is a schematic flowchart of an image transmission method on a movable platform provided by an embodiment of the present application
  • Figure 4A is a schematic diagram of cropping a target area image from a target image according to an embodiment of the present application
  • Figure 4B is a schematic diagram of cropping a target area image from a target image according to another embodiment of the present application.
  • Figure 5 is a schematic diagram of the processing process provided by the embodiment of the present application from when the body part wearing or holding the control terminal drives the control terminal to start rotating, to when the user starts to see images that change with the rotation of the body part;
  • Figure 6 is a schematic flowchart of an image transmission method on a movable platform provided by another embodiment of the present application.
  • Figure 7A is a schematic diagram of a control terminal provided by an embodiment of the present application.
  • Figure 7B is a schematic diagram illustrating the definition of the attitude angle of the control terminal in its pitch direction and yaw direction provided by an embodiment of the present application;
  • Figure 8A is a schematic diagram of cropping a target area image from a target image according to another embodiment of the present application.
  • Figure 8B is a schematic diagram of cropping a target area image from a target image according to yet another embodiment of the present application.
  • Figure 9 is a schematic flowchart of an image transmission method on a movable platform provided by yet another embodiment of the present application.
  • Figure 10 is a schematic diagram of cropping a target area image from a target image provided by yet another embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an image transmission device for a movable platform provided by an embodiment of the present application.
  • the image transmission method for a movable platform provided by the embodiment of the present application can be applied to the application scenario shown in Figure 1.
  • this application scenario may include a movable platform 11 and a control terminal 12 wirelessly connected to the movable platform 11 .
  • the movable platform 11 may include a shooting device 121 that collects and outputs images, and a pan/tilt 122 used to carry the shooting device 121 and adjust the posture of the shooting device 121 .
  • the user can wear or hold the control terminal 12, and the control terminal 12 can rotate following the rotation of the body part (such as a hand or head) wearing or holding the control terminal.
  • the control terminal 12 can sense the rotation of itself.
  • the information is sent to the movable platform 11 through a wireless communication connection, and the movable platform 11 can control the movement of the pan/tilt 122 according to the received rotation information of the control terminal 12 to adjust the posture of the shooting device 121.
  • the movable platform 11 may specifically include a shooting device 121 and a pan/tilt 122, and be capable of controlling the movement of the pan/tilt 122 based on the rotation information of the control terminal 12.
  • the movable platform may include a drone, an unmanned vehicle, At least one of the unmanned vessels and, in some cases, the movable platform may include a handheld device.
  • the control terminal 12 may specifically be a terminal that can communicate with the movable platform 11 and can rotate with the rotation of body parts.
  • the control terminal 12 may include a head-mounted display device, a handheld display device, a smart phone, At least one of a tablet and a remote control.
  • the control terminal 12 may be a glasses-type display device, and the glasses-type display device may be, for example, First Person View (FPV) glasses.
  • FMV First Person View
  • the glasses end can collect its own rotation information and transmit the rotation information to the aircraft end through a wireless communication connection. Then the PTZ system on the aircraft end can filter the rotation information and use The filtered rotation information controls the movement of the pan/tilt 122 to adjust the posture of the shooting device 121 to change the shooting direction of the shooting device 121. After the shooting direction of the shooting device 121 changes, the shooting device 121 uses exposure and image signal processing (Image Signal Processing, ISP) can output the image of the changing shooting direction. Then the aircraft end can encode and transmit the image of the changing shooting direction output by the shooting device to the glasses end. Finally, the glasses end can decode and display, so that the user starts to see the body parts that change. Images that change as they rotate. It should be noted that the processing process shown in Figure 2 is only an example.
  • the processing required includes transmitting rotation information
  • the rotation information the movement of the pan/tilt is controlled
  • the shooting device exposes and processes the image, and transmits the image, so that from the time when the body part wearing or holding the control terminal drives the control terminal to start rotating, to the time when the user starts to see images that change with the rotation of the body part.
  • the delay time is longer.
  • the target image output by the shooting device can be cropped according to the rotation information to obtain the target area image, and the target area image is sent to the control terminal, where the target image is output by the shooting device before the pan/tilt moves in response to the rotation information.
  • the image transmission method provided by the embodiment of the present application can be executed by an image transmission device of the movable platform, and the device can be included in the movable platform 11 .
  • FIG 3 is a schematic flowchart of an image transmission method for a movable platform provided by an embodiment of the present application.
  • the execution subject of this embodiment may be the image transmission device of the movable platform, specifically the processor of the image transmission device.
  • the method in this embodiment may include:
  • Step 31 Obtain the rotation information of the control terminal.
  • the rotation information may specifically be information that can reflect rotation changes of the control terminal.
  • the rotation information may include attitude, angular velocity or angular acceleration.
  • the control terminal may be provided with a sensor for sensing the rotation of the control terminal and outputting rotation information.
  • the sensor provided on the control terminal 12 may be an inertial measurement unit (IMU).
  • the rotation information may include yaw rotation information and/or pitch rotation information.
  • Yaw rotation information refers to information that controls the terminal to rotate in its yaw direction.
  • Yaw rotation information can be used to control the yaw axis motor of the gimbal to control the gimbal to rotate in its yaw direction.
  • the pitch and rotation information refers to the information that controls the terminal to rotate in its pitch direction.
  • the pitch and rotation information can be used to control the pitch axis motor of the gimbal to control the gimbal to rotate in its pitch direction.
  • the yaw direction of the gimbal can correspond to the U-axis of the pixel plane coordinate system of the shooting device, and the pitch direction of the gimbal can correspond to the V-axis of the pixel plane coordinate system of the shooting device.
  • step 31 may specifically include receiving rotation information sent by the control terminal.
  • the rotation information of the control terminal can also be obtained through other methods, which is not limited in this application.
  • the rotation information can be used to control the movement of the pan/tilt on the one hand, and can be used to crop the image output by the shooting device on the other hand.
  • the implementation method of controlling the motion of the pan/tilt based on the rotation information reference may be made to the specific descriptions in the related technologies, and will not be described again here.
  • Step 32 Crop the target image output by the shooting device according to the rotation information to obtain a target area image.
  • Step 33 Control the movement of the pan/tilt according to the rotation information to adjust the posture of the shooting device, wherein the cropped target image is the photograph taken before the pan/tilt moves in response to the rotation information.
  • the image output by the device is the photograph taken before the pan/tilt moves in response to the rotation information.
  • the shooting device since some processing is required between obtaining the rotation information and the pan/tilt moving in response to the rotation information (such as filtering the rotation information, calculating the motor control amount based on the filtering results, etc.), it takes a while, and the shooting device The shooting frequency is usually high. During this period, the shooting device can output one or more images.
  • the movable platform when the rotation information is obtained, can cache one or more images output by the shooting device and send them to the control
  • the image of the terminal therefore the target image, may include: an image output by the camera device from when the rotation information is obtained until the pan/tilt moves in response to the rotation information, and/or, an image output by the camera device before the rotation information is obtained.
  • the image output by the shooting device after the pan/tilt moves in response to the rotation information it can be flexibly decided whether to crop based on the rotation information as needed. It can be understood that the field of view (Field of View, FOV) of the cropped target area image is smaller than the FOV of the target image.
  • FOV Field of View
  • the specific method of cropping the target image may be related to the method of controlling the movement of the pan/tilt based on the rotation information.
  • the rotation information includes yaw rotation information
  • the yaw axis motor of the gimbal can be controlled according to the yaw rotation information, and the U-axis pixel range that needs to be cropped from the target image can be determined
  • the rotation information includes pitch
  • the pitch axis motor of the gimbal can be controlled based on the pitch rotation information, and the V-axis pixel range that needs to be cropped from the target image can be determined.
  • the method of cropping the target area image from the target image can be as shown in Figure 4A .
  • the rotation information includes yaw rotation information used to represent the control terminal to yaw to the right and pitch rotation information used to represent the control terminal to pitch downward
  • the method of cropping the target area image from the target image can be as shown in Figure 4B shown. It should be noted that in FIGS. 4A and 4B , 41 represents the target image, and 42 represents the target area image.
  • the target area image can be cropped based on two factors: image center and image size. That is, the target area image can be cropped from the target image based on the size of the target area image and the center of the target area image. Based on this, in one embodiment, the rotation information of the control terminal can be used to determine the deviation between the center of the target area image and the center of the target image.
  • the rotation information of the control terminal when the rotation information of the control terminal includes the yaw rotation information of the control terminal, the yaw rotation information can be used to determine the deviation between the center of the target area image and the center of the target image on the U axis; and/or , for example, in the case where the rotation information of the control terminal includes the pitch rotation information of the control terminal, the pitch rotation information can be used to determine the deviation between the center of the target area image and the center of the target image on the V axis.
  • the size of the target area image may be fixed.
  • the size of the target area image may be preset, or the size of the target area image may be set by the user.
  • the size of the target area image can be variable. It should be understood that when the size of the target area image is larger, the data amount of the target area image may be larger, and the requirements for the communication connection between the movable platform and the control terminal may be higher. When the size of the target area image is smaller, the target area image may be smaller in size. The smaller the data volume of the area image can be, the lower the requirements on the communication connection between the movable platform and the control terminal can be. Therefore, the size of the target area image may be positively related to the status of the communication connection between the movable platform and the control terminal.
  • the target area The size of the image may be determined based on at least one of a moving speed of the movable platform, a distance between the movable platform and the control terminal, and a communication quality between the movable platform and the control terminal.
  • the communication quality may be determined by at least one of communication bandwidth, transmission bit error rate, signal-to-noise ratio, etc.
  • Step 34 Send the target area image to the control terminal.
  • the movable platform can send the target area image obtained by cropping to the control terminal, and the control terminal can receive the target area image and display the target area image. Further, the movable platform can encode the target area image to obtain the encoding result, and send the encoding result to the control terminal through a wireless communication connection with the control terminal, so that the control terminal decodes the encoding result to obtain the target area. image and display the target area image.
  • the FOV of the target area image is smaller than the target image, compared with sending the target image to the control terminal, by sending the target area image to the control terminal, bandwidth usage can be reduced and transmission resources can be saved.
  • the glasses end can collect its own rotation information and transmit the rotation information to the aircraft end through the wireless communication connection. Then the aircraft end can move the gimbal in response to the rotation information based on the rotation information.
  • the image output by the front shooting device i.e., the target image
  • the image output by the front shooting device is cropped to obtain an image that changes according to the rotation information (i.e., the target area image, that is, an image that changes with the rotation of the body part).
  • the aircraft end can perform the target area image Encoding and transmission are performed to transmit to the glasses, and finally the glasses can be decoded and displayed, so that the user begins to see images that change as the body part rotates.
  • the processing required includes transmission. Rotate the information, crop the image, and transmit the image without including the pan/tilt moving in response to the rotation information, the shooting device exposing, and processing the image. Since cropping the image is compared to the time consumption of the pan/tilt moving in response to the rotation information, the shooting device exposing, and processing the image.
  • the time is much shorter, so in this application, by not having to wait for the pan/tilt to respond to the movement of the rotation information and the time required for the shooting device to expose and process the image, it is possible to reduce the time required for the body part that is wearing or holding the control terminal to drive the control terminal to start rotating. , the delay time until the user starts to see images that change with the rotation of body parts, thus helping to improve the user experience.
  • the image transmission method of the movable platform obtaineds a target area image by cropping the target image output by the shooting device according to the rotation information, and sends the target area image to the control terminal, wherein the cropped
  • the target image is the image output by the shooting device before the pan/tilt moves in response to the rotation information, so that the body part wearing or holding the control terminal drives the control terminal to start rotating (for example, the rotation of the head drives the head-mounted display device to start rotating) , until the user begins to see images that change with the rotation of body parts, there is no need to wait for the gimbal to respond to the movement of the rotation information and the time required for the shooting device to expose and process the image, thereby reducing the need for wearing or holding control.
  • the body part of the terminal drives the control terminal to start rotating, and the delay time until the user starts to see images that change with the rotation of the body part is conducive to improving the user experience.
  • Figure 6 is a schematic flow chart of an image transmission method for a movable platform provided by another embodiment of the present application. Based on the embodiment shown in Figure 3, this embodiment mainly describes how to transmit images based on the rotation information when the rotation information includes posture.
  • An optional implementation method for cropping the target image output by the shooting device, as shown in Figure 6. The method provided by this embodiment may include:
  • Step 61 Obtain the rotation information of the control terminal, where the rotation information includes the posture of the control terminal.
  • step 61 is similar to step 31 and will not be described again.
  • Step 62 Obtain the posture of the shooting device when the shooting device collects the target image.
  • Step 63 According to the posture of the control terminal and the posture of the photographing device, crop the target area image from the target image output by the photographing device.
  • Step 64 Control the movement of the pan/tilt according to the rotation information to adjust the posture of the shooting device, wherein the cropped target image is the photograph taken before the pan/tilt moves in response to the rotation information.
  • the image output by the device is the photograph taken before the pan/tilt moves in response to the rotation information.
  • Step 65 Send the target area image to the control terminal.
  • the image transmission device may also include: determining the target image corresponding to the rotation information from the images output by the shooting device.
  • determining the target image corresponding to the rotation information from the image output by the shooting device may specifically include: determining the moment when the movable platform acquires the rotation information (recorded as the first moment), and the pan/tilt starts responding to the rotation information.
  • the moment of movement (recorded as the second moment), and the image output by the shooting device from the first moment to the second moment is used as the target image.
  • the image output by the shooting device from the 10th second to the 11th second can be directly used as the target image, that is, From the 10th to the 11th second, the latest image output by the shooting device can be cropped directly based on the rotation information.
  • determining the target image corresponding to the rotation information from the image output by the shooting device may specifically include: obtaining the characteristic moment of the rotation information, wherein the rotation The characteristic time of the information is the time when the sensor of the control terminal collects the rotation information or the time when the movable platform obtains the rotation information; the collection time of the image output by the shooting device is obtained; based on the characteristic time and the collection time, it is determined from the image output by the shooting device target image.
  • the image output by the shooting device before the time when the sensor of the control terminal collects the rotation information is the movable platform.
  • the image output by the shooting device before the rotation information is acquired, so that the target image can be obtained from the image output by the shooting device according to the time when the sensor of the control terminal collects the rotation information.
  • an image whose acquisition time output by the shooting device is earlier than or equal to the characteristic time may be determined as the target image, that is, the characteristic time of the rotation information may be earlier than or equal to the acquisition time of the target image.
  • the posture of the shooting device when collecting the target image can be obtained.
  • the posture of the shooting device when collecting the target image can be determined based on the posture of the pan/tilt.
  • the image output by the shooting device may have a corresponding collection time. According to the collection time of the target image and the correspondence between different collection times and the posture of the pan and tilt, the posture of the shooting device when collecting the target image can be determined.
  • the posture of the shooting device when collecting the target image can also be obtained through other methods, which is not limited in this application.
  • the attitude of the pan/tilt may be collected by an attitude sensor (such as an inertial measurement unit) configured on the pan/tilt.
  • the target area image can be cropped from the target image output by the shooting device according to the posture in the rotation information of the control terminal and the posture of the shooting device.
  • step 63 may specifically include the following steps A and B.
  • Step A Determine the deviation between the center of the target area image and the center of the target image according to the posture of the control terminal and the posture of the shooting device;
  • Step B Crop the target area image from the target image output by the shooting device according to the deviation.
  • the attitude of the control terminal may include the attitude of the control terminal in its yaw direction and/or the attitude of the control terminal in its pitch direction.
  • the attitude of the control terminal in its yaw direction may be understood as yaw rotation information.
  • the control terminal The attitude of the terminal in its pitch direction can be understood as pitch rotation information. The following mainly takes the attitude of the control terminal including the attitude of the control terminal in its yaw direction and the attitude of the control terminal in its pitch direction as an example for detailed explanation.
  • the attitude angle of the control terminal in its yaw direction can be represented by the attitude angle of the control terminal in its yaw direction
  • the attitude angle of the control terminal in its pitch direction can be represented by, for example, the attitude angle of the control terminal in its pitch direction. expressed in attitude angle.
  • the control terminal is a head-mounted display device as shown in Figure 7A
  • the axis direction of the head-mounted display device is the arrow direction in Figure 7A
  • the definition of the attitude angle ⁇ of the control terminal in its pitch direction can be shown in Figure 7B.
  • the XYZ coordinate system in Figure 7B is the coordinate system of the control terminal.
  • step A may specifically include the following steps A1 and A2.
  • Step A1 Determine the deviation in the U-axis direction between the center of the target area image and the center of the target image based on the attitude of the control terminal in the yaw direction and the attitude of the shooting device in the yaw direction when collecting the target image
  • Step A2 Determine the deviation in the V-axis direction between the center of the target area image and the center of the target image based on the attitude of the control terminal in the pitch direction and the attitude of the shooting device in the pitch direction when collecting the target image.
  • the deviation in the U-axis direction between the center of the target area image and the center of the target image can be expressed as the attitude angle of the control terminal in the yaw direction and the yaw angle of the shooting device when collecting the target image.
  • the deviation of the attitude angle in the direction for example, can be recorded as
  • the deviation in the V-axis direction between the center of the target area image and the center of the target image can be expressed as the deviation between the attitude angle of the control terminal in the pitch direction and the attitude angle of the shooting device in the pitch direction when collecting the target image.
  • it can be Marked as ⁇ . Therefore, the deviation between the center of the target area image and the center of the target image can be recorded as
  • cropping the target area image from the target image output by the shooting device according to the deviation may specifically include: determining the center of the target area image according to the deviation and the center of the target image; and, cropping the target area image from the target area image according to the center of the target area image. Crop the target area of the image.
  • the way of cropping the target area image from the target image output by the shooting device according to the deviation can be shown in Figure 8A, where F can represent the target image, O can represent the center of the target image, F' can represent the target area image, and O' It can represent the center of the target area image.
  • the attitude angle of the control terminal in the yaw direction, the attitude angle of the shooting device in the yaw direction when collecting target images, and the deviation between the two can satisfy the following formula (1);
  • the attitude angle of the control terminal in the pitch direction The attitude angle, the attitude angle of the shooting device in the pitch direction when collecting the target image, and the deviation between the two can satisfy the following formula (2).
  • the posture of the control terminal can be filtered; and/or optionally, in order to alleviate the image jitter problem caused by the sensor jitter of the movable platform , which can filter the posture of the shooting device.
  • the filtering algorithm may be, for example, a Kalman filtering algorithm.
  • the posture pair of the terminal can be controlled according to the t-1th moment to the tN moment.
  • the posture of the control terminal from time t-1 to time tN can be expressed as 0 ⁇ i ⁇ N-1
  • the obtained filtered attitude can be expressed as It can be based on the attitude of the shooting device from time t-1 to time tN.
  • the posture of the control terminal from time t-1 to time tN can be expressed as 0 ⁇ j is greater than ⁇ M-1
  • the obtained filtered attitude can be expressed as
  • the attitude angle of the control terminal in the yaw direction and the attitude angle of the shooting device in the yaw direction when collecting target images can satisfy the following formula (3);
  • the attitude angle of the shooting device in the pitch direction during the target image can satisfy the following formula (4).
  • the deviation between the center of the area image Y and the center of the image X is determined, so that the area image Y can be cropped from the image X.
  • the image transmission method of the movable platform cuts the target area image from the target image output by the shooting device according to the posture of the control terminal and the posture of the shooting device, and sends the target area image to the control terminal, where , the cropped target image is the image output by the shooting device before the pan/tilt moves in response to the rotation information, so that the body part wearing or holding the control terminal drives the control terminal to start rotating (for example, the rotation of the head drives the head-mounted display device rotation), until the user begins to see images that change with the rotation of body parts, there is no need to wait for the gimbal to respond to the movement of the rotation information and the time required for the shooting device to expose and process the image, thereby reducing the time required for wearing or The delay time between the body part holding the control terminal and the control terminal starting to rotate and the user starting to see images that change with the rotation of the body part is conducive to improving the user experience.
  • Figure 9 is a schematic flow chart of an image transmission method for a movable platform provided by another embodiment of the present application. Based on the embodiment shown in Figure 3, this embodiment mainly describes when the rotation information includes angular velocity or angular acceleration, according to An optional implementation method for cropping the target image output by the shooting device using rotation information, as shown in Figure 9.
  • the method provided by this embodiment may include:
  • Step 91 Obtain rotation information of the control terminal, where the rotation information includes the angular velocity or angular acceleration of the control terminal.
  • Step 92 Crop the target area image from the target image output by the shooting device according to the angular velocity or angular acceleration of the control terminal.
  • Step 93 Control the movement of the pan/tilt according to the rotation information to adjust the posture of the shooting device, wherein the cropped target image is the photograph taken before the pan/tilt moves in response to the rotation information.
  • the image output by the device is the image output by the device.
  • Step 94 Send the target area image to the control terminal.
  • step 92 it may also include: determining the target image corresponding to the rotation information from the image output by the shooting device.
  • determining the target image corresponding to the rotation information from the image output by the shooting device.
  • the control terminal may send the angular velocity or angular acceleration to the movable platform.
  • the movable platform can control the movement of the pan/tilt based on the angular velocity or angular acceleration of the control terminal to adjust the posture of the shooting device.
  • the movable platform can also crop the target area from the target image output by the shooting device based on the angular velocity or angular acceleration of the control terminal. image.
  • step 92 may specifically include the following steps C and D.
  • Step C Determine the deviation between the center of the target area image and the center of the target image according to the angular velocity or angular acceleration of the control terminal;
  • Step D Crop the target area image from the target image output by the shooting device according to the deviation.
  • controlling the angular velocity of the terminal may include controlling the angular velocity of the terminal in its yaw direction and/or controlling the angular velocity of the terminal in its pitch direction.
  • the angular velocity of controlling the terminal in its yaw direction may be understood as yaw rotation information
  • controlling The angular velocity of the terminal in its pitch direction can be understood as pitch rotation information.
  • the angular acceleration of the control terminal may include the angular acceleration of the control terminal in its yaw direction and/or the angular acceleration of the control terminal in its pitch direction.
  • the angular acceleration of the control terminal in its yaw direction may be understood as yaw rotation information.
  • the angular acceleration of the control terminal in its pitch direction can be understood as pitch rotation information.
  • the angular velocity of the control terminal mainly includes the angular velocity of the control terminal in its yaw direction and the angular velocity of the control terminal in its pitch direction.
  • the angular acceleration of the control terminal includes the angular acceleration of the control terminal in its yaw direction and the angular velocity of the control terminal in its pitch direction.
  • the angular acceleration in the pitch direction is taken as an example for detailed explanation.
  • determining the deviation between the center of the target area image and the center of the target image based on the angular velocity of the control terminal may specifically include: determining the center of the target area image and the center of the target image based on the angular velocity of the control terminal in the yaw direction.
  • the angular velocity of the control terminal in the yaw direction can be integrated to obtain the angle of rotation of the control terminal in the yaw direction from the previous moment to the current moment.
  • the target can be obtained according to the angle of rotation of the control terminal in the yaw direction.
  • the deviation in the U-axis direction between the center of the area image and the center of the target image can be recorded as ⁇ u, for example; by integrating the angular velocity of the control terminal in the pitch direction, we can obtain the value of the control terminal in the pitch direction from the previous moment to the current moment.
  • the angle of rotation in the pitch direction can be integrated to obtain the angle of rotation of the pitch direction.
  • the deviation in the U-axis direction between the center of the target area image and the center of the target image can be obtained, which can be recorded as ⁇ v, for example. Therefore, the deviation between the center of the target area image and the center of the target image can be recorded as ( ⁇ v, ⁇ u).
  • determining the deviation between the center of the target area image and the center of the target image based on the angular acceleration of the control terminal may specifically include: determining the center of the target area image and the center of the target image based on the angular acceleration of the control terminal in the yaw direction.
  • the deviation between the centers of the images in the U-axis direction; the deviation in the V-axis direction between the center of the target area image and the center of the target image is determined based on the angular acceleration of the control terminal in the pitch direction.
  • the angular acceleration of the control terminal in the yaw direction can be integrated twice to obtain the angle of rotation of the control terminal in the yaw direction from the previous moment to the current moment.
  • the angle of rotation of the control terminal in the yaw direction the angle of rotation of the control terminal in the yaw direction can be obtained.
  • the angle of rotation in the pitch direction According to the angle of rotation of the control terminal in the pitch direction, the deviation in the U-axis direction between the center of the target area image and the center of the target image can be obtained, that is, ⁇ v.
  • cropping the target area image from the target image output by the shooting device according to the deviation may specifically include: determining the center of the target area image according to the deviation and the center of the target image; and, cropping the target area image from the target area image according to the center of the target area image. Crop the target area of the image.
  • the way of cropping the target area image from the target image output by the shooting device according to the deviation can be shown in Figure 10, where F can represent the target image, O can represent the center of the target image, F' can represent the target area image, and O' It can represent the center of the target area image.
  • the angle at which the control terminal rotates in the yaw direction and the deviation in the U-axis direction between the center of the target area image and the center of the target image can satisfy the following formula (5); the control terminal rotates in the pitch direction.
  • the angle of and the deviation in the V-axis direction between the center of the target area image and the center of the target image can satisfy the following formula (6).
  • u t can represent the angle of rotation of the control terminal in the yaw direction obtained by integration
  • ⁇ u bound can represent the maximum value of the offset along the U-axis
  • v t can represent the angle of rotation of the control terminal in the pitch direction obtained by integration
  • ⁇ v bound can represent the maximum value of the offset along the V axis.
  • ⁇ u bound can be equal to in formula (1)
  • ⁇ v bound can be equal to ⁇ bound in formula (2).
  • the angular velocity or angular acceleration determines the deviation between the center of the area image Y and the center of the image X, so that the area image Y can be cropped from the image X.
  • the image transmission method of the movable platform cuts the target area image from the target image output by the shooting device according to the angular velocity or angular acceleration of the control terminal, and sends the target area image to the control terminal, where the target area image is
  • the cropped target image is an image output by the shooting device before the pan/tilt moves in response to the rotation information, so that the body part wearing or holding the control terminal drives the control terminal to start rotating (for example, the rotation of the head drives the head-mounted display device to rotate ), until the user begins to see images that change with the rotation of body parts, there is no need to wait for the gimbal to move in response to the rotation information and the time required for the shooting device to expose and process the image, thereby reducing the time required for wearing or holding
  • the body part of the control terminal drives the control terminal to rotate, and the delay time until the user sees the corresponding change in the image displayed on the control terminal is conducive to improving the user experience.
  • sending cropped images to the control terminal can be used as a working mode, so that the user can choose whether to use the method provided by the embodiments of the present application for image transmission as needed.
  • the aforementioned step 32 may specifically include: when the working mode is the first working mode, cropping the target image output by the shooting device according to the rotation information to obtain a target area image.
  • sending uncropped images to the control terminal can also be used as another working mode, so that the user can choose to transmit uncropped images as needed for image transmission.
  • the method provided by the embodiment of the present application may also include: when the working mode is the second working mode, sending the target image output by the shooting device to the control terminal, so that the control terminal displays the target image .
  • the control terminal may include an interactive device.
  • the interactive device may include at least one of a function button, a rocker, a touch display, and a touch pad.
  • the interactive device may detect a mode setting operation input by the user,
  • the control terminal may determine the working mode of the movable platform according to the working mode operation detected by the interactive device.
  • the control terminal can also detect the mode setting operation input by the user in other ways (for example, detecting the user's mode setting gesture through its configured sensor), which is not limited in this application.
  • FIG. 11 is a schematic structural diagram of an image transmission device for a movable platform provided by an embodiment of the present application.
  • the device 110 may include: a processor 111 and a memory 112 .
  • the memory 112 is used to store program codes
  • the processor 111 calls the program code, and when the program code is executed, is used to perform the following operations:
  • the movement of the pan/tilt is controlled according to the rotation information to adjust the posture of the shooting device, wherein the cropped target image is output by the photography device before the pan/tilt moves in response to the rotation information. image;
  • the image transmission device provided in this embodiment can be used to implement the technical solutions of the foregoing method embodiments. Its implementation principles and technical effects are similar to those of the method embodiments, and will not be described again here.
  • Embodiments of the present application also provide a movable platform, including a shooting device that collects and outputs images, a pan/tilt for carrying the shooting device and adjusting the posture of the shooting device, and an image transmission device.
  • the image transmission device in the movable platform provided in this embodiment can be as shown in Figure 11, and will not be described again here.
  • the aforementioned program can be stored in a computer-readable storage medium.
  • the steps including the above-mentioned method embodiments are executed; and the aforementioned storage media include: ROM, RAM, magnetic disks, optical disks and other media that can store program codes.

Landscapes

  • Studio Devices (AREA)

Abstract

一种可移动平台的图像传输方法、装置及设备。该方法包括:获取控制终端的转动信息;根据转动信息对拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;根据转动信息控制云台运动,以调节拍摄装置的姿态,其中,被裁切的目标图像是云台响应于转动信息运动前拍摄装置输出的图像;将目标区域图像发送给控制终端。该方法能够减少从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间,有利于提升用户的使用体验。

Description

可移动平台的图像传输方法、装置及设备 技术领域
本申请涉及图像传输技术领域,尤其涉及一种可移动平台的图像传输方法、装置及设备。
背景技术
可移动平台(例如无人机)可以配置拍摄装置,拍摄装置可以拍摄各种环境图像,可移动平台可以利用其配置的云台调节拍摄装置的姿态以改变拍摄装置的拍摄方向。可移动平台可以与控制终端(例如头戴式显示装置)建立无线通信连接,可移动平台可以将拍摄装置采集到的图像发送给控制终端以使控制终端显示图像。
目前,用户可以握持或者佩戴控制终端并让控制终端转动,控制终端可以通过无线通信连接将感测到的转动信息发送给可移动平台,可移动平台可以根据控制终端的转动信息控制云台运动来调整拍摄装置的姿态,以改变拍摄装置的拍摄方向。这样,随着用户带动控制终端转动,拍摄装置的拍摄方向就会随着控制终端的转动而改变,用户就可以看到拍摄装置采集到的拍摄方向变化的图像。例如,控制终端为头戴式显示装置时,佩戴在用户头部的头戴式显示装置会随着用户的头部转动而转动,这样拍摄装置的拍摄方向就会随着用户的头部转动而改变,用户看到头戴式显示装置显示的图像就可以随头部转动变化。
然而,从可移动平台接收到控制终端发送的转动信息到云台开始运动使得拍摄装置的拍摄方向开始改变,这个过程的延时较长,这样会导致从佩戴或握持控制终端的身体部位开始带动控制终端转动(例如用户头部转动带动佩戴的头戴式显示装置开始转动),到用户开始看到随身体部位转动而变化的图像的延时较长,使用流畅度差,导致用户产生眩晕感,用户体验较差。
发明内容
本申请实施例提供一种可移动平台的图像传输方法、装置及设备,用以解决现有技术中从握持或佩戴控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间较长的问题。
第一方面,本申请实施例提供一种所述可移动平台包括采集并输出图像的拍摄装置和用于承载所述拍摄装置并调节所述拍摄装置的姿态的云台,所述方法包括:
获取控制终端的转动信息;
根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;
根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像;
将所述目标区域图像发送给所述控制终端。
第二方面,本申请实施例提供一种可移动平台的图像传输装置,所述可移动平台包括采集并输出图像的拍摄装置和用于承载所述拍摄装置并调节所述拍摄装置的姿态的云台,所述装置包括:存储器和处理器;
所述存储器,用于存储程序代码;
所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
获取控制终端的转动信息;
根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;
根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像;
将所述目标区域图像发送给所述控制终端。
第三方面,本申请实施例提供一种可移动平台,所述可移动平台包括采集并输出图像的拍摄装置、用于承载所述拍摄装置并调节所述拍摄装置的姿态的云台以及图像传输装置,所述装置包括:存储器和处理器;
所述存储器,用于存储程序代码;
所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
获取控制终端的转动信息;
根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;
根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像;
将所述目标区域图像发送给所述控制终端。
第四方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序,当所述计算机程序被执行时,实现如第一方面中所述的方法。
第四方面,本申请实施例提供一种计算机程序,当所述计算机程序被计算机执行时,用于实现上述第一方面所述的方法。
在本申请实施例中,可以根据转动信息对拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,并将目标区域图像发送给控制终端,其中,目标图像是云台响应于转动信息运动前拍摄装置输出的图像。这样,即使云台还没来得及响应于转动信息转动以带动拍摄装置的拍摄方向改变,控制终端就可以获取到根据转动信息而改变图像,不需要等待云台响应于转动信息运动以带动拍摄装置的拍摄方向改变,用户就可以看到改变的图像,从而能够减少从握持或佩戴控制终端的身体部位带动控制终端转动,到用户开始看到随身体部位转动而变化的图像的延迟时间,有利于提升用户的使用体验。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的图像传输方法的应用场景示意图;
图2为现有技术中从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的处理过程的示意图;
图3为本申请一实施例提供的可移动平台的图像传输方法的流程示意图;
图4A为本申请一实施例提供的从目标图像中裁切目标区域图像的示意图;
图4B为本申请另一实施例提供的从目标图像中裁切目标区域图像的示意图;
图5为本申请实施例提供的从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的处理过程的示意图;
图6为本申请另一实施例提供的可移动平台的图像传输方法的流程示意图;
图7A为本申请实施例提供的控制终端的示意图;
图7B为本申请实施例提供的控制终端在其俯仰方向和偏航方向上的姿态角度的定义的示意图;
图8A为本申请又一实施例提供的从目标图像中裁切目标区域图像的示意图;
图8B为本申请又一实施例提供的从目标图像中裁切目标区域图像的示意图;
图9为本申请又一实施例提供的可移动平台的图像传输方法的流程示意图;
图10为本申请又一实施例提供的从目标图像中裁切目标区域图像的示意图;
图11为本申请一实施例提供的可移动平台的图像传输装置的结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例提供的可移动平台的图像传输方法可以应用于图1所示的 应用场景中。如图1所示,该应用场景中可以包括可移动平台11和与所述可移动平台11无线通信连接的控制终端12。
其中,可移动平台11可以包括采集并输出图像的拍摄装置121和用于承载拍摄装置121并调节拍摄装置121的姿态的云台122。用户可以佩戴或握持控制终端12,控制终端12可以跟随佩戴或握持控制终端的身体部位(例如手或者头部)的转动而转动,控制终端12可以将感测自身的转动而得到的转动信息通过无线通信连接发送给可移动平台11,可移动平台11可以根据接收到的控制终端12的转动信息控制云台122运动,以调节拍摄装置121的姿态。
可移动平台11具体可以是包括拍摄装置121和云台122,并能够根据控制终端12的转动信息控制云台122运动的平台,示例性的,可移动平台可以包括无人机、无人车、无人船中的至少一种,在某些情况中,可移动平台还是可以包括手持式设备。控制终端12具体可以是能够与可移动平台11通信连接,且能够随身体部位的转动而转动的终端,示例性的,控制终端12可以包括头戴式显示装置、手持式显示装置、智能手机、平板电脑、遥控器中的至少一种。在头戴式显示装置为眼镜式显示装置时,控制终端12具体可以为眼镜式显示设备,眼镜式显示设备例如可以为第一人称主视角(First Person View,FPV)眼镜。
如图2所示,以可移动平台11为无人机(Unmanned Aerial Vehicle,UAV),控制终端12为眼镜式显示设备为例,在传统技术中,从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像需要经过以下处理过程:
在用户头部转动带动眼镜端开始转动时,首先眼镜端可以采集自身的转动信息,并将转动信息通过无线通信连接传输到飞机端,然后飞机端的云台系统可以对转动信息进行滤波,并使用滤波后的转动信息控制云台122运动来调整拍摄装置121的姿态,以改变拍摄装置121的拍摄方向,拍摄装置121的拍摄方向变化后,拍摄装置121通过曝光以及图像信号处理(Image Signal Processing,ISP)可以输出拍摄方向变化的图像,之后飞机端可以对拍摄装置输出的拍摄方向变化的图像进行编码传输,以传输到眼镜端,最后眼镜端可以进行解码显示,从而用户开始看到随身体部位转动而变化的图像。需要说明的是,图2所示的处理过程仅为举例。
结合图2可以看出,传统技术中从佩戴或握持控制终端的身体部位带动控 制终端开始转动,到用户开始看到随身体部位转动而变化的图像,需要经过的处理过程包括传输转动信息、根据转动信息控制云台运动、拍摄装置曝光并处理图像以及传输图像,从而使得从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间较长。
为了解决现有技术中,从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间较长的技术问题,在本申请实施例中,可以根据转动信息对拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,并将目标区域图像发送给控制终端,其中,目标图像是云台响应于转动信息运动前拍摄装置输出的图像,使得从佩戴或握持控制终端的身体部位带动控制终端开始转动(例如头部转动带动头戴式显示装置开始转动),到用户开始看到随身体部位转动而变化的图像,可以不需要等待云台响应于转动信息运动以及拍摄装置曝光并处理图像这部分处理所需的时间,从而能够减少从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间,有利于提升用户的使用体验。
需要说明的是,本申请实施例提供的图像传输方法,可以由可移动平台的图像传输装置执行,该装置可以包括在可移动平台11中。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
图3为本申请一实施例提供的可移动平台的图像传输方法的流程示意图,本实施例的执行主体可以为可移动平台的图像传输装置,具体可以为图像传输装置的处理器。如图3所示,本实施例的方法可以包括:
步骤31,获取控制终端的转动信息。
其中,转动信息具体可以是能够体现控制终端的转动变化的信息。示例性的,转动信息可以包括姿态、角速度或角加速度。控制终端上可以设置有用于感测控制终端的转动并输出转动信息的传感器,例如控制终端12上设置的传感器例如可以为惯性测量单元(Inertial Measurement Unit,IMU)。
在实际应用中,转动信息可以包括偏航转动信息和/或俯仰转动信息。偏航转动信息是指控制终端在其偏航方向上转动的信息,偏航转动信息可以用于控制云台的偏航轴电机,以控制云台在其偏航方向上转动。俯仰转动信息是 指控制终端在其俯仰方向上转动的信息,俯仰转动信息可以用于控制云台的俯仰轴电机,以控制云台在其俯仰方向上转动。其中,云台的偏航方向可以对应拍摄装置的像素平面坐标系的U轴,云台的俯仰方向可以对应拍摄装置的像素平面坐标系的V轴。
示例性的,步骤31具体可以为接收控制终端发送的转动信息。当然,在其他实施例中也可以通过其他方式获取控制终端的转动信息,本申请对此不做限定。
在本申请中,转动信息一方面可以用于控制云台运动,另一方面可以用于对拍摄装置输出的图像进行裁切处理。关于根据转动信息控制云台运动的实现方式,可以参考相关技术中的具体描述,在此不再赘述。
步骤32,根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像。
步骤33,根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像。
可选的,由于从获取到转动信息到云台响应于转动信息运动之间需要经过一些处理(例如对转动信息进行滤波,根据滤波结果计算电机控制量等),需要耗费一段时间,而拍摄装置的拍摄频率通常较高,在这段时间内拍摄装置可以输出一张或多张图像,另外由于在获取到转动信息时,可移动平台可以缓存有一张或多张拍摄装置输出的需要发送至控制终端的图像,因此目标图像可以包括:从获取转动信息到云台响应于转动信息运动之前拍摄装置输出的图像,和/或,获取到转动信息之前拍摄装置输出的图像。
需要说明的是,对于云台响应于转动信息运动后拍摄装置输出的图像,可以根据需要灵活决定是否根据转动信息进行裁切。可以理解的是,进行裁切得到的目标区域图像的视场角(Field of View,FOV)是小于目标图像的FOV。
其中,对目标图像进行裁切处理的具体方式,可以与根据转动信息控制云台运动的方式有关。示例性的,在转动信息包括偏航转动信息时,可以根据偏航转动信息控制云台的偏航轴电机,并可以确定需要从目标图像裁切出的U轴像素范围;在转动信息包括俯仰转动信息时,可以根据俯仰转动信息控制云台的俯仰轴电机,并可以确定需要从目标图像裁切出的V轴像素范围。
例如,以转动信息包括用于表征控制终端向左偏航的偏航转动信息和用 于表征控制终端向上仰的俯仰转动信息,从目标图像中裁切目标区域图像的方式可以如图4A所示。又例如,以转动信息包括用于表征控制终端向右偏航的偏航转动信息和用于表征控制终端向下俯的俯仰转动信息,从目标图像中裁切目标区域图像的方式可以如图4B所示。需要说明的是,图4A和图4B中,41表示目标图像,42表示目标区域图像。
可选的,可以根据图像中心和图像尺寸两个因子,裁切得到目标区域图像,即可以根据目标区域图像的尺寸和目标区域图像的中心,从目标图像中裁切目标区域图像。基于此,一个实施例中,控制终端的转动信息可以用于确定目标区域图像的中心和目标图像的中心之间的偏差。示例性的,在控制终端的转动信息包括控制终端的偏航转动信息的情况下,偏航转动信息可以用于确定目标区域图像的中心和目标图像的中心在U轴上的偏差;和/或,示例性的,在控制终端的转动信息包括控制终端的俯仰转动信息的情况下,俯仰转动信息可以用于确定目标区域图像的中心和目标图像的中心在V轴上的偏差。
可选的,目标区域图像的尺寸可以是固定不变的,例如目标区域图像的尺寸可以是预先设置的,或者,目标区域图像的尺寸可以是用户设置的。
或者可选的,目标区域图像的尺寸可以是可变的。应理解,在目标区域图像的尺寸越大时,目标区域图像的数据量可以越多,对可移动平台与控制终端之间通信连接的要求可以越高,在目标区域图像的尺寸越小时,目标区域图像的数据量可以越少,对可移动平台与控制终端之间通信连接的要求可以越低。因此,目标区域图像的尺寸可以与可移动平台与控制终端之间通信连接的状况正相关。
由于可移动平台与控制终端之间通信连接的状况可以与可移动平台的移动速度、可移动平台与控制终端之间的距离相关,并可以通过通信质量来表征,因此一个实施例中,目标区域图像的尺寸可以是根据可移动平台的移动速度、可移动平台与控制终端之间的距离、可移动平台与控制终端之间的通信质量中的至少一个确定的。其中,所述通信质量可以以通信带宽、传输误码率、信噪比等中至少一项来确定。
步骤34,将所述目标区域图像发送给所述控制终端。
其中,所述可移动平台可以将裁切处理得到的目标区域图像发送给所述控制终端,所述控制终端可以接收到所述目标区域图像,并显示所述目标区域图像。进一步地,所述可移动平台可以对目标区域图像进行编码得到编码结果, 并将编码结果通过与控制终端之间的无线通信连接发送给控制终端,以由控制终端对编码结果进行解码得到目标区域图像,并显示目标区域图像。
需要说明的是,由于目标区域图像的FOV小于目标图像,因此与将目标图像发送给控制终端相比,通过将目标区域图像发送给控制终端,可以减少带宽占用,节省传输资源。
如图5所示,对比于图2,在本申请中从握持或佩戴控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像需要经过以下处理过程:
在用户头部转动带动眼镜端开始转动时,首先眼镜端可以采集自身的转动信息,并将转动信息通过无线通信连接传输到飞机端,然后飞机端可以根据转动信息对云台响应于转动信息运动前拍摄装置输出的图像(即目标图像)进行裁切处理,得到根据转动信息而变化的图像(即目标区域图像,也即随身体部分转动而变化的图像),之后飞机端可以对目标区域图像进行编码传输,以传输到眼镜端,最后眼镜端可以进行解码显示,从而用户开始看到随身体部分转动而变化的图像。
结合图2和图5可以看出,本申请中从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像,需要经过的处理过程包括传输转动信息、裁切图像以及传输图像,而无需包括云台响应于转动信息运动、拍摄装置曝光并处理图像,由于裁剪图像相比于云台响应于转动信息运动、拍摄装置曝光并处理图像的耗时要短的多,因此本申请中通过不需要等待云台响应于转动信息运动以及拍摄装置曝光并处理图像所需的时间,能够减少从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间,从而有利于提升用户的使用体验。
本实施例提供的可移动平台的图像传输方法,通过根据转动信息对拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,并将目标区域图像发送给控制终端,其中,被裁切的所述目标图像是云台响应于转动信息运动前拍摄装置输出的图像,使得从佩戴或握持控制终端的身体部位带动控制终端开始转动(例如头部转动带动头戴式显示装置开始转动),到用户开始看到随身体部位转动而变化的图像,可以不需要等待云台响应于转动信息运动以及拍摄装置曝光并处理图像这部分处理所需的时间,从而能够减少从佩戴或握 持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间,有利于提升用户的使用体验。
图6为本申请另一实施例提供的可移动平台的图像传输方法的流程示意图,本实施例在图3所示实施例的基础上,主要描述了在转动信息包括姿态时,根据转动信息对拍摄装置输出的目标图像进行裁切处理的一种可选实现方式,如图6所示,本实施例提供的方法可以包括:
步骤61,获取所述控制终端的转动信息,所述转动信息包括所述控制终端的姿态。
需要说明的是,步骤61与步骤31类似,在此不再赘述。
步骤62,获取所述拍摄装置采集目标图像时所述拍摄装置的姿态。
步骤63,根据所述控制终端的姿态和所述拍摄装置的姿态,从所述拍摄装置输出的目标图像中裁切目标区域图像。
步骤64,根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像。
步骤65,将所述目标区域图像发送给所述控制终端。
在实际应用中,在图像传输装置无法直接获知哪些是目标图像时,在步骤62之前还可以包括:从拍摄装置输出的图像中确定与转动信息对应的目标图像。
一个实施例中,从拍摄装置输出的图像中确定与转动信息对应的目标图像具体可以包括:确定可移动平台获取到转动信息的时刻(记为第一时刻),以及云台响应于转动信息开始运动的时刻(记为第二时刻),并将第一时刻到第二时刻拍摄装置输出的图像作为目标图像。例如,假设可移动平台在第10秒获取到转动信息,且在第11秒云台响应于转动信息开始运动,则可以直接将第10秒到第11秒拍摄装置输出的图像作为目标图像,即在第10秒到第11秒,可以直接根据转动信息对拍摄装置最新输出的图像进行裁切处理。
另一个实施例中,在拍摄装置与可移动平台的时间同步的情况下,从拍摄装置输出的图像中确定与转动信息对应的目标图像,具体可以包括:获取转动信息的特征时刻,其中,转动信息的特征时刻为控制终端的传感器采集转动信息的时刻或者可移动平台获取到转动信息的时刻;获取拍摄装置输出的图像的采集时刻;根据特征时刻和采集时刻,从拍摄装置输出的图像中确定目标图 像。
需要说明的是,由于控制终端的传感器采集转动信息的时刻是早于可移动平台获取到转动信息的时刻,因此,控制终端的传感器采集转动信息的时刻之前拍摄装置输出的图像,为可移动平台获取到转动信息之前拍摄装置输出的图像,从而可以根据控制终端的传感器采集转动信息的时刻,从拍摄装置输出的图像中目标图像。
示例性的,可以将拍摄装置输出的采集时刻早于或等于特征时刻的图像,确定为目标图像,即转动信息的特征时刻可以早于或等于目标图像的采集时刻。
针对目标图像,可以获取采集目标图像时拍摄装置的姿态,采集目标图像时拍摄装置的姿态可以根据云台的姿态确定的。例如,拍摄装置输出的图像可以存在对应的采集时刻,根据目标图像的采集时刻以及不同采集时刻与云台姿态的对应关系,可以确定采集目标图像时拍摄装置的姿态。当然,在其他实施例中,也可以通过其他方式获取采集目标图像时拍摄装置的姿态,本申请对此不做限定。所述云台的姿态可以是云台配置的姿态传感器(例如惯性测量单元)采集得到的。
在确定采集目标图像时拍摄装置的姿态后,可以根据控制终端的转动信息中的姿态和拍摄装置的姿态,从拍摄装置输出的目标图像中裁切目标区域图像。
在控制终端的姿态用于确定目标区域图像的中心和目标图像的中心之间的偏差的情况下,步骤63具体可以包括如下步骤A和步骤B。
步骤A,根据控制终端的姿态和拍摄装置的姿态,确定目标区域图像的中心与目标图像中心之间的偏差;
步骤B,根据偏差从拍摄装置输出的目标图像中裁切目标区域图像。
其中,控制终端的姿态可以包括控制终端在其偏航方向上的姿态和/或控制终端在其俯仰方向上的姿态,控制终端在其偏航方向上的姿态可以理解为偏航转动信息,控制终端在其俯仰方向上的姿态可以理解为俯仰转动信息。以下主要以控制终端的姿态包括控制终端在其偏航方向上的姿态以及控制终端在其俯仰方向上的姿态为例进行具体说明。
示例性的,控制终端在其偏航方向上的姿态可以使用控制终端在其偏航方向上的姿态角度来表示,控制终端在其俯仰方向上的姿态角度例如可以使 用控制终端在其俯仰方向上的姿态角度来表示。以控制终端为如图7A所示的头戴式显示设备,且头戴显示设备的轴线方向为图7A中的箭头方向为例,则控制终端在其偏航方向上的姿态角度
Figure PCTCN2022083116-appb-000001
以及控制终端在其俯仰方向上的姿态角度θ的定义可以如图7B所示,图7B中的XYZ坐标系是控制终端的坐标系。
示例性的,步骤A具体可以包括如下步骤A1和A2。步骤A1,根据控制终端在偏航方向上的姿态和采集目标图像时拍摄装置在偏航方向上的姿态,确定目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差;步骤A2,根据控制终端在俯仰方向上的姿态和采集目标图像时拍摄装置在俯仰方向上的姿态,确定目标区域图像的中心与目标图像的中心之间在V轴方向上的偏差。
在通过姿态角度表示姿态时,目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差可以表示为控制终端在偏航方向上的姿态角度和采集目标图像时拍摄装置在偏航方向上的姿态角度的偏差,例如可以记为
Figure PCTCN2022083116-appb-000002
目标区域图像的中心与目标图像的中心之间在V轴方向上的偏差可以表示为控制终端在俯仰方向上的姿态角度和采集目标图像时拍摄装置在俯仰方向上的姿态角度的偏差,例如可以记为Δθ。从而目标区域图像的中心与目标图像中心之间的偏差可以记为
Figure PCTCN2022083116-appb-000003
示例性的,根据偏差从拍摄装置输出的目标图像中裁切目标区域图像,具体可以包括:根据偏差以及目标图像的中心,确定目标区域图像的中心;以及,根据目标区域图像的中心从目标图像中裁切目标区域图像。根据偏差从拍摄装置输出的目标图像中裁切目标区域图像的方式可以如图8A所示,其中,F可以表示目标图像,O可以表示目标图像的中心,F’可以表示目标区域图像,O’可以表示目标区域图像的中心。
一个实施例中,控制终端在偏航方向上的姿态角度、采集目标图像时拍摄装置在偏航方向上的姿态角度以及两者的偏差可以满足如下公式(1);控制终端在俯仰方向上的姿态角度、采集目标图像时拍摄装置在俯仰方向上的姿态角度以及两者的偏差可以满足如下公式(2)。
Figure PCTCN2022083116-appb-000004
Figure PCTCN2022083116-appb-000005
其中,
Figure PCTCN2022083116-appb-000006
可以表示控制终端在偏航方向上的姿态角度,
Figure PCTCN2022083116-appb-000007
可以表示采集目标图像时拍摄装置在偏航方向上的姿态角度,
Figure PCTCN2022083116-appb-000008
可以表示沿U轴偏移的最 大值;θ t可以表示控制终端在俯仰方向上的姿态角度,
Figure PCTCN2022083116-appb-000009
可以表示采集目标图像时拍摄装置在俯仰方向上的姿态角度,Δθ bound可以表示沿V轴偏移的最大值,Δθ bound
Figure PCTCN2022083116-appb-000010
的含义可以如图8B所示,其中,F可以表示目标图像,O可以表示目标图像的中心,F’可以表示目标区域图像,O’可以表示目标区域图像的中心。
可选的,为了减轻由于控制终端的传感器抖动带来的图像抖动问题,可以对控制终端的姿态进行滤波;和/或可选的,为了减轻由于可移动平台的传感器抖动带来的图像抖动问题,可以对拍摄装置的姿态进行滤波。其中,滤波算法例如可以为卡尔曼滤波算法。
示例性的,可以根据第t-1时刻至t-N时刻控制终端的姿态对
Figure PCTCN2022083116-appb-000011
进行滤波,第t-1时刻至t-N时刻控制终端的姿态可以表示为
Figure PCTCN2022083116-appb-000012
0<i≤N-1,得到的滤波后的姿态可以表示为
Figure PCTCN2022083116-appb-000013
可以根据第t-1时刻至t-N时刻拍摄装置的姿态对
Figure PCTCN2022083116-appb-000014
进行滤波,第t-1时刻至t-N时刻控制终端的姿态可以表示为
Figure PCTCN2022083116-appb-000015
0<j大于≤M-1,得到的滤波后的姿态可以表示为
Figure PCTCN2022083116-appb-000016
另一个实施例中,控制终端在偏航方向上的姿态角度与采集目标图像时拍摄装置在偏航方向上的姿态角度可以满足如下公式(3);控制终端在俯仰方向上的姿态角度与采集目标图像时拍摄装置在俯仰方向上的姿态角度可以满足如下公式(4)。
Figure PCTCN2022083116-appb-000017
Figure PCTCN2022083116-appb-000018
其中,
Figure PCTCN2022083116-appb-000019
可以表示控制终端在偏航方向上的滤波后的姿态角度,
Figure PCTCN2022083116-appb-000020
可以表示采集目标图像时拍摄装置在偏航方向上的滤波后的姿态角度,
Figure PCTCN2022083116-appb-000021
可以表示沿U轴偏移的最大值;
Figure PCTCN2022083116-appb-000022
可以表示控制终端在俯仰方向上的滤波后的姿态角度,
Figure PCTCN2022083116-appb-000023
可以表示采集目标图像时拍摄装置在俯仰方向上的滤波后的姿态角度,Δθ bound可以表示沿V轴偏移的最大值;Δθ bound
Figure PCTCN2022083116-appb-000024
的含义可以如图8B所示。
类似的,在需要根据控制终端的转动信息,对云台响应于转动信息运动过程中拍摄装置输出的图像(以下可以记为图像X)进行裁切时,可以根据控制终端的姿态和采集图像X时拍摄装置的姿态,确定区域图像Y的中心与图像X的中心之间的偏差,从而可以实现从图像X中裁切区域图像Y。
本实施例提供的可移动平台的图像传输方法,通过根据控制终端的姿态和拍摄装置的姿态,从拍摄装置输出的目标图像中裁切目标区域图像,并将目标区域图像发送给控制终端,其中,被裁切的所述目标图像是云台响应于转动信息运动前拍摄装置输出的图像,使得从佩戴或握持控制终端的身体部位带动控制终端开始转动(例如头部转动带动头戴式显示装置转动),到用户开始看到随身体部位转动而变化的图像,可以不需要等待云台响应于转动信息运动以及拍摄装置曝光并处理图像这部分处理所需的时间,从而能够减少从佩戴或握持控制终端的身体部位带动控制终端开始转动,到用户开始看到随身体部位转动而变化的图像的延迟时间,有利于提升用户的使用体验。
图9为本申请又一实施例提供的可移动平台的图像传输方法的流程示意图,本实施例在图3所示实施例的基础上,主要描述了在转动信息包括角速度或角加速度时,根据转动信息对拍摄装置输出的目标图像进行裁切处理的一种可选实现方式,如图9所示,本实施例提供的方法可以包括:
步骤91,获取所述控制终端的转动信息,所述转动信息包括所述控制终端的角速度或角加速度。
步骤92,根据所述控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像。
步骤93,根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像。
步骤94,将所述目标区域图像发送给所述控制终端。
类似的,在步骤92之前还可以包括:从拍摄装置输出的图像中确定与转动信息对应的目标图像。关于确定目标图像的具体方式可以参考图6所示实施例的相关描述,在此不再赘述。
一个实施例中,在拍摄装置的初始姿态与控制终端的初始姿态对齐的情况下,控制终端可以向可移动平台发送角速度或角加速度。可移动平台可以根据控制终端的角速度或角加速度控制云台运动,以调节拍摄装置的姿态,可移动平台还可以根据控制终端的角速度或角加速度,从拍摄装置输出的目标图像中裁切目标区域图像。
在控制终端的角速度或角加速度用于确定目标区域图像的中心和目标图像的中心之间的偏差的情况下,步骤92具体可以包括如下步骤C和步骤D。
步骤C,根据所述控制终端的角速度或者角加速度,确定目标区域图像的中心与所述目标图像中心之间的偏差;
步骤D,根据所述偏差从拍摄装置输出的目标图像中裁切目标区域图像。
其中,控制终端的角速度可以包括控制终端在其偏航方向上的角速度和/或控制终端在其俯仰方向上的角速度,控制终端在其偏航方向上的角速度可以理解为偏航转动信息,控制终端在其俯仰方向上的角速度可以理解为俯仰转动信息。控制终端的角加速度可以包括控制终端在其偏航方向上的角加速度和/或控制终端在其俯仰方向上的角加速度,控制终端在其偏航方向上的角加速度可以理解为偏航转动信息,控制终端在其俯仰方向上的角加速度可以理解为俯仰转动信息。
以下主要以控制终端的角速度包括控制终端在其偏航方向上的角速度以及控制终端在其俯仰方向上的角速度,控制终端的角加速度包括控制终端在其偏航方向上的角加速度以及控制终端在其俯仰方向上的角加速度为例进行具体说明。
示例性的,根据控制终端的角速度,确定目标区域图像的中心与目标图像中心之间的偏差,具体可以包括:根据控制终端在偏航方向上的角速度,确定目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差;根据控制终端在俯仰方向上的角速度,确定目标区域图像的中心与目标图像的中心之间在V轴方向上的偏差。
其中,可以通过对控制终端在偏航方向上的角速度进行积分,得到前一时刻至当前时刻,控制终端在偏航方向上转动的角度,根据控制终端在偏航方向上转动的角度可以得到目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差,例如可以记为Δu;可以通过对控制终端在俯仰方向上的角速度进行积分,得到前一时刻至当前时刻,控制终端在俯仰方向上转动的角度,根据控制终端在俯仰方向上转动的角度可以得到目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差,例如可以记为Δv。从而目标区域图像的中心与目标图像中心之间的偏差可以记为(Δv,Δu)。
示例性的,根据控制终端的角加速度,确定目标区域图像的中心与目标图像中心之间的偏差,具体可以包括:根据控制终端在偏航方向上的角加速度,确定目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差;根据控制终端在俯仰方向上的角加速度,确定目标区域图像的中心与目标图像的 中心之间在V轴方向上的偏差。
其中,可以通过对控制终端在偏航方向上的角加速度进行两次积分,得到前一时刻至当前时刻控制终端在偏航方向上转动的角度,根据控制终端在偏航方向上转动的角度可以得到目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差,即Δu;可以通过对控制终端在俯仰方向上的角速度进行两次积分,得到前一时刻至当前时刻控制终端在俯仰方向上转动的角度,根据控制终端在俯仰方向上转动的角度可以得到目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差,即Δv。
示例性的,根据偏差从拍摄装置输出的目标图像中裁切目标区域图像,具体可以包括:根据偏差以及目标图像的中心,确定目标区域图像的中心;以及,根据目标区域图像的中心从目标图像中裁切目标区域图像。根据偏差从拍摄装置输出的目标图像中裁切目标区域图像的方式可以如图10所示,其中,F可以表示目标图像,O可以表示目标图像的中心,F’可以表示目标区域图像,O’可以表示目标区域图像的中心。
一个实施例中,控制终端在偏航方向上转动的角度和目标区域图像的中心与目标图像的中心之间在U轴方向上的偏差可以满足如下公式(5);控制终端在俯仰方向上转动的角度和目标区域图像的中心与目标图像的中心之间在V轴方向上的偏差可以满足如下公式(6)。
Δu=max(min(u t,Δu bound),-Δu bound)       公式(5)
Δv=max(min(v t,Δv bound),-Δv bound)         公式(6)
其中,u t可以表示积分得到的控制终端在偏航方向上转动的角度,Δu bound可以表示沿U轴偏移的最大值,v t可以表示积分得到的控制终端在俯仰方向上转动的角度,Δv bound可以表示沿V轴偏移的最大值。Δu bound可以等于公式(1)中的
Figure PCTCN2022083116-appb-000025
Δv bound可以等于公式(2)中的Δθ bound
在需要根据控制终端的转动信息,对云台响应于转动信息运动过程中拍摄装置输出的图像(即图像X)进行裁切时,可以根据控制终端的角速度或角加速度和采集图像X时拍摄装置的角速度或角加速度,确定区域图像Y的中心与图像X的中心之间的偏差,从而可以实现从图像X中裁切区域图像Y。
本实施例提供的可移动平台的图像传输方法,通过根据控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像,并将目标区域图像发送给控制终端,其中,被裁切的所述目标图像是云台响应于转动信 息运动前拍摄装置输出的图像,使得从佩戴或握持控制终端的身体部位带动控制终端开始转动(例如头部转动带动头戴式显示装置转动),到用户开始看到随身体部位转动而变化的图像,可以不需要等待云台响应于转动信息运动以及拍摄装置曝光并处理图像这部分处理所需的时间,从而能够减少从佩戴或握持控制终端的身体部位带动控制终端转动,到用户看到控制终端显示的图像相应变化的延迟时间,有利于提升用户的使用体验。
可选的,在上述方法实施例的基础上,可以将向控制终端发送裁切图像作为一种工作模式,以便于用户可以根据需要选择是否采用本申请实施例提供的方法进行图像传输。基于此,一个实施例中,前述步骤32具体可以包括:在工作模式是第一工作模式时,根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像。
进一步可选的,还可以将向控制终端发送未裁切图像作为另一种工作模式,以便于用户可以根据需要选择是采用发送未裁切图像的方式进行图像传输。基于此,一个实施例中,本申请实施例提供的方法还可以包括:在工作模式是第二工作模式时,将拍摄装置输出的目标图像发送给控制终端,以使控制终端显示所述目标图像。
示例性的,控制终端上可以包括交互装置,例如,所述交互装置可以包括功能按键、摇杆、触摸显示器、触摸板中的至少一种,所述交互装置可以检测用户输入的模式设置操作,控制终端可以根据所述交互装置检测到的工作模式操作可以确定可移动平台的工作模式。当然,在其他实施例中,控制终端也可以通过其他方式检测用户输入的模式设置操作(例如通过其配置的传感器检测用户的模式设置手势),本申请对此不做限定。
图11为本申请一实施例提供的可移动平台的图像传输装置的结构示意图,如图11所示,该装置110可以包括:处理器111和存储器112。
所述存储器112,用于存储程序代码;
所述处理器111,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
获取控制终端的转动信息;
根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;
根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像;
将所述目标区域图像发送给所述控制终端。
本实施例提供的图像传输装置,可以用于执行前述方法实施例的技术方案,其实现原理和技术效果与方法实施例类似,在此不再赘述。
本申请实施例还提供一种可移动平台,包括采集并输出图像的拍摄装置,用于承载所述拍摄装置并调节所述拍摄装置的姿态的云台,以及图像传输装置。本实施例提供的可移动平台中的图像传输装置,可以如图11所示,在此不再赘述。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可读取存储介质中。该程序在执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (49)

  1. 一种可移动平台的图像传输方法,所述可移动平台包括采集并输出图像的拍摄装置和用于承载所述拍摄装置并调节所述拍摄装置的姿态的云台,其特征在于,所述方法包括:
    获取控制终端的转动信息;
    根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;
    根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像;
    将所述目标区域图像发送给所述控制终端。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    从所述拍摄装置输出的图像中确定与所述转动信息对应的目标图像。
  3. 根据权利要求2所述的方法,其特征在于,所述从所述拍摄装置输出的图像中确定与所述转动信息对应的目标图像,包括:
    获取所述转动信息的特征时刻,其中,所述转动信息的特征时刻为所述控制终端的传感器采集所述转动信息的时刻或者所述可移动平台获取到所述转动信息的时刻;
    获取所述拍摄装置输出的图像的采集时刻;
    根据所述特征时刻和所述采集时刻,从所述拍摄装置输出的图像中确定所述目标图像。
  4. 根据权利要求3所述的方法,其特征在于,所述转动信息的特征时刻早于或等于所述目标图像的采集时刻。
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,所述控制终端的转动信息包括所述控制终端的姿态,所述根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,包括:
    获取所述拍摄装置采集所述目标图像时所述拍摄装置的姿态;
    根据所述控制终端的姿态和所述拍摄装置的姿态,从所述拍摄装置输出的目标图像中裁切目标区域图像。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述控制终端的姿态和所述拍摄装置的姿态,从所述拍摄装置输出的目标图像中裁切目标区 域图像,包括:
    根据所述控制终端的姿态和所述拍摄装置的姿态,确定目标区域图像的中心与所述目标图像中心之间的偏差;
    根据所述偏差从所述拍摄装置输出的目标图像中裁切目标区域图像。
  7. 根据权利要求1-4中任一项所述的方法,其特征在于,所述控制终端的转动信息包括所述控制终端的角速度或者角加速度,所述根据所述转动信息从拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的角速度或者角加速度,确定目标区域图像的中心与所述目标图像中心之间的偏差;
    根据所述偏差从拍摄装置输出的目标图像中裁切目标区域图像。
  9. 根据权利要求1所述的方法,其特征在于,所述转动信息包括姿态、角速度、角加速度中的至少一种。
  10. 根据权利要求1所述的方法,其特征在于,所述控制终端的转动信息用于确定所述目标区域图像的中心和所述目标图像的中心之间的偏差。
  11. 根据权利要求10所述的方法,其特征在于,所述控制终端的转动信息包括所述控制终端的偏航转动信息和俯仰转动信息,所述偏航转动信息和俯仰转动信息分别用于确定所述目标区域图像的中心和所述目标图像的中心在U轴和V轴上的偏差。
  12. 根据权利要求1所述的方法,其特征在于,所述根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,包括:
    在工作模式是第一工作模式时,根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像。
  13. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    在工作模式是第二工作模式时,将拍摄装置输出的目标图像发送给控制终端,以使控制终端显示所述目标图像。
  14. 根据权利要求1所述的方法,其特征在于,所述目标区域图像的尺寸是根据所述可移动平台的移动速度、可移动平台与所述控制终端之间的距离、 可移动平台与所述控制终端之间的通信质量中的至少一个确定的。
  15. 根据权利要求1所述的方法,其特征在于,所述目标区域图像的尺寸是固定不变的。
  16. 根据权利要求1所述的方法,其特征在于,所述控制终端包括头戴式显示装置或者手持式显示装置。
  17. 一种可移动平台的图像传输装置,所述可移动平台包括采集并输出图像的拍摄装置和用于承载所述拍摄装置并调节所述拍摄装置的姿态的云台,其特征在于,所述装置包括:存储器和处理器;
    所述存储器,用于存储程序代码;
    所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
    获取控制终端的转动信息;
    根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;
    根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像;
    将所述目标区域图像发送给所述控制终端。
  18. 根据权利要求17所述的装置,其特征在于,所述处理器还用于:
    从所述拍摄装置输出的图像中确定与所述转动信息对应的目标图像。
  19. 根据权利要求18所述的装置,其特征在于,所述处理器用于从所述拍摄装置输出的图像中确定与所述转动信息对应的目标图像,包括:
    获取所述转动信息的特征时刻,其中,所述转动信息的特征时刻为所述控制终端的传感器采集所述转动信息的时刻或者所述可移动平台获取到所述转动信息的时刻;
    获取所述拍摄装置输出的图像的采集时刻;
    根据所述特征时刻和所述采集时刻,从所述拍摄装置输出的图像中确定所述目标图像。
  20. 根据权利要求19所述的装置,其特征在于,所述转动信息的特征时刻早于或等于所述目标图像的采集时刻。
  21. 根据权利要求17-20中任一项所述的装置,其特征在于,所述控制终 端的转动信息包括所述控制终端的姿态,所述处理器用于根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,包括:
    获取所述拍摄装置采集所述目标图像时所述拍摄装置的姿态;
    根据所述控制终端的姿态和所述拍摄装置的姿态,从所述拍摄装置输出的目标图像中裁切目标区域图像。
  22. 根据权利要求21所述的装置,其特征在于,所述处理器用于根据所述控制终端的姿态和所述拍摄装置的姿态,从所述拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的姿态和所述拍摄装置的姿态,确定目标区域图像的中心与所述目标图像中心之间的偏差;
    根据所述偏差从所述拍摄装置输出的目标图像中裁切目标区域图像。
  23. 根据权利要求17-20中任一项所述的装置,其特征在于,所述控制终端的转动信息包括所述控制终端的角速度或者角加速度,所述处理器用于根据所述转动信息从拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像。
  24. 根据权利要求23所述的装置,其特征在于,所述处理器用于根据所述控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的角速度或者角加速度,确定目标区域图像的中心与所述目标图像中心之间的偏差;
    根据所述偏差从拍摄装置输出的目标图像中裁切目标区域图像。
  25. 根据权利要求17所述的装置,其特征在于,所述转动信息包括姿态、角速度、角加速度中的至少一种。
  26. 根据权利要求17所述的装置,其特征在于,所述控制终端的转动信息用于确定所述目标区域图像的中心和所述目标图像的中心之间的偏差。
  27. 根据权利要求26所述的装置,其特征在于,所述控制终端的转动信息包括所述控制终端的偏航转动信息和俯仰转动信息,所述偏航转动信息和俯仰转动信息分别用于确定所述目标区域图像的中心和所述目标图像的中心在U轴和V轴上的偏差。
  28. 根据权利要求17所述的装置,其特征在于,所述处理器用于根据所 述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,包括:
    在工作模式是第一工作模式时,根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像。
  29. 根据权利要求28所述的装置,其特征在于,所述处理器还用于:
    在工作模式是第二工作模式时,将拍摄装置输出的目标图像发送给控制终端,以使控制终端显示所述目标图像。
  30. 根据权利要求17所述的装置,其特征在于,所述目标区域图像的尺寸是根据所述可移动平台的移动速度、可移动平台与所述控制终端之间的距离、可移动平台与所述控制终端之间的通信质量中的至少一个确定的。
  31. 根据权利要求17所述的装置,其特征在于,所述目标区域图像的尺寸是固定不变的。
  32. 根据权利要求17所述的装置,其特征在于,所述控制终端包括头戴式显示装置或者手持式显示装置。
  33. 一种可移动平台,其特征在于,所述可移动平台包括采集并输出图像的拍摄装置、用于承载所述拍摄装置并调节所述拍摄装置的姿态的云台以及图像传输装置,所述装置包括:存储器和处理器;
    所述存储器,用于存储程序代码;
    所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
    获取控制终端的转动信息;
    根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像;
    根据所述转动信息控制所述云台运动,以调节所述拍摄装置的姿态,其中,被裁切的所述目标图像是所述云台响应于所述转动信息运动前所述拍摄装置输出的图像;
    将所述目标区域图像发送给所述控制终端。
  34. 根据权利要求33所述的可移动平台,其特征在于,所述处理器还用于:
    从所述拍摄装置输出的图像中确定与所述转动信息对应的目标图像。
  35. 根据权利要求34所述的可移动平台,其特征在于,所述处理器用于 从所述拍摄装置输出的图像中确定与所述转动信息对应的目标图像,包括:
    获取所述转动信息的特征时刻,其中,所述转动信息的特征时刻为所述控制终端的传感器采集所述转动信息的时刻或者所述可移动平台获取到所述转动信息的时刻;
    获取所述拍摄装置输出的图像的采集时刻;
    根据所述特征时刻和所述采集时刻,从所述拍摄装置输出的图像中确定所述目标图像。
  36. 根据权利要求35所述的可移动平台,其特征在于,所述转动信息的特征时刻早于或等于所述目标图像的采集时刻。
  37. 根据权利要求33-36中任一项所述的可移动平台,其特征在于,所述控制终端的转动信息包括所述控制终端的姿态,所述处理器用于根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,包括:
    获取所述拍摄装置采集所述目标图像时所述拍摄装置的姿态;
    根据所述控制终端的姿态和所述拍摄装置的姿态,从所述拍摄装置输出的目标图像中裁切目标区域图像。
  38. 根据权利要求37所述的可移动平台,其特征在于,所述处理器用于根据所述控制终端的姿态和所述拍摄装置的姿态,从所述拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的姿态和所述拍摄装置的姿态,确定目标区域图像的中心与所述目标图像中心之间的偏差;
    根据所述偏差从所述拍摄装置输出的目标图像中裁切目标区域图像。
  39. 根据权利要求33-36中任一项所述的可移动平台,其特征在于,所述控制终端的转动信息包括所述控制终端的角速度或者角加速度,所述处理器用于根据所述转动信息从拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像。
  40. 根据权利要求39所述的可移动平台,其特征在于,所述处理器用于根据所述控制终端的角速度或者角加速度,从拍摄装置输出的目标图像中裁切目标区域图像,包括:
    根据所述控制终端的角速度或者角加速度,确定目标区域图像的中心与所述目标图像中心之间的偏差;
    根据所述偏差从拍摄装置输出的目标图像中裁切目标区域图像。
  41. 根据权利要求33所述的可移动平台,其特征在于,所述转动信息包括姿态、角速度、角加速度中的至少一种。
  42. 根据权利要求33所述的可移动平台,其特征在于,所述控制终端的转动信息用于确定所述目标区域图像的中心和所述目标图像的中心之间的偏差。
  43. 根据权利要求42所述的可移动平台,其特征在于,所述控制终端的转动信息包括所述控制终端的偏航转动信息和俯仰转动信息,所述偏航转动信息和俯仰转动信息分别用于确定所述目标区域图像的中心和所述目标图像的中心在U轴和V轴上的偏差。
  44. 根据权利要求33所述的可移动平台,其特征在于,所述处理器用于根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像,包括:
    在工作模式是第一工作模式时,根据所述转动信息对所述拍摄装置输出的目标图像进行裁切处理,以得到目标区域图像。
  45. 根据权利要求44所述的可移动平台,其特征在于,所述处理器还用于:
    在工作模式是第二工作模式时,将拍摄装置输出的目标图像发送给控制终端,以使控制终端显示所述目标图像。
  46. 根据权利要求33所述的可移动平台,其特征在于,所述目标区域图像的尺寸是根据所述可移动平台的移动速度、可移动平台与所述控制终端之间的距离、可移动平台与所述控制终端之间的通信质量中的至少一个确定的。
  47. 根据权利要求33所述的可移动平台,其特征在于,所述目标区域图像的尺寸是固定不变的。
  48. 根据权利要求33所述的可移动平台,其特征在于,所述控制终端包括头戴式显示装置或者手持式显示装置。
  49. 一种计算机可读存储介质,其特征在于,其上存储有计算机程序,当所述计算机程序被执行时,实现如权利要求1至16任一项所述的方法。
PCT/CN2022/083116 2022-03-25 2022-03-25 可移动平台的图像传输方法、装置及设备 WO2023178670A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/083116 WO2023178670A1 (zh) 2022-03-25 2022-03-25 可移动平台的图像传输方法、装置及设备
CN202280047437.3A CN117730543A (zh) 2022-03-25 2022-03-25 可移动平台的图像传输方法、装置及设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/083116 WO2023178670A1 (zh) 2022-03-25 2022-03-25 可移动平台的图像传输方法、装置及设备

Publications (1)

Publication Number Publication Date
WO2023178670A1 true WO2023178670A1 (zh) 2023-09-28

Family

ID=88099589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/083116 WO2023178670A1 (zh) 2022-03-25 2022-03-25 可移动平台的图像传输方法、装置及设备

Country Status (2)

Country Link
CN (1) CN117730543A (zh)
WO (1) WO2023178670A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160086306A1 (en) * 2014-09-19 2016-03-24 Sony Computer Entertainment Inc. Image generating device, image generating method, and program
CN108093244A (zh) * 2017-12-01 2018-05-29 电子科技大学 一种远程随动立体视觉系统
CN111650967A (zh) * 2020-06-03 2020-09-11 南昌航空大学 一种用于影视拍摄的无人机及云台操控系统
CN112804547A (zh) * 2021-01-07 2021-05-14 河北交通职业技术学院 一种基于无人机vr摄像的交互式直播系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160086306A1 (en) * 2014-09-19 2016-03-24 Sony Computer Entertainment Inc. Image generating device, image generating method, and program
CN108093244A (zh) * 2017-12-01 2018-05-29 电子科技大学 一种远程随动立体视觉系统
CN111650967A (zh) * 2020-06-03 2020-09-11 南昌航空大学 一种用于影视拍摄的无人机及云台操控系统
CN112804547A (zh) * 2021-01-07 2021-05-14 河北交通职业技术学院 一种基于无人机vr摄像的交互式直播系统

Also Published As

Publication number Publication date
CN117730543A (zh) 2024-03-19

Similar Documents

Publication Publication Date Title
US20230212064A1 (en) Methods for camera movement compensation
WO2022032538A1 (zh) 全景拍摄方法、装置、系统及计算机可读存储介质
CN108028884B (zh) 摄像系统及摄像控制方法
US10999480B2 (en) Methods for camera movement compensation
KR20170136750A (ko) 전자 장치 및 그의 동작 방법
WO2019227441A1 (zh) 可移动平台的拍摄控制方法和设备
WO2018133589A1 (zh) 航拍方法、装置和无人机
WO2022057723A1 (zh) 一种视频的防抖处理方法及电子设备
WO2019195991A1 (zh) 运动轨迹确定、延时摄影方法、设备及机器可读存储介质
CN113645410B (zh) 图像采集方法、设备及机器可读存储介质
CN113056904A (zh) 图像传输方法、可移动平台及计算机可读存储介质
JPWO2015122052A1 (ja) 画像送信装置、情報処理端末、画像送信方法、情報処理方法、プログラム及び情報記憶媒体
CN109729245A (zh) 摄像装置、支撑装置及其控制方法
WO2022089341A1 (zh) 一种图像处理方法及相关装置
CN108419052B (zh) 一种多台无人机全景成像方法
JP6950793B2 (ja) 電子機器およびプログラム
WO2022061541A1 (zh) 控制方法、手持云台、系统及计算机可读存储介质
WO2023178670A1 (zh) 可移动平台的图像传输方法、装置及设备
CN111512625B (zh) 摄像设备及其控制方法和存储介质
WO2022041013A1 (zh) 控制方法、手持云台、系统及计算机可读存储介质
WO2018010473A1 (zh) 基于智能显示设备的无人机云台转动控制方法
WO2018010472A1 (zh) 控制无人机云台转动的智能显示设备及其控制系统
WO2022000211A1 (zh) 拍摄系统的控制方法、设备、及可移动平台、存储介质
KR20140111079A (ko) 카메라 제어 방법 및 그 시스템
CN112653830B (zh) 合影拍摄实现方法、可穿戴设备、计算机设备和存储介质

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202280047437.3

Country of ref document: CN