CN110786008A - Method for shooting image and mobile platform - Google Patents

Method for shooting image and mobile platform Download PDF

Info

Publication number
CN110786008A
CN110786008A CN201880038924.7A CN201880038924A CN110786008A CN 110786008 A CN110786008 A CN 110786008A CN 201880038924 A CN201880038924 A CN 201880038924A CN 110786008 A CN110786008 A CN 110786008A
Authority
CN
China
Prior art keywords
eye image
mobile platform
camera
observation
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880038924.7A
Other languages
Chinese (zh)
Inventor
陆真国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Shenzhen Dajiang Innovations Technology Co Ltd
Original Assignee
Shenzhen Dajiang Innovations Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dajiang Innovations Technology Co Ltd filed Critical Shenzhen Dajiang Innovations Technology Co Ltd
Publication of CN110786008A publication Critical patent/CN110786008A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

A method for capturing an image and a mobile platform are provided. The method comprises the following steps: controlling the mobile platform to move to a specified observation position; and controlling a camera carried on the mobile platform to shoot a left eye image and a right eye image of a scene within a preset visual angle range by taking the observation position as a center. Because the observation position that the mobile platform can reach is more extensive, consequently, the mobile platform can gather the image material of stereo image from better observation position for the acquisition of image material is less subject to the restraint of scene.

Description

Method for shooting image and mobile platform
Copyright declaration
The disclosure of this patent document contains material which is subject to copyright protection. The copyright is owned by the copyright owner. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the patent and trademark office official records and records.
Technical Field
The present disclosure relates to the field of image capturing, and more particularly, to a method for capturing an image and a mobile platform.
Background
Stereoscopic images (especially panoramic stereoscopic images) can provide depth information with respect to planar images, and therefore, can give an image observer a more realistic sense of immersion.
In order to obtain a stereoscopic image, it is necessary to obtain a left-eye image and a right-eye image within a certain viewing angle range using a photographing apparatus. In the conventional art, a photographing apparatus generally needs to be operated by a user to reach a designated observation point. Therefore, image material that can be captured by conventional techniques is limited to areas or scenes that can be reached by the user.
Disclosure of Invention
The application provides a method for shooting an image and a mobile platform, which can enable the acquisition of image materials to be less constrained by scenes.
In a first aspect, a method of capturing an image is provided, including: controlling the mobile platform to move to a specified observation position; and controlling a camera carried on the mobile platform to shoot a left eye image and a right eye image of a scene within a preset visual angle range by taking the observation position as a center.
In a second aspect, a mobile platform is provided, where the mobile platform includes a control system, and a camera is mounted on the mobile platform, and the control system is configured to control the mobile platform to move to a specified observation position, and control the camera to capture a left-eye image and a right-eye image of a scene within a preset view angle range with the observation position as a center.
In a third aspect, a computer-readable storage medium is provided, having stored thereon instructions for performing the method of the first aspect.
In a fourth aspect, a computer program product is provided, comprising instructions for performing the method of the first aspect.
In a fifth aspect, there is provided an apparatus for capturing an image, comprising means for performing the steps of the method of the first aspect.
Because the observation position that the mobile platform can reach is more extensive, consequently, the mobile platform can gather the image material of stereo image from better observation position for the acquisition of image material is less subject to the restraint of scene.
Drawings
Fig. 1 is a schematic flowchart of a method for capturing an image according to an embodiment of the present disclosure.
Fig. 2 is an exemplary view of an observation position and a preset viewing angle range.
Fig. 3 is an exemplary diagram of a shooting position corresponding to the example of fig. 2.
Fig. 4 is a schematic structural diagram of a mobile platform provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a method for capturing an image according to an embodiment of the present disclosure. The method may be performed by a mobile platform. The mobile platform may be, for example, a mobile device that is equipped with (or includes) a camera and that can be remotely controlled by a user. For example, the mobile platform may be an unmanned device, such as an unmanned aerial vehicle, an unmanned vehicle, or a mobile robot. The method of fig. 1 includes steps 110 to 120, and each step of fig. 1 is described in detail below.
In step 110, the mobile platform is controlled to move to the specified observation position.
In step 120, the camera of the mobile platform is controlled to capture a left-eye image and a right-eye image of the scene within a preset viewing angle range centered on the observation position.
For example, the camera of the mobile platform may be controlled to reach a predetermined shooting position or positions, and the left-eye image and the right-eye image of the scene within the preset viewing angle range may be shot at the preset shooting position or positions.
The manner in which the camera of the mobile platform reaches the preset shooting position is related to the type of the mobile platform, which is not limited in the embodiment of the present application. Using unmanned aerial vehicle as an example, then can utilize unmanned aerial vehicle's flight control system to control unmanned aerial vehicle to fly to this observation position. Then, the body of the unmanned aerial vehicle can be controlled to hover at the observation position, and the camera is controlled to reach a preset shooting position by using the holder; or the flight control system of the unmanned aerial vehicle can be used for controlling the body of the unmanned aerial vehicle to move, so that the camera is controlled to reach the preset shooting position; alternatively, a combination of the above two modes is also possible.
The preset view angle range is not specifically limited, and can be predetermined or designated by a user according to actual needs. The preset viewing angle range may be, for example, a 180 degree range, a 210 degree range, a 270 degree range, or a 360 degree range centered on the observation position. When the preset view angle range is 360 degrees, the stereoscopic image of the scene within the preset view angle range may be referred to as a panoramic stereoscopic image.
The left eye image (or right eye image) of the scene within the preset viewing angle range may be a complete image, or may be a plurality of left eye images (or right eye images) observed from a plurality of observation angles within the preset viewing angle range. Taking panoramic shooting as an example, a left eye image (or a right eye image) can be shot every 15 degrees, and 24 left eye images (or right eye images) are obtained; alternatively, the camera may be controlled to take a round shot and directly output a left-eye panoramic image (or a right-eye panoramic image).
Because the observation position that the mobile platform can reach is more extensive, consequently, the mobile platform can gather the image material of stereo image from better observation position for the acquisition of image material is less subject to the restraint of scene.
Taking panoramic stereo image shooting as an example, traditional panoramic stereo image shooting requires a large number of cameras to be installed at fixed positions for shooting. The shooting equipment is complex in structure and operation and high in manufacturing cost, is heavy, has large limitation on use scenes, and is only used for shooting large-scale sports events or live broadcasting the large-scale sports events.
The method for shooting the image provided by the embodiment of the application can be used for shooting the panoramic stereo image (only the preset visual angle range is set to be 360-degree panoramic), but compared with the traditional shooting device for the panoramic stereo image, the mobile platform is relatively simple to operate and control, and the price is cheaper. In addition, the mobile platform can be remotely controlled by a user, so that a better observation position can be reached, and acquisition of the panoramic stereo image is less limited by the constraints of the scene.
The left-eye image and the right-eye image of the scene within the preset viewing angle range centered on the observation position are obtained, via step 120. Next, a stereoscopic image of the scene within the preset viewing angle range may be generated according to the left-eye image and the right-eye image of the scene within the preset viewing angle range. For example, assuming that a plurality of left-eye images (or right-eye images) are captured by a camera, the plurality of left-eye images (or right-eye images) may be combined into a complete left-eye image (or right-eye image), and then the complete left-eye image and right-eye image are registered to generate a stereoscopic image of a scene within a preset viewing angle range.
The generation process of the stereoscopic image of the scene within the preset viewing angle range may be performed by the mobile platform on-line (for example, the stereoscopic image may be generated directly from the left and right eye images by the image processing system of the mobile platform), or may be performed by the image processing system off-line. For example, the mobile platform may utilize image rendering techniques to transfer left-eye and right-eye images of a scene within a preset viewing angle range to a remote workstation and then synthesize a stereoscopic image at the remote workstation or other image processing device. The mode of generating the three-dimensional image on line can enrich the functions of the mobile platform and simplify the operation of the user.
The implementation of step 120 can be varied and is described in detail below with reference to specific embodiments.
First, the preset viewing angle range may include one or more viewing angles from (or centered on) the observation position. The angular interval between adjacent observation angles may be a preset value (e.g., 45 degrees), or may be input by a user. Smaller angular intervals can achieve better stereoscopic effect.
Taking fig. 2 as an example, the viewing position is the position of the point O, and the preset viewing angle range is a 360-degree panoramic range with the point O as the center. As can be seen from fig. 2, a plurality of observation angles OPi (i is a positive integer from 1 to 8, and an observation angle may also be referred to as an observation direction, i.e., a direction indicated by an arrow from point O in fig. 2) are provided within a 360-degree panoramic range, and adjacent observation angles are 45 degrees apart from each other.
In order to obtain a stereoscopic image of a scene within a preset viewing angle range, a left-eye image and a right-eye image corresponding to each observation angle need to be acquired. There may be a plurality of ways to acquire the left-eye image and the right-eye image corresponding to each observation angle. For example, the camera position of the mobile platform may be empirically controlled online by the user of the mobile platform, and the camera may be controlled to capture the left-eye image and the right-eye image corresponding to each observation angle at the appropriate position considered by the user. For another example, the shooting positions of the left-eye image and the right-eye image corresponding to each observation angle may be automatically determined by the mobile platform according to the preset baseline distance and observation angle. This implementation is described in detail below with reference to specific embodiments.
Optionally, step 120 may include: the camera of the mobile platform is controlled to shoot the left eye image and the right eye image corresponding to the plurality of observation angles within the preset visual angle range, wherein the shooting positions of the left eye image and the right eye image corresponding to the plurality of observation angles are all located on a circumferential line of a circle which takes the observation position as the center and takes the preset baseline distance as the diameter.
It should be understood that the baseline distance may be used to characterize the interpupillary distance between the left and right eyes. Taking the camera of the mobile platform as a binocular camera for example, the preset baseline distance may be a distance between the binocular cameras (e.g., a distance between optical centers of the binocular cameras). Taking the camera of the mobile platform as a monocular camera for example, the baseline distance may be input by a user of the mobile platform, or may be an empirical value or a default value. The empirical value or default value may be, for example, an average interpupillary distance between the left and right eyes of the person obtained from statistics.
When the left-eye image and the right-eye image corresponding to each observation angle are shot, the shooting positions of the left-eye image and the right-eye image corresponding to each observation angle on the circumference line of the circle can be determined according to relevant parameters of the circle (such as the radius or the diameter of the circle and the position of the center of the circle). Taking the observation angle as OP1 shown in fig. 3 as an example, the cameras of the mobile platform can be controlled to enter the position L and the position R respectively, and respectively shoot one image along the directions of LP1 and RP1, the image shot along LP1 is the left-eye image corresponding to the observation position OP1, and the image shot along RP1 is the right-eye image corresponding to the observation position OP 1. The shooting modes of the left-eye image and the right-eye image corresponding to other observation angles are the same, and are not described in detail here. A circle C2 in fig. 3 indicates an imaging plane of the panoramic image.
In the above embodiment, the shooting position of the camera of the mobile platform may be controlled on the circumference of a circle, but this does not mean that the camera must move along the circumference of the circle. The camera may move along the circumference of the circle, or in other ways. Taking fig. 3 as an example, the camera may move around a circle C1, taking one or more images each time a capture position is reached. Alternatively, after the camera captures the left-eye image corresponding to the observation angle OP1 at the position L, the camera may move linearly to the position R and capture the right-eye image corresponding to the observation angle OP1 at the position R. Setting the motion trajectory of the camera as a circular line can simplify the control of the camera, and various implementations of this approach are exemplified below.
As an example, assuming that the camera of the mobile platform is a monocular camera, the monocular camera may be controlled to rotate for the first time along a circumferential line of a circle to capture one of a left eye image and a right eye image corresponding to a plurality of observation angles; and controlling the monocular camera to rotate again along the circumferential line of the circle to capture the other of the left-eye image and the right-eye image corresponding to the plurality of observation angles.
Still taking fig. 3 as an example, the monocular camera may be controlled to rotate for a circle along the circumferential line of the circle, and left-eye images corresponding to 8 observation angles are sequentially photographed; and then controlling the monocular camera to rotate around the circle again for a circle, and shooting right eye images corresponding to the 8 observation angles in sequence.
As another example, if the camera of the mobile platform is a binocular camera, the binocular camera of the mobile platform may be controlled to complete photographing of the left eye image and the right eye image of the scene within the preset viewing angle range in a process of rotating once along a circumferential line of a circle.
Still taking fig. 3 as an example, if the camera is a binocular camera and the diameter of the circle is the baseline distance of the binocular camera, when the left-eye camera is located at the position L, the right-eye camera is located at the position R, and then the left-eye camera and the right-eye camera can be operated simultaneously to capture the left-eye image and the right-eye image corresponding to the observation angle OP1 at a time. Then, the binocular camera may rotate one rotation around the center of the baseline distance, thereby completing photographing of 8 observation angles.
The control modes given by the two examples have the advantages of relatively simple control logic and easy implementation.
As yet another example, if an angle difference between a first observation angle and a second observation angle of the plurality of observation angles is 180 degrees, the camera of the mobile platform may be controlled to move to the shooting position of the left-eye image corresponding to the first observation angle; and controlling a camera of the mobile platform to shoot (shoot once) the left eye image corresponding to the first observation angle and the right eye image corresponding to the second observation angle at the shooting position of the left eye image corresponding to the first observation angle.
Still taking fig. 3 as an example, assuming that the first observation angle is observation angle OP1 in fig. 3, the second observation angle is observation angle OP5 in fig. 3. When the camera is at the position L, the camera may be controlled to photograph a left-eye image corresponding to the observation angle OP1 along LP1, and then photograph a right-eye image corresponding to the observation angle OP5 along LP 5. The shooting of images corresponding to different observation angles is finished at the same position, so that the shooting efficiency can be improved.
The embodiment of the application also provides a mobile platform. As shown in fig. 4, the mobile platform 400 includes a control system 410 and a camera 420. The control system 410 may be used to control the mobile platform 400 to move to a specified observation position and control the camera 420 to capture left-eye and right-eye images of a scene within a preset viewing angle range centered at the observation position.
Optionally, the mobile platform 400 further includes an image processing system for generating a stereoscopic image of the scene within the preset viewing angle range according to the left eye image and the right eye image of the scene within the preset viewing angle range.
Optionally, the preset viewing angle range is 360 degrees.
Optionally, the control system 410 is configured to control the camera 420 to capture left-eye images and right-eye images corresponding to a plurality of observation angles within a preset viewing angle range, where capture positions of the left-eye images and right-eye images corresponding to the plurality of observation angles are all located on a circumferential line of a circle with the observation position as a center and a preset baseline distance as a diameter.
Optionally, the control system 410 is configured to control the camera 420 to rotate along a circumferential line of a circle to capture a left-eye image and a right-eye image of the scene within a preset viewing angle range.
Optionally, the camera 420 is a monocular camera 420, and the control system 410 is configured to control the monocular camera 420 to rotate for the first time along a circumferential line of a circle to capture one of a left-eye image and a right-eye image corresponding to a plurality of observation angles; the monocular camera 420 is controlled to rotate again along the circumferential line of the circle to photograph the other of the left-eye image and the right-eye image corresponding to the plurality of observation angles.
Optionally, the camera 420 is a binocular camera 420, and the control system 410 is configured to control the binocular camera 420 of the mobile platform 400 to complete the photographing of the left eye image and the right eye image of the scene within the preset viewing angle range in a process of rotating once along a circumferential line of a circle.
Optionally, an angle difference between a first observation angle and a second observation angle in the plurality of observation angles is 180 degrees, and the control system 410 is configured to control the camera 420 to move to the shooting position of the left-eye image corresponding to the first observation angle; the control camera 420 captures a left-eye image corresponding to the first observation angle and a right-eye image corresponding to the second observation angle at the capturing position of the left-eye image corresponding to the first observation angle.
Optionally, the diameter of the circle is a default value or determined by input from a user of the mobile platform 400.
Optionally, the interval between adjacent ones of the plurality of observation angles is a preset value or determined by user input of the mobile platform 400.
Optionally, the mobile platform is a drone.
Optionally, the control system 410 is further configured to control the drone to hover at the observation location; the camera 420 is controlled to be at different shooting positions by the pan-tilt.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware or any other combination. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a Digital Video Disk (DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (26)

1. A method of capturing an image, comprising:
controlling the mobile platform to move to a specified observation position;
and controlling a camera carried on the mobile platform to shoot a left eye image and a right eye image of a scene within a preset visual angle range by taking the observation position as a center.
2. The method of claim 1, further comprising:
and generating a stereoscopic image of the scene in the preset visual angle range according to the left eye image and the right eye image of the scene in the preset visual angle range.
3. The method according to claim 1 or 2, wherein the preset viewing angle range is 360 degrees.
4. The method according to any one of claims 1-3, wherein said controlling the camera of the mobile platform to capture left-eye and right-eye images of the scene within a preset viewing angle range centered on the observation position comprises:
and controlling a camera of the mobile platform to shoot a left eye image and a right eye image corresponding to a plurality of observation angles within the preset visual angle range, wherein shooting positions of the left eye image and the right eye image corresponding to the plurality of observation angles are all located on a circumferential line of a circle which takes the observation position as a center and takes a preset baseline distance as a diameter.
5. The method of claim 4, wherein the controlling the camera of the mobile platform to capture the left-eye image and the right-eye image corresponding to a plurality of observation angles within the preset viewing angle range comprises:
and controlling the camera of the mobile platform to rotate along the circumferential line of the circle so as to shoot the left eye image and the right eye image corresponding to the plurality of observation angles.
6. The method of claim 5, wherein the camera of the mobile platform is a monocular camera,
the controlling the camera of the mobile platform to rotate along the circumference of the circle to shoot the left eye image and the right eye image corresponding to a plurality of observation angles includes:
controlling the monocular camera to rotate for the first time along the circle to shoot one of a left eye image and a right eye image corresponding to the plurality of observation angles;
and controlling the monocular camera to rotate along the circle again so as to shoot the other one of the left eye image and the right eye image corresponding to the plurality of observation angles.
7. The method of claim 4, wherein the camera of the mobile platform is a binocular camera,
the controlling the camera of the mobile platform to rotate along the circumference of the circle to shoot the left eye image and the right eye image corresponding to a plurality of observation angles includes:
and controlling the binocular camera to complete the shooting of the left eye image and the right eye image corresponding to the plurality of observation angles in the process of rotating once along the circumferential line of the circle.
8. The method of claim 4, wherein an angular difference between a first observation angle and a second observation angle of the plurality of observation angles is 180 degrees,
the controlling the camera of the mobile platform to shoot the left-eye image and the right-eye image corresponding to the plurality of observation angles within the preset visual angle range includes:
controlling a camera of the mobile platform to move to a shooting position of a left eye image corresponding to the first observation angle;
and controlling a camera of the mobile platform to shoot the left eye image corresponding to the first observation angle and the right eye image corresponding to the second observation angle at the shooting position of the left eye image corresponding to the first observation angle.
9. The method according to any of claims 4-8, wherein the diameter of the circle is a default value or determined by input from a user of the mobile platform.
10. The method of any one of claims 1-9, wherein the separation between adjacent observation angles within the preset range of viewing angles is a preset value or is determined by user input to the mobile platform.
11. The method of any one of claims 1-10, wherein the mobile platform is a drone.
12. The method of claim 11, further comprising:
controlling the drone to hover at the observation location;
and controlling the camera of the unmanned aerial vehicle to be at different shooting positions by using the holder.
13. The mobile platform is characterized by comprising a control system, wherein a camera is mounted on the mobile platform, and the control system is used for controlling the mobile platform to move to a specified observation position and controlling the camera to shoot a left eye image and a right eye image of a scene within a preset view angle range with the observation position as the center.
14. The mobile platform of claim 13, further comprising an image processing system configured to generate a stereoscopic image of the scene within the predetermined viewing angle range according to the left-eye image and the right-eye image of the scene within the predetermined viewing angle range.
15. The mobile platform of claim 13 or 14, wherein the predetermined viewing angle range is 360 degrees.
16. The mobile platform of any one of claims 13-15, wherein the control system is configured to control the camera to capture left-eye and right-eye images corresponding to a plurality of observation angles within the preset viewing angle range, and wherein the capture positions of the left-eye and right-eye images corresponding to the plurality of observation angles are located on a circumferential line of a circle having a diameter of a preset baseline distance and centered on the observation position.
17. The mobile platform of claim 16, wherein the control system is configured to control the camera to rotate along a circumferential line of the circle to capture left-eye images and right-eye images of the scene corresponding to the plurality of observation angles.
18. The mobile platform of claim 17, wherein the camera is a monocular camera, and the control system is configured to control the monocular camera to rotate for the first time along the circle to capture one of a left eye image and a right eye image corresponding to a plurality of the observation angles; and controlling the monocular camera to rotate along the circle again so as to shoot the other one of the left eye image and the right eye image corresponding to the plurality of observation angles.
19. The mobile platform of claim 17, wherein the camera is a binocular camera, and the control system is configured to control the binocular camera to complete the capturing of the left-eye image and the right-eye image corresponding to the plurality of observation angles during one rotation along a circumferential line of the circle.
20. The mobile platform of claim 17, wherein an angular difference between a first observation angle and a second observation angle of the plurality of observation angles is 180 degrees, and the control system is configured to control the camera to move to a capture position for a left eye image corresponding to the first observation angle; and controlling the camera to shoot the left eye image corresponding to the first observation angle and the right eye image corresponding to the second observation angle at the shooting position of the left eye image corresponding to the first observation angle.
21. The mobile platform of any one of claims 16-20, wherein the diameter of the circle is a default value or determined by input from a user of the mobile platform.
22. The mobile platform of any one of claims 13-21, wherein a separation between adjacent observation angles within the preset range of perspectives is a preset value or is determined by a user input to the mobile platform.
23. The method of any one of claims 13-22, wherein the mobile platform is a drone.
24. The mobile platform of claim 23, wherein the control system is further configured to control the drone to hover at the observation location; and controlling the camera to be at different shooting positions by using the holder.
25. A computer-readable storage medium having stored thereon instructions for performing the method of any one of claims 1-12.
26. A computer program product comprising instructions for performing the method of any one of claims 1-12.
CN201880038924.7A 2018-07-25 2018-07-25 Method for shooting image and mobile platform Pending CN110786008A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/097053 WO2020019201A1 (en) 2018-07-25 2018-07-25 Method for capturing image and mobile platform

Publications (1)

Publication Number Publication Date
CN110786008A true CN110786008A (en) 2020-02-11

Family

ID=69180335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880038924.7A Pending CN110786008A (en) 2018-07-25 2018-07-25 Method for shooting image and mobile platform

Country Status (2)

Country Link
CN (1) CN110786008A (en)
WO (1) WO2020019201A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111935474A (en) * 2020-08-17 2020-11-13 广东申义实业投资有限公司 Spread type light-emitting turntable and image shooting method thereof
WO2022198558A1 (en) * 2021-03-25 2022-09-29 深圳市大疆创新科技有限公司 Movable platform, and control method and apparatus for movable platform assembly
WO2022228119A1 (en) * 2021-04-30 2022-11-03 纵深视觉科技(南京)有限责任公司 Image acquisition method and apparatus, electronic device, and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295927A1 (en) * 2009-04-17 2010-11-25 The Boeing Company System and method for stereoscopic imaging
CN102111629A (en) * 2009-12-24 2011-06-29 索尼公司 Image processing apparatus, image capturing apparatus, image processing method, and program
CN205019710U (en) * 2015-08-17 2016-02-10 苏州澎逸运动用品有限公司 Flight experience system
CN106303497A (en) * 2016-08-12 2017-01-04 南方科技大学 Virtual reality content generation method and device
CN108289212A (en) * 2018-01-23 2018-07-17 北京易智能科技有限公司 A kind of unmanned plane binocular stereo imaging and the device and method with human eye real-time, interactive

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295927A1 (en) * 2009-04-17 2010-11-25 The Boeing Company System and method for stereoscopic imaging
CN102111629A (en) * 2009-12-24 2011-06-29 索尼公司 Image processing apparatus, image capturing apparatus, image processing method, and program
CN205019710U (en) * 2015-08-17 2016-02-10 苏州澎逸运动用品有限公司 Flight experience system
CN106303497A (en) * 2016-08-12 2017-01-04 南方科技大学 Virtual reality content generation method and device
CN108289212A (en) * 2018-01-23 2018-07-17 北京易智能科技有限公司 A kind of unmanned plane binocular stereo imaging and the device and method with human eye real-time, interactive

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111935474A (en) * 2020-08-17 2020-11-13 广东申义实业投资有限公司 Spread type light-emitting turntable and image shooting method thereof
WO2022198558A1 (en) * 2021-03-25 2022-09-29 深圳市大疆创新科技有限公司 Movable platform, and control method and apparatus for movable platform assembly
WO2022228119A1 (en) * 2021-04-30 2022-11-03 纵深视觉科技(南京)有限责任公司 Image acquisition method and apparatus, electronic device, and medium

Also Published As

Publication number Publication date
WO2020019201A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
CN109064545B (en) Method and device for data acquisition and model generation of house
CN105684415A (en) Spherical omnidirectional video-shooting system
JP7146662B2 (en) Image processing device, image processing method, and program
US20080158345A1 (en) 3d augmentation of traditional photography
CN107205122A (en) The live camera system of multiresolution panoramic video and method
JP2017505565A (en) Multi-plane video generation method and system
CN110786008A (en) Method for shooting image and mobile platform
JP2005092121A (en) Photographing auxiliary device, image processing method, image processing device, computer program, and recording medium storing program
CN104902263A (en) System and method for showing image information
WO2018196658A1 (en) Virtual reality media file generation method and device, storage medium, and electronic device
EP2685707A1 (en) System for spherical video shooting
CN105376554B (en) 3D cameras mechanism, the mobile device with the mechanism and control method
CN105791688A (en) Mobile terminal and imaging method
CN103488040A (en) Stereo panoramic image synthesis method and related stereo camera
KR20190062794A (en) Image merging method and system using viewpoint transformation
JP7269910B2 (en) Shooting control method, device, storage medium and system for intelligent shooting system
KR20150091064A (en) Method and system for capturing a 3d image using single camera
US20130021448A1 (en) Stereoscopic three-dimensional camera rigs
CN108614636A (en) A kind of 3D outdoor scenes VR production methods
CN115769592A (en) Image acquisition method, image acquisition device, electronic device, and medium
CN108419052B (en) Panoramic imaging method for multiple unmanned aerial vehicles
CN108513122B (en) Model adjusting method and model generating device based on 3D imaging technology
CN205754586U (en) Coordinate the system of shooting image
KR20160136893A (en) Mobile time slice camera
CN110086994A (en) A kind of integrated system of the panorama light field based on camera array

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200211

RJ01 Rejection of invention patent application after publication