WO2021217584A1 - 相机调用方法以及设备 - Google Patents

相机调用方法以及设备 Download PDF

Info

Publication number
WO2021217584A1
WO2021217584A1 PCT/CN2020/088297 CN2020088297W WO2021217584A1 WO 2021217584 A1 WO2021217584 A1 WO 2021217584A1 CN 2020088297 W CN2020088297 W CN 2020088297W WO 2021217584 A1 WO2021217584 A1 WO 2021217584A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
lens
index
calling
attribute
Prior art date
Application number
PCT/CN2020/088297
Other languages
English (en)
French (fr)
Inventor
杨健
江杨
李兴
郑文强
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/088297 priority Critical patent/WO2021217584A1/zh
Publication of WO2021217584A1 publication Critical patent/WO2021217584A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/30Supply or distribution of electrical power
    • B64U50/34In-flight charging

Definitions

  • This application relates to the field of camera technology, and in particular to a method and device for invoking a camera.
  • the UAV usually includes a power system, a flight control system, a frame, and a gimbal carried on the frame.
  • the power system provides power for the drone
  • the flight controller is used to measure and control the flight status of the drone
  • the gimbal is used to carry the load.
  • the load type can be a camera.
  • Each pan/tilt is configured with two device identification deviceIDs, and is configured with a corresponding table of the pan/tilt and camera indexes.
  • obtain the camera call instruction input by the user obtain the camera index and call content from the camera call instruction, determine the PTZ ID according to the camera index and the above corresponding relationship, and then use the first deviceID of the PTZ as the camera deviceID, and then generate a machine call instruction based on the deviceID and the calling content to complete the camera call.
  • the above calling method is only for a single-lens camera. When the load is a dual-lens camera, the camera is divided into two single-lens cameras according to the number of lenses.
  • the calling method of one single-lens camera is the same as the above method, so I won’t repeat it here. .
  • Call for another single-lens camera configure the camera index for the single-lens camera, and build the correspondence between the camera index and the deviceID.
  • the single-lens camera needs to be called, obtain the camera call instruction entered by the user, obtain the camera index and the content of the transfer from the camera call instruction, determine the deviceID according to the camera index and the corresponding relationship, and then generate the machine call instruction according to the deviceID and the calling content, namely Can complete the camera call.
  • the embodiments of the present application provide a method for invoking a camera and related equipment.
  • this application provides a method for invoking a camera, including:
  • the request type includes the first type that calls the common attributes of all shots and the second type that calls the private attributes of each shot;
  • the camera call instruction is generated according to the request type, call attribute and call index.
  • this application provides a method for invoking a camera.
  • the method includes:
  • this application provides a control terminal, including:
  • the obtaining unit is used to obtain a call request for calling the camera, where the camera includes a main lens and multiple auxiliary lenses, and the call request includes a call attribute and a call index;
  • the determining unit is used to determine the request type of the call request according to the call attribute, where the request type is the first type that calls the common attributes of all shots and the second type that calls the private attributes of each shot;
  • the generating unit is used to generate the camera call instruction according to the request type, call index and call attribute.
  • this application provides a camera calling device, including:
  • the receiving unit is used to receive the camera call instruction from the control terminal;
  • the generating unit is used to parse the camera call instruction to generate the device identification and call attributes;
  • the determining unit is used to determine the calling lens according to the calling attribute and device identification;
  • the setting unit is used to set the attributes of the calling lens according to the calling attributes.
  • this application provides an electronic device, including:
  • Memory used to store programs
  • the processor is configured to execute the program stored in the memory, and when the program is executed, the processor is configured to execute the camera calling method according to the embodiment of the present application in the first aspect or the second aspect.
  • the present application provides a calling system, including the control terminal related to the third aspect and/or the camera calling device related to the fourth aspect.
  • an embodiment of the present invention provides a computer-readable storage medium, including instructions, which when run on a computer, cause the computer to execute the camera invoking method according to the embodiment of the present application in the first aspect or the second aspect.
  • an embodiment of the present invention provides a computer program product including instructions, which when run on a computer, cause the computer to execute the camera invoking method according to the embodiment of the present application in the first aspect or the second aspect.
  • the embodiments of the application provide a method and device for invoking a camera.
  • the invoking attributes in the user invoking request input by the control terminal, it is determined whether the request type of the invoking request is to call the common attributes of each lens of the camera or to call the private attributes of each lens of the camera.
  • the method for generating the calling instruction from the user calling request is determined.
  • the call attributes in the call request if the shared attributes are called, the call attributes of the shared attributes are configured in the request. If the private attributes are called, the call attributes of the private attributes are configured in the request. Only one call request is required. Complete the call to the common attributes of multiple shots, simplify the call process.
  • FIG. 1 is a schematic architecture diagram of an unmanned aerial system provided by an embodiment of the application
  • Figure 2 is a schematic diagram of camera calling in related technologies
  • FIG. 3 is a schematic diagram of a camera calling method provided by an embodiment of the application.
  • Figure 4 is a schematic diagram of an application scenario provided by an embodiment of the application.
  • FIG. 5 is a schematic flowchart of a camera calling method provided by an embodiment of the application.
  • FIG. 6 is a schematic diagram of the process of determining the device identifier according to the calling index according to an embodiment of the application.
  • FIG. 7 is a schematic diagram of the software architecture of the control terminal provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of the relationship between a multi-lens camera and a device identifier provided by an embodiment of the application;
  • FIG. 9 is an architecture diagram of camera objects and lens objects in the ShareLib layer provided by an embodiment of the application.
  • FIG. 10 is an architecture diagram of a camera object and a lens object in the SDK layer provided by an embodiment of the application.
  • FIG. 11 is a schematic diagram of the principle of a camera calling method provided by an embodiment of the application.
  • FIG. 12 is a schematic flowchart of a camera calling method provided by an embodiment of the application.
  • FIG. 13 is a schematic structural diagram of a control terminal provided by an embodiment of the application.
  • FIG. 14 is a schematic structural diagram of a camera calling device provided by an embodiment of the application.
  • 15 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 16 is a schematic structural diagram of a camera calling system provided by an embodiment of the application.
  • the embodiments of the present application provide a method, device, and storage medium for invoking a camera.
  • the embodiments of the present application can be applied to various types of drones.
  • the drone can be a small or large drone.
  • the drone may be a rotorcraft, for example, a multi-rotor drone that is propelled through the air by a plurality of propulsion devices.
  • the embodiments of the present application are not limited to this. It will be obvious to the skilled person that other types of drones can be used without restrictions.
  • Fig. 1 is a schematic architecture diagram of an unmanned aerial system provided by an embodiment of the application.
  • the unmanned aerial system 100 may include a drone 110, a display device 130, and a control terminal 140.
  • the UAV 110 may include a power system 150, a flight control system 160, a frame, and a pan/tilt 120 carried on the frame.
  • the drone 110 can wirelessly communicate with the control terminal 140 and the display device 130.
  • the drone 110 further includes a battery (not shown in the figure), and the battery provides electrical energy for the power system 150.
  • the UAV 110 may be an agricultural UAV or an industrial application UAV, and there is a need for cyclic operation. Correspondingly, the battery also has the need for cyclic operation.
  • the frame may include a fuselage and a tripod (also called a landing gear).
  • the fuselage may include a center frame and one or more arms connected to the center frame, and the one or more arms extend radially from the center frame.
  • the tripod is connected with the fuselage and used for supporting the UAV 110 when it is landed.
  • the power system 150 may include one or more electronic governors (referred to as ESCs) 151, one or more propellers 153, and one or more motors 152 corresponding to the one or more propellers 153, wherein the motors 152 are connected to Between the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are arranged on the arm of the UAV 110; the electronic governor 151 is used to receive the driving signal generated by the flight control system 160 and provide driving according to the driving signal Current is supplied to the motor 152 to control the speed of the motor 152.
  • the motor 152 is used to drive the propeller to rotate, thereby providing power for the flight of the drone 110, and the power enables the drone 110 to realize one or more degrees of freedom of movement.
  • the drone 110 may rotate about one or more rotation axes.
  • the aforementioned rotation axis may include a roll axis (Roll), a yaw axis (Yaw), and a pitch axis (pitch).
  • the motor 152 may be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brushed motor.
  • the flight control system 160 may include a flight controller 161 and a sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the drone, that is, the position information and state information of the drone 110 in space, such as three-dimensional position, three-dimensional angle, three-dimensional velocity, three-dimensional acceleration, and three-dimensional angular velocity.
  • the sensing system 162 may include, for example, at least one of sensors such as a gyroscope, an ultrasonic sensor, an electronic compass, an inertial measurement unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system may be the Global Positioning System (GPS).
  • the flight controller 161 is used to control the flight of the drone 110, for example, it can control the flight of the drone 110 according to the attitude information measured by the sensor system 162. It should be understood that the flight controller 161 can control the drone 110 according to pre-programmed program instructions, and can also control the drone 110 by responding to one or more remote control signals from the control terminal 140.
  • the pan/tilt head 120 may include a motor 122.
  • the pan/tilt is used to carry a load, and the load may be the camera 123.
  • the flight controller 161 can control the movement of the pan/tilt 120 through the motor 122.
  • the pan/tilt head 120 may further include a controller for controlling the movement of the pan/tilt head 120 by controlling the motor 122.
  • the pan-tilt 120 may be independent of the drone 110 or a part of the drone 110.
  • the motor 122 may be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brushed motor.
  • the pan/tilt may be located on the top of the drone or on the bottom of the drone.
  • the photographing device 123 may be, for example, a device for capturing images, such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller and take pictures under the control of the flight controller.
  • the imaging device 123 of this embodiment at least includes a photosensitive element, and the photosensitive element is, for example, a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge-coupled Device (CCD) sensor. It can be understood that the camera 123 can also be directly fixed to the drone 110, so the pan/tilt 120 can be omitted.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • the display device 130 is located on the ground end of the unmanned aerial vehicle 100, can communicate with the drone 110 in a wireless manner, and can be used to display the attitude information of the drone 110.
  • the image photographed by the photographing device 123 may also be displayed on the display device 130. It should be understood that the display device 130 may be an independent device or integrated in the control terminal 140.
  • the control terminal 140 is located on the ground end of the unmanned aerial vehicle 100, and can communicate with the drone 110 in a wireless manner for remote control of the drone 110.
  • the camera call is realized through the device ID of the PTZ, where each PTZ is configured with two device IDs, deviceID.
  • the specific process is as follows: When the UAV system is initialized, a correspondence table of the PTZ and the camera index is configured. When it is necessary to call a single-lens camera, obtain the camera call instruction input by the user, obtain the camera index and call content from the camera call instruction, determine the PTZ ID according to the camera index and the above corresponding relationship, and then the first deviceID of the PTZ As the camera deviceID, a machine call instruction is generated according to the deviceID and the calling content to complete the camera call.
  • the above calling method is only for a single-lens camera.
  • the load is a dual-lens camera, the camera is divided into two single-lens cameras according to the number of lenses.
  • the calling method of one single-lens camera is the same as the above method, so I won’t repeat it here. .
  • Call for another single-lens camera configure the camera index for the single-lens camera, and build the correspondence between the camera index and the deviceID.
  • the single-lens camera When the single-lens camera needs to be called, obtain the camera call instruction entered by the user, obtain the camera index and the content of the transfer from the camera call instruction, determine the deviceID according to the camera index and the corresponding relationship, and then generate the machine call instruction according to the deviceID and the calling content, namely Can complete the camera call.
  • the multi-lens camera ZENMUSE XT2 is taken as an example to illustrate the camera calling method in the related technology.
  • it is abstracted into two single-lens cameras at the software layer. They are visible light lens cameras and thermal imaging lens cameras. And configure the camera index for the visible light lens and the thermal imaging lens respectively.
  • the camera index of the visible light lens camera is bound to the gimbal where the XT2 camera is located, and the default device identifier of the gimbal is assigned to the camera index of the visible light lens.
  • the camera index of the thermal imaging lens camera is defined as 2, and the camera index is directly bound to the device identification.
  • This application provides a camera calling method, device and storage medium. As shown in Fig. 3, this application introduces the concept of lens and no longer splits the multi-lens camera into multiple single-lens cameras.
  • the multi-lens camera configuration has two levels of index, the first level is the camera index, the second level is the lens index, the camera index is used to call the common attributes of multiple lenses, that is, the camera layer call, the lens index and the camera index are used in combination, call The private attributes of a single lens, that is, the lens layer call. And configure the common attribute calling method of multiple lenses, and each lens configures the private attribute calling method. Configure the call properties and call index in the call request.
  • the calling index is configured as the camera index, and the calling attribute is a common attribute.
  • the main lens of the camera receives the calling instruction generated according to the calling request, and distributes the calling instruction to each lens, so as to realize the calling of common attributes of multiple lenses. Only one instruction is needed to realize the call of common attributes.
  • the calling index is configured as camera index and lens index, and the calling attributes are private attributes.
  • the corresponding lens receives the call instruction generated according to the call request. Allow multiple lenses to reuse the same device ID, and determine the corresponding lens from the lenses that use the same device ID by calling properties, which can expand the number of lenses on the PTZ.
  • the device identification is determined jointly by the camera index and the lens index, so that the two-level structure of the index corresponds to the two-level structure of the camera lens, which is convenient for camera calling.
  • FIG. 4 is a schematic diagram of an application scenario provided by an embodiment of the application.
  • FIG. 4 shows a drone 201 and a control terminal 202 of the drone.
  • the control terminal 202 of the drone 201 may be one or more of a remote control, a smart phone, a desktop computer, a laptop computer, and a wearable device (watch, bracelet).
  • the control terminal 202 is the remote controller 2021 and the terminal device 2022 as an example for schematic description.
  • FIG. 5 is a schematic flowchart of a camera calling method provided by an embodiment of the application. As shown in Figure 5, the camera invocation method provided in this embodiment can be applied to a control terminal, and the method includes:
  • the control terminal obtains a call request for calling the camera.
  • the camera carried on the pan/tilt is a multi-lens camera, and the camera includes a main lens and multiple auxiliary lenses.
  • each main camera can work independently, for example, it can work for one main camera, or it can work for one main camera and multiple auxiliary cameras in collaboration.
  • the user can generate a call request by operating the application APP, or generate a call request in other ways, and there is no restriction here.
  • the call request includes the call attribute and the call index.
  • the calling attribute is the specific calling content of the lens, for example: the calling request is used to set the resolution of the lens, and the calling attribute includes the specific method of setting the resolution.
  • the call request is used to set the camera mode of the lens, and the call attribute includes a specific method for setting the camera mode, and the method includes parameter adjustment such as exposure and focal length.
  • the types of call attributes include public attributes and private attributes. Common attributes refer to the common attributes of each lens, such as the camera mode of the lens. Private attributes are the private attributes of each lens individually, for example: lens resolution.
  • the call index is used to determine the device ID of the camera.
  • the camera In the UAV system, the camera is installed on the pan/tilt, and each pan/tilt is equipped with a device identification, and the camera is called by the device identification.
  • the device identification For the safety of drone control, usually only the index of the camera is disclosed, and the device identification is determined according to the camera index.
  • the calling attribute is a common attribute, the calling index only includes the camera index.
  • the calling attribute is a private attribute
  • the calling attribute includes the camera index and the lens index. The lens index cannot be used alone, and must be used together with the camera index.
  • S302 The control terminal determines the request type of the invocation request according to the invocation attribute.
  • the calling attribute is a common attribute of all shots, if it is, the request type is set to the first type, and if not, the request type is set to the second type.
  • the first type is to call the common attributes of all shots.
  • the second type is to call the private attributes of each lens.
  • the control terminal generates a camera call instruction according to the request type, the call attribute, and the call index.
  • the calling index when the calling attribute is a shared attribute, the calling index only includes the camera index, and the camera calling instruction is generated according to the camera index and the shared attribute.
  • the calling attribute when the calling attribute is a private attribute, the calling attribute includes the camera index and the lens index, and the camera calling instruction is generated according to the camera index, the lens index and the private attribute.
  • the camera calling method by setting the calling attribute in the calling request, if the shared attribute is called, the calling attribute of the shared attribute is configured in the request; if the private attribute is called, the calling private is configured in the request.
  • a lens index is introduced, which forms a two-level index with the camera index to configure the common and private attributes of multiple lenses. Make the index structure and attributes more closely match the actual physical structure of the camera. According to the camera index and lens index, the specific lens is located. There is no situation where multiple identical lenses are reused with the same index. It is possible to install multiple identical multi-lens cameras on the drone.
  • the PTZ is associated according to the camera index, and the PTZ is directly associated with the device identification, or the PTZ and the lens index are associated with the device identification, which is convenient for users to call.
  • the camera invocation method provided in the embodiments of the present application can be applied to a control terminal, and the method includes:
  • S401 The control terminal obtains a calling request for calling a camera.
  • S402 The control terminal determines the request type of the invocation request according to the invocation attribute.
  • the control terminal generates a camera call instruction according to the request type, the call attribute, and the call index.
  • the call index only includes the camera index
  • the call attribute is the common attribute of all lenses.
  • the corresponding relationship between the camera index and the pan/tilt identifier is called in the configuration table, and the pan/tilt carrying the camera is determined according to the camera index and the corresponding relationship between the camera index and the pan/tilt identifier. Then generate camera call instructions according to the pan/tilt and call attributes.
  • generating the camera call instruction according to the pan/tilt and the calling attributes specifically includes: using the first device identifier of the pan/tilt as the device identifier associated with the camera index. According to the device identification and the calling attribute associated with the camera index, a camera calling instruction is generated.
  • the camera call instruction can be generated according to the device identification and the call attribute by a known algorithm, which will not be repeated here.
  • the calling index only includes the camera index and the lens index, and the calling attribute is the private attribute of each lens.
  • the camera call instruction is generated. First, the device identification is determined according to the camera index and the lens index, and then the camera calling instruction is generated according to the device identification and calling attributes.
  • determining the device identification according to the camera index and the lens index specifically includes: calling the correspondence between the camera index and the pan/tilt identification in the configuration table, and determining the correspondence between the camera index and the camera index and the pan/tilt identification in the configuration table PTZ carrying the camera. Then determine the device identification matching the lens index according to the pan/tilt and lens index.
  • determining the device identification matching the lens index according to the pan/tilt and lens index specifically includes: in the configuration table, call up a relationship table indicating the corresponding relationship between the device identification and the lens index, and then determine from the relationship table and The relationship table associated with the pan/tilt. After the relationship table is determined, the device identifier that matches the lens index is determined according to the corresponding relationship between the device identifier and the lens index.
  • the lens index of the main lens and the camera index correspond to the same device identification.
  • the relationship table includes: the corresponding relationship between the lens index of the main lens and the first device identifier, and the corresponding relationship between the lens index of the auxiliary lens and the second device identifier. There is also a corresponding relationship between the camera index and the first device identifier.
  • the pan/tilt associated with the camera index is determined according to the correspondence between the camera index and the pan/tilt, and then the device identifier is determined as the first device identifier based on the pan/tilt associated with the camera index.
  • the control terminal sends a camera calling instruction to the drone.
  • control terminal sends the camera invocation instruction to the controlled drone after generating the camera invocation instruction.
  • the camera lens is divided into a main lens and an auxiliary lens, the lens index and the camera index of the main lens are multiplexed with the first device identifier, and the lens index of the auxiliary lens is multiplexed with the same second lens.
  • the architecture of the mobile terminal software development tools used to control drones includes: application layer (Application abbreviation: APP), SDK layer, SharedLib layer, Midware Layer and firmware layer.
  • the SDK layer is used to provide external interfaces for APP.
  • the SharedLib layer is an abstraction of all the components on the drone.
  • Each component is configured with a component object DJISDKCacheHWAbstraction, where HW stands for Hardware, and each component object contains methods for setting the attributes of each component.
  • Each component attribute can be the inherent attribute of the component and the attribute that controls the operation of each component.
  • Each component object also contains an index Index, which is used to determine which component is specific when there are multiple identical components.
  • each camera can abstract a camera object CameraAbstraction.
  • CameraAbstraction only includes the camera index Index and the lens index SubIndex of each lens. Combining the camera index Index and the lens index SubIndex can determine the called lens from the lenses of many cameras.
  • the three gimbals of the UAV carry three cameras in a one-to-one correspondence, and each camera is equipped with three lenses. Then the camera index and lens index of each camera are shown in Table 1 below.
  • the user needs to call the first lens of the first camera, enter the camera index as 0 and the lens index as 0 in the camera object CameraAbstraction to realize the call.
  • Table 1 is the camera index and lens index table
  • the camera index Index is bound to the gimbal, that is, there is a corresponding relationship between the camera index Index and the gimbal.
  • the corresponding relationship between the camera index Index and the pan/tilt in this embodiment is shown in Table 2 below.
  • Table 2 shows the correspondence between the PTZ and the camera index
  • the camera index Index is 0, and the camera is carried on the No. 1 gimbal.
  • the camera index Index is 1, and the camera is carried on the 2nd gimbal.
  • the camera index Index is 2, and the camera is carried on the 3rd gimbal.
  • the MidWare layer transmits data packets to the firmware layer, and the firmware layer forwards the data packets to the drone. Because the UAV uses DeviceType and DeviceId to uniquely identify the hardware devices on the UAV. in,
  • DeviceType refers to the device type
  • DeviceId refers to the device identification. From the camera object, the device type can be determined as a camera, and the ShareLib layer only needs to determine the device ID of the camera DeviceId according to the index in the camera object.
  • the device identification of the gimbal is usually assigned to the camera.
  • the corresponding relationship between the SubIndex of each lens index and the device ID of the PTZ device ID is configured, so that ShareLib can determine the device ID according to the index.
  • the camera index is also configured with a corresponding device ID deviceID.
  • each pan/tilt only reserves two device identities for the camera, DeviceId.
  • the camera index and the main lens index can be reused with the same device identification, and other auxiliary lenses can reuse the same device identification.
  • Table 3 shows the device identification of each pan/tilt. The correspondence between the lens index and the device identification based on the device identification shown in Table 3 is shown in Table 4 below. Let the camera index and the main lens index reuse the same device ID, and the device ID obtained according to the camera index is shown in Table 5. Combining Table 4 and Table 5, it can be seen that the device identification generated by calling public attributes and private attributes of the 3-lens camera carried on the pan/tilt head numbered Gimbal 1 is shown in Figure 8.
  • the XT2 camera object DJICameraXT2CameraAbstraction contains the visible light lens object DJILensXT2VisibleLensAbstraction and the infrared lens object DJILensXT2ThermalLensAbstraction.
  • the XT2 camera object DJICameraXT2CameraAbstraction inherits the multi-lens camera object DJIMultiLensCameraAbstraction.
  • the multi-lens camera object DJIMultiLensCameraAbstraction further inherits the component object DJISDKCacheHWAbstraction.
  • the visible light lens object DJILensXT2VisibleLensAbstraction inherits the lens object DJILensAbstraction, and the lens object DJILensAbstraction inherits the subcomponent object DJISubComponentHWAbstraction.
  • the infrared lens object DJILensXT2ThermalLensAbstraction inherits the lens object DJILensAbstraction, and the lens object DJILensAbstraction inherits the subcomponent object DJISubComponentHWAbstraction.
  • the sub-component object DJISubComponentHWAbstraction inherits the component object DJISDKCacheHWAbstraction, but the two do not inherit the functions of the camera and lens. Therefore, when calling the common attributes of multiple lenses, the XT2 camera object can be called. When calling the private properties of a single lens, you can call the visible light lens object and the infrared lens object. When multiple lenses reuse the same device ID, the call attribute in the object is used to distinguish which lens is specifically called.
  • the XT2 camera object XT2Camera includes the XT2 camera visible light lens object XT2VisbleLens and the XT2 camera infrared lens object XT2ThermalLens, which conforms to the physical structure of the XT2 camera.
  • the XT2 camera object XT2Camera inherits the basic camera object BaseCamera, and the basic camera object BaseCamera inherits the camera object Camera.
  • the visible light lens object XT2VisbleLens of the XT2 camera and the infrared lens object XT2ThermalLens of the XT2 camera both inherit the base lens BaseLens, and the base lens BaseLens inherits the lens Lens.
  • Another method for invoking a camera provided by an embodiment of the present application will be mainly described with reference to the software architecture of the control terminal and FIG. 11.
  • Another camera invocation method provided by the embodiment of the present application can be applied to a control terminal, and the method includes:
  • S501 The control terminal obtains a call request for calling the camera.
  • the call request is the first type of calling shared attributes
  • the call request is the call interface of the XT2 camera object XT2Camera
  • the SDK layer parses the calling interface and generates DJIKey. And transfer DJIKey to the Sharelib layer.
  • S502 The control terminal determines the request type of the invocation request according to the invocation attribute.
  • the Sharelib layer After receiving the DJIKey, the Sharelib layer parses the DJIKey to generate the calling attribute FeatureID and the calling index. And according to the calling attribute FeatureID, it is determined whether the calling request type is calling a common attribute or calling a private attribute. If the calling attribute FeatureID is a common attribute, the calling index is the camera index. If the calling attribute FeatureID is a private attribute, the calling index is the camera index and the lens index.
  • the control terminal generates a camera call instruction according to the request type, the call attribute, and the call index.
  • the Midware layer generates camera call instructions according to the request type, request type call attributes, and call index. If the request type is a shared attribute, the Midware layer determines the device ID according to the camera index, and then generates a camera call instruction according to the device ID and the calling attribute. If the request type is a private attribute, the Midware layer determines the device identification according to the camera index and the lens index, and then generates a camera call instruction according to the device identification and the calling attribute.
  • the calling attribute is a common attribute
  • the calling index is the camera index
  • the camera index is 0.
  • the gimbal is Gimbal1.
  • the first device ID of the gimbal Gimbal 1 is 0100, and it is determined that the device ID corresponding to the camera index of 0 is 0100.
  • the calling index is the camera index and the lens index. If the camera index is 0, the lens index is 0, and the pan/tilt is determined to be Gimbal 1 according to the camera index. After determining the relationship between the lens index and the device ID of the gimbal Gimbal 1, the device ID can be determined to be 0100, which is the same as the device ID corresponding to the camera index. Correspondingly, if the camera index is 0 and the lens index is 1, it can be determined that the device ID is 0101.
  • FIG. 12 is a schematic flowchart of a camera calling method provided by an embodiment of the application. As shown in FIG. 12, another camera invocation method provided by an embodiment of the present application can be applied to a camera invocation device, and the method includes:
  • the camera calling device receives a camera calling instruction from the control terminal.
  • the camera call instruction is a camera call instruction generated according to the camera call method provided in the foregoing embodiment.
  • the camera calling device parses the camera calling instruction, and generates a device identification and calling attributes.
  • the unmanned aerial vehicle internally stores an analysis method, which can parse the device identification and call attributes according to the algorithm used by the control terminal to generate camera instructions.
  • the camera calling device determines to call the lens according to the calling attribute and the device identifier.
  • determining the calling lens according to the calling attribute and the device identification specifically includes: determining at least one candidate lens associated with the device identification. Determine the calling shot from the candidate shots according to the calling attributes.
  • Determining at least one candidate lens associated with the device identifier specifically includes: determining an associated pan/tilt that matches the device identifier according to the relationship between the device identifier and the pan/tilt identifier. According to the preset relationship between the PTZ and the camera index, the associated camera that matches the associated PTZ is determined. Select the candidate lens from the lens of the associated camera according to the device ID.
  • Determine the calling lens from the candidate lenses according to the calling attribute which specifically includes: if the calling attribute is a common attribute of all lenses, the calling lens is all the lenses of the associated camera. If the calling attribute is a private attribute of each lens, select the calling lens from the candidate lenses according to the relationship between the calling attribute and the lens.
  • auxiliary lens parameters or operating methods are different, and the calling attributes of each auxiliary lens are also different.
  • the specific calling lens can be selected from the candidate lenses with the same device identification.
  • each pan/tilt only reserves two device identities for the camera, DeviceId.
  • the camera index and the main lens index can be reused with the same device identification, and other auxiliary lenses can reuse the same device identification.
  • the selection of candidate lenses from the lenses of the associated cameras according to the device identifiers specifically includes: if the device identifier is the first device identifier, the candidate lenses are all lenses of the associated camera. If the device identifier is the second device identifier, the candidate lens is the auxiliary lens of the associated camera.
  • S604 The camera calling device sets the attributes of the calling lens according to the calling attributes.
  • the lens is set according to the specific inherent parameters or the running process in the call attribute to realize the camera call process.
  • the main lens distributes the calling attribute to other auxiliary lenses, and the auxiliary lens performs parameter setting or running process setting according to the calling attribute.
  • the main lens performs parameter setting or running process setting according to the calling attribute.
  • the calling lens is the auxiliary lens corresponding to the private attribute at this time, and the auxiliary lens performs parameter setting or running process setting according to the calling attribute.
  • the camera calling method by setting the calling attribute in the calling request, if the shared attribute is called, the calling attribute of the shared attribute is configured in the request; if the private attribute is called, the calling private is configured in the request Calling the attributes of the attributes, only one call request is needed to complete the call of the common attributes of multiple shots, simplifying the calling process.
  • the camera lens is divided into main lens and auxiliary lens, the lens index of the main lens and the camera index are multiplexed with the first device ID, and the lens index of the auxiliary lens is multiplexed with the corresponding relationship between the second device ID, and then the call attribute is used to distinguish Reuse lenses with the same device ID, so that the number of lenses is not limited by the device ID.
  • FIG. 13 is a schematic structural diagram of a control terminal provided by an embodiment of this application. As shown in FIG. 13, the control terminal provided by an embodiment of this application includes:
  • the acquiring unit 701 is configured to acquire a calling request for calling a camera, where the camera includes a main lens and multiple auxiliary lenses, and the calling request includes a calling attribute and a calling index;
  • the determining unit 702 is configured to determine the request type of the call request according to the call attribute, where the request type is the first type that calls the common attributes of all shots and the second type that calls the private attributes of each shot;
  • the generating unit 703 is configured to generate a camera call instruction according to the request type, call index, and call attribute.
  • the generating unit 703 is specifically configured to:
  • the calling index is the camera index
  • the camera index and the correspondence between the camera index and the pan/tilt identification determine the pan/tilt carrying the camera
  • the generating unit 703 is specifically configured to:
  • a camera calling instruction is generated.
  • the calling attribute is a common attribute of all shots.
  • the generating unit 703 is specifically configured to:
  • the calling index includes camera index and lens index
  • the camera call instruction is generated according to the device identification and the call attribute.
  • the generating unit 703 is specifically configured to:
  • pan/tilt and lens index determine the device ID that matches the lens index.
  • the generating unit 703 is specifically configured to:
  • the relationship table is used to indicate the corresponding relationship between the device identifier and the lens index
  • the device ID matching the lens index is determined.
  • the corresponding relationship between the device identifier and the lens index specifically includes:
  • the calling attribute is a private attribute of each lens.
  • the determining unit 702 is specifically configured to:
  • the request type is set to the first type, and if not, the request type is set to the second type.
  • control terminal further includes a sending unit 704, and the sending unit 704 is configured to:
  • FIG. 14 is a schematic structural diagram of a camera calling device provided by an embodiment of the application. As shown in FIG. 14, the camera invoking device provided by the embodiment of the present application includes:
  • the receiving unit 801 is configured to receive a camera call instruction from the control terminal;
  • the generating unit 802 is used to parse the camera call instruction to generate a device identification and call attributes
  • the determining unit 803 is configured to determine the calling lens according to the calling attribute and the device identifier
  • the setting unit 804 is used to set the attributes of the calling lens according to the calling attributes.
  • the determining unit 803 is specifically configured to:
  • the determining unit 803 is specifically configured to:
  • the device ID According to the relationship between the device ID and the PTZ ID, determine the associated PTZ that matches the device ID;
  • the determining unit 803 is specifically configured to:
  • the candidate lenses are all lenses of the associated camera
  • the candidate lens is the auxiliary lens of the associated camera.
  • the determining unit 803 is specifically configured to:
  • the calling lens is all the lenses of the associated camera
  • the calling attribute is a private attribute of each lens
  • FIG. 15 is a schematic structural diagram of an electronic device shown in an embodiment of the application.
  • the electronic device 900 provided in this embodiment includes: a transmitter 901, a receiver 902, a memory 903, and a processor 902.
  • the transmitter 901 is used to send instructions and data
  • the receiver 902 is used to receive instructions and data
  • the memory 903 is used to store computer execution instructions
  • the processor 904 is configured to execute computer-executable instructions stored in the memory to implement each step executed by the texture generation method in the foregoing embodiment. For details, please refer to the relevant description in the aforementioned camera invocation method embodiment.
  • the foregoing memory 903 may be independent or integrated with the processor 904.
  • the processing device further includes a bus for connecting the memory 903 and the processor 904.
  • FIG. 16 is a schematic structural diagram of a camera calling system provided by an embodiment of the application.
  • the camera calling system 1000 of this embodiment may include: a camera calling device 1002 and/or a control terminal 1003.
  • the camera calling device 1002 can adopt the structure of the embodiment shown in FIG. 15, and correspondingly, it can execute the technical solutions of the drone in the foregoing method embodiments.
  • the implementation principles and technical effects are similar and will not be repeated here.
  • control terminal 1003 can adopt the structure of the embodiment shown in FIG. 15, which can correspondingly execute the technical solutions of the control terminal in the foregoing method embodiments.
  • the implementation principles and technical effects are similar, and will not be repeated here.
  • the camera invoking device 1002 may be set on the drone 1001, and the camera invoking device 1002 is, for example, a part of the drone 1001. Alternatively, a part of the camera invoking device 1002 may be set on the drone 1001, and another part of the camera invoking device 1002 may be set on the control terminal 1003.
  • a person of ordinary skill in the art can understand that all or part of the steps in the above method embodiments can be implemented by a program instructing relevant hardware.
  • the foregoing program can be stored in a computer readable storage medium. When the program is executed, it is executed. Including the steps of the foregoing method embodiment; and the foregoing storage medium includes: read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disks or optical disks, etc., which can store program codes Medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

一种相机调用方法和设备,所述方法包括:获取用于调用相机的调用请求,其中,调用请求包括调用属性和调用索引(S301);根据所述调用属性确定所述调用请求的请求类型(S302);根据所述请求类型、所述调用属性和所述调用索引生成相机调用指令(S303)。通过在调用请求中设置调用属性,若调用共有属性,则在请求中配置调用共有属性的调用属性,仅需一个调用请求即可完成对多个镜头的共有属性的调用,简化调用过程。

Description

相机调用方法以及设备 技术领域
本申请涉及相机技术领域,尤其涉及一种相机调用方法以及设备。
背景技术
无人机通常包括动力系统、飞行控制系统、机架和承载在机架上的云台。其中,动力系统为无人机提供动力,飞行控制器用于测量和控制无人机的飞行状态,云台用于携带负载,例如:负载类型可以为相机。
每个云台配置有2个设备标识deviceID,且配置有云台和相机索引的对应关系表。当需要调用相机时,获取用户输入的相机调用指令,从相机调用指令中获取相机索引和调用内容,根据相机索引和上述对应关系确定云台标识,再将该云台的第一个deviceID作为相机deviceID,再根据deviceID和调用内容生成机器调用指令,即可完成相机调用。上述调用方法仅针对单镜头相机,当负载为双镜头相机时,根据镜头个数将该相机拆分为两个单镜头相机,其中一个单镜头相机调用方式同上述方式相同,此处不再赘述。针对另外一个单镜头相机调用,为该单镜头相机配置相机索引,构建相机索引与deviceID之间的对应关系。当需要调用该单镜头相机时,获取用户输入的相机调用指令,从相机调用指令中获取相机索引和调动内容,根据相机索引和对应关系确定deviceID,再根据deviceID和调用内容生成机器调用指令,即可完成相机调用。
然而,当需要对双镜头相机中镜头共有属性进行调用时,需要用户分别输入每个单镜头相机的相机调用指令,再根据每条相机调用指令生成每个镜头的机器调用指令,造成调用过程繁琐。
发明内容
本申请实施例提供一种相机调用方法和相关设备。
第一方面,本申请提供一种相机调用方法,包括:
获取用于调用相机的调用请求,其中,相机包括一个主镜头和多个辅镜头,调用请求包括调用属性和调用索引;
根据调用属性确定调用请求的请求类型,其中,请求类型包括调用所有镜头的共有属 性的第一类型和调用每个镜头的私有属性的第二类型;
根据请求类型、调用属性和调用索引生成相机调用指令。
第二方面,本申请提供一种相机调用方法,方法包括:
从控制终端接收相机调用指令;
对相机调用指令进行解析,生成设备标识和调用属性;
根据调用属性和设备标识确定调用镜头;
根据调用属性对调用镜头的属性进行设置。
第三方面,本申请提供一种控制终端,包括:
获取单元,用于获取用于调用相机的调用请求,其中,相机包括一个主镜头和多个辅镜头,调用请求包括调用属性和调用索引;
确定单元,用于根据调用属性确定调用请求的请求类型,其中,请求类型为调用所有镜头的共有属性的第一类型和调用每个镜头的私有属性的第二类型;
生成单元,用于根据请求类型、调用索引和调用属性生成相机调用指令。
第四方面,本申请提供一种相机调用设备,包括:
接收单元,用于从控制终端接收相机调用指令;
生成单元,用于对相机调用指令进行解析,生成设备标识和调用属性;
确定单元,用于根据调用属性和设备标识确定调用镜头;
设置单元,用于根据调用属性对调用镜头的属性进行设置。
第五方面,本申请提供一种电子设备,包括:
存储器,用于存储程序;
处理器,用于执行所述存储器存储的所述程序,当所述程序被执行时,所述处理器用于执行如第一方面或第二方面本申请实施例的相机调用方法。
第六方面,本申请提供一种调用系统,包括第三方面所涉及的控制终端、和/或第四方面所涉及的相机调用设备。
第七方面,本发明实施例提供一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如第一方面或第二方面本申请实施例的相机调用方法。
第八方面,本发明实施例提供一种包括指令的计算机程序产品,当所述指令在计算机上运行时,使得计算机执行如第一方面或第二方面本申请实施例的相机调用方法。
本申请实施例提供一种相机调用方法以及设备,根据控制终端输入的用户调用请求中调用属性,确定调用请求的请求类型为调用相机各个镜头的共有属性,还是为调用相机各 个镜头的私有属性。根据请求类型确定由用户调用请求生成调用指令的方式。通过在调用请求中设置调用属性,若调用共有属性,则在请求中配置调用共有属性的调用属性,若调用私有属性,则在请求中配置调用私有属性的调用属性,仅需一个调用请求即可完成对多个镜头的共有属性的调用,简化调用过程。
附图说明
图1为本申请实施例提供的无人飞行系统的示意性架构图;
图2为相关技术中相机调用原理图;
图3为本申请实施例提供的相机调用方法的原理图;
图4为本申请实施例提供的应用场景示意图;
图5为本申请实施例提供的相机调用方法的流程示意图;
图6为本申请实施例提供的根据调用索引确定设备标识的流程示意图;
图7为本申请实施例提供的控制终端的软件架构示意图;
图8为本申请实施例提供的多镜头相机与设备标识的关系示意图;
图9为本申请实施例提供的ShareLib层中相机对象与镜头对象的架构图;
图10为本申请实施例提供的SDK层中相机对象与镜头对象的架构图;
图11为本申请实施例提供的相机调用方法的原理示意图;
图12为本申请实施例提供的相机调用方法的流程示意图;
图13为本申请实施例提供的控制终端的结构示意图;
图14为本申请实施例提供的相机调用设备的结构示意图;
图15为本申请实施例提供的电子设备的结构示意图;
图16为本申请实施例提供的相机调用系统的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
本申请实施例提供一种相机调用方法、设备和存储介质。其中,本申请实施例可以应用于各种类型的无人机。例如,无人机可以是小型或大型的无人机。在某些实施例中,无人机可以是旋翼无人机(rotorcraft),例如,由多个推动装置通过空气推动的多旋翼无人机,本申请的实施例并不限于此,对于本领域技术人员将会显而易见的是,可以不受限制 地使用其他类型的无人机。
图1为本申请实施例提供的无人飞行系统的示意性架构图。本实施例以旋翼无人机为例进行说明。无人飞行系统100可以包括无人机110、显示设备130和控制终端140。其中,无人机110可以包括动力系统150、飞行控制系统160、机架和承载在机架上的云台120。无人机110可以与控制终端140和显示设备130进行无线通信。其中,无人机110还包括电池(图中未示出),电池为动力系统150提供电能。无人机110可以是农业无人机或行业应用无人机,有循环作业的需求。相应的,电池也有循环作业的需求。
机架可以包括机身和脚架(也称为起落架)。机身可以包括中心架以及与中心架连接的一个或多个机臂,一个或多个机臂呈辐射状从中心架延伸出。脚架与机身连接,用于在无人机110着陆时起支撑作用。
动力系统150可以包括一个或多个电子调速器(简称为电调)151、一个或多个螺旋桨153以及与一个或多个螺旋桨153相对应的一个或多个电机152,其中电机152连接在电子调速器151与螺旋桨153之间,电机152和螺旋桨153设置在无人机110的机臂上;电子调速器151用于接收飞行控制系统160产生的驱动信号,并根据驱动信号提供驱动电流给电机152,以控制电机152的转速。电机152用于驱动螺旋桨旋转,从而为无人机110的飞行提供动力,该动力使得无人机110能够实现一个或多个自由度的运动。在某些实施例中,无人机110可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴(Roll)、偏航轴(Yaw)和俯仰轴(pitch)。应理解,电机152可以是直流电机,也可以交流电机。另外,电机152可以是无刷电机,也可以是有刷电机。
飞行控制系统160可以包括飞行控制器161和传感系统162。传感系统162用于测量无人机的姿态信息,即无人机110在空间的位置信息和状态信息,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度等。传感系统162例如可以包括陀螺仪、超声传感器、电子罗盘、惯性测量单元(Inertial Measurement Unit,IMU)、视觉传感器、全球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是全球定位系统(Global Positioning System,GPS)。飞行控制器161用于控制无人机110的飞行,例如,可以根据传感系统162测量的姿态信息控制无人机110的飞行。应理解,飞行控制器161可以按照预先编好的程序指令对无人机110进行控制,也可以通过响应来自控制终端140的一个或多个遥控信号对无人机110进行控制。
云台120可以包括电机122。云台用于携带负载,负载可以是拍摄装置123。飞行控制器161可以通过电机122控制云台120的运动。可选的,作为另一实施例,云台120还 可以包括控制器,用于通过控制电机122来控制云台120的运动。应理解,云台120可以独立于无人机110,也可以为无人机110的一部分。应理解,电机122可以是直流电机,也可以是交流电机。另外,电机122可以是无刷电机,也可以是有刷电机。还应理解,云台可以位于无人机的顶部,也可以位于无人机的底部。
拍摄装置123例如可以是照相机或摄像机等用于捕获图像的设备,拍摄装置123可以与飞行控制器通信,并在飞行控制器的控制下进行拍摄。本实施例的拍摄装置123至少包括感光元件,该感光元件例如为互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)传感器或电荷耦合元件(Charge-coupled Device,CCD)传感器。可以理解,拍摄装置123也可直接固定于无人机110上,从而云台120可以省略。
显示设备130位于无人飞行系统100的地面端,可以通过无线方式与无人机110进行通信,并且可以用于显示无人机110的姿态信息。另外,还可以在显示设备130上显示拍摄装置123拍摄的图像。应理解,显示设备130可以是独立的设备,也可以集成在控制终端140中。
控制终端140位于无人飞行系统100的地面端,可以通过无线方式与无人机110进行通信,用于对无人机110进行远程操纵。
应理解,上述对于无人飞行系统各组成部分的命名仅是出于标识的目的,并不应理解为对本申请的实施例的限制。
下面重点描述相关技术中存在问题。当云台上承载负载为相机时,通过云台的设备标识deviceID实现相机调用,其中,每个云台配置有2个设备标识deviceID。具体过程为:在无人机系统初始化时,配置有云台和相机索引的对应关系表。当需要调用单镜头相机时,获取用户输入的相机调用指令,从相机调用指令中获取相机索引和调用内容,根据相机索引和上述对应关系确定云台标识,再将该云台的第一个deviceID作为相机deviceID,根据deviceID和调用内容生成机器调用指令,即可完成相机调用。
上述调用方法仅针对单镜头相机,当负载为双镜头相机时,根据镜头个数将该相机拆分为两个单镜头相机,其中一个单镜头相机调用方式同上述方式相同,此处不再赘述。针对另外一个单镜头相机调用,为该单镜头相机配置相机索引,构建相机索引与deviceID之间的对应关系。当需要调用该单镜头相机时,获取用户输入的相机调用指令,从相机调用指令中获取相机索引和调动内容,根据相机索引和对应关系确定deviceID,再根据deviceID和调用内容生成机器调用指令,即可完成相机调用。
如图2所示,以多镜头相机ZENMUSE XT2为例说明相关技术中的相机调用方法。为了实现更复杂的功能,在软件件层将其抽象成两个单镜头相机。分别为可见光镜头相机和热成像镜头相机。并为可见光镜头和热成像镜头分别配置相机索引,可见光镜头相机的相机索引和XT2相机所在云台绑定,并将云台的默认设备标识分配给可见光镜头的相机索引。热成像镜头相机的相机索引则定义成2,由相机索引直接和设备标识绑定。
然而,当需要对双镜头相机中镜头共有属性进行调用时,需要用户分别输入每个单镜头相机的相机调用指令,再根据每条相机调用指令生成每个镜头的机器调用指令,造成调用过程繁琐。当无人机的云台承载多个相同多镜头相机时,多个热成像镜头会使用同一相机索引,导致无法根据相机索引确定具体热成像镜头。另外,由于每个云台的deviceID的限制,使得云台所搭载的镜头数量有限。
本申请提供一种相机调用方法、设备以及存储介质。如图3所示,本申请引入镜头概念,不再将多镜头相机拆分成多个单镜头相机。多镜头相机配置有两级索引,第一级为相机索引,第二级为镜头索引,相机索引用于调用多个镜头的共有属性,也就是相机层调用,镜头索引和相机索引结合使用,调用单个镜头的私有属性,也就是镜头层调用。并配置多个镜头的共有属性调用方法,各个镜头配置私有属性调用方法。在调用请求中配置调用属性和调用索引。若调用公共属性,调用索引配置为相机索引,调用属性为共有属性。由相机的主镜头接收根据该调用请求生成的调用指令,并将调用指令分发至各个镜头,实现多个镜头共有属性调用。仅需一条指令即可实现共有属性调用。若调用私有属性,调用索引配置为相机索引和镜头索引,调用属性为私有属性。由对应镜头接收根据该调用请求生成的调用指令。让多个镜头复用同一个设备标识,通过调用属性从使用同一设备标识的镜头中确定对应镜头,即可扩大云台所搭载的镜头数量。又通过相机索引和镜头索引共同确定设备标识,使索引两级结构和相机镜头的两级结构对应,便于进行相机调用。
本申请各实施例应用于相机调用场景。图4为本申请实施例提供的应用场景示意图,图4中示出了无人机201、无人机的控制终端202。无人机201的控制终端202可以是遥控器、智能手机、台式电脑、膝上型电脑、穿戴式设备(手表、手环)中的一种或多种。本申请实施例以控制终端202为摇控器2021和终端设备2022为例来进行示意性说明。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
图5为本申请实施例提供的相机调用方法的流程示意图。如图5所示,本实施例 提供的相机调用方法可以应用于控制终端,方法包括:
S301、控制终端获取用于调用相机的调用请求。
其中,在云台上承载的相机为多镜头相机,相机包括一个主镜头和多个辅镜头。在无人机作业时,每个主镜头可以单独工作,例如,可以为一个主镜头作业,也可以为一个主镜头和多个辅镜头协同作业。
用户可以通过操作应用程序APP生成调用请求,也可以通过其他方式生成调用请求,此处不做限制。调用请求包括调用属性和调用索引。调用属性为镜头的具体调用内容,例如:调用请求用于设置镜头分辨率,调用属性包括设置分辨率的具体方法。调用请求用于设置镜头拍照模式,调用属性包括设置拍照模式的具体方法,方法中包括曝光量、焦距等参数调整。调用属性的类型又包括共有属性和私有属性。共有属性是指每个镜头都有的共有属性,例如:镜头的拍照模式。私有属性为每个镜头单独所有的私有属性,例如:镜头分辨率。
调用索引用于确定该相机的设备标识。在无人机系统中,相机安装在云台上,每个云台设置有设备标识,通过设备标识实现相机调用。为了无人机控制安全,通常仅公开相机的索引,根据相机索引确定设备标识。当调用属性为共有属性时,调用索引仅包括相机索引。当调用属性为私有属性时,调用属性包括相机索引和镜头索引,镜头索引不能单独使用,必须和相机索引一起使用。
S302、控制终端根据调用属性确定调用请求的请求类型。
其中,判断调用属性是否为所有镜头的共有属性,若是,将请求类型设置为第一类型,若否,将请求类型设置为第二类型。其中,第一类型为调用所有镜头的共有属性。第二类型为调用每个镜头的私有属性。
S303、控制终端根据请求类型、调用属性和调用索引生成相机调用指令。
其中,当调用属性为共有属性时,调用索引仅包括相机索引,根据相机索引和共有属性生成相机调用指令。当调用属性为私有属性时,调用属性包括相机索引和镜头索引,根据相机索引、镜头索引和私有属性生成相机调用指令。
在本申请实施例提供的相机调用方法中,通过在调用请求中设置调用属性,若调用共有属性,则在请求中配置调用共有属性的调用属性,若调用私有属性,则在请求中配置调用私有属性的调用属性,仅需一个调用请求即可完成对多个镜头的共有属性的调用,简化调用过程。另外,引进镜头索引,与相机索引构成两级索引,配置多个镜头的共有属性和私有属性。使得索引结构和属性与相机实际物理结构更加匹配。根据相机索引和镜头索引 定位具体镜头,不存在多个相同镜头复用同一个索引情况,可以实现在无人机上安装多个相同的多镜头相机。采用根据相机索引关联云台,再由云台直接关联设备标识,或者由云台和镜头索引关联设备标识,便于用户调用。
下面重点描述本申请实施例提供的另一种相机调用方法。本申请实施例提供的相机调用方法可以应用于控制终端,方法包括:
S401、控制终端获取用于调用相机的调用请求。
其中,该步骤已经在S301中详细说明,此处不再赘述。
S402、控制终端根据调用属性确定调用请求的请求类型。
其中,该步骤已经在S301中详细说明,此处不再赘述。
S403、控制终端根据请求类型、调用属性和调用索引生成相机调用指令。
其中,该步骤已经在S301中详细说明,重复部分此处不再赘述。下面结合图6分别以请求类型为第一类型和第二类型为例详细说明生成相机调用指令的过程。
若请求类型为第一类型,调用索引仅包括相机索引,调用属性为所有镜头的共有属性。根据相机索引和共有属性生成相机调用指令。在配置表中调出相机索引与云台标识之间的对应关系,根据相机索引和相机索引与云台标识之间的对应关系,确定承载相机的云台。再根据云台和调用属性生成相机调用指令。
其中,根据云台和调用属性生成相机调用指令,具体包括:将云台的第一设备标识作为与相机索引关联的设备标识。根据与相机索引关联的设备标识和调用属性,生成相机调用指令。可以通过已知算法根据设备标识和调用属性生成相机调用指令,此处不再赘述。
若请求类型为第二类型,调用索引仅包括相机索引和镜头索引,调用属性为每个镜头的私有属性。根据相机索引、镜头索引和私有属性生成相机调用指令。首先根据相机索引和镜头索引确定设备标识,再根据设备标识和调用属性生成相机调用指令。
其中,根据相机索引和镜头索引确定设备标识,具体包括:在配置表中调出相机索引与云台标识之间的对应关系,根据相机索引和相机索引与云台标识之间的对应关系,确定承载相机的云台。再根据云台和镜头索引确定与镜头索引相匹配的设备标识。
其中,根据云台和镜头索引确定与镜头索引相匹配的设备标识,具体包括:在配置表中调出用于表示设备标识与镜头索引之间对应关系的关系表,再从关系表中确定与云台关联的关系表。再确定关系表之后,根据设备标识与镜头索引之间对应关系确定与镜头索引相匹配的设备标识。
为了便于无人机解析相机调用指令,让主镜头的镜头索引和相机索引对应同一个设备标识。配置用于表示如下设备标识与镜头索引之间对应关系的关系表。其中,关系表包括:主镜头的镜头索引与第一设备标识之间对应关系,辅镜头的镜头索引与第二设备标识之间对应关系。相机索引和第一设备标识之间也有对应关系。也就是当请求类型为第一类型时,根据相机索引和云台之间的对应关系确定与相机索引关联的云台,再根据相机索引关联的云台确定设备标识为第一设备标识。
S404、控制终端向无人机发送相机调用指令。
其中,控制终端在生成相机调用指令之后,向所控制无人机发送相机调用指令。
在本申请实施例提供的相机调用方法中,将相机的镜头分为主镜头和辅镜头,主镜头的镜头索引和相机索引复用第一设备标识,辅镜头的镜头索引复用同一个第二设备标识之间对应关系,再使用调用属性区分复用同一设备标识的镜头,使得镜头数量不受设备标识限制。
下面结合控制终端的软件架构重点描述本申请实施例提供的相机调用方法。首先重点描述控制终端的软件架构,如图7所示,用于实现对无人机控制的移动终端软件开发工具的架构包括:应用程序层(Application简称:APP)、SDK层、SharedLib层、Midware层以及固件层。
其中,SDK层用于提供为APP使用的对外接口。SharedLib层是对无人机上所有部件的抽象,每个部件均配置有部件对象DJISDKCacheHWAbstraction,其中,HW代表Hardware,在每个部件对象中包含设置各个部件属性的方法。各个部件属性可以为部件的固有属性以及控制各个部件运转的属性。在每个部件对象中还包含有索引Index,用于在有多个相同部件的情况下确定具体是哪个部件。
以承载在云台上的多镜头相机为例,每个相机可以抽象出一个相机对象CameraAbstraction。CameraAbstraction中仅包括相机索引Index和每个镜头的镜头索引SubIndex。结合相机索引Index和镜头索引SubIndex即可从众多相机的镜头中确定所调用的镜头。
假设无人机的3个云台上承载一一对应承载有3个相机,每个相机设置有3个镜头。则每个相机的相机索引和镜头索引如下表1所示。在用户需要调用第一个相机的第一镜头时,在相机对象CameraAbstraction中输入相机索引为0,镜头索引为0,即可实现调用。
表1为相机索引和镜头索引表
Figure PCTCN2020088297-appb-000001
其中,相机索引Index是和云台进行绑定的,也就是相机索引Index和云台之间有对应关系。本实施例中相机索引Index和云台之间有对应关系如下表2所示。
表2为云台与相机索引的对应关系表
云台编号 Gimbal 1 Gimbal 2 Gimbal 3
相机索引 0 1 2
相机索引Index为0,该相机承载在1号云台。相机索引Index为1,该相机承载在2号云台。相机索引Index为2,该相机承载在3号云台。用户在开发APP时,很方便通过云台物理位置确定相机索引,进而对相应的相机进行操作。
MidWare层向固件层传送数据包,固件层再将数据包转发至无人机。由于无人机内部是通过DeviceType和DeviceId来唯一识别无人机上的硬件设备。其中,
DeviceType指设备类型,DeviceId是指设备标识。从相机对象可以确定设备类型为相机,ShareLib层仅需要根据相机对象中索引确定相机的设备标识DeviceId。
由于相机承载于云台,通常将云台的设备标识赋予给相机。配置有每个镜头索引SubIndex与云台的设备标识deviceID之间对应关系,使ShareLib能够根据索引确定设备标识。为了实现多个镜头公共属性的调用,为相机索引又配置有对应的设备标识deviceID。
在云台预留给相机的设备标识的数量有限情况下,例如:每个云台只给相机预留两个设备标识DeviceId。可以让相机索引和主镜头索引复用同一个设备标识,其他辅镜头复用同一个设备标识。表3为各个云台的设备标识,基于表3所示设备标识所形成的镜头索引和设备标识的对应如下表4所示。让相机索引和主镜头索引复用同一个设备标识,根据相机索引所得到的设备标识如表5所示。结合表4和表5可知,在编号为Gimbal 1的云台上承载的3镜头相机,在调用公共属性和私有属性所生成的设备标识如图8所示。
表3云台的设备标识
云台编号 Gimbal 1 Gimbal 2 Gimbal 3
设备标识 0100 0102 0104
  0101 0103 0105
表4镜头索引和设备标识的对应关系
Figure PCTCN2020088297-appb-000002
表5根据相机索引所确定的设备标识
相机索引 0 1 2
设备标识 0100 0102 0104
下面重点描述相机对象的架构,下面以XT2相机为例进行描述,如图9所示,在SharedLib层,XT2相机对象DJICameraXT2CameraAbstraction包含可见光镜头对象DJILensXT2VisibleLensAbstraction和红外镜头对象DJILensXT2ThermalLensAbstraction。
其中,XT2相机对象DJICameraXT2CameraAbstraction继承着多镜头相机对象DJIMultiLensCameraAbstraction。多镜头相机对象DJIMultiLensCameraAbstraction又进一步继承着部件对象DJISDKCacheHWAbstraction。
其中,可见光镜头对象DJILensXT2VisibleLensAbstraction继承着镜头对象DJILensAbstraction,而镜头对象DJILensAbstraction继承着子部件对象DJISubComponentHWAbstraction。
其中,红外镜头对象DJILensXT2ThermalLensAbstraction继承镜头对象DJILensAbstraction,而镜头对象DJILensAbstraction继承着子部件对象DJISubComponentHWAbstraction。
其中,子部件对象DJISubComponentHWAbstraction继承部件对象DJISDKCacheHWAbstraction,但是两者并没有对相机和镜头的功能继承。因此,在调用多个镜头的共有属性时,可以调用XT2相机对象。在调用单个镜头的私有属性时,可以调用可见光镜头对象和红外镜头对象。在多个镜头复用同一设备标识时,通过对象内的调用属性区分具体调用哪个镜头。
相应地,如图10所示,在SDK层,XT2相机对象XT2Camera包含XT2相机可见光镜头对象XT2VisbleLens和XT2相机红外镜头对象XT2ThermalLens,符合XT2相机的物理结构。
其中,XT2相机对象XT2Camera继承基础相机对象BaseCamera,基础相机对象BaseCamera又继承着相机对象Camera。XT2相机的可见光镜头对象XT2VisbleLens和XT2相机的红外镜头对象XT2ThermalLens都继承着基础镜头BaseLens,基础镜头BaseLens又继承着镜头Lens。
下面结合控制终端的软件架构和图11重点描述本申请实施例提供的又一种相机调用方法。本申请实施例提供的又一种相机调用方法可以应用于控制终端,方法包括:
S501、控制终端获取用于调用相机的调用请求。
其中,该步骤已经在S301中详细说明,重复部分此处不再赘述。下面结合控制终端的具体软件架构和多镜头相机ZENMUSE XT2进行描述。从应用程序APP中读取调用请求,若调用请求为调用共有属性的第一类型,调用请求为XT2相机对象XT2Camera的调用接口,XT2相机对象XT2Camera中调用索引为相机索引Index,若调用1号云台的相机,相机索引Index=0。若调用请求为调用私有属性的第二类型,调用请求可以为XT2相机的红外镜头对象XT2ThermalLens的调用接口,则红外镜头对象XT2ThermalLens的镜头索引SubIndex=0。调用请求还可以XT2相机的可见光镜头对象XT2VisbleLens的调用接口,则可见光镜头对象XT2VisbleLens的镜头索引SubIndex=1。SDK层对调用接口进行解析,生成DJIKey。并将DJIKey传输至Sharelib层。
S502、控制终端根据调用属性确定调用请求的请求类型。
其中,该步骤已经在S302中详细说明,重复部分此处不再赘述。下面结合控制终端的具体软件架构描述。Sharelib层在接收到DJIKey后,对DJIKey进行解析生成调用属性FeatureID和调用索引。并根据调用属性FeatureID确定调用请求类型为调用共有属性或者调用私有属性。若调用属性FeatureID为共有属性,则调用索引为相机索引。若调用属性FeatureID为私有属性,调用索引为相机索引和镜头索引。
S503、控制终端根据请求类型、调用属性和调用索引生成相机调用指令。
其中,该步骤已经在S303中详细说明,重复部分此处不再赘述。下面结合控制终端的具体软件架构描述。
其中,Midware层根据请求类型、请求类型调用属性和调用索引生成相机调用指令。若请求类型为共有属性,则Midware层根据相机索引确定设备标识,再根据设备标识和调用属性生成相机调用指令。若请求类型为私有属性,则Midware层根据相机索引和镜头索引确定设备标识,再根据设备标识和调用属性生成相机调用指令。
下面以表1至表4中所示对应关系为例描述确定设备标识过程。当调用属性为共有属性,调用索引为相机索引,相机索引为0。根据相机索引即可确定云台为Gimbal1。云台Gimbal 1的第一设备标识为0100,则确定相机索引为0对应的设备标识为0100。
当调用属性为私有属性,调用索引为相机索引和镜头索引,若相机索引为0,镜头索引为0,根据相机索引即可确定云台为Gimbal 1。在确定云台Gimbal 1的镜头索引和设备标识之间关系,可以确定设备标识为0100,与相机索引对应设备标识相同。相应地,若相机索引为0,镜头索引为1,可以确定设备标识为0101。
在本申请实施例提供的相机调用方法中,通过在调用请求中设置调用属性,若调用共有属性,则在请求中配置调用共有属性的调用属性,若调用私有属性,则在请求中配置调用私有属性的调用属性,仅需一个调用请求即可完成对多个镜头的共有属性的调用,简化调用过程。
图12为本申请实施例提供的相机调用方法的流程示意图。如图12所示,本申请实施例提供的又一种相机调用方法可以应用于相机调用设备,方法包括:
S601、相机调用设备从控制终端接收相机调用指令。
其中,该相机调用指令是根据上述实施例提供的相机调用方法所生成的相机调用指令。
S602、相机调用设备对相机调用指令进行解析,生成设备标识和调用属性。
其中,无人机内部存储有解析方法,可根据控制终端生成相机指令所使用算法解析出设备标识和调用属性。
S603、相机调用设备根据调用属性和设备标识确定调用镜头。
其中,根据调用属性和设备标识确定调用镜头,具体包括:确定与设备标识关联的至少一个备选镜头。根据调用属性从备选镜头中确定调用镜头。
确定与设备标识关联的至少一个备选镜头,具体包括:根据设备标识和云台标识之间的关系确定与设备标识匹配的关联云台。根据预设的云台与相机索引之间关系确定与关联云台匹配的关联相机。根据设备标识从关联相机的镜头中选择备选镜头。
根据调用属性从备选镜头中确定调用镜头,具体包括:若调用属性为所有镜头的共有属性,调用镜头为关联相机的所有镜头。若调用属性为每个镜头的私有属性,根据调用属性与镜头之间关系从备选镜头中选择调用镜头。
其中,不同类型的辅镜头参数或者运行方法不同,各个辅镜头的调用属性也不同,可以根据调用属性和辅镜头之间对应关系,从具有同一个设备标识的备选镜头中选择具体调 用镜头。
在云台预留给相机的设备标识的数量有限情况下,例如:每个云台只给相机预留两个设备标识DeviceId。可以让相机索引和主镜头索引复用同一个设备标识,其他辅镜头复用同一个设备标识。根据设备标识从关联相机的镜头中选择备选镜头,具体包括:若设备标识为第一设备标识,备选镜头为关联相机的所有镜头。若设备标识为第二设备标识,备选镜头为关联相机的辅镜头。
S604、相机调用设备根据调用属性对调用镜头的属性进行设置。
其中,在确定调用镜头后,根据调用属性中具体固有参数或者运行过程,对镜头进行设置,实现相机调用过程。
当设备标识为主镜头对应设备标识,且调用属性为共有属性时,主镜头将调用属性分发至其他辅镜头,辅镜头根据调用属性进行参数设置或运行过程设置。当设备标识为主镜头对应设备标识,且调用属性为私有属性时,主镜头根据调用属性进行参数设置或运行过程设置。
当设备标识为辅镜头对应设备标识,且调用属性为私有属性时,此时调用镜头为私有属性对应的辅镜头,辅镜头根据调用属性进行参数设置或运行过程设置。
在本申请实施例提供的相机调用方法中,通过在调用请求中设置调用属性,若调用共有属性,则在请求中配置调用共有属性的调用属性,若调用私有属性,则在请求中配置调用私有属性的调用属性,仅需一个调用请求即可完成对多个镜头的共有属性的调用,简化调用过程。将相机的镜头分为主镜头和辅镜头,主镜头的镜头索引和相机索引复用第一设备标识,辅镜头的镜头索引复用同一个第二设备标识之间对应关系,再使用调用属性区分复用同一设备标识的镜头,使得镜头数量不受设备标识限制。
图13为本申请实施例提供的一种控制终端的结构示意图,如图13所示,本申请实施例提供的控制终端包括:
获取单元701,用于获取用于调用相机的调用请求,其中,相机包括一个主镜头和多个辅镜头,调用请求包括调用属性和调用索引;
确定单元702,用于根据调用属性确定调用请求的请求类型,其中,请求类型为调用所有镜头的共有属性的第一类型和调用每个镜头的私有属性的第二类型;
生成单元703,用于根据请求类型、调用索引和调用属性生成相机调用指令。
可选地,生成单元703,具体用于:
当请求类型为第一类型时,调用索引为相机索引;
根据相机索引和相机索引与云台标识之间的对应关系,确定承载相机的云台;
根据云台和调用属性生成相机调用指令。
可选地,生成单元703,具体用于:
将云台的第一设备标识作为与相机索引关联的设备标识;
根据与相机索引关联的设备标识和调用属性,生成相机调用指令。
可选地,调用属性为所有镜头的共有属性。
可选地,生成单元703,具体用于:
当请求类型为第二类型时,调用索引包括相机索引和镜头索引;
根据相机索引和镜头索引确定设备标识;
根据设备标识和调用属性生成相机调用指令。
可选地,生成单元703,具体用于:
根据相机索引与云台之间对应关系,确定承载相机的云台;
根据云台和镜头索引,确定与镜头索引相匹配的设备标识。
可选地,生成单元703,具体用于:
确定与云台关联的关系表,关系表用于表示设备标识与镜头索引之间对应关系;
根据设备标识与镜头索引之间对应关系,确定与镜头索引相匹配的设备标识。
可选地,设备标识与镜头索引之间对应关系,具体包括:
主镜头的镜头索引与第一设备标识之间对应关系;
辅镜头的镜头索引与第二设备标识之间对应关系。
可选地,调用属性为每个镜头的私有属性。
可选地,确定单元702,具体用于:
判断调用属性是否为所有镜头的共有属性,若是,将请求类型设置为第一类型,若否,将请求类型设置为第二类型。
可选地,所述控制终端还包括发送单元704,发送单元704用于:
向无人机发送相机调用指令。
图14为本申请实施例提供的一种相机调用设备的结构示意图。如图14所示,本申请实施例提供的相机调用设备包括:
接收单元801,用于从控制终端接收相机调用指令;
生成单元802,用于对相机调用指令进行解析,生成设备标识和调用属性;
确定单元803,用于根据调用属性和设备标识确定调用镜头;
设置单元804,用于根据调用属性对调用镜头的属性进行设置。
可选地,确定单元803,具体用于:
确定与设备标识关联的至少一个备选镜头;
根据调用属性从备选镜头中确定调用镜头。
可选地,确定单元803,具体用于:
根据设备标识和云台标识之间的关系,确定与设备标识匹配的关联云台;
根据预设的云台与相机索引之间关系,确定与关联云台匹配的关联相机;
根据设备标识从关联相机的镜头中选择备选镜头。
可选地,确定单元803,具体用于:
若设备标识为第一设备标识,备选镜头为关联相机的所有镜头;
若设备标识为第二设备标识,备选镜头为关联相机的辅镜头。
可选地,确定单元803,具体用于:
若调用属性为所有镜头的共有属性,调用镜头为关联相机的所有镜头;
若调用属性为每个镜头的私有属性,根据调用属性与镜头之间关系,从备选镜头中选择调用镜头。
图15为本申请实施例示出的电子设备的结构示意图。如图15所示,本实施例提供的电子设备900包括:发送器901、接收器902、存储器903、及处理器902。
发送器901,用于发送指令和数据;
接收器902,用于接收指令和数据;
存储器903,用于存储计算机执行指令;
处理器904,用于执行存储器存储的计算机执行指令,以实现上述实施例中贴图生成方法所执行的各个步骤。具体可以参见前述相机调用方法实施例中的相关描述。
可选地,上述存储器903既可以是独立的,也可以跟处理器904集成在一起。
当存储器903独立设置时,该处理设备还包括总线,用于连接存储器903和处理器904。
图16为本申请实施例提供的相机调用系统的结构示意图。如图15所示,本实施例的相机调用系统1000可以包括:相机调用设备1002、和/或控制终端1003。
其中,相机调用设备1002可以采用图15所示实施例的结构,其对应地,可以执行上述各方法实施例中无人机的技术方案,其实现原理和技术效果类似,此处不再赘述。
其中,控制终端1003可以采用图15所示实施例的结构,其对应地,可以执行上述各方法实施例中控制终端的技术方案,其实现原理和技术效果类似,此处不再赘述。
需要说明的是,相机调用设备1002可以设置在无人机1001上,相机调用设备1002例如为无人机1001的一部分。或者,相机调用设备1002的一部分可以设置在无人机1001上,相机调用设备1002的另一部分可以设置在控制终端1003上。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:只读内存(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (35)

  1. 一种相机调用方法,其特征在于,包括:
    获取用于调用相机的调用请求,其中,所述相机包括一个主镜头和多个辅镜头,所述调用请求包括调用属性和调用索引;
    根据所述调用属性确定所述调用请求的请求类型,其中,所述请求类型包括调用所有镜头的共有属性的第一类型和调用每个镜头的私有属性的第二类型;
    根据所述请求类型、所述调用属性和所述调用索引生成相机调用指令。
  2. 根据权利要求1所述的方法,其特征在于,若所述请求类型为所述第一类型,所述调用索引包括相机索引;根据所述请求类型、所述调用属性和所述调用索引生成相机调用指令,具体包括:
    根据所述相机索引和相机索引与云台标识之间的对应关系,确定承载所述相机的云台;
    根据所述云台和所述调用属性生成相机调用指令。
  3. 根据权利要求2所述的方法,其特征在于,根据所述云台和所述调用属性生成相机调用指令,具体包括:
    将云台的第一设备标识作为与相机索引关联的设备标识;
    根据所述与相机索引关联的设备标识和所述调用属性,生成相机调用指令。
  4. 根据权利要求2或3所述的方法,其特征在于,所述调用属性为所有镜头的共有属性。
  5. 根据权利要求1所述的方法,其特征在于,若所述请求类型为所述第二类型,所述相机索引包括相机索引和镜头索引;根据所述请求类型、所述调用属性和所述调用索引生成相机调用指令具体包括:
    根据所述相机索引和所述镜头索引确定设备标识;
    根据所述设备标识和所述调用属性生成相机调用指令。
  6. 根据权利要求5所述的方法,其特征在于,根据所述相机索引和所述镜头索引确定设备标识,具体包括:
    根据相机索引与云台之间对应关系,确定承载所述相机的云台;
    根据所述云台和所述镜头索引,确定与所述镜头索引相匹配的设备标识。
  7. 根据权利要求6所述的方法,其特征在于,根据所述云台和所述镜头索引,确定与所述镜头索引相匹配的设备标识,具体包括:
    确定与所述云台关联的关系表,所述关系表用于表示设备标识与所述镜头索引之间对 应关系;
    根据设备标识与所述镜头索引之间对应关系,确定与所述镜头索引相匹配的设备标识。
  8. 根据权利要求7所述的方法,其特征在于,所述设备标识与所述镜头索引之间对应关系,具体包括:
    主镜头的镜头索引与第一设备标识之间对应关系;
    辅镜头的镜头索引与第二设备标识之间对应关系。
  9. 根据权利要求5至8任一项所述的方法,其特征在于,所述调用属性为每个镜头的私有属性。
  10. 根据权利要求1所述的方法,其特征在于,根据所述调用属性确定所述调用请求的请求类型,具体包括:
    判断所述调用属性是否为所有镜头的共有属性,若是,将所述请求类型设置为所述第一类型,若否,将所述请求类型设置为所述第二类型。
  11. 根据权利要求1所述的方法,其特征在于,在根据所述请求类型和调用请求生成相机调用指令之后,所述方法包括:
    向无人机发送所述相机调用指令。
  12. 一种相机调用方法,其特征在于,所述方法包括:
    从控制终端接收相机调用指令;
    对所述相机调用指令进行解析,生成设备标识和调用属性;
    根据所述调用属性和所述设备标识确定调用镜头;
    根据所述调用属性对所述调用镜头的属性进行设置。
  13. 根据权利要求12所述的方法,其特征在于,根据所述调用属性和设备标识确定调用镜头,具体包括:
    确定与所述设备标识关联的至少一个备选镜头;
    根据所述调用属性从所述备选镜头中确定调用镜头。
  14. 根据权利要求13所述的方法,其特征在于,确定与所述设备标识关联的至少一个备选镜头,具体包括:
    根据设备标识和云台标识之间的关系,确定与设备标识匹配的关联云台;
    根据预设的云台与相机索引之间关系,确定与关联云台匹配的关联相机;
    根据所述设备标识从所述关联相机的镜头中选择备选镜头。
  15. 根据权利要求14所述的方法,其特征在于,根据所述设备标识从所述关联相机 的镜头中选择备选镜头,具体包括:
    若所述设备标识为第一设备标识,所述备选镜头为所述关联相机的所有镜头;
    若所述设备标识为第二设备标识,所述备选镜头为所述关联相机的辅镜头。
  16. 根据权利要求13至15任一项所述的方法,其特征在于,根据所述调用属性从所述备选镜头中确定调用镜头,具体包括:
    若所述调用属性为所有镜头的共有属性,所述调用镜头为所述关联相机的所有镜头;
    若所述调用属性为每个镜头的私有属性,根据调用属性与镜头之间关系,从所述备选镜头中选择调用镜头。
  17. 一种控制终端,其特征在于,包括:
    获取单元,用于获取用于调用相机的调用请求,其中,所述相机包括一个主镜头和多个辅镜头,所述调用请求包括调用属性和调用索引;
    确定单元,用于根据所述调用属性确定所述调用请求的请求类型,其中,所述请求类型为调用所有镜头的共有属性的第一类型和调用每个镜头的私有属性的第二类型;
    生成单元,用于根据所述请求类型、所述调用索引和所述调用属性生成相机调用指令。
  18. 根据权利要求17所述的终端,其特征在于,所述生成单元,具体用于:
    当所述请求类型为所述第一类型时,所述调用索引为相机索引;
    根据所述相机索引和相机索引与云台标识之间的对应关系,确定承载所述相机的云台;
    根据所述云台和所述调用属性生成相机调用指令。
  19. 根据权利要求18所述的终端,其特征在于,所述生成单元,具体用于:
    将云台的第一设备标识作为与相机索引关联的设备标识;
    根据所述与相机索引关联的设备标识和所述调用属性,生成相机调用指令。
  20. 根据权利要求18或19任一项所述的终端,其特征在于,所述调用属性为所有镜头的共有属性。
  21. 根据权利要求17所述的终端,其特征在于,所述生成单元,具体用于:
    当所述请求类型为所述第二类型时,所述调用索引包括相机索引和镜头索引;
    根据所述相机索引和所述镜头索引确定设备标识;
    根据所述设备标识和所述调用属性生成相机调用指令。
  22. 根据权利要求21所述的终端,其特征在于,所述生成单元,还用于:
    根据相机索引与云台之间对应关系,确定承载所述相机的云台;
    根据所述云台和所述镜头索引,确定与所述镜头索引相匹配的设备标识。
  23. 根据权利要求22所述的终端,其特征在于,所述生成单元,还用于:
    确定与所述云台关联的关系表,所述关系表用于表示设备标识与所述镜头索引之间对应关系;
    根据设备标识与所述镜头索引之间对应关系,确定与所述镜头索引相匹配的设备标识。
  24. 根据权利要求23所述的终端,其特征在于,所述设备标识与所述镜头索引之间对应关系,具体包括:
    主镜头的镜头索引与第一设备标识之间对应关系;
    辅镜头的镜头索引与第二设备标识之间对应关系。
  25. 根据权利要求21至24任一项所述的终端,其特征在于,所述调用属性为每个镜头的私有属性。
  26. 根据权利要求17所述的终端,其特征在于,所述确定单元,具体用于:
    判断所述调用属性是否为所有镜头的共有属性,若是,将所述请求类型设置为所述第一类型,若否,将所述请求类型设置为所述第二类型。
  27. 根据权利要求17所述的终端,其特征在于,所述控制终端还包括发送单元,所述发送单元用于:
    向无人机发送所述相机调用指令。
  28. 一种相机调用设备,其特征在于,包括:
    接收单元,用于从控制终端接收相机调用指令;
    生成单元,用于对所述相机调用指令进行解析,生成设备标识和调用属性;
    确定单元,用于根据所述调用属性和所述设备标识确定调用镜头;
    设置单元,用于根据所述调用属性对所述调用镜头的属性进行设置。
  29. 根据权利要求28所述的设备,其特征在于,所述确定单元具体用于:
    确定与所述设备标识关联的至少一个备选镜头;
    根据所述调用属性从所述备选镜头中确定调用镜头。
  30. 根据权利要求29所述的设备,其特征在于,所述确定单元具体用于:
    根据设备标识和云台标识之间的关系,确定与设备标识匹配的关联云台;
    根据预设的云台与相机索引之间关系,确定与关联云台匹配的关联相机;
    根据所述设备标识从所述关联相机的镜头中选择备选镜头。
  31. 根据权利要求30所述的设备,其特征在于,所述确定单元具体用于:
    若所述设备标识为第一设备标识,所述备选镜头为所述关联相机的所有镜头;
    若所述设备标识为第二设备标识,所述备选镜头为所述关联相机的辅镜头。
  32. 根据权利要求29至31任一项所述的设备,其特征在于,所述确定单元具体用于:
    若所述调用属性为所有镜头的共有属性,所述调用镜头为所述关联相机的所有镜头;
    若所述调用属性为每个镜头的私有属性,根据调用属性与镜头之间关系,从所述备选镜头中选择调用镜头。
  33. 一种相机调用系统,其特征在于,包括:
    如权利要求17-27任一项所述的控制终端;和/或
    如权利要求28-32任一项所述的相机调用设备。
  34. 一种计算机可读存储介质,其特征在于,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1-16任一项所述的方法。
  35. 一种包括指令的计算机程序产品,其特征在于,当所述指令在计算机上运行时,使得计算机执行如权利要求1-16任一项所述的方法。
PCT/CN2020/088297 2020-04-30 2020-04-30 相机调用方法以及设备 WO2021217584A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/088297 WO2021217584A1 (zh) 2020-04-30 2020-04-30 相机调用方法以及设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/088297 WO2021217584A1 (zh) 2020-04-30 2020-04-30 相机调用方法以及设备

Publications (1)

Publication Number Publication Date
WO2021217584A1 true WO2021217584A1 (zh) 2021-11-04

Family

ID=78331674

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/088297 WO2021217584A1 (zh) 2020-04-30 2020-04-30 相机调用方法以及设备

Country Status (1)

Country Link
WO (1) WO2021217584A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150229889A1 (en) * 2014-02-13 2015-08-13 Semiconductor Components Industries, Llc Adaptive image sensor systems and methods
US9185286B2 (en) * 2013-02-21 2015-11-10 Samsung Electronics Co., Ltd. Combining effective images in electronic device having a plurality of cameras
CN107544298A (zh) * 2017-06-27 2018-01-05 新华三云计算技术有限公司 一种摄像头调用方法和装置
CN108702489A (zh) * 2016-02-19 2018-10-23 三星电子株式会社 包括多个相机的电子设备及其操作方法
CN109644229A (zh) * 2016-08-31 2019-04-16 三星电子株式会社 用于控制相机的方法及其电子设备
CN109756763A (zh) * 2017-11-03 2019-05-14 三星电子株式会社 用于基于优先级处理图像的电子装置及其操作方法
CN110072070A (zh) * 2019-03-18 2019-07-30 华为技术有限公司 一种多路录像方法及设备
CN110602408A (zh) * 2019-09-29 2019-12-20 联想(北京)有限公司 电子设备以及由其执行的拍照方法
CN110691198A (zh) * 2018-07-05 2020-01-14 杭州海康威视数字技术股份有限公司 一种红外灯控制方法、装置及电子设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9185286B2 (en) * 2013-02-21 2015-11-10 Samsung Electronics Co., Ltd. Combining effective images in electronic device having a plurality of cameras
US20150229889A1 (en) * 2014-02-13 2015-08-13 Semiconductor Components Industries, Llc Adaptive image sensor systems and methods
CN108702489A (zh) * 2016-02-19 2018-10-23 三星电子株式会社 包括多个相机的电子设备及其操作方法
CN109644229A (zh) * 2016-08-31 2019-04-16 三星电子株式会社 用于控制相机的方法及其电子设备
CN107544298A (zh) * 2017-06-27 2018-01-05 新华三云计算技术有限公司 一种摄像头调用方法和装置
CN109756763A (zh) * 2017-11-03 2019-05-14 三星电子株式会社 用于基于优先级处理图像的电子装置及其操作方法
CN110691198A (zh) * 2018-07-05 2020-01-14 杭州海康威视数字技术股份有限公司 一种红外灯控制方法、装置及电子设备
CN110072070A (zh) * 2019-03-18 2019-07-30 华为技术有限公司 一种多路录像方法及设备
CN110602408A (zh) * 2019-09-29 2019-12-20 联想(北京)有限公司 电子设备以及由其执行的拍照方法

Similar Documents

Publication Publication Date Title
US10979615B2 (en) System and method for providing autonomous photography and videography
EP3397554B1 (en) System and method for utilization of multiple-camera network to capture static and/or motion scenes
CN109952755B (zh) 飞行路径生成方法、飞行路径生成系统、飞行体以及记录介质
US20190373173A1 (en) Multi-gimbal assembly
US10901437B2 (en) Unmanned aerial vehicle including an omnidirectional depth sensing and obstacle avoidance aerial system and method of operating same
CN108769531B (zh) 控制拍摄装置的拍摄角度的方法、控制装置及遥控器
WO2020143677A1 (zh) 一种飞行控制方法及飞行控制系统
US11513514B2 (en) Location processing device, flight vehicle, location processing system, flight system, location processing method, flight control method, program and recording medium
WO2019029551A1 (en) SYSTEM AND METHOD FOR AVOIDING OBSTACLES IN AIR SYSTEMS
WO2018053825A1 (zh) 对焦方法和装置、图像拍摄方法和装置及摄像系统
CN110022444A (zh) 无人飞行机的全景拍照方法与使用其的无人飞行机
WO2020172800A1 (zh) 可移动平台的巡检控制方法和可移动平台
WO2021168819A1 (zh) 无人机的返航控制方法和设备
WO2019140686A1 (zh) 跟随控制方法、控制终端及无人机
WO2019230604A1 (ja) 検査システム
CN110383810B (zh) 信息处理设备、信息处理方法和信息处理程序
WO2021217371A1 (zh) 可移动平台的控制方法和装置
CN112672133A (zh) 基于无人机的立体成像方法和装置、计算机可读存储介质
CN109076101B (zh) 云台控制方法、设备及计算机可读存储介质
WO2021251441A1 (ja) 方法、システムおよびプログラム
WO2021168821A1 (zh) 可移动平台的控制方法和设备
WO2021217584A1 (zh) 相机调用方法以及设备
WO2022000245A1 (zh) 飞行器的定位方法、辅助定位系统的控制方法和装置
CN110799801A (zh) 基于无人机的测距方法、装置及无人机
US20200334192A1 (en) Communication method, device, and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933087

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933087

Country of ref document: EP

Kind code of ref document: A1