WO2022061508A1 - 拍摄控制方法、装置、系统及存储介质 - Google Patents

拍摄控制方法、装置、系统及存储介质 Download PDF

Info

Publication number
WO2022061508A1
WO2022061508A1 PCT/CN2020/116793 CN2020116793W WO2022061508A1 WO 2022061508 A1 WO2022061508 A1 WO 2022061508A1 CN 2020116793 W CN2020116793 W CN 2020116793W WO 2022061508 A1 WO2022061508 A1 WO 2022061508A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
photographing
target
devices
wireless positioning
Prior art date
Application number
PCT/CN2020/116793
Other languages
English (en)
French (fr)
Inventor
彭玄
龚明
王焱
曾文琪
赵巍
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/116793 priority Critical patent/WO2022061508A1/zh
Publication of WO2022061508A1 publication Critical patent/WO2022061508A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to the technical field of control, and in particular, to a shooting control method, device, system and storage medium.
  • the photographing device can be carried on a movable platform (such as a drone, a hand-held gimbal, etc.) or a fixed platform (such as a mounting pole) to capture and acquire images of objects (such as people, vehicles, etc.) in the environment, and according to the target object Adjusting the posture of the photographing device at the position in the image enables the photographing device to track and photograph the target photographing object.
  • a movable platform such as a drone, a hand-held gimbal, etc.
  • a fixed platform such as a mounting pole
  • the embodiments of the present application provide a shooting control method, device, system, and storage medium, which can efficiently and accurately realize tracking shooting of a target shooting object by multiple shooting devices.
  • a first aspect of the embodiments of the present application provides a photographing control method, which is applied to a photographing control system, where the photographing control system includes a plurality of photographing devices for photographing an environment in which a plurality of wireless positioning devices are set In the base station, a first wireless positioning tag is set on the target shooting object in the environment, wherein the method includes:
  • the relative relationship between the multiple shooting devices and the target shooting object is determined according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations position;
  • the shooting postures of the plurality of shooting devices are controlled according to the relative orientation, so as to track and shoot the target shooting object.
  • a second aspect of an embodiment of the present application provides a photographing control device.
  • the photographing control device is applied to a photographing control system.
  • the photographing control system includes a plurality of photographing devices for photographing an environment.
  • a wireless positioning base station a first wireless positioning tag is set on the target shooting object in the environment, and the shooting control device includes a memory and a processor, wherein,
  • the memory for storing program codes
  • the processor calls the program code in the memory, and when the program code is executed, is used to perform the following operations:
  • the relative relationship between the multiple shooting devices and the target shooting object is determined according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations position;
  • the shooting postures of the plurality of shooting devices are controlled according to the relative orientation, so as to track and shoot the target shooting object.
  • a third aspect of the embodiments of the present application provides a shooting control system, where the shooting control system includes:
  • a first wireless positioning tag set on the target photographing object in the environment the first wireless positioning tag and the multiple wireless positioning base stations are wirelessly connected;
  • the photographing control apparatus is respectively connected in communication with the plurality of photographing devices and the plurality of wireless positioning base stations.
  • a fourth aspect of the embodiments of the present application provides a computer storage medium, where computer program instructions are stored in the computer storage medium, and when the computer program instructions are executed by a processor, are used to execute the shooting control described in the first aspect method.
  • the relative orientation of the multiple shooting devices and the target shooting object can be determined in real time according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations , and then control the shooting postures of multiple shooting devices according to the relative orientation determined in real time, so as to track and shoot the target shooting object, which can achieve efficient and accurate tracking and shooting of the target shooting object by multiple shooting devices.
  • FIG. 1 is a schematic structural diagram of a shooting control system according to an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of a shooting control method according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of the positions of a shooting device and a target shooting object according to an embodiment of the present invention
  • FIG. 4 is a schematic flowchart of a shooting control method according to another embodiment of the present application.
  • 5A is a schematic interface diagram of a shooting control device according to an embodiment of the present application.
  • 5B is a schematic interface diagram of a photographing control device according to another embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a shooting control method according to another embodiment of the present application.
  • FIG. 7 is a schematic diagram of a scene of a shooting control system according to an embodiment of the present application.
  • FIG. 8 is a schematic flowchart of a shooting control method according to another embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a photographing control device according to an embodiment of the present application.
  • the shooting control method provided by the embodiment of the present application can be applied to a shooting control system.
  • the photographing control system may include multiple photographing devices, multiple wireless positioning base stations, a photographing control device and a first wireless positioning tag.
  • the first wireless positioning tag is set on the target shooting object, and the target shooting object is located in a designated environment, and a plurality of wireless positioning base stations are also set in the environment.
  • a wireless connection is established between the first wireless positioning tag and the plurality of wireless positioning base stations, and the first wireless positioning tag can perform wireless signal communication with the wireless positioning base station through the wireless connection.
  • the number of target photographing objects may be one or more, which is not specifically limited by the embodiments of the present application.
  • a communication connection is established between the photographing control device and each wireless positioning base station.
  • the shooting control device may determine the relative orientation of the multiple shooting devices and the target shooting object according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations during the shooting process of the multiple shooting devices.
  • a plurality of photographing devices are used for photographing the environment, and a communication connection is established between the plurality of photographing devices and the photographing control device, and the photographing control device can control the photographing postures of the plurality of photographing devices through the communication connection according to the above-mentioned relative orientation, so as to Track and shoot the target subject.
  • the shooting control apparatus may determine the relative relationship between the multiple shooting devices and the target shooting object according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations during the shooting process of the multiple shooting devices. orientation, and then control the shooting postures of multiple shooting devices according to the relative orientation, so as to track and shoot the target shooting object.
  • the embodiment of the present application can determine the distance between the shooting device and the target shooting object according to the wireless signal communication between the first wireless positioning tag and multiple wireless positioning base stations even under dark light conditions.
  • the relative orientation of the shooting device is adjusted according to the relative orientation of the shooting device and the target shooting object, so that the shooting device can track and shoot the target shooting object.
  • the embodiments of the present application can efficiently track and shoot the target shooting object. Secondly, if the target shooting object is mixed with other shooting objects (for example, the shape characteristics of the target shooting object and other shooting objects are similar and cannot be accurately distinguished from the shape), or are blocked by other objects, the embodiment of the present application can still be based on the first wireless
  • the wireless signal communication between the positioning tag and multiple wireless positioning base stations can determine the relative orientation of multiple shooting devices and the target shooting object in real time, and can accurately track and shoot the target shooting object.
  • the embodiment of the present application can adjust the posture of each photographing device according to the relative orientation of the plurality of photographing devices and the target photographing object, so as to realize the tracking and photographing of the target photographing object by the plurality of photographing devices.
  • Each shooting device performs tracking shooting on the target shooting object, so as to meet the requirement of tracking shooting the target shooting object through multiple different viewing angles or orientations.
  • the first wireless positioning tag may be an Ultra Wide Band (Ultra Wide Band, UWB) tag, a Radio Frequency Identification (Radio Frequency Identification, RFID) tag or a Wi-Fi Tag (location tag).
  • UWB Ultra Wide Band
  • RFID Radio Frequency Identification
  • Wi-Fi Tag location tag
  • the first wireless positioning tag may be set in a wearable device, and the wearable device may be worn on the target photographing object.
  • the target photographing object is a movable platform
  • the first wireless positioning tag can be integrated inside the movable platform, or the first wireless positioning tag can be integrated in the payload, and the payload is carried on the movable platform. on the cloud platform.
  • the first wireless positioning tag is used to send a signal to the wireless positioning base station, and the signal may include the position information of the target photographing object.
  • multiple wireless location base stations may be placed in various areas of the environment.
  • the distances between each wireless positioning base station are arranged at equal distances according to the communication range, and are used for receiving the signal sent by the first wireless positioning tag.
  • the wireless positioning base station may be independent of the photographing device, or may be integrated in the photographing device, for example, each photographing device integrates a wireless positioning base station.
  • the photographing device may include a camera, a mobile phone, a video camera, an Augmented Reality (AR) device or a camera.
  • the photographing device may be mounted on a fixed platform (eg, a mounting pole, a wall, a ceiling), and based on this, a plurality of photographing devices may be installed in various areas of the environment.
  • the position of the photographing device can be moved, for example, the photographing device can be mounted on a movable platform (such as a drone, a handheld gimbal, an unmanned vehicle), or the photographing device can be installed on a sliding on track.
  • the photographing device may be installed on the fixed platform or the movable platform through an attitude adjustment mechanism, for example, the attitude adjustment mechanism may be a pan/tilt.
  • the photographing control apparatus may run in a device such as a smart phone, a computer, or a ground station.
  • the movable platform in this embodiment of the present application may include an unmanned aerial vehicle, an unmanned vehicle, or a mobile robot, or the like.
  • the communication connection in this embodiment of the present application may be a wired connection or a wireless connection.
  • the wired connection may include a Universal Serial Bus (Universal Serial Bus, USB) connection, a cable connection, or an optical fiber connection, and the like.
  • Wireless connections can include WIFI connections, data network connections, Bluetooth connections, Narrow Band Internet of Things (NB-IoT) connections, LoRa connections, Global System for Mobile Communication (GSM) connections, Zigbee connections , UWB connection or code division multiple access (CodeDivisionMultipleAccess, CDMA) and so on.
  • NB-IoT Narrow Band Internet of Things
  • GSM Global System for Mobile Communication
  • GSM Global System for Mobile Communication
  • UWB connection or code division multiple access (CodeDivisionMultipleAccess, CDMA) and so on.
  • the environment in this embodiment of the present application may be an indoor environment or an outdoor environment, such as sports venues such as football fields and basketball courts, performance venues such as concerts and theaters, or sports venues such as unmanned aerial vehicle arenas and racing venues.
  • sports venues such as football fields and basketball courts
  • performance venues such as concerts and theaters
  • sports venues such as unmanned aerial vehicle arenas and racing venues.
  • FIG. 2 is a schematic flowchart of a shooting control method proposed by an embodiment of the present application. As shown in FIG. 2, the method may include:
  • wireless signal communication may be performed between the first wireless positioning tag and multiple wireless positioning base stations, wherein the wireless signal may include UWB signal, WIFI signal, Bluetooth signal or Zigbee signal.
  • the wireless signal may include UWB signal, WIFI signal, Bluetooth signal or Zigbee signal.
  • the relative orientations of the multiple shooting devices and the target shooting object are determined.
  • the relative orientation of the multiple shooting devices and the target shooting object may be obtained based on the Time of Flight (ToF) technology, the Angle-of-Arrival (AoA) technology, or the Time Difference of Arrival (TDoA) technology arrived.
  • ToF Time of Flight
  • AoA Angle-of-Arrival
  • TDoA Time Difference of Arrival
  • the wireless positioning base station may receive the first wireless signal from the first wireless positioning tag, and further, the relative orientation of the plurality of photographing devices and the target photographing object may be determined according to the received first wireless signal.
  • the first wireless positioning tag can periodically broadcast the first wireless signal, and each wireless positioning base station covering the first wireless positioning tag can receive the first wireless signal, and determine a plurality of shooting devices and targets according to the first wireless signal.
  • the relative orientation of the subject can periodically broadcast the first wireless signal, and each wireless positioning base station covering the first wireless positioning tag can receive the first wireless signal, and determine a plurality of shooting devices and targets according to the first wireless signal. The relative orientation of the subject.
  • each wireless positioning base station may broadcast the second wireless signal periodically, and the first wireless positioning tag may receive the second wireless signal to obtain the position of the target photographing object, and then determine the position of the target object according to the received second wireless signal.
  • the relative orientation of the plurality of photographing devices to the target photographic subject may be broadcast the second wireless signal periodically, and the first wireless positioning tag may receive the second wireless signal to obtain the position of the target photographing object, and then determine the position of the target object according to the received second wireless signal. The relative orientation of the plurality of photographing devices to the target photographic subject.
  • the shooting control device may determine the position of the target shooting object according to the wireless signal communication between the first wireless positioning tag and multiple wireless positioning base stations, and the shooting control device may also determine the position of each shooting device, and then According to the position of the target photographing object and the positions of each photographing device, the relative orientation of each photographing device and the target photographing object is determined.
  • the photographing control device can determine according to position A and position B.
  • Position B The relative orientation of the photographing device and the target photographing object is: the target photographing object is located in the northeast direction of the photographing device.
  • the manner in which the photographing control device determines the position of each photographing device may include any of the following:
  • the shooting control device determines the position of each shooting device according to the wireless signal communication between the second wireless positioning tag set on each shooting device and a plurality of wireless positioning base stations.
  • a second wireless positioning tag may be set on each photographing device, respectively.
  • wireless signal communication can be performed between the second wireless positioning tag and multiple wireless positioning base stations, and each wireless positioning base station can send the signal from the second wireless positioning tag to the shooting control device to shoot
  • the control device can determine the position of each photographing device according to the signal.
  • the manner in which the second wireless location tag performs wireless signal communication with multiple wireless location base stations is the same as the manner in which the first wireless location tag performs wireless signal communication with multiple wireless location base stations. The description of the manner in which the positioning tag performs wireless signal communication with multiple wireless positioning base stations will not be repeated in this embodiment.
  • the photographing apparatus can determine the position of the target photographing object in real time according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations.
  • the photographing control device may acquire the position of each photographing device in the preset memory, wherein the relative position of each wireless positioning base station and each photographing device is fixed.
  • the photographing control apparatus may acquire the position of each photographing device, and store the position of each photographing device in the preset memory. Based on this, the photographing control apparatus can directly acquire the positions of each photographing device from the preset memory during photographing control.
  • the position of each shooting device may be input by the user into the shooting control device; or, each shooting device is provided with a positioning module, and after any shooting device obtains the position of the shooting device through the positioning module, the position of the shooting device can be sent to the user.
  • the shooting control device; or, each shooting device is provided with a second wireless positioning tag, and the shooting control device can determine the position of each shooting device according to the wireless signal communication between the second wireless positioning tag and multiple wireless positioning base stations.
  • the shooting control device can receive the relative orientation of the target shooting object and each wireless positioning base station sent by a plurality of wireless positioning base stations, and the relative orientation of each shooting device and each wireless positioning base station, and then according to the target shooting object and the relative orientation of each wireless positioning base station.
  • the relative orientation of each wireless positioning base station and the relative orientation of each shooting device and each wireless positioning base station determine the relative orientation of each shooting device and the target shooting object.
  • each wireless positioning base station knows the position of the wireless positioning base station itself, then the wireless positioning base station can determine the position of the target shooting object according to the wireless signal communication between the wireless positioning base station and the first wireless positioning tag, and then according to the target shooting object The position of the target shooting object and the position of the wireless positioning base station is determined, the relative orientation of the target shooting object and the wireless positioning base station is determined, and the relative orientation of the target shooting object and the wireless positioning base station is sent to the shooting control device.
  • the wireless positioning base station can determine the location of the shooting device on which the second wireless positioning tag is set according to the wireless signal communication between the wireless positioning base station and the second wireless positioning tag, and then according to the position of the shooting device and the wireless positioning base station position, determine the relative azimuth between the photographing device and the wireless positioning base station, and send the relative azimuth between the photographing device and the wireless positioning base station to the photographing control device.
  • S202 controlling the shooting postures of the plurality of shooting devices according to the relative orientation, so as to track and shoot the target shooting object.
  • the photographing control device may control the photographing direction of the photographing device according to the relative orientation of the target photographing object and a certain photographing device, so as to perform the shooting on the target photographing object.
  • Track shooting For example, the relative orientation of a certain photographing device and the target photographing object is: the target photographing object is located in the northeast direction of the photographing device, then the photographing control device may send a photographing posture adjustment instruction to the photographing device, and the photographing posture adjustment instruction is used to indicate The shooting device is rotated to the northeast direction, and the shooting device can be rotated to the northeast direction in response to the shooting posture adjustment instruction, and then the shooting operation is performed, so as to realize the tracking shooting of the target shooting object.
  • the photographing posture of the photographing device may be the posture of the camera in the photographing device.
  • the photographing control device can control the photographing device according to the relative orientation of the target photographing object and a certain photographing device, as well as the current photographing direction of the photographing device the shooting position and/or shooting direction to track the target subject.
  • the relative orientation of a certain shooting device and the target shooting object is: the target shooting object is located in the northeast direction of the shooting device, and the current shooting direction of the shooting device is facing the north direction, then the shooting control device can send the shooting device to the shooting device.
  • the shooting attitude adjustment instruction is used to instruct the shooting device to rotate 80° to the right, the shooting device can respond to the shooting attitude adjustment instruction to rotate 80° to the right, so that the shooting device faces the target shooting object, and then Carry out the shooting operation to realize the tracking shooting of the target shooting object.
  • the relative orientation of the multiple shooting devices and the target shooting object can be determined in real time according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations , and then control the shooting postures of multiple shooting devices according to the relative orientation determined in real time, so as to track and shoot the target shooting object, which can achieve efficient and accurate tracking and shooting of the target shooting object by multiple shooting devices.
  • FIG. 4 is a schematic flowchart of a shooting control method proposed by another embodiment of the present application. As shown in FIG. 4 , the method may include:
  • the environment in which the multiple photographing devices are located may include at least two photographing objects, and each photographing object is respectively provided with a first wireless positioning tag. If tracking and shooting of one or more specific shooting objects among the at least two shooting objects is required, the shooting control device may acquire the shooting object indication information, and determine the target shooting object from the at least two shooting objects according to the shooting object indicating information , the target shooting object is the above-mentioned specific shooting object.
  • the manner in which the photographing control device obtains the indication information of the photographing object may include any of the following:
  • the photographing control device detects a user's operation of selecting a photographing object, and determines the indication information of the photographing object according to the detected selecting operation of the photographing object.
  • the shooting control device can display a user interface on the display screen, and the user interface includes an object identification list, and the object identification list can include the object identification of each shooting object located in the environment,
  • the photographing control device detects the user's operation of selecting a photographing object, and then determines the photographing object indication information according to the selecting operation of the photographing object.
  • the object identifier is used to identify the photographed object, and the object identifier may include the object name, portrait or appearance features of the photographed object.
  • the shooting control device can display a user interface on the display screen, and the user interface includes the object identifiers of each shooting object located in the environment.
  • the relative orientation between the objects matches the relative orientation between the shooting objects located in the environment.
  • the shooting control device detects the user’s shooting object selection operation, and then determines the shooting object indication information according to the shooting object selection operation.
  • the object identifier in this embodiment of the present application may include a preset image, or a social application avatar or a face image of the photographed object, or the like.
  • the preset image can be a water drop image, a star image, or a musical note image.
  • the photographing control device acquires photographing object indication information sent by a first wireless positioning tag set on one photographing object among the at least two photographing objects.
  • each photographing object located in the environment is provided with a first wireless positioning tag, and there is a function button on the device to which the first wireless positioning tag belongs.
  • the first wireless positioning tag corresponding to the function button can The photographing subject indication information is sent to the photographing control device.
  • the function keys may be physical keys or virtual keys, which are not specifically limited by the embodiments of the present application.
  • each photographing object located in the environment is provided with a first wireless location tag, and there is a microphone on the device to which the first wireless location tag belongs.
  • the first wireless positioning tag to which the device belongs can send the shooting object indication information to the shooting control device.
  • the preset keyword may be a keyword used to indicate that it is to be photographed preferentially, for example, "snap me” or "I'm ready", etc., which are not specifically limited by the embodiments of the present application.
  • S402 Determine a target shooting object from at least two shooting objects according to the shooting object indication information.
  • the photographing control device may determine the target photographing object from the at least two photographing objects according to the photographing object indication information. For example, after the user clicks a certain object identifier on the user interface, the shooting control device detects the user's selection operation of the shooting object, determines the shooting object indication information according to the detected shooting object selection operation, and then can click the user according to the shooting object indication information. The shooting object corresponding to the object identification is determined as the target shooting object. For another example, after a certain first wireless location tag sends the photographing object indication information to the photographing control apparatus, the photographing control apparatus may determine the photographing object corresponding to the first wireless location tag sending the photographing object indication information as the target photographing object.
  • step S403 in this embodiment of the present application reference may be made to the description of step S201 in the foregoing embodiment, which is not repeated in this embodiment of the present application.
  • step S404 in this embodiment of the present application reference may be made to the description of step S202 in the foregoing embodiment, which is not repeated in this embodiment of the present application.
  • the target shooting object in a scene where there are at least two shooting objects, can be determined from the at least two shooting objects according to the shooting object indication information, and then during the shooting process of multiple shooting devices, according to the target shooting object the wireless signal communication between the first wireless positioning tag set on the shooting object and multiple wireless positioning base stations, determine the relative orientation of multiple shooting devices and the target shooting object, and control the shooting posture of multiple shooting devices according to the relative orientation, In order to track and shoot the target shooting object, multiple shooting devices can efficiently and accurately realize the tracking shooting of the target shooting object.
  • FIG. 6 is a schematic flowchart of a shooting control method proposed by another embodiment of the present invention. As shown in FIG. 6 , the method may include:
  • the shooting control apparatus may directly determine the shooting object as the target shooting object.
  • a first wireless positioning tag is set on the photographing object.
  • the photographing control apparatus may determine, for each of the multiple photographing devices, a target to be tracked and photographed from the multiple photographing objects subject.
  • the manner in which the photographing control device determines, for each of the plurality of photographing devices, a target photographing object to be tracked and photographed from among the plurality of photographing objects may include any of the following:
  • the shooting control device determines the relative distances between the multiple shooting devices and the respective shooting objects according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations, and selects the relative distances from the multiple shooting devices for each shooting device in the multiple shooting devices. Among the shooting objects, a target shooting object to be tracked and photographed with the smallest relative distance is determined.
  • the shooting control system includes at least three shooting devices and three shooting objects, and the three shooting objects are respectively provided with first wireless positioning tags.
  • the shooting control device determines the relative distance between the shooting device and each shooting object according to the wireless signal communication between the first wireless positioning tag set on each shooting object and multiple wireless positioning base stations, and then calculates the relative distance between the shooting device and each shooting object.
  • the smallest photographing object is determined as the target photographing object of the photographing device, and then the relative relationship between the photographing device and the target photographing object is determined according to the wireless signal communication between the first wireless positioning tag set on the target photographing object and a plurality of wireless positioning base stations.
  • Orientation controlling the shooting posture of the shooting device according to the relative orientation between the shooting device and the target shooting object, so as to track and shoot the target shooting object.
  • the photographing object with the smallest relative distance from the photographing device is determined as the target photographing object of the photographing device, and then the photographing device is controlled to track and photograph the target photographing object. Tracking and photographing the target photographing object by the photographing device with the smallest relative distance from the target photographing object can ensure that the target photographing object will not be lost, and can also improve the clarity of the target photographing object in the image obtained by the tracking photographing. In the embodiment of the present application, it is also possible to stop the tracking shooting of the shooting object when the shooting device is far away from the shooting object.
  • the shooting control device determines the relative orientations of the multiple shooting devices and the multiple shooting objects according to the wireless signal communication between the first wireless positioning tags set on the multiple shooting objects and the multiple wireless positioning base stations, and according to the determined Relative orientation, determine the position of each shooting object in the shooting picture of each shooting device, and determine the distance between one of the multiple shooting objects and the center point of the shooting screen of the shooting device for each shooting device of the multiple shooting devices The target subject to be tracked with the smallest distance.
  • the shooting control device determines the relative relationship between the shooting device and each shooting object according to the wireless signal communication between the first wireless positioning tag set on each shooting object and a plurality of wireless positioning base stations Then, according to the determined relative azimuth, determine the position of each shooting object in the shooting screen of the shooting device, and determine the shooting object with the smallest distance from the center point of the shooting screen of the shooting device as the shooting device the target shooting object, and then according to the wireless signal communication between the first wireless positioning tag set on the target shooting object and multiple wireless positioning base stations, determine the relative orientation of the shooting device and the target shooting object, according to the shooting device and the target shooting The relative orientation between the objects controls the shooting posture of the shooting device, so as to track and shoot the target shooting object.
  • the photographing object with the smallest distance from the center point of the photographing screen of the photographing device is determined as the target photographing object of the photographing device, and then the photographing device is controlled to track and photograph the target photographing object, so as to ensure that the target photographing The subject will not be lost, and the clarity and integrity of the target subject in the image captured by the tracking can be improved.
  • the photographing control device detects the user's photographing object selection operation, and determines a target photographing object to be tracked and photographed for each photographing device from the plurality of photographing objects according to the detected photographing object selecting operation.
  • the photographing control device may detect the user's operation of selecting a photographing object, determine the target photographing object of the photographing device according to the detected photographing object selecting operation, and then determine the target photographing object of the photographing device according to the first photographing object set on the target photographing object.
  • the wireless signal communication between a wireless positioning tag and a plurality of wireless positioning base stations determines the relative orientation of the shooting device and the target shooting object, and controls the shooting posture of the shooting device according to the relative orientation between the shooting device and the target shooting object, to track the target subject.
  • the photographing control device may display a user interface on the display screen, and the user interface includes the device identifiers of each photographing device located in the environment.
  • the photographing control device detects the user's selection operation of the photographing device, and then according to The shooting device selection operation determines the shooting device selected by the user, and displays the object identifiers of each shooting object located in the environment in the user interface.
  • the shooting control device detects the user's shooting object selection operation, and then according to The subject selection operation determines the target subject of the photographing device.
  • the shooting control device may display a user interface on the display screen, and the user interface includes object identifiers of each shooting object located in the environment.
  • the shooting control device detects the user's shooting object selection operation, and then The shooting object selected by the user is determined according to the shooting object selection operation, and the equipment identification of each shooting equipment located in the environment is displayed in the user interface.
  • the shooting control device detects the user's shooting equipment selection operation, and then The photographing device selected by the user is determined according to the photographing device selection operation, and the above-determined photographing object is used as the target photographing object of the photographing device.
  • the photographing control device can detect whether the photographing object indication information sent by the first wireless positioning tag set on any photographing object and the photographing device indication information sent by any photographing device are received within a preset time period, and if so, Then, the photographing object corresponding to the first wireless positioning tag that sends the photographing object indication information is determined as the target photographing object of the photographing device that sends the photographing device indication information.
  • the user can decide which shooting device to use to track which shooting object, so as to ensure that the tracking shooting device and shooting object match the user's needs.
  • the manner in which the photographing control apparatus in the embodiment of the present application determines the relative orientation of the photographing device and the target photographing object of the photographing device may refer to the specific description of step S201 in the foregoing embodiment, which will not be repeated in the embodiment of the present application.
  • S604 Control the photographing posture of the photographing device according to the relative orientation between the photographing device and the target photographing object to be tracked and photographed, so as to track and photograph the target photographing object to be tracked and photographed.
  • step S202 for the manner in which the photographing control device controls the photographing posture of the photographing device according to the relative orientation of the photographing device and the target photographing object of the photographing device, reference may be made to the specific description of step S202 in the foregoing embodiment, which will not be repeated in this embodiment of the present application. .
  • a target photographing object to be tracked and photographed is determined for each of the multiple photographing devices from the multiple photographing objects, and according to the target photographing target to be tracked and photographed
  • the wireless signal communication between the first wireless positioning tag set on the shooting object and a plurality of wireless positioning base stations determines the relative orientation of the shooting device and the target shooting object to be tracked and shot, according to the relationship between the shooting device and the target shooting object to be tracked and shot.
  • the relative orientation between the two controls the shooting posture of the shooting device, so that the target shooting object to be tracked and shot is tracked and shot, and each shooting device can be used to maximize the tracking shooting of the shooting object.
  • the shooting control device determines the multiple After the relative orientation of each shooting device and the target shooting object, a shooting device whose relative orientation to the target shooting object is smaller than a preset threshold may be selected from the multiple shooting devices, and then the selected shooting device and the target shooting object are controlled to be selected according to the relative orientation of the selected shooting device and the target shooting object.
  • the shooting posture of the shooting equipment is used to track and shoot the target shooting object.
  • the photographing device whose relative orientation to the target photographing object is less than a preset threshold may be: a photographing device whose distance from the target photographing object is less than the preset distance threshold, and/or whose relative direction to the target photographing object is less than a preset angle Threshold shooting equipment.
  • the photographing control device may determine the relative orientation of the multiple photographing devices and the target photographing object according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations, or may set the setting on the target photographing object.
  • the positioning data obtained by the positioning module and the signals sent by the first wireless positioning tag to multiple wireless positioning base stations are fused to obtain the relative orientations of the multiple shooting devices and the target shooting object, so as to improve the positioning accuracy.
  • FIG. 8 is a shooting control method proposed by another embodiment of the present application, and the method includes:
  • step S801 in this embodiment of the present application reference may be made to the description of step S201 in the foregoing embodiment, which is not repeated in this embodiment of the present application.
  • S802 Control the shooting postures of the plurality of shooting devices according to the relative orientation, so as to track and shoot the target shooting object.
  • step S802 in this embodiment of the present application reference may be made to the description of step S201 in the foregoing embodiment, which is not repeated in this embodiment of the present application.
  • the shooting control device controls the shooting device to track and shoot the target shooting object, it can be determined that the target shooting object is in the shooting screen of the shooting device according to the relative orientation of the shooting device and the target shooting object, and the shooting posture of the shooting device when shooting the target shooting object s position.
  • the relative orientation of the shooting device and the target shooting object is: the target shooting object is located 30m northeast of the shooting device, and the shooting posture of the shooting device when shooting the target shooting object is towards the north direction, then the shooting control device can determine the target shooting The position of the object in the shooting screen of the shooting device is the upper right of the shooting screen.
  • S804 Edit the images captured by the plurality of capturing devices according to the determined position, so as to obtain the image area of the target capturing object.
  • the photographing control device may edit images captured by a plurality of photographing devices by means of manual editing or automatic editing according to the determined positions, so as to obtain the image area of the target photographing object.
  • the shooting control device may determine an initial clipped image region in the image captured by the shooting device according to the determined position, and then run a preset recognition model algorithm for the initial clipped image region to determine the target shooting in the image The image area of the subject and clips the image area of the target subject from the image.
  • the shooting control device determines that the position of the target shooting object in the shooting screen of the shooting device is the upper right of the shooting screen, then the initial clipped image region determined by the shooting control device may be the region located at the upper right in the image.
  • the preset recognition model algorithm is used to identify the human body; assuming that the target shooting object is an unmanned aerial vehicle, then the preset recognition model algorithm is used to identify the unmanned aerial vehicle, and the shooting control device can be used for the upper right in the image.
  • the preset recognition model algorithm is run in the square area to determine the image area of the target photographing object in the image, so that all the target photographing objects in the image are located in the determined image area of the target photographing object.
  • the embodiment of the present application can realize automatic editing of the target shooting object.
  • the shooting control device when the shooting control device is in the automatic editing mode, the images shot by the multiple shooting devices can be edited according to the determined positions, so as to obtain the image area of the target shooting object.
  • the editing prompt signs may be displayed in the shooting pictures of the plurality of shooting devices according to the determined positions, and the user responds to the editing prompt signs to detect that the user corresponds to the image area of the shooting equipment.
  • the selection operation the image area of the target photographing object in the image captured by the photographing device is determined according to the detected image area selection operation, and the image area of the target photographing object is clipped from the image.
  • the shooting control device can display a clip prompt mark on the shooting screen of the shooting device according to the determined position, and detect the user's response to the clip prompt mark on the user's image area selection operation corresponding to the shooting device, According to the detected image region selection operation, the image region of the target shooting object in the image captured by the shooting device is determined, and the image region of the target shooting object is clipped from the image.
  • the relative orientation of the multiple shooting devices and the target shooting object is determined according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations.
  • the relative orientation controls the shooting postures of multiple shooting devices to track and shoot the target shooting object. Editing images captured by multiple shooting devices to obtain the image area of the target shooting object can improve the accuracy and effectiveness of image editing.
  • FIG. 9 is a structural diagram of a photographing control device provided by an embodiment of the present application.
  • the photographing control device includes a memory 901 and a processor 902 , wherein, A program code is stored in the memory 901, and the processor 902 calls the program code in the memory. When the program code is executed, the processor 902 performs the following operations:
  • the relative relationship between the multiple shooting devices and the target shooting object is determined according to the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations position;
  • the shooting postures of the plurality of shooting devices are controlled according to the relative orientation, so as to track and shoot the target shooting object.
  • the processor 902 determines, according to the wireless signal communication between the first wireless location tag and the plurality of wireless location base stations, the relative relationship between the plurality of photographing devices and the target photographing object When azimuth is used, it is specifically used to perform the following operations:
  • the relative orientation of each of the shooting devices and the target shooting object is determined.
  • a second wireless positioning tag is set on each of the photographing devices
  • processor 902 determines the position of each photographing device, it is specifically configured to perform the following operations:
  • the position of each of the shooting devices is determined according to the wireless signal communication between the second wireless positioning tag set on each of the shooting devices and the plurality of wireless positioning base stations.
  • processor 902 determines the position of each photographing device, it is specifically configured to perform the following operations:
  • the positions of each of the photographing devices are acquired in a preset memory, wherein the relative positions of each of the wireless positioning base stations and each of the photographing devices are fixed.
  • the environment includes at least two photographed objects, wherein a first wireless positioning tag is set on each of the photographed objects, and the processor 902 is further configured to perform the following operations:
  • the target photographing subject is determined from the at least two photographing subjects according to the photographing subject indication information.
  • the processor 902 is specifically configured to perform the following operations when acquiring the indication information of the shooting object:
  • a user's operation of selecting a photographing object is detected, and the photographing object indication information is determined according to the detected operation of selecting a photographing object.
  • the processor 902 is specifically configured to perform the following operations when acquiring the indication information of the shooting object:
  • the processor When determining the target shooting object from the at least two shooting objects according to the shooting object indication information, the processor is specifically configured to perform the following operations:
  • the shooting object corresponding to the first wireless positioning tag sending the shooting object indication information is determined as the target shooting object.
  • the environment includes a photographed object, wherein a first wireless positioning tag is set on the photographed object;
  • the processor 902 is further configured to perform the following operations:
  • the number of the shooting objects is multiple, determine a target shooting object to be tracked and photographed from the multiple shooting objects for each shooting device in the multiple shooting devices;
  • the photographing posture of the photographing device is controlled according to the relative orientation between each photographing device and the determined target photographing object to be tracked and photographed, so as to perform tracking photographing of the target photographing object to be tracked and photographed.
  • the processor 902 is specifically configured to perform the following operations when determining a target photographing object to be tracked and photographed from the plurality of photographing objects for each photographing device of the plurality of photographing devices :
  • the wireless signal communication between the first wireless positioning tag and the multiple wireless positioning base stations determine the relative distances between the multiple shooting devices and each of the shooting objects
  • a target photographing object to be tracked and photographed with the smallest relative distance is determined from the plurality of photographing objects.
  • the processor 902 is specifically configured to perform the following operations when determining a target photographing object to be tracked and photographed from the plurality of photographing objects for each photographing device of the plurality of photographing devices :
  • the relative orientations of the multiple shooting devices and the multiple shooting objects are determined
  • a target photographing object to be tracked and photographed with the smallest distance from the center point of the photographing image of the photographing device is determined from the plurality of photographing objects.
  • the processor 902 is specifically configured to perform the following operations when determining a target photographing object to be tracked and photographed from the plurality of photographing objects for each photographing device of the plurality of photographing devices :
  • a target photographing subject to be tracked and photographed is determined for each of the plurality of photographing devices from the plurality of photographing subjects according to the detected photographing subject selection operation.
  • processor 902 is further configured to perform the following operations:
  • the relative orientation and the shooting postures of the plurality of shooting devices determine the position of the target shooting object in the shooting picture of the shooting devices
  • the images captured by the plurality of capturing devices are edited according to the determined positions, so as to obtain the image area of the target capturing object.
  • the processor 902 is specifically configured to perform the following operations when editing the images captured by the plurality of capturing devices according to the determined positions to obtain the image area of the target capturing object :
  • a preset recognition model algorithm is run for the initial clipped image region to determine the image region of the target object in the image, and the image region of the target object is clipped from the image.
  • the processor 902 is specifically configured to perform the following operations when editing the images captured by the plurality of capturing devices according to the determined positions to obtain the image area of the target capturing object :
  • the images captured by the plurality of shooting devices are edited according to the determined positions, so as to obtain the image area of the target shooting object.
  • processor 902 is further configured to perform the following operations:
  • the image area of the target subject is clipped from the image.
  • the photographing control device provided in this embodiment can execute the photographing control methods shown in FIG. 2 to FIG. 8 provided in the foregoing embodiments, and the execution manner and beneficial effects are similar, which will not be repeated here.
  • An embodiment of the present application provides a photographing control system, including a plurality of photographing devices for photographing an environment; a plurality of wireless positioning base stations arranged in the environment; A wireless positioning tag, the first wireless positioning tag and the multiple wireless positioning base stations are wirelessly connected; the shooting control device as described above, the shooting control device is respectively connected with the multiple shooting devices and the Communication connections between multiple wireless positioning base stations.
  • the operation of the photographing control device is the same as or similar to that described above, and will not be repeated here.
  • the relative positions of each of the wireless location base stations and each of the photographing devices are variable, and each of the photographing devices and the plurality of wireless location base stations are wirelessly connected.
  • the relative positions of each of the wireless positioning base stations and each of the photographing devices are fixed.
  • the photographing control apparatus is configured with an input device, and the input device is used for acquiring photographic object indication information.
  • the first wireless positioning tag is configured with an input device, and the input device is used to obtain the indication information of the photographed object.
  • the photographing device is configured with a display screen and an input device, the display screen is used for displaying a photographing picture, and the input device is used for detecting an image area selection operation.
  • An embodiment of the present application further provides a computer storage medium, where computer program instructions are stored in the computer storage medium, and when the computer program instructions are executed by a processor, are used to execute the shooting control methods shown in FIG. 2 to FIG. 8 . .

Abstract

一种拍摄控制方法、装置、系统及存储介质,该方法应用于拍摄控制系统,拍摄控制系统包括多个用于对环境进行拍摄的拍摄设备,环境中设置多个无线定位基站,环境中的目标拍摄对象上设置第一无线定位标签,该方法包括:在多个拍摄设备拍摄的过程中,根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与所述目标拍摄对象的相对方位(S201);根据相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄(S202)。可高效且精准地实现多个拍摄设备对目标拍摄对象进行跟踪拍摄。

Description

拍摄控制方法、装置、系统及存储介质 技术领域
本发明涉及控制技术领域,尤其涉及拍摄控制方法、装置、系统及存储介质。
背景技术
目前,拍摄设备可以承载在可移动平台(例如无人机、手持式云台等)或者固定平台(例如安装杆)拍摄获取环境中拍摄对象(例如人、车辆等)的图像,并根据目标对象在图像中的位置调整拍摄设备的姿态实现拍摄设备对目标拍摄对象进行跟踪拍摄。然而,利用图像来实现跟踪拍摄,在某些情况下无法实现,例如暗光条件下。另外,在某些场景中,需要多个不同的视角或者方位对目标拍摄对象进行跟踪拍摄,然而,目前现有技术还不支持多个拍摄设备对目标拍摄对象的跟踪拍摄。因此,如何高效且精准地实现多个拍摄设备对目标拍摄对象进行跟踪拍摄是目前亟需解决的技术问题。
发明内容
有鉴于此,本申请实施例提供了拍摄控制方法、装置、系统及存储介质,可高效且精准地实现多个拍摄设备对目标拍摄对象进行跟踪拍摄。
本申请实施例第一方面提供了一种拍摄控制方法,该方法应用于拍摄控制系统,所述拍摄控制系统包括多个用于对环境进行拍摄的拍摄设备,所述环境中设置多个无线定位基站,所述环境中的目标拍摄对象上设置第一无线定位标签,其中,所述方法包括:
在所述多个拍摄设备拍摄的过程中,根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位;
根据所述相对方位控制所述多个拍摄设备的拍摄姿态,以对所述目标拍摄对象进行跟踪拍摄。
本申请实施例第二方面提供了一种拍摄控制装置,所述拍摄控制装置应用于拍摄控制系统,所述拍摄控制系统包括多个用于对环境进行拍摄的拍摄设备,所述环境中设置多个无线定位基站,所述环境中的目标拍摄对象上设置第一无线定位标签,所述拍摄控制装置包括存储器和处理器,其中,
所述存储器,用于存储有程序代码;
所述处理器,调用存储器中的程序代码,当程序代码被执行时,用于执行如下操作:
在所述多个拍摄设备拍摄的过程中,根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位;
根据所述相对方位控制所述多个拍摄设备的拍摄姿态,以对所述目标拍摄对象进行跟踪拍摄。
本申请实施例第三方面提供了一种拍摄控制系统,所述拍摄控制系统包括:
多个用于对环境进行拍摄的拍摄设备;
设置在所述环境中的多个无线定位基站;
设置在所述环境中的目标拍摄对象上的第一无线定位标签,所述第一无线定位标签和所述多个无线定位基站之间无线连接;
如第二方面所述的拍摄控制装置,所述拍摄控制装置分别和所述多个拍摄设备以及所述多个无线定位基站之间通信连接。
本申请实施例第四方面提供了一种计算机存储介质,所述计算机存储介质中存储有计算机程序指令,所述计算机程序指令被处理器执行时,用于执行如第一方面所述的拍摄控制方法。
在本申请实施例中,在多个拍摄设备拍摄的过程中,可以根据第一无线定位标签与多个无线定位基站之间的无线信号通信,实时确定多个拍摄设备与目标拍摄对象的相对方位,然后根据实时确定得到的相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄,可实现高效且精准地多个拍摄设备对目标拍摄对象进行跟踪拍摄。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例的一种拍摄控制系统的架构示意图;
图2是本申请实施例的一种拍摄控制方法的流程示意图;
图3是本发明实施例的一种拍摄设备和目标拍摄对象的位置示意图;
图4是本申请另一实施例的一种拍摄控制方法的流程示意图;
图5A是本申请实施例的一种拍摄控制装置的界面示意图;
图5B是本申请另一实施例的一种拍摄控制装置的界面示意图;
图6是本申请另一实施例的一种拍摄控制方法的流程示意图;
图7是本申请实施例的一种拍摄控制系统的场景示意图;
图8是本申请另一实施例的一种拍摄控制方法的流程示意图;
图9是本申请实施例的一种拍摄控制装置的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本发明。本文所使用的术语“及/或”包括一个或多个相关的所列项目的任意的和所有的组合。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实 施例及实施例中的特征可以相互组合。
本申请实施例提供的拍摄控制方法可以应用在拍摄控制系统中。以图1所示的拍摄控制系统的架构示意图为例,拍摄控制系统可以包括多个拍摄设备,多个无线定位基站,拍摄控制装置以及第一无线定位标签。
其中,第一无线定位标签设置在目标拍摄对象上,目标拍摄对象位于指定环境中,该环境中还设置有多个无线定位基站。第一无线定位标签和多个无线定位基站之间建立有无线连接,第一无线定位标签可以通过该无线连接与无线定位基站之间进行无线信号通信。目标拍摄对象的数量可以为一个或者多个,具体不受本申请实施例的限定。
拍摄控制装置和各个无线定位基站之间建立有通信连接。拍摄控制装置可以在多个拍摄设备拍摄的过程中,根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位。
多个拍摄设备用于对该环境进行拍摄,多个拍摄设备和拍摄控制装置之间建立有通信连接,拍摄控制装置可以根据上述相对方位,通过该通信连接控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。
本申请实施例中,拍摄控制装置可以在多个拍摄设备拍摄的过程中,根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位,然后根据相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。相对传统的利用图像来实现跟踪拍摄,本申请实施例即使在暗光条件下,也可以根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定拍摄设备与目标拍摄对象的相对方位,并根据拍摄设备与目标拍摄对象的相对方位调整拍摄设备的姿态实现拍摄设备对目标拍摄对象进行跟踪拍摄,基于此,本申请实施例可高效地对目标拍摄对象进行跟踪拍摄。其次,目标拍摄对象如果和其他拍摄对象混在一起(例如目标拍摄对象和其他拍摄对象的外形特征相似,无法从外形上准确区分),或者被其他物体遮挡,本申请实施例还是可以根据第一无线定位标签与多个无线定位基站之间的无线信号通信,实时确定多个拍摄设备与目标拍摄对象的相对方位,可实现精准地对目标拍摄对象进行跟踪拍摄。另外,本申请实施例可根据多个拍摄设备与目标拍摄对象的相对方位调整各个拍摄设备的姿态,实现多个拍摄设备对目标拍摄对象进行跟踪拍摄,也就是说,本申请实施例可实现多个拍摄设备对目标拍摄对象进行跟踪拍摄,以满足通过多个不同的视角或者方位对目标拍摄对象进行跟踪拍摄的需求。
在该实施例中,第一无线定位标签可以为超宽带(Ultra Wide Band,UWB)标签、射频识别(Radio Frequency Identification,RFID)标签或者Wi-Fi Tag(定位标签)。在一个示例性场景中,假设目标拍摄对象为人或者动物,第一无线定位标签可以设置在可穿戴设备内,可穿戴设备可以穿戴在目标拍摄对象上。在另一个示例性场景中,假设目标拍摄对象为可移动平台,第一无线定位标签可以集成在可移动平台内部,或者第一无线定位标签可以集成在负载中,负载搭载在该可移动平台的云台上。第一无线定位标签用于向无线定位基站发送信号,该信号可以包括目标拍摄对象的位置信息。
在该实施例中,多个无线定位基站可以设置于该环境的各个区域。例如,各个无线定位基站之间的间距依据通信范围等距离布置,用于接收第一无线定位标签发送的信号。示 例性的,无线定位基站可以独立于拍摄设备,也可以集成在拍摄设备中,例如每个拍摄设备集成了一个无线定位基站。
在该实施例中,拍摄设备可以包括摄像头、手机、摄像机、增强现实(Augmented Reality,AR)设备或者相机。在一个示例性场景中,拍摄设备可以是安装在固定平台(例如安装杆、墙体、天花板上),基于此,多个拍摄设备可以设置于该环境的各个区域。在另一个示例性场景中,拍摄设备的位置可移动,例如拍摄设备可以挂载在可移动平台(例如无人机、手持式云台、无人车)上,又如拍摄设备可以安装在滑动轨道上。拍摄设备可以通过姿态调节机构安装在所述固定平台或可移动平台上,例如,所述姿态调节机构可以是云台。
在该实施例中,拍摄控制装置可以运行在智能手机、计算机或者地面站等设备中。
本申请实施例中的可移动平台可以包括无人飞行器、无人汽车或者可移动机器人等。
本申请实施例中的通信连接可以是有线连接,也可以是无线连接。示例性的,有线连接可以包括通用串行总线(Universal Serial Bus,USB)连接、电缆连接或者光纤连接等。无线连接可以包括WIFI连接、数据网络连接、蓝牙连接、窄带物联网(Narrow Band Internet of Things,NB-IoT)连接、LoRa连接、全球移动通信系统(Global System for Mobile Communication,GSM)连接、Zigbee连接、UWB连接或者码分多址(CodeDivisionMultipleAccess,CDMA)等。
本申请实施例中的环境可以是室内环境或者室外环境,例如足球场、篮球场等运动场所,演唱会、剧院等表演场所,或者无人飞行器竞技场地、赛车场地等竞技场所等。
基于图1所示的拍摄控制系统,请参见图2,是本申请实施例提出的一种拍摄控制方法的流程示意图,如图2所示,该方法可包括:
S201,在多个拍摄设备拍摄的过程中,根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位。
具体实现中,在多个拍摄设备拍摄的过程中,第一无线定位标签与多个无线定位基站之间可以进行无线信号通信,其中,所述无线信号可以包括UWB信号、WIFI信号、蓝牙信号或者Zigbee信号。根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位。多个拍摄设备与目标拍摄对象的相对方位可以是基于飞行时间(Time of Flight,ToF)技术、到达角度(Angle-of-Arrival,AoA)技术或者到达时间差(Time Difference of Arrival,TDoA)技术获取到的。
在某些情况中,无线定位基站可以接收来自第一无线定位标签的第一无线信号,进一步地,可以根据接收到的第一无线信号确定多个拍摄设备与目标拍摄对象的相对方位。
进一步地,第一无线定位标签可以周期性地广播第一无线信号,覆盖了第一无线定位标签的各个无线定位基站可以接收第一无线信号,并根据第一无线信号确定多个拍摄设备与目标拍摄对象的相对方位。
在某些情况中,各个无线定位基站可以周期性地广播第二无线信号,第一无线定位标签可以可以接收第二无线信号获取目标拍摄对象的位置,进而可以根据接收到的第二无线信号确定多个拍摄设备与目标拍摄对象的相对方位。
在一种实现方式中,拍摄控制装置可以根据第一无线定位标签与多个无线定位基站之 间的无线信号通信,确定目标拍摄对象的位置,拍摄控制装置还可以确定各个拍摄设备的位置,然后根据目标拍摄对象的位置和各个拍摄设备的位置,确定各个拍摄设备与目标拍摄对象的相对方位。以图3所示的拍摄设备和目标拍摄对象的位置示意图为例,假设目标拍摄对象处于位置A,多个拍摄设备中的某个拍摄设备处于位置B,那么拍摄控制装置可以确定根据位置A和位置B该拍摄设备与目标拍摄对象的相对方位为:目标拍摄对象位于该拍摄设备的东北方向处。
其中,拍摄控制装置确定各个拍摄设备的位置的方式可以包括如下任一种:
一、拍摄控制装置根据各个拍摄设备上设置的第二无线定位标签和多个无线定位基站之间的无线信号通信,确定各个拍摄设备的位置。
具体实现中,可以在各个拍摄设备上分别设置第二无线定位标签。在多个拍摄设备拍摄的过程中,第二无线定位标签与多个无线定位基站之间可以进行无线信号通信,各个无线定位基站可以将来自第二无线定位标签的信号发送给拍摄控制装置,拍摄控制装置可以根据该信号,确定各个拍摄设备的位置。其中,第二无线定位标签与多个无线定位基站进行无线信号通信的方式与第一无线定位标签与多个无线定位基站进行无线信号通信的方式相同,具体可以参见上述实施例中对第一无线定位标签与多个无线定位基站进行无线信号通信的方式的描述,本实施例不再赘述。
在该实施例中,无论拍摄设备是否可以移动,拍摄装置都可以根据第一无线定位标签与多个无线定位基站之间的无线信号通信,实时确定目标拍摄对象的位置。
二、拍摄控制装置可以在预设存储器中获取各个拍摄设备的位置,其中,各个无线定位基站和各个拍摄设备的相对位置固定。
在该实施例中,在固定安装各个拍摄设备之后,拍摄控制装置可以获取各个拍摄设备的位置,并将各个拍摄设备的位置存储在预设存储器中。基于此,拍摄控制装置在拍摄控制时,可以直接从预设存储器中获取各个拍摄设备的位置。其中,各个拍摄设备的位置可以是用户输入拍摄控制装置的;或者,各个拍摄设备设置了定位模块,任一拍摄设备通过定位模块获取到该拍摄设备的位置之后,可以将该拍摄设备的位置发给拍摄控制装置;或者,各个拍摄设备设置了第二无线定位标签,拍摄控制装置可以根据第二无线定位标签与多个无线定位基站之间的无线信号通信,确定各个拍摄设备的位置。
在一种实现方式中,拍摄控制装置可以接收多个无线定位基站发送的目标拍摄对象与各个无线定位基站的相对方位,以及各个拍摄设备与各个无线定位基站的相对方位,然后根据目标拍摄对象与各个无线定位基站的相对方位,以及各个拍摄设备与各个无线定位基站的相对方位,确定各个拍摄设备与目标拍摄对象的相对方位。
例如,各个无线定位基站知悉该无线定位基站本身的位置,那么无线定位基站可以根据该无线定位基站与第一无线定位标签之间的无线信号通信,确定目标拍摄对象的位置,然后根据目标拍摄对象的位置和无线定位基站的位置,确定目标拍摄对象与该无线定位基站的相对方位,并将目标拍摄对象与该无线定位基站的相对方位发送给拍摄控制装置。同理,无线定位基站可以根据该无线定位基站与第二无线定位标签之间的无线信号通信,确定设置该第二无线定位标签的拍摄设备的位置,然后根据该拍摄设备的位置和无线定位基站的位置,确定该拍摄设备与该无线定位基站的相对方位,并将该拍摄设备与该无线定位 基站的相对方位发送给拍摄控制装置。
S202,根据相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。
在一种实现方式中,在拍摄设备安装在固定平台上的情况下,拍摄控制装置可以根据目标拍摄对象和某个拍摄设备的相对方位,控制该拍摄设备的拍摄方向,以对目标拍摄对象进行跟踪拍摄。例如,某个拍摄设备与目标拍摄对象的相对方位为:目标拍摄对象位于该拍摄设备的东北方向处,那么拍摄控制装置可以向该拍摄设备发送拍摄姿态调整指令,该拍摄姿态调整指令用于指示该拍摄设备转动至东北方向,该拍摄设备可以响应拍摄姿态调整指令转动至东北方向,然后进行拍摄操作,以实现对目标拍摄对象的跟踪拍摄。其中,拍摄设备的拍摄姿态可以为拍摄设备中的摄像头的姿态。
在一种实现方式中,在拍摄设备安装在可移动平台上的情况下,拍摄控制装置可以根据目标拍摄对象和某个拍摄设备的相对方位,以及该拍摄设备当前的拍摄方向,控制该拍摄设备的拍摄位置和/或拍摄方向,以对目标拍摄对象进行跟踪拍摄。例如,某个拍摄设备与目标拍摄对象的相对方位为:目标拍摄对象位于该拍摄设备的东北方向处,该拍摄设备当前的拍摄方向为朝向正北方向,那么拍摄控制装置可以向该拍摄设备发送拍摄姿态调整指令,该拍摄姿态调整指令用于指示该拍摄设备向右转动80°,该拍摄设备可以响应拍摄姿态调整指令向向右转动80°,使得该拍摄设备朝向目标拍摄对象,然后进行拍摄操作,以实现对目标拍摄对象的跟踪拍摄。
在本申请实施例中,在多个拍摄设备拍摄的过程中,可以根据第一无线定位标签与多个无线定位基站之间的无线信号通信,实时确定多个拍摄设备与目标拍摄对象的相对方位,然后根据实时确定得到的相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄,可实现高效且精准地实现多个拍摄设备对目标拍摄对象进行跟踪拍摄。
请参见图4,是本申请另一实施例提出的一种拍摄控制方法的流程示意图,如图4所示,该方法可包括:
S401,获取拍摄对象指示信息。
在该实施例中,多个拍摄设备所处环境中可以包括至少两个拍摄对象,各个拍摄对象上分别设置有第一无线定位标签。如果需要对至少两个拍摄对象中的一个或者多个特定的拍摄对象进行跟踪拍摄,拍摄控制装置可以获取拍摄对象指示信息,并根据拍摄对象指示信息,从至少两个拍摄对象中确定目标拍摄对象,目标拍摄对象即上述特定的拍摄对象。
其中,拍摄控制装置获取拍摄对象指示信息的方式可以包括如下任一种:
一、拍摄控制装置检测用户的拍摄对象选择操作,根据检测到的拍摄对象选择操作确定拍摄对象指示信息。
以图5A所示的拍摄控制装置的界面示意图为例,拍摄控制装置可以在显示屏幕中显示用户界面,用户界面包括对象标识列表,对象标识列表可以包括位于环境中的各个拍摄对象的对象标识,用户点击某个对象标识时,拍摄控制装置检测到用户的拍摄对象选择操作,进而根据拍摄对象选择操作确定拍摄对象指示信息。对象标识用于标识拍摄对象,对象标识可以包括拍摄对象的对象名称、画像或者外貌特征等。
以图5B所示的拍摄控制装置的界面示意图为例,拍摄控制装置可以在显示屏幕中显示 用户界面,用户界面包括位于环境中的各个拍摄对象的对象标识,其中,各个拍摄对象的对象标识之间的相对方位与位于环境中的各个拍摄对象之间的相对方位匹配,用户点击某个对象标识时,拍摄控制装置检测到用户的拍摄对象选择操作,进而根据拍摄对象选择操作确定拍摄对象指示信息。本申请实施例中的对象标识可以包括预设图像,或者拍摄对象的社交应用头像或者人脸图像等。预设图像可以为水滴图像、星星图像或者音符图像等。
二、拍摄控制装置获取至少两个拍摄对象中的一个拍摄对象上设置的第一无线定位标签发送的拍摄对象指示信息。
例如,位于环境中的每个拍摄对象设置有第一无线定位标签,第一无线定位标签所属设备上存在一个功能按键,用户点击该功能按键,则该功能按键对应的第一无线定位标签可以将拍摄对象指示信息发送给拍摄控制装置。其中,功能按键可以为物理按键或者虚拟按键,具体不受本申请实施例的限定。
又如,位于环境中的每个拍摄对象设置有第一无线定位标签,第一无线定位标签所属设备上存在麦克风,用户通过某个设备的麦克风输入语音信息时,如果该语音信息包含预设关键字,则该设备所属的第一无线定位标签可以将拍摄对象指示信息发送给拍摄控制装置。其中,预设关键字可以为用于指示优先拍摄自身的关键字,例如“快拍我”或者“我准备好了”,等等,具体不受本申请实施例的限定。
S402,根据拍摄对象指示信息,从至少两个拍摄对象中确定目标拍摄对象。
拍摄控制装置获取拍摄对象指示信息之后,可以根据拍摄对象指示信息,从至少两个拍摄对象中确定目标拍摄对象。例如,用户在用户界面点击某个对象标识之后,拍摄控制装置检测到用户的拍摄对象选择操作,根据检测到的拍摄对象选择操作确定拍摄对象指示信息,进而可以根据拍摄对象指示信息,将用户点击的对象标识对应的拍摄对象确定为目标拍摄对象。又如,某个第一无线定位标签向拍摄控制装置发送拍摄对象指示信息之后,拍摄控制装置可以将发送该拍摄对象指示信息的第一无线定位标签对应的拍摄对象确定为目标拍摄对象。
S403,在多个拍摄设备拍摄的过程中,根据目标拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与所述目标拍摄对象的相对方位。
本申请实施例中的步骤S403具体可以参见上述实施例中步骤S201的描述,本申请实施例不再赘述。
S404,根据相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。
本申请实施例中的步骤S404具体可以参见上述实施例中步骤S202的描述,本申请实施例不再赘述。
在本申请实施例中,在存在至少两个拍摄对象的场景下,可以根据拍摄对象指示信息,从至少两个拍摄对象中确定目标拍摄对象,然后在多个拍摄设备拍摄的过程中,根据目标拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与所述目标拍摄对象的相对方位,根据相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄,可实现高效且精准地实现多个拍摄设备对目标拍摄对象进行跟踪拍摄。
请参见图6,是本发明另一实施例提出的一种拍摄控制方法的流程示意图,如图6所示,该方法可包括:
S601,当拍摄对象的数量为一个时,将拍摄对象确定为目标拍摄对象。
在该实施例中,如果多个拍摄设备所处环境中仅存在一个拍摄对象,那么拍摄控制装置可以直接将该拍摄对象确定为目标拍摄对象。其中,该拍摄对象上设置了第一无线定位标签。
S602,当拍摄对象的数量为多个时,为多个拍摄设备中的每一个拍摄设备从多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象。
在该实施例中,如果多个拍摄设备所处环境中存在多个拍摄对象,那么拍摄控制装置可以为多个拍摄设备中的每一个拍摄设备从多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象。
其中,拍摄控制装置为多个拍摄设备中的每一个拍摄设备从多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象的方式可以包括如下任一种:
一、拍摄控制装置根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与各个拍摄对象的相对距离,为多个拍摄设备中的每一个拍摄设备从多个拍摄对象中确定一个相对距离最小的待跟踪拍摄的目标拍摄对象。
以图7所示的拍摄控制系统的场景示意图为例,该拍摄控制系统至少包括三个拍摄设备以及三个拍摄对象,三个拍摄对象分别设置有第一无线定位标签。针对任一拍摄设备,拍摄控制装置根据各个拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定该拍摄设备与各个拍摄对象的相对距离,然后将相对距离最小的拍摄对象确定为该拍摄设备的目标拍摄对象,进而根据目标拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定该拍摄设备与目标拍摄对象的相对方位,根据该拍摄设备与目标拍摄对象之间的相对方位控制该拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。
本申请实施例将与拍摄设备的相对距离最小的拍摄对象确定为该拍摄设备的目标拍摄对象,进而控制该拍摄设备对该目标拍摄对象进行跟踪拍摄。通过与目标拍摄对象的相对距离最小的拍摄设备对该目标拍摄对象进行跟踪拍摄,可确保该目标拍摄对象不会被跟丢,还可以提高目标拍摄对象在跟踪拍摄得到的图像中的清晰度。本申请实施例也可以实现在拍摄设备与拍摄对象距离较远时停止对该拍摄对象的跟踪拍摄。
二、拍摄控制装置根据多个拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与多个拍摄对象的相对方位,根据确定的相对方位,确定每一个拍摄对象在每一个拍摄设备的拍摄画面中的位置,为多个拍摄设备中的每一个拍摄设备从多个拍摄对象中确定一个与拍摄设备的拍摄画面的中心点之间的距离最小的待跟踪拍摄的目标拍摄对象。
在该实施例中,针对任一拍摄设备,拍摄控制装置根据各个拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定该拍摄设备与各个拍摄对象的相对方位,然后根据确定的相对方位,确定每一个拍摄对象在该拍摄设备的拍摄画面中的 位置,并将与该拍摄设备的拍摄画面的中心点之间的距离最小的拍摄对象确定为该拍摄设备的目标拍摄对象,进而根据目标拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定该拍摄设备与目标拍摄对象的相对方位,根据该拍摄设备与目标拍摄对象之间的相对方位控制该拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。
本申请实施例将与拍摄设备的拍摄画面的中心点之间的距离最小的拍摄对象确定为该拍摄设备的目标拍摄对象,进而控制该拍摄设备对该目标拍摄对象进行跟踪拍摄,可确保该目标拍摄对象不会被跟丢,还可以提高目标拍摄对象在跟踪拍摄得到的图像中的清晰度和完整度。
三、拍摄控制装置检测用户的拍摄对象选择操作,根据检测到的拍摄对象选择操作为多个拍摄设备中的每一个拍摄设备从多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象。
在该实施例中,针对任一拍摄设备,拍摄控制装置可以检测用户的拍摄对象选择操作,根据检测到的拍摄对象选择操确定该拍摄设备的目标拍摄对象,进而根据目标拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定该拍摄设备与目标拍摄对象的相对方位,根据该拍摄设备与目标拍摄对象之间的相对方位控制该拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。
例如,拍摄控制装置可以在显示屏幕中显示用户界面,用户界面包括位于环境中的各个拍摄设备的设备标识,用户点击某个设备标识时,拍摄控制装置检测到用户的拍摄设备选择操作,进而根据拍摄设备选择操作确定用户选择的拍摄设备,并在用户界面中显示位于环境中的各个拍摄对象的对象标识,用户点击某个对象标识时,拍摄控制装置检测到用户的拍摄对象选择操作,进而根据拍摄对象选择操作确定该拍摄设备的目标拍摄对象。
又如,拍摄控制装置可以在显示屏幕中显示用户界面,用户界面包括位于环境中的各个拍摄对象的对象标识,用户点击某个对象标识时,拍摄控制装置检测到用户的拍摄对象选择操作,进而根据拍摄对象选择操作确定用户选择的拍摄对象,并在用户界面中显示位于环境中的各个拍摄设备的设备标识,用户点击某个设备标识时,拍摄控制装置检测到用户的拍摄设备选择操作,进而根据拍摄设备选择操作确定用户选择的拍摄设备,并将上述确定的拍摄对象作为该拍摄设备的目标拍摄对象。
又如,拍摄控制装置可以检测是否在预设时间段内接收到任一拍摄对象上设置的第一无线定位标签发送的拍摄对象指示信息,以及任一拍摄设备发送的拍摄设备指示信息,若是,则将发送拍摄对象指示信息的第一无线定位标签对应的拍摄对象确定为发送拍摄设备指示信息的拍摄设备的目标拍摄对象。
本申请实施例中,可以由用户决定由哪个拍摄设备对哪个拍摄对象进行跟踪拍摄,确保跟踪拍摄的拍摄设备和拍摄对象与用户需求匹配。
S603,根据待跟踪拍摄的目标拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定拍摄设备与待跟踪拍摄的目标拍摄对象的相对方位。
本申请实施例中拍摄控制装置确定拍摄设备与该拍摄设备的目标拍摄对象的相对方位的方式可参见上述实施例中步骤S201的具体描述,本申请实施例不再赘述。
S604,根据拍摄设备与待跟踪拍摄的目标拍摄对象之间的相对方位控制拍摄设备的拍 摄姿态,以对待跟踪拍摄的目标拍摄对象进行跟踪拍摄。
本申请实施例中拍摄控制装置根据拍摄设备与该拍摄设备的目标拍摄对象的相对方位控制该拍摄设备的拍摄姿态的方式可参见上述实施例中步骤S202的具体描述,本申请实施例不再赘述。
在本申请实施例中,当拍摄对象的数量为多个时,为多个拍摄设备中的每一个拍摄设备从多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象,根据待跟踪拍摄的目标拍摄对象上设置的第一无线定位标签与多个无线定位基站之间的无线信号通信,确定拍摄设备与待跟踪拍摄的目标拍摄对象的相对方位,根据拍摄设备与待跟踪拍摄的目标拍摄对象之间的相对方位控制拍摄设备的拍摄姿态,以对待跟踪拍摄的目标拍摄对象进行跟踪拍摄,可最大化利用每一个拍摄设备对拍摄对象进行跟踪拍摄。
上述实施例可以相互组合,得到新的拍摄控制方法,例如,拍摄控制装置在多个拍摄设备拍摄的过程中,根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位之后,可以从多个拍摄设备中选取与目标拍摄对象的相对方位小于预设阈值的拍摄设备,然后根据选取的拍摄设备与目标拍摄对象的相对方位控制选取的拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。示例性的,与目标拍摄对象的相对方位小于预设阈值的拍摄设备可以为:与目标拍摄对象的距离小于预设距离阈值的拍摄设备,和/或与目标拍摄对象的相对方向小于预设角度阈值的拍摄设备。
在上述实施例中,拍摄控制装置可以根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位,也可以将目标拍摄对象上设置的定位模块获取到的定位数据,以及第一无线定位标签发送至多个无线定位基站的信号相融合,得到多个拍摄设备与目标拍摄对象的相对方位,以提高定位精度。
基于上述实施例的描述,请参见图8,是本申请另一实施例提出的一种拍摄控制方法,该方法包括:
S801,在多个拍摄设备拍摄的过程中,根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位。
本申请实施例中的步骤S801具体可以参见上述实施例中步骤S201的描述,本申请实施例不再赘述。
S802,根据相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄。
本申请实施例中的步骤S802具体可以参见上述实施例中步骤S201的描述,本申请实施例不再赘述。
S803,根据相对方位和多个拍摄设备的拍摄姿态,确定目标拍摄对象在拍摄设备的拍摄画面中的位置。
拍摄控制装置控制拍摄设备对目标拍摄对象进行跟踪拍摄之后,可以根据拍摄设备和目标拍摄对象的相对方位,以及拍摄设备拍摄目标拍摄对象时的拍摄姿态,确定目标拍摄对象在拍摄设备的拍摄画面中的位置。
例如,拍摄设备与目标拍摄对象的相对方位为:目标拍摄对象位于该拍摄设备的东北 方向30m处,拍摄设备拍摄目标拍摄对象时的拍摄姿态为朝向正北方向,那么拍摄控制装置可以确定目标拍摄对象在拍摄设备的拍摄画面中的位置为拍摄画面的右上方。
S804,根据确定的位置对多个拍摄设备拍摄得到的图像进行剪辑,以获取目标拍摄对象的图像区域。
具体实现中,拍摄控制装置可以通过手动剪辑或者自动剪辑的方式,根据确定的位置对多个拍摄设备拍摄得到的图像进行剪辑,以获取目标拍摄对象的图像区域。
在一种实现方式中,拍摄控制装置可以根据确定的位置在拍摄设备拍摄得到的图像中确定初始剪辑图像区域,然后针对初始剪辑图像区域运行预设的识别模型算法,以确定图像中的目标拍摄对象的图像区域,并将目标拍摄对象的图像区域从图像中剪辑出来。
例如,拍摄控制装置确定目标拍摄对象在拍摄设备的拍摄画面中的位置为拍摄画面的右上方,那么拍摄控制装置确定的初始剪辑图像区域可以为图像中位于右上方的区域。假设目标拍摄对象为人,那么预设的识别模型算法用于识别人体;假设目标拍摄对象为无人飞行器,那么预设的识别模型算法用于识别无人飞行器,拍摄控制装置可以针对图像中位于右上方的区域运行预设的识别模型算法,以确定图像中的目标拍摄对象的图像区域,使得图像中的目标拍摄对象全部位于确定得到的目标拍摄对象的图像区域内。
本申请实施例可实现对目标拍摄对象的自动剪辑。
在一种实现方式中,当拍摄控制装置处于自动剪辑模式时,可以根据确定的位置对多个拍摄设备拍摄得到的图像进行剪辑,以获取目标拍摄对象的图像区域。
在一种实现方式中,当拍摄控制装置处于手动剪辑模式时,可以根据确定的位置在多个拍摄设备的拍摄画面中显示剪辑提示标识,检测用户响应剪辑提示标识对用户对应拍摄设备的图像区域选择操作,根据检测到的图像区域选择操作确定拍摄设备拍摄得到的图像中的目标拍摄对象的图像区域,将目标拍摄对象的图像区域从图像中剪辑出来。
在该实施例中,针对任一拍摄设备,拍摄控制装置可以根据确定的位置在该拍摄设备的拍摄画面中显示剪辑提示标识,检测用户响应剪辑提示标识对用户对应拍摄设备的图像区域选择操作,根据检测到的图像区域选择操作确定拍摄设备拍摄得到的图像中的目标拍摄对象的图像区域,将目标拍摄对象的图像区域从图像中剪辑出来。
在本申请实施例中,在多个拍摄设备拍摄的过程中,根据第一无线定位标签与多个无线定位基站之间的无线信号通信,确定多个拍摄设备与目标拍摄对象的相对方位,根据相对方位控制多个拍摄设备的拍摄姿态,以对目标拍摄对象进行跟踪拍摄,根据相对方位和多个拍摄设备的拍摄姿态,确定目标拍摄对象在拍摄设备的拍摄画面中的位置,根据确定的位置对多个拍摄设备拍摄得到的图像进行剪辑,以获取目标拍摄对象的图像区域,可提高图像剪辑的准确性和有效性。
本申请实施例提供了一种拍摄控制装置,应用于拍摄控制系统中,该拍摄控制系统包括多个用于对环境进行拍摄的拍摄设备,所述环境中设置多个无线定位基站,所述环境中的目标拍摄对象上设置第一无线定位标签,图9是本申请实施例提供的拍摄控制装置的结构图,如图9所示,所述拍摄控制装置包括存储器901和处理器902,其中,存储器901中存储有程序代码,处理器902调用存储器中的程序代码,当程序代码被执行时,处理器 902执行如下操作:
在所述多个拍摄设备拍摄的过程中,根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位;
根据所述相对方位控制所述多个拍摄设备的拍摄姿态,以对所述目标拍摄对象进行跟踪拍摄。
在一个实施例中,所述处理器902在根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位时,具体用于执行如下操作:
根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定目标拍摄对象的位置;
确定各个拍摄设备的位置;
根据所述目标拍摄对象的位置和各个所述拍摄设备的位置,确定各个所述拍摄设备与所述目标拍摄对象的相对方位。
在一个实施例中,各个所述拍摄设备上设置第二无线定位标签;
所述处理器902在确定各个拍摄设备的位置时,具体用于执行如下操作:
根据各个所述拍摄设备上设置的第二无线定位标签和所述多个无线定位基站之间的无线信号通信,确定各个所述拍摄设备的位置。
在一个实施例中,所述处理器902在确定各个拍摄设备的位置时,具体用于执行如下操作:
在预设存储器中获取各个所述拍摄设备的位置,其中,各个所述无线定位基站和各个所述拍摄设备的相对位置固定。
在一个实施例中,所述环境中包括至少两个拍摄对象,其中,各个所述拍摄对象上设置第一无线定位标签,所述处理器902还用于执行如下操作:
获取拍摄对象指示信息;
根据所述拍摄对象指示信息,从所述至少两个拍摄对象中确定所述目标拍摄对象。
在一个实施例中,所述处理器902在获取拍摄对象指示信息时,具体用于执行如下操作:
检测用户的拍摄对象选择操作,根据检测到的拍摄对象选择操作确定所述拍摄对象指示信息。
在一个实施例中,所述处理器902在获取拍摄对象指示信息时,具体用于执行如下操作:
获取所述至少两个拍摄对象中的一个拍摄对象上设置的第一无线定位标签发送的拍摄对象指示信息;
所述处理器在根据所述拍摄对象指示信息,从所述至少两个拍摄对象中确定所述目标拍摄对象时,具体用于执行如下操作:
将发送所述拍摄对象指示信息的第一无线定位标签对应的拍摄对象确定为所述目标拍摄对象。
在一个实施例中,所述环境中包括拍摄对象,其中,所述拍摄对象上设置第一无线定 位标签;
所述处理器902还用于执行如下操作:
当所述拍摄对象的数量为一个时,将所述拍摄对象确定为所述目标拍摄对象;
当所述拍摄对象的数量为多个时,为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象;
根据所述待跟踪拍摄的目标拍摄对象上设置的第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述待跟踪拍摄的目标拍摄对象的相对方位;
根据所述每一个拍摄设备与所述确定的一个待跟踪拍摄的目标拍摄对象之间的相对方位控制所述拍摄设备的拍摄姿态,以对所述待跟踪拍摄的目标拍摄对象进行跟踪拍摄。
在一个实施例中,所述处理器902在为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象时,具体用于执行如下操作:
根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与各个所述拍摄对象的相对距离;
为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个相对距离最小的待跟踪拍摄的目标拍摄对象。
在一个实施例中,所述处理器902在为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象时,具体用于执行如下操作:
根据所述多个拍摄对象上设置的第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述多个拍摄对象的相对方位;
根据所述确定的相对方位,确定每一个拍摄对象在每一个拍摄设备的拍摄画面中的位置;
为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个与所述拍摄设备的拍摄画面的中心点之间的距离最小的待跟踪拍摄的目标拍摄对象。
在一个实施例中,所述处理器902在为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象时,具体用于执行如下操作:
检测用户的拍摄对象选择操作;
根据检测到的拍摄对象选择操作为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象。
在一个实施例中,所述处理器902还用于执行如下操作:
根据所述相对方位和所述多个拍摄设备的拍摄姿态,确定所述目标拍摄对象在所述拍摄设备的拍摄画面中的位置;
根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域。
在一个实施例中,所述处理器902在根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域时,具体用于执行如下操作:
根据所述确定的位置在所述图像中确定初始剪辑图像区域;
针对所述初始剪辑图像区域运行预设的识别模型算法,以确定所述图像中的目标拍摄 对象的图像区域,并将所述目标拍摄对象的图像区域从所述图像中剪辑出来。
在一个实施例中,所述处理器902在根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域时,具体用于执行如下操作:
当处于自动剪辑模式时,根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域。
在一个实施例中,所述处理器902还用于执行如下操作:
当处于手动剪辑模式时,根据所述确定的位置在所述多个拍摄设备的拍摄画面中显示剪辑提示标识;
检测用户响应所述剪辑提示标识对所述用户对应拍摄设备的图像区域选择操作;
根据检测到的图像区域选择操作确定所述拍摄设备拍摄得到的图像中的目标拍摄对象的图像区域;
将所述目标拍摄对象的图像区域从所述图像中剪辑出来。
本实施例提供的拍摄控制装置能执行前述实施例提供的如图2至图8所示的拍摄控制方法,且执行方式和有益效果类似,在这里不再赘述。
本申请实施例提供一种拍摄控制系统,包括多个用于对环境进行拍摄的拍摄设备;设置在所述环境中的多个无线定位基站;设置在所述环境中的目标拍摄对象上的第一无线定位标签,所述第一无线定位标签和所述多个无线定位基站之间无线连接;如前所述的拍摄控制装置,所述拍摄控制装置分别和所述多个拍摄设备以及所述多个无线定位基站之间通信连接。拍摄控制装置工作与前述相同或类似,此处不再赘述。
在一个实施例中,各个所述无线定位基站和各个所述拍摄设备的相对位置可变,各个所述拍摄设备和所述多个无线定位基站之间无线连接。
在一个实施例中,各个所述无线定位基站和各个所述拍摄设备的相对位置固定。
在一个实施例中,所述拍摄控制装置配置有输入设备,所述输入设备用于获取拍摄对象指示信息。
在一个实施例中,所述第一无线定位标签配置有输入设备,所述输入设备用于获取拍摄对象指示信息。
在一个实施例中,所述拍摄设备配置有显示屏幕和输入设备,所述显示屏幕用于显示拍摄画面,所述输入设备用于检测图像区域选择操作。
本申请实施例还提供一种计算机存储介质,所述计算机存储介质中存储有计算机程序指令,所述计算机程序指令被处理器执行时,用于执行如图2至图8所示的拍摄控制方法。
可以理解,以上所揭露的仅为本申请实施例的部分实施例而已,当然不能以此来限定本发明之权利范围,本领域普通技术人员可以理解实现上述实施例的全部或部分流程,并依本发明权利要求所作的等同变化,仍属于发明所涵盖的范围。

Claims (37)

  1. 一种拍摄控制方法,应用于拍摄控制系统,其特征在于,所述拍摄控制系统包括多个用于对环境进行拍摄的拍摄设备,所述环境中设置多个无线定位基站,所述环境中的目标拍摄对象上设置第一无线定位标签,其中,所述方法包括:
    在所述多个拍摄设备拍摄的过程中,根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位;
    根据所述相对方位控制所述多个拍摄设备的拍摄姿态,以对所述目标拍摄对象进行跟踪拍摄。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位,包括:
    根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定目标拍摄对象的位置;
    确定各个拍摄设备的位置;
    根据所述目标拍摄对象的位置和各个所述拍摄设备的位置,确定各个所述拍摄设备与所述目标拍摄对象的相对方位。
  3. 根据权利要求2所述的方法,其特征在于,各个所述拍摄设备上设置第二无线定位标签;
    所述确定各个拍摄设备的位置,包括:
    根据各个所述拍摄设备上设置的第二无线定位标签和所述多个无线定位基站之间的无线信号通信,确定各个所述拍摄设备的位置。
  4. 根据权利要求2所述的方法,其特征在于,所述确定各个拍摄设备的位置,包括:
    在预设存储器中获取各个所述拍摄设备的位置,其中,各个所述无线定位基站和各个所述拍摄设备的相对位置固定。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述环境中包括至少两个拍摄对象,其中,各个所述拍摄对象上设置第一无线定位标签,所述方法还包括:
    获取拍摄对象指示信息;
    根据所述拍摄对象指示信息,从所述至少两个拍摄对象中确定所述目标拍摄对象。
  6. 根据权利要求5所述的方法,其特征在于,所述获取拍摄对象指示信息,包括:
    检测用户的拍摄对象选择操作,根据检测到的拍摄对象选择操作确定所述拍摄对象指示信息。
  7. 根据权利要求5所述的方法,其特征在于,所述获取拍摄对象指示信息,包括:
    获取所述至少两个拍摄对象中的一个拍摄对象上设置的第一无线定位标签发送的拍摄对象指示信息;
    所述根据所述拍摄对象指示信息,从所述至少两个拍摄对象中确定所述目标拍摄对象,包括:
    将发送所述拍摄对象指示信息的第一无线定位标签对应的拍摄对象确定为所述目标拍摄对象。
  8. 根据权利要求1-4任一项所述的方法,其特征在于,所述环境中包括拍摄对象,其中,所述拍摄对象上设置第一无线定位标签;
    所述方法还包括:
    当所述拍摄对象的数量为一个时,将所述拍摄对象确定为所述目标拍摄对象;
    当所述拍摄对象的数量为多个时,为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象;
    根据所述待跟踪拍摄的目标拍摄对象上设置的第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述待跟踪拍摄的目标拍摄对象的相对方位;
    根据所述每一个拍摄设备与所述确定的一个待跟踪拍摄的目标拍摄对象之间的相对方位控制所述拍摄设备的拍摄姿态,以对所述待跟踪拍摄的目标拍摄对象进行跟踪拍摄。
  9. 根据权利要求8所述的方法,其特征在于,所述为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象,包括:
    根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与各个所述拍摄对象的相对距离;
    为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个相对距离最小的待跟踪拍摄的目标拍摄对象。
  10. 根据权利要求8所述的方法,其特征在于,所述为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象,包括:
    根据所述多个拍摄对象上设置的第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述多个拍摄对象的相对方位;
    根据所述确定的相对方位,确定每一个拍摄对象在每一个拍摄设备的拍摄画面中的位置;
    为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个与所述拍摄设备的拍摄画面的中心点之间的距离最小的待跟踪拍摄的目标拍摄对象。
  11. 根据权利要求8所述的方法,其特征在于,所述为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象,包括:
    检测用户的拍摄对象选择操作;
    根据检测到的拍摄对象选择操作为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述方法还包括:
    根据所述相对方位和所述多个拍摄设备的拍摄姿态,确定所述目标拍摄对象在所述拍摄设备的拍摄画面中的位置;
    根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域。
  13. 根据权利要求12所述的方法,其特征在于,所述根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域,包括:
    根据所述确定的位置在所述图像中确定初始剪辑图像区域;
    针对所述初始剪辑图像区域运行预设的识别模型算法,以确定所述图像中的目标拍摄对象的图像区域,并将所述目标拍摄对象的图像区域从所述图像中剪辑出来。
  14. 根据权利要求12所述的方法,其特征在于,所述根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域,包括:
    当处于自动剪辑模式时,根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域。
  15. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    当处于手动剪辑模式时,根据所述确定的位置在所述多个拍摄设备的拍摄画面中显示剪辑提示标识;
    检测用户响应所述剪辑提示标识对所述用户对应拍摄设备的图像区域选择操作;
    根据检测到的图像区域选择操作确定所述拍摄设备拍摄得到的图像中的目标拍摄对象的图像区域;
    将所述目标拍摄对象的图像区域从所述图像中剪辑出来。
  16. 一种拍摄控制装置,其特征在于,所述拍摄控制装置应用于拍摄控制系统,所述拍摄控制系统包括多个用于对环境进行拍摄的拍摄设备,所述环境中设置多个无线定位基站,所述环境中的目标拍摄对象上设置第一无线定位标签,所述拍摄控制装置包括存储器和处理器,其中,
    所述存储器,用于存储有程序代码;
    所述处理器,调用存储器中的程序代码,当程序代码被执行时,用于执行如下操作:
    在所述多个拍摄设备拍摄的过程中,根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位;
    根据所述相对方位控制所述多个拍摄设备的拍摄姿态,以对所述目标拍摄对象进行跟 踪拍摄。
  17. 根据权利要求16所述的装置,其特征在于,所述处理器在根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述目标拍摄对象的相对方位时,具体用于执行如下操作:
    根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定目标拍摄对象的位置;
    确定各个拍摄设备的位置;
    根据所述目标拍摄对象的位置和各个所述拍摄设备的位置,确定各个所述拍摄设备与所述目标拍摄对象的相对方位。
  18. 根据权利要求17所述的装置,其特征在于,各个所述拍摄设备上设置第二无线定位标签;
    所述处理器在确定各个拍摄设备的位置时,具体用于执行如下操作:
    根据各个所述拍摄设备上设置的第二无线定位标签和所述多个无线定位基站之间的无线信号通信,确定各个所述拍摄设备的位置。
  19. 根据权利要求17所述的装置,其特征在于,所述处理器在确定各个拍摄设备的位置时,具体用于执行如下操作:
    在预设存储器中获取各个所述拍摄设备的位置,其中,各个所述无线定位基站和各个所述拍摄设备的相对位置固定。
  20. 根据权利要求16-19任一项所述的装置,其特征在于,所述环境中包括至少两个拍摄对象,其中,各个所述拍摄对象上设置第一无线定位标签,所述处理器还用于执行如下操作:
    获取拍摄对象指示信息;
    根据所述拍摄对象指示信息,从所述至少两个拍摄对象中确定所述目标拍摄对象。
  21. 根据权利要求20所述的装置,其特征在于,所述处理器在获取拍摄对象指示信息时,具体用于执行如下操作:
    检测用户的拍摄对象选择操作,根据检测到的拍摄对象选择操作确定所述拍摄对象指示信息。
  22. 根据权利要求20所述的装置,其特征在于,所述处理器在获取拍摄对象指示信息时,具体用于执行如下操作:
    获取所述至少两个拍摄对象中的一个拍摄对象上设置的第一无线定位标签发送的拍摄对象指示信息;
    所述处理器在根据所述拍摄对象指示信息,从所述至少两个拍摄对象中确定所述目标 拍摄对象时,具体用于执行如下操作:
    将发送所述拍摄对象指示信息的第一无线定位标签对应的拍摄对象确定为所述目标拍摄对象。
  23. 根据权利要求16-19任一项所述的装置,其特征在于,所述环境中包括拍摄对象,其中,所述拍摄对象上设置第一无线定位标签;
    所述处理器还用于执行如下操作:
    当所述拍摄对象的数量为一个时,将所述拍摄对象确定为所述目标拍摄对象;
    当所述拍摄对象的数量为多个时,为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象;
    根据所述待跟踪拍摄的目标拍摄对象上设置的第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述待跟踪拍摄的目标拍摄对象的相对方位;
    根据所述每一个拍摄设备与所述确定的一个待跟踪拍摄的目标拍摄对象之间的相对方位控制所述拍摄设备的拍摄姿态,以对所述待跟踪拍摄的目标拍摄对象进行跟踪拍摄。
  24. 根据权利要求23所述的装置,其特征在于,所述处理器在为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象时,具体用于执行如下操作:
    根据所述第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与各个所述拍摄对象的相对距离;
    为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个相对距离最小的待跟踪拍摄的目标拍摄对象。
  25. 根据权利要求23所述的装置,其特征在于,所述处理器在为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象时,具体用于执行如下操作:
    根据所述多个拍摄对象上设置的第一无线定位标签与所述多个无线定位基站之间的无线信号通信,确定所述多个拍摄设备与所述多个拍摄对象的相对方位;
    根据所述确定的相对方位,确定每一个拍摄对象在每一个拍摄设备的拍摄画面中的位置;
    为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个与所述拍摄设备的拍摄画面的中心点之间的距离最小的待跟踪拍摄的目标拍摄对象。
  26. 根据权利要求23所述的装置,其特征在于,所述处理器在为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象时,具体用于执行如下操作:
    检测用户的拍摄对象选择操作;
    根据检测到的拍摄对象选择操作为所述多个拍摄设备中的每一个拍摄设备从所述多个拍摄对象中确定一个待跟踪拍摄的目标拍摄对象。
  27. 根据权利要求16-26任一项所述的装置,其特征在于,所述处理器还用于执行如下操作:
    根据所述相对方位和所述多个拍摄设备的拍摄姿态,确定所述目标拍摄对象在所述拍摄设备的拍摄画面中的位置;
    根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域。
  28. 根据权利要求27所述的装置,其特征在于,所述处理器在根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域时,具体用于执行如下操作:
    根据所述确定的位置在所述图像中确定初始剪辑图像区域;
    针对所述初始剪辑图像区域运行预设的识别模型算法,以确定所述图像中的目标拍摄对象的图像区域,并将所述目标拍摄对象的图像区域从所述图像中剪辑出来。
  29. 根据权利要求27所述的装置,其特征在于,所述处理器在根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域时,具体用于执行如下操作:
    当处于自动剪辑模式时,根据所述确定的位置对所述多个拍摄设备拍摄得到的图像进行剪辑,以获取所述目标拍摄对象的图像区域。
  30. 根据权利要求27所述的装置,其特征在于,所述处理器还用于执行如下操作:
    当处于手动剪辑模式时,根据所述确定的位置在所述多个拍摄设备的拍摄画面中显示剪辑提示标识;
    检测用户响应所述剪辑提示标识对所述用户对应拍摄设备的图像区域选择操作;
    根据检测到的图像区域选择操作确定所述拍摄设备拍摄得到的图像中的目标拍摄对象的图像区域;
    将所述目标拍摄对象的图像区域从所述图像中剪辑出来。
  31. 一种拍摄控制系统,其特征在于,包括:
    多个用于对环境进行拍摄的拍摄设备;
    设置在所述环境中的多个无线定位基站;
    设置在所述环境中的目标拍摄对象上的第一无线定位标签,所述第一无线定位标签和所述多个无线定位基站之间无线连接;
    如权利要求16-30任一项所述的拍摄控制装置,所述拍摄控制装置分别和所述多个拍摄设备以及所述多个无线定位基站之间通信连接。
  32. 根据权利要求31所述的系统,其特征在于,各个所述无线定位基站和各个所述拍摄设备的相对位置可变,各个所述拍摄设备和所述多个无线定位基站之间无线连接。
  33. 根据权利要求31所述的系统,其特征在于,各个所述无线定位基站和各个所述拍摄设备的相对位置固定。
  34. 根据权利要求31-33任一项所述的系统,其特征在于,所述拍摄控制装置配置有输入设备,所述输入设备用于获取拍摄对象指示信息。
  35. 根据权利要求31-33任一项所述的系统,其特征在于,所述第一无线定位标签配置有输入设备,所述输入设备用于获取拍摄对象指示信息。
  36. 根据权利要求31-33任一项所述的系统,其特征在于,所述拍摄设备配置有显示屏幕和输入设备,所述显示屏幕用于显示拍摄画面,所述输入设备用于检测图像区域选择操作。
  37. 一种计算机存储介质,其特征在于,所述计算机存储介质中存储有计算机程序指令,所述计算机程序指令被处理器执行时,用于执行如权利要求1-15任一项所述的拍摄控制方法。
PCT/CN2020/116793 2020-09-22 2020-09-22 拍摄控制方法、装置、系统及存储介质 WO2022061508A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/116793 WO2022061508A1 (zh) 2020-09-22 2020-09-22 拍摄控制方法、装置、系统及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/116793 WO2022061508A1 (zh) 2020-09-22 2020-09-22 拍摄控制方法、装置、系统及存储介质

Publications (1)

Publication Number Publication Date
WO2022061508A1 true WO2022061508A1 (zh) 2022-03-31

Family

ID=80844697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/116793 WO2022061508A1 (zh) 2020-09-22 2020-09-22 拍摄控制方法、装置、系统及存储介质

Country Status (1)

Country Link
WO (1) WO2022061508A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104853104A (zh) * 2015-06-01 2015-08-19 深圳市微队信息技术有限公司 一种自动跟踪拍摄运动目标的方法以及系统
CN206272857U (zh) * 2016-12-16 2017-06-20 珠海太川云社区技术股份有限公司 一种人员精确定位监视系统
CN207501947U (zh) * 2017-08-31 2018-06-15 郑州联睿电子科技有限公司 一种追踪系统
CN108521860A (zh) * 2017-06-22 2018-09-11 深圳市大疆创新科技有限公司 拍摄设备的控制方法、拍摄设备及系统
CN108989750A (zh) * 2018-07-17 2018-12-11 上海建工集团股份有限公司 一种重型设备的动态可视化监控系统及方法
CN110622089A (zh) * 2018-01-22 2019-12-27 深圳市大疆创新科技有限公司 跟随控制方法、控制终端及无人机

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104853104A (zh) * 2015-06-01 2015-08-19 深圳市微队信息技术有限公司 一种自动跟踪拍摄运动目标的方法以及系统
CN206272857U (zh) * 2016-12-16 2017-06-20 珠海太川云社区技术股份有限公司 一种人员精确定位监视系统
CN108521860A (zh) * 2017-06-22 2018-09-11 深圳市大疆创新科技有限公司 拍摄设备的控制方法、拍摄设备及系统
CN207501947U (zh) * 2017-08-31 2018-06-15 郑州联睿电子科技有限公司 一种追踪系统
CN110622089A (zh) * 2018-01-22 2019-12-27 深圳市大疆创新科技有限公司 跟随控制方法、控制终端及无人机
CN108989750A (zh) * 2018-07-17 2018-12-11 上海建工集团股份有限公司 一种重型设备的动态可视化监控系统及方法

Similar Documents

Publication Publication Date Title
US9560274B2 (en) Image generation apparatus and image generation method
US9729788B2 (en) Image generation apparatus and image generation method
US20170323458A1 (en) Camera for Locating Hidden Objects
US10514708B2 (en) Method, apparatus and system for controlling unmanned aerial vehicle
US10284776B2 (en) Image generation apparatus and image generation method
US9894272B2 (en) Image generation apparatus and image generation method
CN110276789B (zh) 目标跟踪方法及装置
JP2021520540A (ja) カメラの位置決め方法および装置、端末並びにコンピュータプログラム
EP3352453B1 (en) Photographing method for intelligent flight device and intelligent flight device
US10832489B2 (en) Presenting location based icons on a device display
JP7400882B2 (ja) 情報処理装置、移動体、遠隔制御システム、情報処理方法およびプログラム
US10338768B1 (en) Graphical user interface for finding and depicting individuals
JP2022531187A (ja) 測位方法及び装置、電子機器並びに記憶媒体
KR20170011194A (ko) 이동 단말기 및 그 제어 방법
WO2014125134A1 (es) Método para la representación de entornos virtuales localizados geográficamente y dispositivo móvil
CN111147744B (zh) 拍摄方法、数据处理方法、装置、电子设备及存储介质
WO2022061508A1 (zh) 拍摄控制方法、装置、系统及存储介质
CN113220928A (zh) 一种图像搜索方法、装置、电子设备及存储介质
KR101358064B1 (ko) 사용자 이미지를 이용한 원격 제어 방법 및 시스템
CN112432636B (zh) 定位方法及装置、电子设备和存储介质
US20230035962A1 (en) Space recognition system, space recognition method and information terminal
KR20190032787A (ko) 360°비디오 생성 시스템 및 방법
JP2004219847A (ja) 小型携帯端末
CN112860827A (zh) 设备间交互控制方法、设备间交互控制装置及存储介质
WO2019200615A1 (en) Apparatus, methods and computer programs for facilitating tuning of an antenna

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20954362

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20954362

Country of ref document: EP

Kind code of ref document: A1