WO2022094808A1 - Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible - Google Patents

Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible Download PDF

Info

Publication number
WO2022094808A1
WO2022094808A1 PCT/CN2020/126560 CN2020126560W WO2022094808A1 WO 2022094808 A1 WO2022094808 A1 WO 2022094808A1 CN 2020126560 W CN2020126560 W CN 2020126560W WO 2022094808 A1 WO2022094808 A1 WO 2022094808A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
terminal device
user
video
shooting
Prior art date
Application number
PCT/CN2020/126560
Other languages
English (en)
Chinese (zh)
Inventor
彭阳
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/126560 priority Critical patent/WO2022094808A1/fr
Publication of WO2022094808A1 publication Critical patent/WO2022094808A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present application relates to the technical field of photographing control, and in particular, to a photographing control method, device, drone, device, and readable storage medium.
  • the present application provides a shooting control method, device, unmanned aerial vehicle, device and readable storage medium, aiming at shooting multi-view video, so as to improve the interestingness and shock of the video.
  • an embodiment of the present application provides a shooting control method, which is applied to a control device, the control device is used to control a drone and a terminal device, the drone includes a first shooting device, and the terminal device A second photographing device is included, and the method includes:
  • the first video is obtained by the control device controlling the first shooting device to shoot the user holding the terminal device according to a shooting control instruction triggered by the user;
  • the second video is obtained by the control device controlling the second shooting device to shoot the drone according to the shooting control instruction;
  • the first video and the second video are synthesized to obtain a target video, where the target video includes at least a part of video frames in the first video and at least a part of video frames in the second video.
  • an embodiment of the present application further provides a control device, the control device is used to control a drone and a terminal device, the drone includes a first photographing device, and the terminal device includes a second photographing device , the control device includes a memory and a processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and implement the following steps when executing the computer program:
  • the first video is obtained by the control device controlling the first shooting device to shoot the user holding the terminal device according to a shooting control instruction triggered by the user;
  • the second video is obtained by the control device controlling the second shooting device to shoot the drone according to the shooting control instruction;
  • the first video and the second video are synthesized to obtain a target video, where the target video includes at least a part of video frames in the first video and at least a part of video frames in the second video.
  • an embodiment of the present application also provides an unmanned aerial vehicle, including:
  • a power system arranged on the body, for providing flight power for the drone;
  • a first photographing device mounted on the body, for photographing a first video
  • a first wireless communication device provided on the body, for communicating with the second wireless communication device in the terminal device;
  • the above control device is used to control the first camera to shoot the first video and/or the drone to fly.
  • an embodiment of the present application further provides a terminal device, including:
  • a display device for displaying images or a human-computer interface
  • a second wireless communication device for communicating with the first wireless communication device in the drone
  • the above control device is used to control the second shooting device to shoot a second video.
  • an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the above-mentioned The steps of the shooting control method.
  • Embodiments of the present application provide a photographing control method, device, drone, device, and readable storage medium, by acquiring a first photograph obtained by photographing a user holding a terminal device by a first photographing device in the drone.
  • At least a part of the target video in the video frame, so that the target video obtained by synthesis is a multi-view video, which greatly improves the interestingness and shock of the video.
  • FIG. 1 is a schematic diagram of a scene for implementing a shooting control method provided by an embodiment of the present application
  • FIG. 2 is a schematic flowchart of steps of a shooting control method provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of a scene in which a first video and a second video are synthesized in an embodiment of the present application;
  • FIG. 4 is a schematic diagram of another scene in which the first video and the second video are synthesized in an embodiment of the present application
  • FIG. 5 is a schematic flowchart of steps of another shooting control method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a scenario in which a drone is controlled to fly around and follow a user holding a terminal device in an embodiment of the present application;
  • FIG. 7 is a schematic diagram of another scenario in which a drone is controlled to fly around and follow a user holding a terminal device in an embodiment of the present application;
  • FIG. 8 is a schematic block diagram of the structure of a control device provided by an embodiment of the present application.
  • FIG. 9 is a schematic block diagram of the structure of an unmanned aerial vehicle provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural block diagram of a terminal device provided by an embodiment of the present application.
  • the embodiments of the present application provide a shooting control method, device, unmanned aerial vehicle, equipment, and readable storage medium.
  • the first video obtained by shooting and the second video obtained by shooting the drone by the second shooting device in the terminal device, and the first video and the second video are synthesized to obtain at least a part of the first video. frame and the target video of at least a part of the video frames in the second video, so that the target video obtained by synthesis is a multi-view video, which greatly improves the interestingness and shock of the video.
  • FIG. 1 is a schematic diagram of a scene for implementing the shooting control method provided by the embodiment of the present application.
  • the scene includes an unmanned aerial vehicle 100 and a terminal device 200.
  • the unmanned aerial vehicle 100 includes a body 110, a power system 120 arranged on the body 110, a first photographing device 130 arranged on the body 110, and a A first wireless communication device (not shown in FIG. 1 ) in the body 110
  • the terminal device 200 includes a display device 210, a second photographing device (not shown in FIG. 1 ) and a second wireless communication device (not shown in FIG. 1 ) out)
  • the first wireless communication device in the drone 100 can communicate with the second wireless communication device in the terminal device 200 , thereby establishing a communication connection between the drone 100 and the terminal device 200 .
  • the power system 120 is used to provide the flying power for the drone 100, and the power system 120 can make the drone take off from the ground vertically, or land on the ground vertically, without any horizontal movement of the drone (such as no taxiing on the runway is required).
  • the power system 120 may allow the drone to preset positions and/or turn the steering wheel in the air.
  • One or more of the powertrains 120 may be controlled independently of the other powertrains 120 .
  • one or more power systems 120 may be controlled simultaneously.
  • the drone may have multiple horizontally oriented power systems 120 to track the lift and/or push of the target.
  • the horizontally oriented power system 120 may be actuated to provide the drone with the ability to take off vertically, land vertically, and hover.
  • one or more of the horizontally oriented power systems 120 may rotate in a clockwise direction, while one or more of the other horizontally oriented power systems may rotate in a counter-clockwise direction.
  • the rotational rate of each power system 120 in the horizontal direction can be varied independently to achieve lift and/or push operations caused by each power system 120 to adjust the spatial orientation, speed and/or acceleration of the drone (eg, relative to multiple rotation and translation up to three degrees of freedom).
  • the drone 100 may further include a pan/tilt 140 disposed on the body 110, the pan/tilt 140 is used for carrying the first photographing device 130, and the pan/tilt 140 includes three-axis motors, which are a pan-axis motor, a pitch-axis motor, and a tilt-axis motor.
  • the axis motor and the roll axis motor are used to adjust the balance posture of the first photographing device 130 mounted on the gimbal 140, so as to photograph a high-precision and stable image anytime and anywhere.
  • the terminal device can be a headset equipped with a second camera, AR or VR glasses, a smartphone, a tablet computer, a handheld gimbal and other handheld devices, and the terminal device is a handheld device equipped with a second camera.
  • the gimbal and the hand-held gimbal include a handle and a gimbal for carrying the second shooting device. Through the gimbal, the balance posture of the second shooting device mounted on the gimbal can be adjusted, so that high-precision and stable shooting can be obtained anytime, anywhere. screen.
  • the drone may also include a sensing system, which may include one or more sensors to sense the spatial orientation, velocity, and/or acceleration of the drone (eg, relative to up to three Degree of freedom rotation and translation), angular acceleration, attitude, position (absolute position or relative position), etc.
  • the one or more sensors include GPS sensors, motion sensors, inertial sensors, proximity sensors, or image sensors.
  • the sensing system may also be used to collect data on the environment in which the UAV 100 is located, such as climatic conditions, potential obstacles to be approached, locations of geographic features, locations of man-made structures, and the like.
  • the drone 100 may include a tripod, which is a contact piece between the drone 100 and the ground when the drone lands, and the tripod may be the drone 100 in a flying state (for example, the drone 100 is When cruising), it is put away and put down when it is landing; it can also be fixedly installed on the UAV 100, and it is always put down.
  • a tripod which is a contact piece between the drone 100 and the ground when the drone lands
  • the tripod may be the drone 100 in a flying state (for example, the drone 100 is When cruising), it is put away and put down when it is landing; it can also be fixedly installed on the UAV 100, and it is always put down.
  • the drone 100 or the terminal device 200 further includes a control device (not shown in FIG. 1 ).
  • the control device acquires the shooting control instruction triggered by the user
  • the control device controls the unmanned aerial vehicle according to the shooting control instruction.
  • the first shooting device 130 in the drone 100 shoots the user holding the terminal device 200 to obtain a first video
  • the second shooting device in the terminal device 200 controls the drone 100 according to the shooting control instruction
  • the second video is obtained, and then the first video and the second video are synthesized to obtain a target video including at least a part of video frames in the first video and at least a part of video frames in the second video.
  • the control device acquires a user-triggered shooting control instruction sent by the terminal device 200, and controls the first shooting device 130 in the drone 100 according to the shooting control instruction.
  • the control device controls the terminal device 200 to send the shooting control command to the drone 100 when acquiring the shooting control command triggered by the user
  • the drone 100 controls the first shooting device to shoot the user holding the terminal device 200 according to the shooting control instruction to obtain the first video, and simultaneously controls the second shooting device to shoot the drone 100 to obtain the second video;
  • the first video sent by the drone 100 is acquired, and the first video and the second video are synthesized to obtain a target video including at least a part of video frames in the first video and at least a part of video frames in the second video.
  • the central control device is communicatively connected to the terminal device 200, and the central control device is also communicatively connected to the drone 100.
  • the terminal device 200 obtains the shooting control instruction triggered by the user, the terminal device 200 sends the drone to the drone.
  • the terminal device 200 controls the second shooting device to shoot the drone according to the shooting control command, obtains a second video, and sends the second video to the central control device;
  • the drone 100 is receiving
  • the shooting control instruction arrives, according to the shooting control instruction, the first shooting device 130 is controlled to shoot the user of the handheld terminal device 200 to obtain the first video, and the first video is sent to the central control device;
  • the first video sent by the terminal device 100 and the second video sent by the terminal device 200 are combined, and the first video and the second video are synthesized to obtain at least a part of the video frames in the first video and at least a part of the video frames in the second video. target video.
  • the shooting control method provided by the embodiments of the present application will be described in detail with reference to the scene in FIG. 1 .
  • the scene in FIG. 1 is only used to explain the shooting control method provided by the embodiment of the present application, but does not constitute a limitation on the application scene of the shooting control method provided by the embodiment of the present application.
  • FIG. 2 is a schematic flowchart of steps of a shooting control method provided by an embodiment of the present application.
  • the photographing control method can be applied to a control device, and the control device is used to control a drone and a terminal device, the drone includes a first photographing device, and the terminal device includes a second photographing device.
  • the photographing control method includes steps S101 to S103.
  • Step S101 acquiring a first video, the first video is obtained by the control device controlling the first shooting device to shoot the user holding the terminal device according to a shooting control instruction triggered by the user;
  • Step S102 acquiring a second video, where the second video is obtained by the control device controlling the second shooting device to shoot the drone according to the shooting control instruction;
  • Step S103 Synthesize the first video and the second video to obtain a target video, where the target video includes at least a part of video frames in the first video and at least a part of video frames in the second video.
  • the control device may be deployed in the drone, may also be deployed in the terminal device, or may be deployed in the central control device that is communicatively connected with the terminal device and the drone, which is not specifically limited in this embodiment of the present application.
  • the first video may be the current video obtained by the control device controlling the first shooting device to shoot the user holding the terminal device in real time according to the shooting control instruction triggered by the user, or it may be the current video that previously controlled the first shooting device according to the shooting control instruction triggered by the user.
  • a historical video obtained by a shooting device shooting a user holding a terminal device.
  • the second video can be obtained by the control device controlling the second shooting device to shoot the drone in real time according to the shooting control instruction triggered by the user.
  • the current video may also be a historical video previously obtained by controlling the second photographing device to photograph the drone according to the photographing control instruction triggered by the user.
  • the start and end shooting moments of the first video and the second video are the same, which can ensure the time synchronization of the first video and the second video, and facilitate subsequent synthesis of the first video and the second video.
  • the user holding the terminal equipment in the process of photographing the drone and the user holding the terminal equipment, is in the shooting area of the first shooting device of the drone, and the drone is in the second shooting area of the terminal equipment.
  • each video frame in the first video shot by the first shooting device includes the user holding the terminal device, and it can also be ensured that each video in the second video shot by the second shooting device can be guaranteed. All frames include drones, which is convenient for subsequent synthesis to obtain multi-view target videos.
  • the first video includes a plurality of first video frames, each of the first video frames includes a user holding a terminal device, and the user holding the terminal device in the first video frame is located in the first video frame.
  • the second video includes a plurality of second video frames, and each second video frame includes a drone.
  • the preset position may be the center position of the first video frame, or may be the remaining positions of the first video frame, which is not specifically limited in this embodiment of the present application.
  • the target video includes a plurality of first video frame groups and a plurality of second video frame groups, the first video frame group in the target video is adjacent to the second video frame group, and the first video frame group includes a or more first video frames, and the second video frame group includes one or more second video frames.
  • extracting a preset number of first video frames from the first video randomly inserting each first video frame into the second video to obtain the target video; or extracting a preset number of video frames from the second video.
  • Second video frame randomly insert each second video frame into the first video to obtain the target video.
  • the first video frame in the target video is adjacent to the second video frame, and the preset number may be determined based on the total number of video frames in the first video, which is not specifically limited in this embodiment of the present application.
  • the first video 11 includes 10 first video frames
  • the second video 12 includes 10 second video frames
  • 00:01-00:02, 00:00 are extracted from the first video 11 .
  • the first video frames corresponding to 00:04, 00:05-00:06, 00:07-00:08, and 00:09-00:10 are inserted into the second video 12 at 00:01 and 00:03 respectively.
  • the target video 13 includes 15 video frames
  • the video duration is 15 seconds.
  • a preset number of first video frames are extracted from the first video, and according to the shooting moment of each first video frame, a matching second video frame is extracted from the second video frame to obtain a preset number of video frames.
  • the second video frame wherein the matching first video frame and the second video frame have the same shooting time; according to the shooting time of each second video frame, insert each second video frame into the extracted first video frame In the first video, the target video is obtained, or, according to the shooting moment of each first video frame, each first video frame is inserted into the second video after the second video frame is extracted.
  • 00:01-00:02, 00:03-00:04, 00:05-00:06, 00:07-00:08, and 00:09 are extracted from the first video 21 - the first video frame corresponding to 00:10, and extract 00:01-00:02, 00:03-00:04, 00:05-00:06, 00:07-00:08 and For the second video frame corresponding to 00:09-00:10, obtain the to-be-inserted video 22 after extracting the second video frame, and then set the video time as 00:01-00:02, 00:03-00:04, 00: The first video frames of 05-00:06, 00:07-00:08 and 00:09-00:10 are inserted into the video to be inserted 22 to obtain the target video 23 .
  • the first video, the second video, and the third video may also be acquired, and the viewing angles for shooting the first video, the second video, and the third video are different; Synthesize to obtain a target video, where the target video includes part or all of the video frames of the first video, part or all of the video frames of the second video, and part or all of the video frames of the third video. It can be understood that more videos with different viewing angles can also be obtained, and more videos with different viewing angles can be synthesized to obtain videos with multiple viewing angles.
  • the first video is obtained by the control device controlling the first shooting device in the first drone to shoot the user holding the terminal device according to the shooting control instruction triggered by the user
  • the second video is obtained by the control device according to the shooting control instruction
  • the third video is obtained by controlling the second shooting device in the terminal device to shoot the drone
  • the third video is obtained by the control device controlling the third shooting device in the second drone according to the shooting control instruction triggered by the user, to hold the terminal device.
  • Obtained by the user shooting, and the user holding the terminal device is located in the shooting area of the first shooting device and the third shooting device, and the first drone and/or the second drone is located in the shooting area of the second shooting device .
  • the terminal device is controlled to output shooting prompt information, wherein the shooting prompt information is used to prompt the user to shoot a video, and the shooting prompt information is also used to prompt the user that the drone is already in the shooting area of the second shooting device, And the user holding the terminal device is also in the shooting area of the first shooting device; the shooting control instruction triggered by the user is obtained, and according to the shooting control instruction, the first shooting device is controlled to shoot the user holding the terminal device, and the first shooting control instruction is obtained.
  • the second shooting device is controlled to shoot the drone to obtain a second video; the first video and the second video are synthesized to obtain a target video, wherein the target video includes the first video at least a portion of the video frames in the second video and at least a portion of the video frames in the second video.
  • the user's drone is already in the shooting area of the second shooting device, and the user holding the terminal device is also in the shooting area of the first shooting device, the user is prompted to shoot video, which facilitates the user to control the first shooting device and The second shooting device shoots the video synchronously to ensure that the time of the captured video is strictly synchronized.
  • the terminal device before the terminal device is controlled to output the shooting prompt information, the terminal device is controlled to display a shooting control page, wherein the shooting control page includes a viewing angle switching icon and the first image collected by the first shooting device; The user's touch operation in the first image takes the user in the first image as the first photographing subject of the first photographing device; in response to the user's triggering operation on the viewing angle switching icon, the photographing of the first image in the control page is performed.
  • the terminal device is controlled to output shooting prompt information to remind the user that the drone is in the Within the shooting area of the second shooting device, and the user holding the terminal device is also within the shooting area of the first shooting device.
  • the terminal device when there is no user holding the terminal device in the first image, the terminal device is controlled to output first prompt information, and the first prompt information is used to prompt the user to adjust the first shooting area of the first shooting device; according to The first posture adjustment instruction triggered by the user adjusts the posture of the first photographing device, and the first photographing area and the displayed first image change as the posture of the first photographing device changes.
  • the posture of the first photographing device may be adjusted by adjusting the posture of the drone, or the posture of the first photographing device may be adjusted by adjusting the posture of the pan/tilt on which the first photographing device is mounted.
  • the manner of controlling the terminal device to output the first prompt information may be: acquiring first orientation information of the drone relative to the terminal device; and controlling the terminal device to output the first prompt information according to the first orientation information.
  • the first prompt information includes first orientation information of the drone relative to the terminal device.
  • the method of acquiring the first position information of the drone relative to the terminal device may be: acquiring the first position information of the drone and the second position information of the terminal device; according to the first position information and the second position information position information, and determine the first orientation information of the drone relative to the terminal device. Or obtain the first position information of the drone, the second position information of the terminal device and the direction information output by the compass sensor of the terminal device; according to the first position information, the second position information and the direction information, determine the relative position of the drone to the terminal The first orientation information of the device.
  • second prompt information is output, wherein the second prompt information is used to prompt the user to stop adjusting the first shooting area, and is also used to prompt the user to stop adjusting the first shooting area.
  • the user holding the terminal device is located within the shooting area of the first shooting device.
  • the terminal device when there is no drone in the second image, the terminal device is controlled to output third prompt information, wherein the third prompt information is used to prompt the user to adjust the second shooting area of the second shooting device; according to the user
  • the triggered second posture adjustment instruction adjusts the posture of the second photographing device, wherein the second photographing area and the displayed second image change as the posture of the second photographing device changes.
  • controlling the terminal device to output the third prompt information includes: acquiring second orientation information of the terminal device relative to the drone; and controlling the terminal device to output the third prompt information according to the second orientation information.
  • the second prompt information includes second orientation information of the terminal device relative to the drone.
  • the manner of acquiring the second orientation information of the terminal device relative to the drone may be: acquiring the first location information of the drone and the second location information of the terminal device; Position information, to determine the second orientation information of the terminal device relative to the drone. Or obtain the first location information of the drone, the second location information of the terminal device, and the direction information output by the compass sensor of the terminal device; The second orientation information of the machine.
  • fourth prompt information is output, wherein the fourth prompt information is used to prompt the user to stop adjusting the second shooting area, and is also used to prompt the user that no one is there.
  • the camera is located in the shooting area of the second shooting device.
  • the shooting control page also displays a flight shooting mode option of the drone, the user can select the flight shooting mode of the drone through the flight shooting mode option, and the flight shooting mode of the drone includes a return shooting mode. , away from shooting mode, normal follow mode and surround follow mode, when the first shooting device is in the return shooting mode, the user is prompted to maintain the attitude of the second shooting device unchanged, and the control device can control the drone from the set position to the holding position.
  • the position of the user with the terminal equipment is to fly close to the user holding the terminal equipment, and hover when the height of the drone is lower than the set height; when the first shooting device is in the remote shooting mode, the user is prompted Keeping the attitude of the second shooting device unchanged, the control device can control the drone to fly from the position of the user holding the terminal device to the set position; when the drone is in the normal follow mode, the control device can The movement speed and/or movement trajectory of the user of the terminal device follows the user holding the terminal device; when the drone is in the follow-around mode, the control device can control the drone to surround the drone according to the rotation speed of the user holding the terminal device. Fly with the user holding the terminal device.
  • the shooting control method provided by the above-mentioned embodiment, by acquiring the first video obtained by the first shooting device in the drone shooting the user holding the terminal device, and the second shooting device in the terminal device shooting the drone. Shooting the obtained second video, and synthesizing the first video and the second video to obtain a target video comprising at least a part of video frames in the first video and at least a part of video frames in the second video, so that the target video obtained by synthesis For multi-view video, it greatly improves the fun and shock of the video.
  • FIG. 5 is a schematic flowchart of steps of another shooting control method provided by an embodiment of the present application.
  • the shooting control method includes steps S201 to S202.
  • S202 Control the drone to fly according to the flight trajectory, so that the drone is far away from the user holding the terminal device or close to the user holding the terminal device.
  • the mode identifier in the photographing control instruction is acquired, and the mode identifier is the first mode identifier corresponding to the return photographing mode or the first mode identifier corresponding to the remote photographing mode.
  • the control device obtains the flight trajectory in the shooting control command, and controls the drone to fly according to the flight trajectory, so that the drone is far away from the user holding the terminal device or close to the user holding the terminal device, so that the The first photographing device can photograph a video with a viewing angle far away from the user or close to the user.
  • the flight trajectory includes any one of the first flight trajectory and the second flight trajectory, so the drone can be controlled to fly according to the first flight trajectory, so that the drone is far away from the user holding the terminal device;
  • the man-machine flies according to the second flight trajectory, so that the drone is close to the user holding the terminal device.
  • the mode identification in the shooting control instruction is used to indicate the flight shooting mode of the drone, and the mode identifications of different flight shooting modes are different.
  • the mode identification includes a first mode identification, a second mode identification, a third mode identification and The fourth mode identification, and the first mode identification corresponds to the return shooting mode, the second mode identification corresponds to the remote shooting mode, the third mode identification corresponds to the normal follow mode, and the fourth mode identification corresponds to the surround follow mode.
  • the first flight trajectory includes a first starting waypoint and a first ending waypoint.
  • the hovering height of the drone is within a preset height range, and the drone is connected to the holding terminal.
  • the horizontal distance between users of the device is within the preset distance range, so that the drone and the user holding the terminal device maintain a certain range to ensure the flight safety of the drone.
  • the second flight trajectory includes the second starting waypoint and the second ending waypoint. When the drone is at the second ending waypoint, the hovering height of the drone is within the preset height range, and the drone is connected to the terminal device held by the drone.
  • the horizontal distance between the users of the device is within the preset distance range, so that the drone and the user holding the terminal device maintain a certain range to ensure the flight safety of the drone.
  • the preset height range and the preset distance range may be set based on actual conditions, which are not specifically limited in this embodiment of the present application.
  • the preset height range is 2 meters to 3 meters
  • the preset distance range is 1 meter to 2 meters. .
  • the drone in the process of controlling the first photographing apparatus to photograph the user holding the terminal device, the drone is controlled to follow the user holding the terminal device to fly. Further, in the process of controlling the first shooting device to shoot the user holding the terminal device, the mode identification in the shooting control instruction is obtained, and the mode identification is the third mode identification corresponding to the ordinary following mode or the surround following mode. When the corresponding fourth mode is identified, the drone is controlled to follow the user holding the terminal device to fly. By controlling the drone to follow the user holding the terminal device to fly, the first photographing device can shoot a video with a viewing angle of following the user holding the terminal device.
  • the method of controlling the drone to follow the user holding the terminal device to fly may be: acquiring the current image collected by the first photographing device; controlling the drone to follow the user holding the terminal device to fly according to the current image.
  • the method of controlling the drone to follow the user holding the terminal device according to the current image collected by the first photographing device may be: inputting the current image collected by the first photographing device into the preset target detection model, so as to detect the first A user in a current image collected by a photographing device performs target detection, and obtains target detection information of the user, wherein the target detection information includes the size information and position information of the user in the world coordinate system and the relative drone of the user. according to the target detection information and the preset target tracking algorithm, predict the target position information of the user holding the terminal device at the next moment; according to the target position information and the current position information of the UAV, control the UAV to follow The user holding the terminal device flies.
  • the size information of the user holding the terminal device in the world coordinate system includes the length information, width information and/or height information of the user holding the terminal device
  • the angle information of the user holding the terminal device relative to the drone includes: The yaw angle, pitch angle and roll angle of the user holding the terminal device relative to the drone.
  • the preset target detection model includes a 2D target detection model or a 3D target detection model.
  • the 3D target detection model is the pre-trained first neural network.
  • the 2D target detection model is a pre-trained second neural network model.
  • the first neural network model is different from the second neural network model.
  • the first neural network model includes convolutional neural network models CNN, RCNN, Fast RCNN and Faster RCNN.
  • Any one of the second neural network model includes any one of the convolutional neural network model CNN, RCNN, Fast RCNN and Faster RCNN.
  • the method of training the first neural network model to obtain the 3D target detection model may be: acquiring first training sample data, wherein the first training sample data includes a plurality of first images and each first image The first target detection information of the target to be tracked in ; the first neural network model is iteratively trained according to the first training sample data, until the iteratively trained first neural network model converges, and a 3D target detection model is obtained.
  • the detection information and corresponding images are used to train the first neural network model, which can solve the problem that the existing 3D target detection algorithm cannot be reused on the UAV, so that the UAV can detect the target to be tracked based on the 3D target detection model. , which is convenient for subsequent and accurate control of the drone to follow the user holding the terminal device to fly.
  • the method of training the second neural network model to obtain the 2D target detection model may be: acquiring second training sample data, wherein the second training sample data includes a plurality of second images and each The second target detection information of the target to be tracked in the two images; the second neural network model is iteratively trained according to the second training sample data, until the iteratively trained second neural network model converges, and a 2D target detection model is obtained.
  • a 2D target detection model can be obtained, so that the UAV can Target detection can be performed on the target to be tracked based on the 2D target detection model, which is convenient for subsequent and accurate control of the drone to follow the user holding the terminal device to fly.
  • the preset target tracking algorithm includes any one of a mean shift algorithm, a Kalman filter algorithm, a particle filter algorithm, and a moving target modeling algorithm. In some other embodiments, other target tracking algorithms may also be used, which are not limited herein.
  • the method of controlling the drone to follow the user holding the terminal device to fly may be: obtaining the target rotation speed of the user holding the terminal device; according to the target rotation speed, controlling the drone to follow the terminal device around the drone of users flying.
  • the user holding the terminal device is always in the shooting area of the first shooting device, and the drone is always in the shooting area of the second shooting device .
  • the drone is controlled to fly around and follow the user holding the terminal device, so that the first shooting device can shoot a more spectacular video.
  • the user 31 holding the terminal device rotates on the spot according to the rotation track 32, and the rotation direction is counterclockwise, and the shooting area of the second shooting device on the terminal device is the line 33 and the line 32.
  • the drone 35 is located in the area between the line 33 and the line 34.
  • the drone 35 flies around and follows the user holding the terminal device 31 according to the follow-around track 37, and the surrounding direction is also counterclockwise.
  • the photographing area of the first photographing device on the drone 35 is the area between the lines 37 and 38
  • the user 31 holding the terminal device is located in the area between the lines 37 and 38 .
  • the method of controlling the drone to fly around and follow the user holding the terminal device may be: acquiring the starting waypoint of the surround in the shooting control instruction, and controlling the drone to fly towards the surround. Flying at the starting point; after the drone reaches the starting waypoint and flying around the starting point, according to the target rotation speed of the user holding the terminal device, the drone is controlled to start from the starting point of User flies.
  • the encircling starting waypoint in the shooting control instruction may be input by the user through the shooting control page of the terminal device.
  • the terminal device includes an inertial measurement unit and a compass sensor
  • the method of acquiring the target rotation speed of the user holding the terminal device may be: acquiring the first sensing data collected by the inertial measurement unit, 1. sensing data, to determine the target rotation speed of the user holding the terminal device; or, to obtain the first sensing data collected by the inertial measurement unit and the second sensing data collected by the compass sensor; according to the first sensing data data and the second sensory data to determine the target rotational speed of the user holding the terminal device.
  • the target rotation speed of the user holding the terminal device can be accurately determined by the inertial measurement unit, so that the drone can fly around and follow the user accurately according to the target rotation speed.
  • the method of acquiring the target rotation speed of the user holding the terminal device may be: acquiring images collected by the first photographing device at different times, and determining that the identification target set on the terminal device is in each image. According to the pixel coordinates of the identification target in each image, the target rotation speed of the user holding the terminal device is determined.
  • the identification target includes an anti-reflection film set on the terminal device, and the anti-reflection film is more easily recognized by the first photographing device.
  • the rotation speed of the marking target set on the terminal equipment can be determined through the images collected by the first photographing device at different times. Since the marking target is set on the terminal equipment, the rotation speed of the marking target can be determined as holding the terminal equipment. The user's target rotation speed.
  • the terminal device includes an inertial measurement unit and a compass sensor
  • the method of acquiring the target rotation speed of the user holding the terminal device may be: acquiring the first sensor data collected by the inertial measurement unit and setting it on the terminal device.
  • the target rotation speed of the user holding the terminal device is determined according to the first sensing data and the rotation speed of the identified target.
  • the method of determining the target rotation speed of the user holding the terminal device may be: according to the first sensing data, determining the rotation speed of the user holding the terminal device.
  • the first rotation speed, and according to the first rotation speed, the first preset weight coefficient corresponding to the first rotation speed, the rotation speed of the identification target, and the second preset weight coefficient corresponding to the rotation speed of the identification target determine the holding terminal device The user's target rotation speed.
  • the first preset weight coefficient and the second preset weight coefficient may be set based on actual conditions, which are not specifically limited in this embodiment of the present application. For example, the first preset weight coefficient is 0.7, and the second preset weight coefficient is 0.7. is 0.3.
  • the target rotation speed may be w 1 *k 1 +w 2 *k 2 .
  • the terminal device includes an inertial measurement unit and a compass sensor
  • the method of acquiring the target rotation speed of the user holding the terminal device may be: acquiring the first sensing data collected by the inertial measurement unit, the The second sensory data and the rotation speed of the identification target set on the terminal device; according to the first sensory data, the second sensory data and the rotation speed of the identification target, determine the target rotation speed of the user holding the terminal device .
  • the method of determining the target rotation speed of the user holding the terminal device may be: according to the first sensing data and the The second sensing data determines the second rotation speed of the user holding the terminal device; according to the second rotation speed, the third preset weight coefficient corresponding to the second rotation speed, the rotation speed of the identification target, and the rotation speed of the identification target corresponding to The second preset weight coefficient determines the target rotation speed of the user holding the terminal device.
  • the target rotation speed may be w 3 *k 3 +w 2 *k 2 .
  • the method of controlling the drone to fly around and follow the user holding the terminal device may be: obtaining the surrounding flight angle of the drone; controlling the drone to follow and hold the drone according to the target rotation speed
  • the user of the terminal device flies until the angle at which the drone flies around the user reaches a circling flight angle.
  • the surrounding flight angle can be set by the user through the shooting control page of the terminal device, or can be extracted and preset, which is not specifically limited in this embodiment of the present application.
  • the surrounding flight angle includes 90°, 120°, 180°, 240°, 270°, 300°, 360°, etc.
  • the throwing direction of the drone is determined; the drone is controlled along the throwing direction. flight.
  • the method of determining the flying direction of the UAV may be as follows: if the flying control command sent by the terminal device is detected, the flying speed of the UAV at the current position is determined, and the direction of the horizontal component of the flying speed is determined. , and the direction of the horizontal component of the flight speed is determined as the flying direction of the UAV.
  • the terminal device includes a flight trigger control key, and when a user triggering operation on the flight trigger control key is detected, a flight control instruction is generated, and the flight trigger control key can be a physical button or a virtual button. This application The embodiment does not specifically limit this. By controlling the drone to fly along the throwing direction, the first photographing device can shoot a video with the throwing effect.
  • the user 41 holding the terminal device rotates on the spot according to the rotation trajectory 42 , and the rotation direction is counterclockwise, and the drone starts around the starting waypoint 43
  • the drone starts around the starting waypoint 43
  • the drone flies around to the waypoint 44.
  • there is no The man-machine flies along the direction of the throwing trajectory 45 from the waypoint 44 , and hovers when the drone reaches the waypoint 46 .
  • the manner of controlling the drone to fly along the throwing direction may be: controlling the drone to fly along the throwing direction until the distance of the drone flying along the throwing direction reaches a preset distance; or Control the drone to fly along the throwing direction until the flying distance of the drone along the throwing direction reaches the throwing distance in the throwing control command; or, control the drone to decelerate and fly along the throwing direction until The horizontal flight speed of the drone becomes zero.
  • the posture of the first photographing device is adjusted so that the user holding the terminal device is always within the photographing area of the first photographing device.
  • the posture of the first photographing device is adjusted so that the holding terminal in the image collected by the first photographing device after the posture is adjusted.
  • the user of the device is in a preset position in this image.
  • the preset position may be the central position of the image, or may be other positions in the image, which is not specifically limited in this application.
  • the image captured by the first photographing device is acquired, and it is determined whether the position of the user holding the terminal device in the image is The preset position; if the position of the user holding the terminal device in the image is not the preset position, and the deflection angle of the pan/tilt head equipped with the first photographing device is greater than the preset angle, the terminal device is controlled to output rotation speed adjustment prompt information , to prompt the user to adjust its own rotation speed, to ensure that the drone can better fly around and follow the user holding the terminal device, and to improve the video shooting effect and user experience.
  • the flight trajectory in the shooting control instruction is acquired, and the drone is controlled to fly according to the flight trajectory, The drone is kept away from the user holding the terminal device or close to the user holding the terminal device, so that the first photographing device can shoot a video with a viewing angle far away from the user or close to the user.
  • FIG. 8 is a schematic structural block diagram of a control apparatus provided by an embodiment of the present application.
  • the control device 300 includes a processor 301 and a memory 302, and the processor 301 and the memory 302 are connected through a bus 303, such as an I2C (Inter-integrated Circuit) bus.
  • the control device 300 is used to control the drone and the terminal device, the drone includes a first photographing device, the terminal device includes a second photographing device, and the drone is communicatively connected to the terminal device.
  • the processor 301 may be a micro-controller unit (Micro-controller Unit, MCU), a central processing unit (Central Processing Unit, CPU) or a digital signal processor (Digital Signal Processor, DSP) or the like.
  • MCU Micro-controller Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 302 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) magnetic disk, an optical disk, a U disk, a mobile hard disk, and the like.
  • ROM Read-Only Memory
  • the memory 302 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) magnetic disk, an optical disk, a U disk, a mobile hard disk, and the like.
  • the processor 301 is used for running the computer program stored in the memory 302, and implements the following steps when executing the computer program:
  • the first video is obtained by the control device controlling the first shooting device to shoot the user holding the terminal device according to a shooting control instruction triggered by the user;
  • the second video is obtained by the control device controlling the second shooting device to shoot the drone according to the shooting control instruction;
  • the first video and the second video are synthesized to obtain a target video, where the target video includes at least a part of video frames in the first video and at least a part of video frames in the second video.
  • the start and end shooting moments of the first video and the second video are the same.
  • the user is in the photographing area of the first photographing device, while the drone is photographing the second within the shooting area of the device.
  • the first video includes a plurality of first video frames, each of the first video frames includes the user, and the user in the first video frame is located in the first video frame.
  • the preset position of the frame the second video includes a plurality of second video frames, and each of the second video frames includes the drone.
  • the target video includes a plurality of first video frame groups and a plurality of second video frame groups, and the first video frame group in the target video is adjacent to the second video frame group , the first video frame group includes one or more first video frames, and the second video frame group includes one or more second video frames.
  • the processor is further configured to implement the following steps:
  • the acquiring the first video includes:
  • the shooting control instruction controlling the first shooting device to shoot the user holding the terminal device to obtain a first video
  • the obtaining of the second video includes:
  • the second shooting device is controlled to shoot the drone to obtain a second video.
  • the processor before the processor controls the terminal device to output the shooting prompt information, the processor is further configured to:
  • the terminal device controlling the terminal device to display a shooting control page, wherein the shooting control page includes a viewing angle switching icon and a first image collected by the first shooting device;
  • the user in the first image is used as the first photographing subject of the first photographing device
  • the drone in the second image is used as the second photographing subject of the second photographing device;
  • the terminal device is controlled to output photographing prompt information.
  • the processor is further configured to implement the following steps:
  • controlling the terminal device to output first prompt information, where the first prompt information is used to prompt the user to adjust the first shooting area of the first shooting device;
  • the posture of the first photographing device is adjusted according to a first posture adjustment instruction triggered by the user, and the first photographing area and the displayed first image change with the posture of the first photographing device.
  • controlling the terminal device to output the first prompt information includes:
  • the terminal device is controlled to output first prompt information.
  • the processor is further configured to implement the following steps:
  • second prompt information is output, wherein the second prompt information is used to prompt the user to stop adjusting the first shooting area.
  • the processor is further configured to implement the following steps:
  • controlling the terminal device to output third prompt information, where the third prompt information is used to prompt the user to adjust the second shooting area of the second shooting device;
  • the posture of the second photographing device is adjusted according to the second posture adjustment instruction triggered by the user, and the second photographing area and the displayed second image change with the change of the posture of the second photographing device.
  • controlling the terminal device to output the third prompt information includes:
  • the terminal device is controlled to output third prompt information.
  • the processor is further configured to implement the following steps:
  • fourth prompt information is output, wherein the fourth prompt information is used to prompt the user to stop adjusting the second shooting area.
  • the processor is further configured to implement the following steps:
  • the drone is controlled to fly according to the flight trajectory, so that the drone is far away from the user holding the terminal device or close to the user holding the terminal device.
  • the flight trajectory includes any one of a first flight trajectory and a second flight trajectory
  • the controlling the UAV to fly according to the flight trajectory includes:
  • the drone is controlled to fly according to the second flight trajectory, so that the drone is close to the user holding the terminal device.
  • the processor is further configured to implement the following steps:
  • the drone In the process of controlling the first photographing apparatus to photograph the user holding the terminal device, the drone is controlled to follow the user holding the terminal device to fly.
  • controlling the drone to follow the user holding the terminal device to fly includes:
  • the drone is controlled to follow the user holding the terminal device to fly according to the current image.
  • controlling the drone to follow the user holding the terminal device to fly according to the current image includes:
  • the target detection information includes the user in the world Size information, position information and angle information of the user relative to the drone in the coordinate system;
  • the size information, the angle information and the preset target tracking algorithm predict the target position information of the user at the next moment
  • the drone is controlled to follow the user holding the terminal device to fly.
  • controlling the drone to follow the user holding the terminal device to fly includes:
  • the drone is controlled to fly around and follow the user holding the terminal device.
  • the terminal device includes an inertial measurement unit and a compass sensor
  • the acquiring the target rotation speed of the user holding the terminal device includes:
  • a target rotation speed of the user holding the terminal device is determined.
  • the acquiring the target rotation speed of the user holding the terminal device includes:
  • the target rotation speed of the user holding the terminal device is determined.
  • the identification target includes an antireflection film provided on the terminal device.
  • the terminal device includes an inertial measurement unit and a compass sensor
  • the acquiring the target rotation speed of the user holding the terminal device includes:
  • the target rotation speed of the user holding the terminal device is determined.
  • controlling the drone to fly around and following the user holding the terminal device according to the target rotation speed includes:
  • the drone is controlled to fly around and follow the user holding the terminal device according to the target rotation speed, until the flying angle of the drone around the user reaches the around flying angle.
  • the processor is further configured to implement the following steps:
  • the drone is controlled to fly along the throwing direction.
  • controlling the drone to fly along the throwing direction includes:
  • the UAV is controlled to decelerate along the throwing direction until the horizontal flight speed of the UAV becomes zero.
  • the processor is further configured to implement the following steps:
  • the posture of the first photographing device is adjusted so that the user in the image captured by the first photographing device after the posture is adjusted is located in the Preset positions in the image.
  • FIG. 9 is a schematic structural block diagram of an unmanned aerial vehicle provided by an embodiment of the present application.
  • the drone 400 includes: a body 410 , a power system 420 , a first photographing device 430 , a first wireless communication device 440 and a control device 450 , a power system 420 , a first photographing device 430 , and a first wireless communication device
  • the device 440 and the control device 450 are arranged on the body 410, the power system 420 is used to provide flight power for the UAV 400, the first photographing device 430 is used for the first video, and the first wireless communication device 440 is used to communicate with the terminal equipment.
  • the second wireless communication device performs communication, and the control device 450 is configured to control the first shooting device 430 to shoot the first video and/or the drone 400 to fly.
  • FIG. 10 is a schematic structural block diagram of a terminal device provided by an embodiment of the present application.
  • the terminal device 500 includes a second photographing apparatus 510 , a display apparatus 520 , a second wireless communication apparatus 530 and a control apparatus 540 , the second photographing apparatus 510 is used for photographing a second video, and the display apparatus 520 is used for displaying images Or a human-computer interaction interface, the second wireless communication device 530 is used to communicate with the first wireless communication device in the drone, and the control device is used to control the second shooting device to shoot the second video.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, the computer program includes program instructions, and the processor executes the program instructions, so as to realize the provision of the above embodiments.
  • the steps of the shooting control method are described in detail below.
  • the computer-readable storage medium may be an internal storage unit of the terminal device or the drone described in any of the foregoing embodiments, such as a hard disk or memory of the terminal device or the drone.
  • the computer-readable storage medium can also be an external storage device of the terminal device or the drone, such as a plug-in hard disk equipped on the terminal device or the drone, a smart memory card (Smart Media Card, SMC), Secure Digital (SD) card, flash memory card (Flash Card), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un procédé et un appareil de commande de prise de photographies, un véhicule aérien sans pilote, un dispositif et un support de stockage lisible. Le procédé comprend : l'obtention d'une première vidéo (S101) ; l'obtention d'une seconde vidéo (S102) ; et la synthèse de la première vidéo et de la seconde vidéo pour obtenir une vidéo cible, la vidéo cible comprenant au moins une partie de trames vidéo dans la première vidéo et au moins une partie de trames vidéo dans la seconde vidéo (S103). La présente demande améliore l'aspect intéressant et l'impact d'une vidéo.
PCT/CN2020/126560 2020-11-04 2020-11-04 Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible WO2022094808A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/126560 WO2022094808A1 (fr) 2020-11-04 2020-11-04 Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/126560 WO2022094808A1 (fr) 2020-11-04 2020-11-04 Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible

Publications (1)

Publication Number Publication Date
WO2022094808A1 true WO2022094808A1 (fr) 2022-05-12

Family

ID=81456847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/126560 WO2022094808A1 (fr) 2020-11-04 2020-11-04 Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible

Country Status (1)

Country Link
WO (1) WO2022094808A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106973261A (zh) * 2017-03-03 2017-07-21 湖北天专科技有限公司 以第三方视角观察无人机的设备、系统和方法
CN107087427A (zh) * 2016-11-30 2017-08-22 深圳市大疆创新科技有限公司 飞行器的控制方法、装置和设备以及飞行器
CN109561282A (zh) * 2018-11-22 2019-04-02 亮风台(上海)信息科技有限公司 一种用于呈现地面行动辅助信息的方法与设备
US20190387153A1 (en) * 2018-06-14 2019-12-19 Honeywell International Inc. Imaging resolution and transmission system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107087427A (zh) * 2016-11-30 2017-08-22 深圳市大疆创新科技有限公司 飞行器的控制方法、装置和设备以及飞行器
CN106973261A (zh) * 2017-03-03 2017-07-21 湖北天专科技有限公司 以第三方视角观察无人机的设备、系统和方法
US20190387153A1 (en) * 2018-06-14 2019-12-19 Honeywell International Inc. Imaging resolution and transmission system
CN109561282A (zh) * 2018-11-22 2019-04-02 亮风台(上海)信息科技有限公司 一种用于呈现地面行动辅助信息的方法与设备

Similar Documents

Publication Publication Date Title
US11644832B2 (en) User interaction paradigms for a flying digital assistant
US11573562B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
CN108351653B (zh) 用于uav飞行控制的系统和方法
JP6735821B2 (ja) Uav経路を計画し制御するシステム及び方法
WO2018209702A1 (fr) Procédé de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support d'informations lisible par machine
WO2018098704A1 (fr) Procédé, appareil et système de commande, véhicule aérien sans pilote, et plateforme mobile
WO2017071143A1 (fr) Système et procédés de planification et de commande de trajectoire d'uav
WO2018053877A1 (fr) Procédé de commande, dispositif de commande, et système de distribution
JP2019507924A (ja) Uav軌道を調整するシステム及び方法
US20180024557A1 (en) Autonomous system for taking moving images, comprising a drone and a ground station, and associated method
WO2022036500A1 (fr) Procédé d'aide au vol pour véhicule aérien sans pilote, dispositif, puce, système et support
WO2020048365A1 (fr) Procédé et dispositif de commande de vol pour aéronef, et dispositif terminal et système de commande de vol
CN108450032B (zh) 飞行控制方法和装置
CN115238018A (zh) 用于管理3d飞行路径的方法和相关系统
JP7435599B2 (ja) 情報処理装置、情報処理方法、及びプログラム
US12007763B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
WO2022094808A1 (fr) Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible
WO2022021028A1 (fr) Procédé de détection de cible, dispositif, aéronef sans pilote et support de stockage lisible par ordinateur
WO2022188151A1 (fr) Procédé de photographie d'image, appareil de commande, plateforme mobile et support de stockage informatique
WO2019134148A1 (fr) Procédé et dispositif de commande de véhicule aérien sans pilote, et plate-forme mobile
CN117693663A (zh) 控制方法、头戴式显示设备、控制系统及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20960270

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20960270

Country of ref document: EP

Kind code of ref document: A1