WO2023143460A1 - 控制飞行组件的方法和装置、终端和可读存储介质 - Google Patents

控制飞行组件的方法和装置、终端和可读存储介质 Download PDF

Info

Publication number
WO2023143460A1
WO2023143460A1 PCT/CN2023/073405 CN2023073405W WO2023143460A1 WO 2023143460 A1 WO2023143460 A1 WO 2023143460A1 CN 2023073405 W CN2023073405 W CN 2023073405W WO 2023143460 A1 WO2023143460 A1 WO 2023143460A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
flight
flying
target
axis
Prior art date
Application number
PCT/CN2023/073405
Other languages
English (en)
French (fr)
Inventor
李磊
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2023143460A1 publication Critical patent/WO2023143460A1/zh

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • the present application belongs to the field of electronic technology, and in particular relates to a method and device for controlling flight components, a terminal and a readable storage medium.
  • the purpose of the embodiments of the present application is to provide a method and device, a terminal, and a readable storage medium for controlling a flight component, so as to perform flexible and convenient control on the flight component.
  • the embodiment of the present application provides a method for controlling a flight component.
  • the flying component is movably arranged inside the terminal, and the terminal is provided with an exit for the flying component to move out of the terminal, and the method includes: responding to a first input of the touch body on the terminal , displaying a control on the terminal; in response to a second input of the touch body to the control, controlling the flight component to move in the direction of the exit; after the flight component is removed from the terminal , according to the The action of the touch body controls the flying action of the flying component.
  • the embodiment of the present application provides a device for controlling a flight component.
  • the flying component is movably arranged inside the terminal, and the terminal is provided with an exit for the flying component to move out of the terminal, and the device includes: a first response module, configured to respond to the touch body when the A first input on the terminal, displaying controls on the terminal; a second response module, configured to control the direction of the flying component to the exit in response to the second input of the touch body to the control Movement: a flight control module, configured to control the flight action of the flight component according to the movement of the touch body after the flight component is removed from the terminal.
  • the embodiment of the present application provides a terminal.
  • the terminal includes a processor and a memory, and the memory stores programs or instructions that can run on the processor.
  • the programs or instructions are executed by the processor, the control of the flight component as described in the first aspect is realized. method steps.
  • the embodiment of the present application provides a readable storage medium.
  • the readable storage medium stores programs or instructions, and when the programs or instructions are executed by the processor, the steps of the method for controlling the flight component according to the first aspect are realized.
  • the embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions, so as to implement the first aspect the method described.
  • an embodiment of the present application provides a computer program product, the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the method described in the first aspect.
  • the flight component is controlled based on the touch body, and controls are displayed on the terminal in response to the first input of the touch body on the terminal, and the flight control is controlled in response to the second input of the touch body to the control.
  • the component moves toward the exit, and after the flying component is removed from the terminal, the flying component is controlled to perform corresponding flying actions according to the movement of the touch body.
  • FIG. 1 is a schematic flowchart of a method for controlling a flight component provided by an embodiment of the present application
  • Fig. 2 is a schematic diagram of the flight assembly provided by the embodiment of the present application.
  • Fig. 3 is a schematic diagram of a window interface provided by an embodiment of the present application.
  • Fig. 4 is a schematic diagram of the coordinate system of the stylus provided by the embodiment of the present application.
  • Fig. 5 is a schematic diagram of the coordinate system of the flight assembly provided by the embodiment of the present application.
  • Fig. 6 is a schematic diagram of setting the target axis of the flight assembly provided by the embodiment of the present application.
  • Fig. 7 is a block diagram of a device for controlling flight components provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a terminal provided in an embodiment of the present application.
  • Fig. 9 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • the terminal, the touch body and the flying component involved therein will be described with examples.
  • the flying component includes a bracket, a battery mounted on the bracket, a driving mechanism, a camera, a first wireless communication module, a processor, and a memory.
  • the driving mechanism can drive the overall translation, lifting and rotation of the flight components.
  • the driving mechanism includes a propeller A1 , a propeller A2 , a propeller A3 , and a propeller A4 , and a plurality of motors driving the propeller A1 , propeller A2 , propeller A3 , and propeller A4 .
  • Camera B is installed in the middle of the four propellers.
  • the flight component is also equipped with sensors such as acceleration sensors and gyroscopes, through which the movement and rotation of the flight component can be detected, and then the attitude of the flight component can be determined.
  • a terminal in an embodiment of the present application, includes a casing, a touch screen, a second wireless communication module, a processor, and a memory.
  • the casing is provided with an outlet for moving in and out of the flying components.
  • the flying assembly may be movably disposed inside the terminal, the flying assembly may be moved from the inside of the terminal to the outside of the terminal through the exit, and physically detached from the terminal.
  • the terminal can be, for example, a mobile phone, a tablet computer, a notebook computer, and the like.
  • the stylus is provided with sensors such as an acceleration sensor and a gyroscope, through which the movement and rotation of the stylus can be detected to determine the posture of the stylus.
  • the stylus is also provided with a third wireless communication module, a processor and a memory.
  • the terminal, the stylus, and the flight component are supported to have information exchange capabilities.
  • the stylus can interact with the flight components through the terminal.
  • the touch body may be a user.
  • a wearable device may be worn on the user, and based on a sensing signal output by a sensor in the wearable device, an action of the user may be detected.
  • a wearable device is provided with sensors such as an acceleration sensor and a gyroscope, through which a user's translational movement or rotational movement can be detected.
  • the wearable device is also provided with a fourth wireless communication module, a processor and a memory.
  • the wearable device may be, for example, a smart watch, a smart bracelet, and the like.
  • the first wireless communication module Through the first wireless communication module, the second wireless communication module, and the fourth wireless communication module, it is possible to support terminals, wearable devices, and flight components to have the ability to exchange information between two pairs. or, wearable Devices can exchange information through terminals and flight components.
  • FIG. 1 is a schematic flowchart of a method for controlling flight components provided by an embodiment of the present application. As shown in Fig. 1, the method may include steps S102-S106, which will be described in detail below.
  • Step S102 displaying controls on the terminal in response to the first input of the touch body on the terminal.
  • the touch body may be a stylus or a user.
  • the user may use a stylus or use a finger to directly operate the terminal to generate a first input, and in response to the first input, the terminal displays controls.
  • control is a floating icon.
  • floating icon can be moved by dragging.
  • the control is a window interface corresponding to the flight component.
  • the window interface may include a virtual flying component frame corresponding to the flying component, for example as shown in FIG. 3
  • the window interface D is a virtual flying component frame corresponding to the flying component shown in FIG. 2 .
  • the window interface includes a virtual flight component frame corresponding to the flight component, which can facilitate the user to intuitively manipulate the flight component.
  • Step S104 in response to the second input of the touch body to the control, control the flight component to move towards the exit.
  • control is a floating icon and the floating icon can be dragged and moved.
  • the user can use a stylus or drag and move the floating icon to the direction of the exit to generate a second input to control the flight.
  • the components move synchronously in the direction of the outlet.
  • the sliding direction of the touch body, the direction in which the floating icon is dragged and moved, and the direction in which the flying component moves out of the terminal are consistent, providing users with a more intuitive operation experience, and at the same time very simple and flexible.
  • the control is a window interface corresponding to the flight component.
  • the second input is the input generated by the stylus sliding from the first frame of the window interface to the second frame, the first frame is the frame of the window interface far away from the exit, and the second frame is the frame of the window interface close to the exit.
  • the exit C of the terminal is located at the top of the terminal, the window interface D is the border of the virtual flight component, the window interface D is located at the top of the screen interface and is close to the exit C, and the second input is the stylus from the first point of the window interface D.
  • the second input may be an input generated by long pressing the first frame of the window interface with the stylus and then sliding to the second frame. Long-pressing the first border of the window interface with the stylus means that the stylus touches the first border and lasts longer than a preset time.
  • control is a window interface corresponding to the flying component
  • the window interface includes a virtual flying component frame corresponding to the flying component
  • the flying component moves synchronously with the sliding of the stylus, so that the stylus moves from the first frame to When sliding to the second frame, the flying component just comes out of the terminal.
  • the flying component can be controlled flexibly and conveniently, and it is very intuitive and easy for users to understand and operate.
  • the touch body is a stylus.
  • the flight component In response to the fourth input generated by the long press of the stylus on the window interface, the flight component is controlled to establish a communication connection with the stylus. For example, the stylus clicks on the center of the window interface and the duration of the click exceeds a preset duration.
  • the flight component in response to the fifth input generated by the continuous clicking of the stylus on the window interface, the flight component is controlled to establish a communication connection with the stylus. For example, the stylus continuously clicks the center of the window interface.
  • the flight component and the stylus can directly perform information interaction, such as sending and receiving instructions, sending and receiving sensor data, sending and receiving attitude data, and so on.
  • Step S106 after the flying component is removed from the terminal, the flying action of the flying component is controlled according to the movement of the touch body.
  • the flying component is controlled to fly to a preset height and hover. That is to say, after the flight component is removed from the terminal, it first flies to a preset altitude and hovers, waiting for the user's flight control command.
  • the preset height may be a height set by the user, for example, the preset height may be 1 meter above the terminal or 2 meters above the ground.
  • a preset program is stored in the flight component, and under the control of the preset program, the flight component automatically flies to a preset altitude and hovers after being moved out of the terminal.
  • the action of the touch body can be translation and rotation.
  • the touch body is a user, and the user wears a wearable device. After the flight component is removed from the terminal, before the flight action of the flight component is controlled according to the action of the touch body, the sensing signal of the target sensor in the wearable device worn by the user is obtained, and the touch body is determined according to the sensing signal Actions.
  • the touch body is a stylus, and before the flight action of the flying component is controlled according to the movement of the touch body, the motion of the stylus can be determined through the sensing signal output by the sensor installed in the stylus.
  • the following describes an embodiment of controlling the translation of the flying component after the flying component is pushed out of the terminal.
  • the process of controlling the translation of the flying component according to the movement of the touch body includes steps S202-S204.
  • Step S202 detecting the translation direction and translation distance of the touch body, and determining the target translation direction and target translation distance of the flying component according to the translation direction and translation distance of the touch body.
  • the translation direction of the touch body refers to the direction in which the touch body moves in space.
  • the translation distance of the touch body is the distance that the touch body moves in space. For example, if the touch body moves 1 meter to the east, the translation direction of the touch body is due east, and the translation distance is 1 meter.
  • the target translation direction of the flight component is consistent with the translation direction of the touch body.
  • the user wants to pan the flying component to the east he can perform the pan operation on the touch body to the east.
  • the user wants to pan the flight component to the south he can operate the touch body to pan to the south.
  • the target translation distance of the flying component may be a preset multiple of the translation distance of the touch body.
  • the preset multiple is, for example, n times, n ⁇ 1, when the translation distance of the touch body controlled by the user is P, the target translation distance of the flying component is the product of P and n.
  • Step S204 controlling the flight component to translate according to the target translation direction and the target translation distance.
  • the following describes an embodiment of controlling the rotation of the flying component after the flying component is pushed out of the terminal.
  • the coordinate system of the stylus itself is a three-dimensional orthogonal Cartesian coordinate system, wherein the Z-axis of the stylus is the coordinate axis along the length direction of the pen body, and the positive direction of the Z-axis of the stylus is the touch axis.
  • the tip of the stylus points to the tail of the stylus.
  • the body coordinate system refers to a three-dimensional orthogonal rectangular coordinate system fixed on the aircraft, and its origin is located at the center of mass of the aircraft.
  • the process of controlling the rotation of the flying component according to the motion of the touch body includes steps S302-S304.
  • Step S302 detecting the rotation direction and rotation angle of the touch body, and determining the target rotation direction and target rotation angle of the flying component according to the rotation direction and rotation angle of the touch body.
  • the rotation angle of the touch body is the rotation angle of the touch body in space.
  • the rotation direction of the touch body refers to the direction in which the touch body rotates in space.
  • the target rotation angle of the flying component may be consistent with the rotation angle of the touch body.
  • the target rotation angle of the flying component may be a preset multiple of the rotation angle of the touch body.
  • the target rotation direction of the flight component is consistent with the rotation direction of the touch body.
  • the Z-axis of the stylus is the coordinate axis along the length of the pen body
  • the Z-axis of the stylus is its default rotation axis
  • the rotation direction of the stylus refers to Looking from the negative direction of the Z-axis of the stylus to the positive direction of the Z-axis of the stylus, whether the stylus rotates clockwise or counterclockwise.
  • the height direction of the user is the direction of the rotation axis by default
  • the rotation direction of the touch body refers to whether it is clockwise or counterclockwise when viewed from above the user.
  • the rotation direction of the flight component refers to whether the flight component rotates clockwise or counterclockwise when viewed from the negative direction of the target axis of the flight component to the positive direction of the target axis of the flight component.
  • Step S304 controlling the flying component to rotate with the target axis as the rotation axis according to the target rotation direction and the target rotation angle.
  • the touch body is a stylus
  • the flying is controlled according to the movement of the touch body.
  • the process of rotating the row assembly includes steps S402-S404.
  • Step S402 detecting the rotation angle of the stylus with the length direction of the pen body as the rotation axis, and determining the target rotation angle of the flying component according to the rotation angle. Detect the rotation direction of the stylus with the length direction of the pen body as the rotation axis, and determine the target rotation direction of the flying component according to the rotation direction.
  • the sensor installed in the stylus can detect the rotation angle and direction of rotation with the length of the pen body as the rotation axis.
  • the rotation angle of the stylus with the length direction of the stylus as the rotation axis is the rotation angle with the Z axis of the stylus as the rotation axis.
  • the rotation direction of the stylus refers to looking from the negative direction of the Z-axis of the stylus to the positive direction of the Z-axis of the stylus. Clockwise rotation or counterclockwise rotation.
  • the target rotation angle of the flight component can be consistent with the rotation angle of the stylus. For example, if the rotation angle of the stylus is 180 degrees with the length direction of the pen body as the rotation axis, the target rotation angle of the flight component is also 180 degrees.
  • the target rotation angle of the flying component may be a preset multiple of the rotation angle of the stylus. The preset multiple is, for example, m times, m ⁇ 1, when the user controls the rotation angle with the length direction of the pen body as the rotation axis as Q, the target rotation angle of the flying component is the product of Q and m.
  • Step S404 controlling the flying component to rotate according to the target rotation direction and target rotation angle with the target axis as the rotation axis.
  • the target rotation direction in which the flying component rotates around the target axis corresponds to the rotation direction in which the stylus takes the length direction of the pen body as the rotation axis.
  • the rotation direction of the stylus refers to the positive direction of the Z-axis of the stylus when viewed from the negative direction of the Z-axis of the stylus. Clockwise rotation or counterclockwise rotation.
  • the rotation direction of the flight component refers to whether the flight component rotates clockwise or counterclockwise when viewed from the negative direction of the X axis of the flight component to the positive direction of the X axis of the flight component. The hour hand rotates.
  • the rotation direction of the flight component refers to whether the flight component rotates clockwise or counterclockwise when viewed from the negative direction of the Y axis of the flight component to the positive direction of the Y axis of the flight component.
  • the hour hand rotates.
  • the rotation direction of the flight component refers to the negative direction from the Z axis of the flight component. Looking at the positive direction of the Z-axis of the flying component, whether the flying component rotates clockwise or counterclockwise.
  • the target rotation direction of the flight component with the target axis as the rotation axis is also clockwise rotation. If it is detected that the rotation direction of the stylus with the length direction of the pen body as the rotation axis is counterclockwise, the target rotation direction of the flight component with the target axis as the rotation axis is also counterclockwise.
  • the target axis of the flying component is the Z axis of the flying component by default, and the user can manually change the target axis of the flying component.
  • the terminal provides a setting interface of the target axis of the flying component, and the user can set any coordinate axis of the flying component as the target axis in the setting interface.
  • the target axis of the flight component is determined through steps S502-S506.
  • Step S502 acquiring attitude data of the flight component.
  • the attitude data of the flight components can be obtained to determine the attitude of the flight components.
  • Step S504 according to the attitude data of the flight component, the attitude image of the flight component is displayed on the terminal, and the attitude image includes the body coordinate system of the flight component.
  • the attitude image of the flight component is the current attitude image of the flight component.
  • the terminal can receive and display the images captured by the flight components in real time, and display the attitude images of the flight components at the same time, so as to understand the relationship between the current attitude of the user's flight components and the real-time images captured by the flight components more intuitively, and understand the desired rotation result Which rotation method should be used to achieve this.
  • the terminal displays the real-time image captured by the current flight component and the attitude image of the flight component, and the attitude image shows the body coordinate system of the flight component.
  • Step S506 in response to the third input of the touch body on the gesture image, selecting a coordinate axis of the flying component as the target axis.
  • the third input is an input generated by the user selecting a coordinate axis of the flight component in the attitude image of the flight component displayed in the terminal, and the selected coordinate axis is determined as the target axis.
  • the third input is, for example, an input generated by clicking a coordinate axis of the body coordinate system in the attitude image of the flight component with the stylus. For example, if the user uses the stylus to click on the Y coordinate axis of the body coordinate system in the attitude image, then the Y coordinate axis of the flight component is determined as the target axis.
  • the attitude image of the flight component is displayed on the terminal for the user to select, which is more intuitive and convenient for the user to operate.
  • the target axis of the flight component is determined through steps S602-S606.
  • Step S602 acquiring the attitude data of the stylus and the attitude data of the flying component.
  • the attitude data of the flight components can be obtained to determine the attitude of the flight components.
  • Step S604 according to the attitude data of the stylus and the attitude data of the flight component, determine the first coordinate axis of the flight component, the first coordinate axis of the flight component is the body coordinate system of the flight component and the length direction of the stylus pen body Coordinate axes whose acute angles are less than 45 degrees.
  • the coordinate axis is more similar to the Z-axis of the stylus in space.
  • Parallel use this coordinate axis as the first coordinate axis of the flight component.
  • Step S606 determining the first coordinate axis of the flight component as the target axis.
  • the coordinate axis of the flight component that is more approximately parallel to the stylus is set as the first coordinate axis.
  • the user only needs to adjust the posture of the stylus to set the target axis of the flight component.
  • controlling the flying action of the flying component according to the action of the touch body includes the following steps:
  • Step S702 generating flight instructions according to the motion of the touch body.
  • Step S704 acquiring the average movement speed of the touch body within a preset time before generating the flight instruction.
  • Step S706 when the average motion speed is less than or equal to the preset speed threshold, set the flight The commands are sent to the flight components, so that the flight components fly according to the flight instructions.
  • the flight instructions generated by the frequent and rapid movements of the touch body are filtered out, so as to avoid the flight components receiving frequent and conflicting flight instructions in a short period of time, and prevent adverse effects on the safety of the flight components and the safety of the surrounding environment.
  • the flight commands are translation commands. Obtain the average movement speed of the touch body in the preset time before generating the translation command, and only when the average movement speed is less than or equal to the preset threshold, the translation command is sent to the flight component; when the average movement speed is greater than the preset threshold In the case of , filter out this translation command and not send it to the flight component.
  • the preset time and the preset threshold can be set according to actual conditions, the preset time can be, for example, 3 seconds, and the preset threshold can be, for example, 1 meter per second.
  • the flight command is a rotation command.
  • the preset time and the preset threshold can be set according to actual conditions, the preset time can be, for example, 3 seconds, and the preset threshold can be, for example, 1 meter per second.
  • the flight control interface is displayed on the terminal, and the flight action of the flight component is controlled in response to the input of the touch body on the flight control interface.
  • the flight control interface clicks touch body on the touch body, click the "translation” control on the flight interface, and input the target translation direction and target translation distance to control the translation of the flight component.
  • Click the "rotation" control on the touch body and input the target axis, target rotation direction, and target rotation angle to control the translation of the flight component.
  • all or part of the steps in the aforementioned method embodiments for controlling flight components may be applied to and executed by the aforementioned terminal, for example, the aforementioned steps S102-S104 may be executed by the terminal.
  • all or part of the steps related to controlling the translation/rotation of the flight component may be Executed by a stylus or wearable device.
  • the flight component is controlled based on the touch body, and controls are displayed on the terminal in response to the first input of the touch body on the terminal, and the flight control is controlled in response to the second input of the touch body to the control.
  • the component moves toward the exit, and after the flying component is removed from the terminal, the flying component is controlled to perform corresponding flying actions according to the movement of the touch body.
  • the method for controlling the flying component of the embodiment of the present application can be used in combination with the terminal and the stylus to control the flying component.
  • a complete set is proposed including pushing out the flying component from the terminal, translating the flying component after the flying component is separated from the terminal, and rotating the flying component.
  • the component control scheme enables flexible and simple control of flight components, and is very intuitive, easy for users to understand and operate.
  • the method for controlling the flight component provided in the embodiment of the present application may be executed by a device for controlling the flight component.
  • the device for controlling the flying component provided by the embodiment of the present application is described by taking the method for controlling the flying component executed by the device for controlling the flying component as an example.
  • the embodiment of the present application provides a device for controlling flight components, as shown in FIG. 7 , the device for controlling flight components includes a first response module W1 , a second response module W2 and a flight control module W3 .
  • the first response module W1 is configured to display controls on the terminal in response to the first input of the touch body on the terminal.
  • the second response module W2 is configured to control the flight component to move toward the exit in response to the second input of the touch body to the control.
  • the flight control module W3 is configured to control the flight action of the flight component according to the movement of the touch body after the flight component is removed from the terminal.
  • the device further includes a hovering control module.
  • the hovering control module is used to control the flying component to fly to a preset height and hover before controlling the flying action of the flying component according to the action of the touch body after the flying component is removed from the terminal.
  • the touch body is a user
  • the device further includes a touch body action determination module.
  • the touch body action determination module is used to determine the movement of the touch body according to the movement of the touch body after the flight component is removed from the terminal. Before the flight action of controlling the flight component, the sensing signal of the target sensor in the wearable device worn by the user is obtained, and the action of the touch body is determined according to the sensing signal.
  • controlling the flying action of the flying component according to the movement of the touch body includes: detecting the translation direction and translation distance of the touch body, and determining the target translation direction of the flying component according to the translation direction and translation distance of the touch body distance from the target. Control the flight component to translate according to the target translation direction and target translation distance.
  • the controlling the flying action of the flying component according to the movement of the touch body includes: detecting the rotation direction and rotation angle of the touch body, and determining the target rotation direction of the flight component according to the rotation direction and rotation angle of the touch body and the target rotation angle. Control the flight component to rotate with the target axis as the rotation axis according to the target rotation direction and target rotation angle.
  • the touch body is a stylus
  • the control of the flying action of the flying component according to the action of the touch body includes: detecting the rotation angle of the stylus with the length direction of the pen body as the rotation axis, and determining the rotation angle according to the rotation angle.
  • the target rotation angle of the flight component Detect the rotation direction of the stylus with the length direction of the pen body as the rotation axis, and determine the target rotation direction of the flying component according to the rotation direction. Control the flight component to rotate according to the target rotation direction and target rotation angle with the target axis as the rotation axis.
  • the device further includes a first target axis determination module.
  • the first target axis determination module is used to acquire attitude data of the flight component before controlling the flight component to rotate with the target axis as the rotation axis according to the target rotation direction and target rotation angle.
  • the attitude image of the flight component is displayed on the terminal according to the attitude data of the flight component, the attitude image includes the body coordinate system of the flight component, and the body coordinate system is a three-dimensional rectangular coordinate system.
  • one coordinate axis of the flight component is selected as the target axis.
  • the touch body is a touch pen
  • the device further includes a second target axis determining module.
  • the second target axis determination module is used to obtain the attitude data of the stylus and the attitude data of the flight component before controlling the flight component to rotate with the target axis as the rotation axis according to the target rotation direction and target rotation angle.
  • determine the first coordinate axis of the flight component determines the first coordinate axis of the flight component, the first coordinate axis of the flight component is the pen in the body coordinate system of the flight component and the stylus
  • the coordinate axes whose acute angle included in the length direction of the body is less than 45 degrees, the body coordinate system is a three-dimensional Cartesian coordinate system. Determine the first coordinate axis of the flight component as the target axis.
  • the controlling the flight action of the flying component according to the motion of the touch body includes: generating a flight instruction according to the motion of the touch body. Obtain the average movement speed of the touch body within the preset time before generating the flight command. When the average motion speed is less than or equal to the preset speed threshold, the flight instruction is sent to the flight component, so that the flight component flies according to the flight instruction.
  • the device for controlling the flying component controls the flying component based on the touch body, responds to the first input of the touch body on the terminal, displays the control on the terminal, and responds to the first input of the touch body on the control
  • the second input is to control the flight component to move toward the exit, and after the flight component is removed from the terminal, the flight component is controlled to perform corresponding flight actions according to the movement of the touch body.
  • the device for controlling the flying component provided by the embodiment of the present application can be used in combination with a terminal and a stylus to control the flying component. It proposes a complete set including pushing out the flying component from the terminal, translating the flying component after the flying component is separated from the terminal, and rotating the flying component.
  • the control scheme of the flight component can perform flexible and simple control on the flight component, and is very intuitive, and is easy for users to understand and operate.
  • the device for controlling the flight component in the embodiment of the present application may be an electronic device, or a component in the electronic device, such as an integrated circuit or a chip.
  • the electronic device may be a terminal, or other devices other than the terminal.
  • the electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle electronic device, a mobile Internet device (Mobile Internet Device, MID), an augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR ) devices, robots, wearable devices, ultra-mobile personal computers (ultra-mobile personal computer, UMPC), netbooks or personal digital assistants (personal digital assistant, PDA), etc.
  • the device for controlling the flight component in the embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, which are not specifically limited in this embodiment of the present application.
  • the device for controlling the flight component provided by the embodiment of the present application can realize various processes realized by the method embodiments in Fig. 1 to Fig. 6 , and details are not repeated here to avoid repetition.
  • the embodiment of the present application also provides a terminal M00, including a processor M01 and a memory M02.
  • the memory M02 stores programs or instructions that can run on the processor M01.
  • the program Or when the instruction is executed by the processor M01, the steps of the above-mentioned method embodiment for controlling the flight component can be realized, and the same technical effect can be achieved. To avoid repetition, details are not repeated here.
  • terminals in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 9 is a schematic diagram of a hardware structure of an electronic device 1000 implementing an embodiment of the present application.
  • the electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010, etc. part.
  • the electronic device 1000 can also include a power supply (such as a battery) for supplying power to various components, and the power supply can be logically connected to the processor 1010 through the power management system, so that the management of charging, discharging, and function can be realized through the power management system. Consumption management and other functions.
  • a power supply such as a battery
  • the structure of the electronic device shown in FIG. 9 does not constitute a limitation to the electronic device.
  • the electronic device may include more or fewer components than shown in the figure, or combine some components, or arrange different components, which will not be repeated here. .
  • the processor 1010 is configured to: respond to a first input of the touch body on the electronic device, display controls on the electronic device; respond to a second input of the touch body to the control, control the The flying component moves toward the exit; after the flying component is removed from the electronic device, the flying action of the flying component is controlled according to the action of the touch body.
  • the processor 1010 is further configured to: after the flying component is removed from the electronic device, before controlling the flying action of the flying component according to the action of the touch body, control The flying component flies to a preset height and hovers.
  • the touch body is a user
  • the processor 1010 is further configured to: after the flight component is removed from the electronic device, control the flight of the flight component according to the movement of the touch body Before the action, the sensing signal of the target sensor in the wearable device worn on the user is acquired, and the action of the touch body is determined according to the sensing signal.
  • controlling the flying action of the flying component according to the movement of the touch body includes: detecting the translation direction and translation distance of the touch body, and determining the The target translation direction and target translation distance of the flight component; controlling the flight component to translate according to the target translation direction and the target translation distance.
  • controlling the flying action of the flying component according to the movement of the touch body includes: detecting the rotation direction and the rotation angle of the touch body, and determining according to the rotation direction and the rotation angle of the touch body A target rotation direction and a target rotation angle of the flight component; controlling the flight component to rotate according to the target rotation direction and the target rotation angle with the target axis as the rotation axis.
  • the touch body is a stylus
  • the controlling the flying motion of the flying component according to the movement of the touch body includes: detecting the rotation of the stylus with the length direction of the pen body as the rotation axis Angle, determining the target rotation angle of the flying component according to the rotation angle; detecting the rotation direction of the stylus with the length direction of the pen body as the rotation axis, and determining the target rotation direction of the flying component according to the rotation direction;
  • the flying component is controlled to rotate according to the target rotation direction and the target rotation angle with the target axis as the rotation axis.
  • the processor 1010 is further configured to: before controlling the flight component to rotate according to the target rotation direction and the target rotation angle with the target axis as the rotation axis, acquire the attitude data of the flight component ; Display the attitude image of the flight assembly on the electronic device according to the attitude data of the flight assembly, the attitude image includes the body coordinate system of the flight assembly, and the body coordinate system is a three-dimensional Cartesian coordinate system; Response For the third input of the touch body on the gesture image, a coordinate axis of the flying component is selected as a target axis.
  • the touch body is a stylus
  • the processor 1010 is further configured to: control the flying component to rotate according to the target rotation direction and the target rotation angle with the target axis as the rotation axis
  • the first coordinate axis of the flying component is determined, so
  • the first coordinate axis of the flight component is the coordinate axis in the body coordinate system of the flight component that has an acute angle with the pen body length direction of the stylus that is less than 45 degrees
  • the body coordinate system is a three-dimensional Cartesian coordinate system ; Determine the first coordinate axis of the flight component as the target axis.
  • the controlling the flight action of the flying component according to the action of the touch body includes: generating a flight instruction according to the action of the touch body; The average motion speed within a preset time; when the average motion speed is less than or equal to a preset speed threshold, the flight instruction is sent to the flight component, so that the flight component flies according to the flight instruction .
  • the electronic device controls the flight component based on the touch body, and displays controls on the terminal in response to the first input of the touch body on the terminal, and responds to the second input of the touch body to the control,
  • the flight component is controlled to move toward the exit, and after the flight component is removed from the terminal, the flight component is controlled to perform corresponding flight actions according to the movement of the touch body.
  • the electronic device provided by the embodiment of the present application can be used in combination with a terminal and a stylus to control the flying component, and a complete set is proposed including pushing out the flying component from the terminal, translating the flying component after the flying component is separated from the terminal, and rotating the flying component.
  • the control scheme enables flexible and simple control of flight components, and is very intuitive, easy for users to understand and operate.
  • the input unit 1004 may include a graphics processor (Graphics Processing Unit, GPU) 10041 and a microphone 10042, and the graphics processor 10041 is used for the image capture device (such as the image data of the still picture or video obtained by the camera) for processing.
  • the display unit 1006 may include a display panel 10061, The display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072 .
  • the touch panel 10071 is also called a touch screen.
  • the touch panel 10071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 10072 may include, but are not limited to, physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the memory 1009 can be used to store software programs as well as various data.
  • the memory 1009 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instructions required by at least one function (such as a sound playing function, image playback function, etc.), etc.
  • memory 1009 may include volatile memory or nonvolatile memory, or, memory 1009 may include both volatile and nonvolatile memory.
  • the non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electronically programmable Erase Programmable Read-Only Memory (Electrically EPROM, EEPROM) or Flash.
  • ROM Read-Only Memory
  • PROM programmable read-only memory
  • Erasable PROM Erasable PROM
  • EPROM erasable programmable read-only memory
  • Electrical EPROM Electrical EPROM
  • EEPROM electronically programmable Erase Programmable Read-Only Memory
  • Volatile memory can be random access memory (Random Access Memory, RAM), static random access memory (Static RAM, SRAM), dynamic random access memory (Dynamic RAM, DRAM), synchronous dynamic random access memory (Synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Rate SDRAM, DDRSDRAM), enhanced synchronous dynamic random access memory (Enhanced SDRAM, ESDRAM), synchronous connection dynamic random access memory (Synch link DRAM , SLDRAM) and Direct Memory Bus Random Access Memory (Direct Rambus RAM, DRRAM).
  • RAM Random Access Memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Rate SDRAM Double Data Rate SDRAM
  • DDRSDRAM double data rate synchronous dynamic random access memory
  • Enhanced SDRAM, ESDRAM enhanced synchronous dynamic random access memory
  • Synch link DRAM , SLDRAM
  • Direct Memory Bus Random Access Memory Direct Rambus
  • the processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor and a modem processor, wherein the application processor mainly processes operations related to the operating system, user interface, and application programs, etc., Modem processors mainly process wireless communication signals, such as baseband processors. It can be understood that the above-mentioned modem processor may not be integrated into the processor 1010 middle.
  • the flight component can be controlled flexibly and conveniently, and it is very intuitive and easy for users to understand and operate.
  • the embodiment of the present application also provides a readable storage medium.
  • the readable storage medium stores programs or instructions.
  • the program or instructions are executed by the processor, the various processes of the above-mentioned method embodiments for controlling flight components are implemented, and can To achieve the same technical effect, in order to avoid repetition, no more details are given here.
  • the processor is the processor in the electronic device described in the above embodiments.
  • the readable storage medium includes a computer-readable storage medium, such as a computer read-only memory ROM, a random access memory RAM, a magnetic disk or an optical disk, and the like.
  • the embodiment of the present application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions to realize any of the aforementioned control flight components.
  • the chip includes a processor and a communication interface
  • the communication interface is coupled to the processor
  • the processor is used to run programs or instructions to realize any of the aforementioned control flight components.
  • chips mentioned in the embodiments of the present application may also be called system-on-chip, system-on-chip, system-on-a-chip, or system-on-a-chip.
  • the embodiment of the present application provides a computer program product, the program product is stored in a storage medium, and the program product is executed by at least one processor to realize the various processes in the above-mentioned embodiment of the method for controlling flight components, and can achieve the same Technical effects, in order to avoid repetition, will not be repeated here.
  • the term “comprising”, “comprising” or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article or apparatus comprising a set of elements includes not only those elements, It also includes other elements not expressly listed, or elements inherent in the process, method, article, or device. Without further limitations, an element defined by the phrase “comprising a " does not preclude the presence of additional identical elements in the process, method, article, or apparatus comprising that element.
  • the scope of the methods and devices in the embodiments of the present application is not limited to performing functions in the order shown or discussed, and may also include performing functions in a substantially simultaneous manner or in reverse order according to the functions involved. Executing a function, e.g. For example, the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种控制飞行组件的方法和装置、终端和可读存储介质,飞行组件可移动地设置在终端的内部,终端上设有供飞行组件移出终端的出口,该方法包括:响应于触控体在终端上的第一输入,在终端上显示控件(S102);响应于触控体对控件的第二输入,控制飞行组件向终端的出口的方向移动(S104);在飞行组件从终端中移出后,根据触控体的动作控制飞行组件的飞行动作(S106)。

Description

控制飞行组件的方法和装置、终端和可读存储介质
相关申请的交叉引用
本申请要求于2022年1月26日提交的申请号为202210093579.5,发明名称为“控制飞行组件的方法和装置、终端和可读存储介质”的中国专利申请的优先权,其通过引用方式全部并入本申请。
技术领域
本申请属于电子技术领域,具体涉及控制飞行组件的方法和装置、终端和可读存储介质。
背景技术
随着手机、平板电脑等终端设备的快速普及,终端上摄像头布置方式多种多样,出现了许多新奇的布置思路,现有布置摄像头思路有可将升降摄像头与无人机结合,当摄像头从手机内弹出,其上的螺旋桨旋转,可将摄像头作为小型无人机飞离机体,实现更灵活的拍摄。但在现有的飞行组件实现方式中,用户对飞行组件的控制不够灵活简便。
发明内容
本申请实施例的目的是提供控制飞行组件的方法和装置、终端和可读存储介质,以对飞行组件进行灵活简便的控制。
第一方面,本申请实施例提供了一种控制飞行组件的方法。所述飞行组件可移动地设置在终端的内部,所述终端上设有供所述飞行组件移出所述终端的出口,所述方法包括:响应于触控体在所述终端上的第一输入,在所述终端上显示控件;响应于所述触控体对所述控件的第二输入,控制所述飞行组件向所述出口的方向移动;在所述飞行组件从所述终端中移出后,根据所 述触控体的动作控制所述飞行组件的飞行动作。
第二方面,本申请实施例提供了一种控制飞行组件的装置。所述飞行组件可移动地设置在终端的内部,所述终端上设有供所述飞行组件移出所述终端的出口,所述装置包括:第一响应模块,用于响应于触控体在所述终端上的第一输入,在所述终端上显示控件;第二响应模块,用于响应于所述触控体对所述控件的第二输入,控制所述飞行组件向所述出口的方向移动;飞行控制模块,用于在所述飞行组件从所述终端中移出后,根据所述触控体的动作控制所述飞行组件的飞行动作。
第三方面,本申请实施例提供了一种终端。所述终端包括处理器和存储器,所述存储器存储可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如第一方面所述的控制飞行组件的方法的步骤。
第四方面,本申请实施例提供了一种可读存储介质。所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面的控制飞行组件的方法的步骤。
第五方面,本申请实施例提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面所述的方法。
第六方面,本申请实施例提供一种计算机程序产品,该程序产品被存储在存储介质中,该程序产品被至少一个处理器执行以实现如第一方面所述的方法。
在本申请实施例中,基于触控体对飞行组件进行控制,响应于触控体在终端上的第一输入,在终端上显示控件,响应于触控体对控件的第二输入,控制飞行组件向出口的方向移动,在飞行组件从终端中移出后,根据触控体的动作控制飞行组件进行相应的飞行动作。通过本公开实施例的控制方式,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
附图说明
图1是本申请实施例提供的控制飞行组件的方法的流程示意图;
图2是本申请实施例提供的飞行组件的示意图;
图3是本申请实施例提供的窗口界面的示意图;
图4是本申请实施例提供的触控笔的坐标系的示意图;
图5是本申请实施例提供的飞行组件的坐标系的示意图;
图6是本申请实施例提供的设置飞行组件的目标轴的示意图;
图7是本申请实施例提供的控制飞行组件的装置的框图;
图8是本申请实施例提供的终端的示意图;
图9是本申请实施例提供的电子设备的示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施,且“第一”、“第二”等所区分的对象通常为一类,并不限定对象的个数,例如第一对象可以是一个,也可以是多个。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”,一般表示前后关联对象是一种“或”的关系。
下面结合附图,通过具体的实施例及其应用场景对本申请实施例提供的控制飞行组件的方法进行详细地说明。
在说明本公开实施例的控制飞行组件的方法之前,先对其中涉及的终端、触控体和飞行组件进行举例说明。
在本申请的一个实施例中,飞行组件包括支架、安装在支架上的电池、驱动机构、摄像头、第一无线通信模块、处理器以及存储器。驱动机构可以带动飞行组件整体平移、升降、旋转。在一个例子中,参见图2所示,驱动机构包括螺旋桨A1、螺旋桨A2、螺旋桨A3、螺旋桨A4以及驱动螺旋桨A1、螺旋桨A2、螺旋桨A3、螺旋桨A4的多个电机。摄像头B安装在四个螺旋桨的中间。飞行组件中还设有例如加速度传感器、陀螺仪等传感器,通过这些传感器可以检测出飞行组件的移动和旋转,进而确定飞行组件的姿态。
在本申请的一个实施例中,终端包括壳体、触控屏、第二无线通信模块、处理器和存储器。壳体上设有可供飞行组件移入移出的出口。飞行组件可以可移动地设置在终端的内部,飞行组件可以从终端的内部通过出口移动到终端的外部,并且在实体上脱离终端。该终端例如可以为手机、平板电脑、笔记本电脑等。
在本申请的一个实施例中,触控笔中设有例如加速度传感器、陀螺仪等传感器,通过这些传感器可以检测出触控笔的移动和旋转,进而确定触控笔的姿态。触控笔中还设有第三无线通信模块、处理器和存储器。
通过第一无线通信模块、第二无线通信模块、第三无线通信模块,支持终端、触控笔、飞行组件两两之间具有信息交互的能力。或者,触控笔可以通过终端和飞行组件进行信息交互。
在本申请的一个实施例中,触控体可以是用户。在该用户身上可以穿戴有可穿戴设备,基于该可穿戴设备中的传感器输出的传感信号,可以检测到用户的动作。例如,可穿戴设备中设有加速度传感器、陀螺仪等传感器,通过这些传感器可以检测出用户的平移动作或者旋转动作。可穿戴设备中还设有第四无线通信模块、处理器和存储器。该可穿戴设备例如可以为智能手表、智能手环等。
通过第一无线通信模块、第二无线通信模块、第四无线通信模块,支持终端、可穿戴设备、飞行组件两两之间具有信息交互的能力。或者,可穿戴 设备可以通过终端和飞行组件进行信息交互。
请参看图1,其是本申请实施例提供的一种控制飞行组件的方法的流程示意图。如图1所示,该方法可以包括步骤S102-S106,以下予以详细说明。
步骤S102,响应于触控体在终端上的第一输入,在终端上显示控件。
本公开实施例中,触控体可以是触控笔或者是用户。用户可以使用触控笔或者通过手指直接对终端进行操作以产生第一输入,响应于第一输入,终端显示控件。
在一个例子中,控件为悬浮图标。在一个例子中,该悬浮图标可以被拖拽移动。
在一个例子中,控件为对应于飞行组件的窗口界面。该窗口界面可以包括与飞行组件对应的虚拟飞行组件边框,例如图3所示,窗口界面D是与图2所示的飞行组件对应的虚拟飞行组件边框。窗口界面包括与飞行组件对应的虚拟飞行组件边框,可以方便用户直观的对飞行组件进行操控。
步骤S104,响应于触控体对控件的第二输入,控制飞行组件向所述出口的方向移动。
在一个例子中,控件为悬浮图标并且该悬浮图标可以被拖拽移动,用户可以使用触控笔或者通过手指拖拽移动该悬浮图标向所述出口的方向移动以产生第二输入,以控制飞行组件同步向所述出口的方向移动。也就是说,触控体的滑动方向、悬浮图标被拖拽移动的方向、飞行组件移出终端的方向一致,给用户提供了一种较为直观的操作体验,同时非常简便灵活。
在一个例子中,控件为对应于飞行组件的窗口界面。第二输入为触控笔从窗口界面的第一边框滑动至第二边框产生的输入,第一边框为窗口界面的远离出口的边框,第二边框为窗口界面的靠近出口的边框。参见图3所示,终端的出口C位于终端的顶部,窗口界面D为虚拟飞行组件边框,窗口界面D位于屏幕界面的顶部并且临近出口C,第二输入为触控笔从窗口界面D的第一边框D11滑动至第二边框D12产生的输入,其中,第一边框D11为窗口 界面D的远离出口的边框,第二边框D12为窗口界面的靠近出口的边框。参见图3中的箭头所示,触控笔从第一边框D11向第二边框D12滑动,也就是朝向出口C的方向滑动,响应于该操作,飞行组件从出口C处被推出到终端的外部。也就是说,触控体的滑动方向和飞行组件移出终端的方向一致,给用户提供了一种较为直观的操作体验,同时非常简便灵活。
在一个例子中,第二输入可以是触控笔长按窗口界面的第一边框然后再滑动至第二边框产生的输入。触控笔长按窗口界面的第一边框是指触控笔点击第一边框并且持续时间超过预设时长。
在一个例子中,控件为对应于飞行组件的窗口界面,该窗口界面包括与飞行组件对应的虚拟飞行组件边框,飞行组件跟随触控笔的滑动同步进行移动,以使得触控笔从第一边框滑动至第二边框时,飞行组件恰好从终端中脱出,这种方式可以使得用户更容易感知到自己的滑动操作和飞行组件的移动之间的关联,给予用户更好的操作体验。
通过本公开实施例的控制飞行组件的方法,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
在本申请的一些实施例中,触控体为触控笔。响应于触控笔在窗口界面上的长按产生的第四输入,控制飞行组件与触控笔建立通信连接。例如,触控笔点击窗口界面的中心位置且点击持续时长超过预设时长。或者,响应于触控笔在窗口界面上的连续点击产生的第五输入,控制飞行组件与触控笔建立通信连接。例如,触控笔连续点击窗口界面的中心位置。在飞行组件与触控笔建立通信连接后,飞行组件与触控笔可以直接进行信息交互,例如指令的发送和接收、传感数据的发送和接收、姿态数据的发送和接收等等。
步骤S106,在飞行组件从终端中移出后,根据触控体的动作控制飞行组件的飞行动作。
在本申请的一些实施例中,在飞行组件从终端中移出后并且在根据触控体的动作控制飞行组件的飞行动作之前,控制飞行组件飞行至预设高度悬停。 也就是说,飞行组件从终端中移出后,先飞行至预设高度悬停,等待用户的飞行控制命令。预设高度可以为用户设定的高度,例如:预设高度可以为高出终端1米或者距离地面2米。在一个例子中,飞行组件中存储有预设程序,在该预设程序的控制下,飞行组件移出终端后自动飞行至预设高度悬停。
触控体的动作可以是平移、旋转。在一个例子中,触控体为用户,用户身上穿戴有可穿戴设备。在飞行组件从终端中移出后,在根据触控体的动作控制飞行组件的飞行动作之前,获取穿戴在用户身上的可穿戴设备中的目标传感器的传感信号,根据传感信号确定触控体的动作。在一个例子中,触控体为触控笔,在根据触控体的动作控制飞行组件的飞行动作之前,通过触控笔中安装的传感器输出的传感信号,可以确定触控笔的动作。
下面介绍飞行组件被推出终端后,控制飞行组件进行平移的实施例。
在本申请的一个实施例中,根据触控体的动作控制飞行组件进行平移的过程包括步骤S202-S204。
步骤S202,检测触控体的平移方向和平移距离,根据触控体的平移方向和平移距离确定飞行组件的目标平移方向和目标平移距离。
触控体的平移方向是指触控体在空间中运动的方向。触控体的平移距离为触控体在空间中移动的距离。例如触控体向正东移动1米,则触控体的平移方向为正东方向,平移距离为1米。
飞行组件的目标平移方向和触控体的平移方向一致。当用户想要飞行组件向东进行平移时,可以通过对触控体进行向东平移的操作。当用户想要飞行组件向南进行平移时,可以通过对触控体进行向南平移的操作。
飞行组件的目标平移距离可以是触控体的平移距离的预设倍数。该预设倍数例如为n倍,n≥1,当用户控制触控体的平移距离为P时,飞行组件的目标平移距离为P和n的乘积。
步骤S204,控制飞行组件按照目标平移方向和目标平移距离进行平移。
下面介绍飞行组件被推出终端后,控制飞行组件进行旋转的实施例。
首先,介绍触控笔自身的坐标系和飞行组件的机体坐标系。参照图4所示,触控笔自身的坐标系为三维正交直角坐标系,其中,触控笔的Z轴是沿笔身长度方向的坐标轴,触控笔的Z轴的正向为触控笔的笔尖指向触控笔的尾部。参照图5所示,机体坐标系是指固定在飞行器上的三维正交直角坐标系,其原点位于飞行器的质心。
在本申请的一个实施例中,根据触控体的动作控制飞行组件进行旋转的过程包括步骤S302-S304。
步骤S302,检测触控体的旋转方向和旋转角度,根据触控体的旋转方向和旋转角度确定飞行组件的目标旋转方向和目标旋转角度。
触控体的旋转角度为触控体在空间中旋转的角度。触控体的旋转方向是指触控体在空间中旋转的方向。
飞行组件的目标旋转角度可以和触控体的旋转角度一致。或者,飞行组件的目标旋转角度可以和触控体的旋转角度的预设倍数。
飞行组件的目标旋转方向和触控体的旋转方向是对应一致的。例如,对于触控体是触控笔的情况,触控笔的Z轴是沿笔身长度方向的坐标轴,触控笔的Z轴为其默认的旋转轴,触控笔的旋转方向是指从触控笔的Z轴的负向看向触控笔的Z轴的正向,触控笔是顺时针旋转还是逆时针旋转。例如,对于触控体是用户的情况,默认用户身高方向是旋转轴的方向,触控体的旋转方向是指从用户的上方看下去,是顺时针旋转还是逆时针旋转。飞行组件的旋转方向,是指从飞行组件的目标轴的负向看向飞行组件的目标轴的正向,该飞行组件是顺时针旋转还是逆时针旋转。通过这种设置方式,使得飞行组件的目标旋转方向和触控体的旋转方向保持一致性,以通过触控体的旋转控制飞行组件进行对应的旋转。
步骤S304,控制飞行组件按照目标旋转方向和目标旋转角度以目标轴为旋转轴进行旋转。
在本申请的一个实施例中,触控体为触控笔,根据触控体的动作控制飞 行组件进行旋转的过程包括步骤S402-S404。
步骤S402,检测触控笔以笔身长度方向为旋转轴的旋转角度,根据旋转角度确定飞行组件的目标旋转角度。检测触控笔以笔身长度方向为旋转轴的旋转方向,根据旋转方向确定飞行组件的目标旋转方向。
通过触控笔中安装的传感器,可以检测以笔身长度方向为旋转轴的旋转角度和旋转方向。触控笔以笔身长度方向为旋转轴的旋转角度也就是以触控笔的Z轴为旋转轴的旋转角度。触控笔以笔身长度方向为旋转轴进行旋转时,触控笔的旋转方向是指从触控笔的Z轴的负向看向触控笔的Z轴的正向,触控笔是顺时针旋转还是逆时针旋转。
飞行组件的目标旋转角度可以和触控笔的旋转角度一致,例如,触控笔以笔身长度方向为旋转轴的旋转角度为180度,则飞行组件的目标旋转角度也为180度。或者,飞行组件的目标旋转角度可以和触控笔的旋转角度的预设倍数。该预设倍数例如为m倍,m≥1,当用户控制以笔身长度方向为旋转轴的旋转角度为Q时,飞行组件的目标旋转角度为Q和m的乘积。
步骤S404,控制飞行组件照目标旋转方向和目标旋转角度以目标轴为旋转轴进行旋转。
在一个例子中,飞行组件以目标轴进行旋转的目标旋转方向和触控笔以笔身长度方向为旋转轴的旋转方向是对应一致的。
触控笔以笔身长度方向为旋转轴进行旋转时,触控笔的旋转方向是指从触控笔的Z轴的负向看向触控笔的Z轴的正向,触控笔是顺时针旋转还是逆时针旋转。飞行组件以飞行组件的X轴为旋转轴进行旋转时,飞行组件的旋转方向是指从飞行组件的X轴的负向看向飞行组件的X轴的正向,飞行组件是顺时针旋转还是逆时针旋转。飞行组件以飞行组件的Y轴为旋转轴进行旋转时,飞行组件的旋转方向是指从飞行组件的Y轴的负向看向飞行组件的Y轴的正向,飞行组件是顺时针旋转还是逆时针旋转。飞行组件以飞行组件的Z轴为旋转轴进行旋转时,飞行组件的旋转方向是指从飞行组件的Z轴的负 向看向飞行组件的Z轴的正向,飞行组件是顺时针旋转还是逆时针旋转。
如果检测触控笔以笔身长度方向为旋转轴的旋转方向为正时针旋转,飞行组件以目标轴为旋转轴的旋转的目标旋转方向同样为正时针旋转。如果检测触控笔以笔身长度方向为旋转轴的旋转方向为逆时针旋转,飞行组件以目标轴为旋转轴的旋转的目标旋转方向同样为逆时针旋转。
在本申请的一个实施例中,飞行组件的目标轴默认是飞行组件的Z轴,用户可以人为更改飞行组件的目标轴。
在本申请的一个实施例中,终端提供飞行组件的目标轴的设置界面,用户可以在设置界面中设置飞行组件的任一坐标轴为目标轴。
在本申请的一个实施例中,飞行组件的目标轴是通过步骤S502-S506确定的。
步骤S502,获取飞行组件的姿态数据。
通过飞行组件中安装的传感器,可以获取到飞行组件的姿态数据确定飞行组件的姿态。
步骤S504,根据飞行组件的姿态数据在终端上显示飞行组件的姿态图像,姿态图像包括飞行组件的机体坐标系。
飞行组件的姿态图像为飞行组件当前所处的姿态图像。终端可以接收飞行组件实时拍摄的图像并且显示出来,并且同时显示飞行组件的姿态图像,以便于更直观理解用户飞行组件的当前姿态和飞行组件拍摄的实时图像的关系,明白其想要的旋转结果应该通过哪种旋转方式才能够实现。参见图6所示,终端显示当前飞行组件拍摄的实时图像和飞行组件的姿态图像,姿态图像中显示有飞行组件的机体坐标系。
步骤S506,响应于触控体在姿态图像上的第三输入,选取飞行组件的一个坐标轴作为目标轴。
参见图6所示,第三输入为用户在终端内显示的飞行组件的姿态图像内选取飞行组件的一个坐标轴产生的输入,被选取的坐标轴即被确定为目标轴。 第三输入例如为触控笔点击飞行组件的姿态图像内的机体坐标系的一个坐标轴产生的输入。例如,用户使用触控笔点击姿态图像内的机体坐标系的Y坐标轴,则飞行组件的Y坐标轴被确定为目标轴。
在该例子中,将飞行组件的姿态图像显示在终端上让用户进行选取,更为直观便于用户操作。
在本申请的一个实施例中,飞行组件的目标轴是通过步骤S602-S606确定的。
步骤S602,获取触控笔的姿态数据和飞行组件的姿态数据。
通过飞行组件中安装的传感器,可以获取到飞行组件的姿态数据确定飞行组件的姿态。
步骤S604,根据触控笔的姿态数据和飞行组件的姿态数据,确定飞行组件的第一坐标轴,飞行组件的第一坐标轴为飞行组件的机体坐标系中与触控笔的笔身长度方向的锐角夹角小于45度的坐标轴。
飞行组件的机体坐标系中的三轴中,如果某一轴与触控笔的笔身长度方向的锐角夹角小于45度,说明该坐标轴与触控笔的Z轴在空间中更为近似平行,将该坐标轴作为飞行组件的第一坐标轴。
步骤S606,将飞行组件的第一坐标轴确定为目标轴。
在该例子中,将飞行组件的与触控笔更为近似平行的坐标轴设置为第一坐标轴,用户只需要调整触控笔的姿态就可以设置飞行组件的目标轴,这种方式方便直观,便于用户操作。
在一个例子中,所述根据触控体的动作控制飞行组件的飞行动作,包括以下步骤:
步骤S702,根据触控体的动作生成飞行指令。
步骤S704,获取触控体在生成飞行指令之前的预设时间内的平均运动速度。
步骤S706,在平均运动速度小于等于预设速度阈值的情况下,将飞行指 令发送给飞行组件,以使得飞行组件根据飞行指令进行飞行。
通过这种方式,过滤掉触控体频繁快速动作生成的飞行指令,以避免飞行组件在短时间内收到频繁、冲突的飞行指令,防止对飞行组件的安全和周围环境的安全造成不良影响。
在一个例子中,飞行指令为平移指令。获取触控体在生成平移指令之前的预设时间内的平均运动速度,只在平均运动速度小于等于预设阈值的情况下,才将平移指令发送给飞行组件,在平均运动速度大于预设阈值的情况下,过滤掉这条平移指令不发送给飞行组件。该预设时间和预设阈值可以根据实际情况进行设定,该预设时间例如可以为3秒,该预设阈值例如可以为1米每秒。
在一个例子中,飞行指令为旋转指令。获取触控体在生成旋转指令之前的预设时间内的平均运动速度,只在平均运动速度小于等于预设阈值的情况下,才将旋转指令发送给飞行组件,在平均运动速度大于预设阈值的情况下,过滤掉这条旋转指令,不发送给飞行组件。该预设时间和预设阈值可以根据实际情况进行设定,该预设时间例如可以为3秒,该预设阈值例如可以为1米每秒。
在一个例子中,在飞行组件从终端中移出后,在终端上显示飞行控制界面,响应于触控体在飞行控制界面的输入,控制飞行组件的飞行动作。例如,飞行控制界面上设有“平移”控件和“旋转”控件,触控体点击触控体在飞行界面上点击“平移”控件,输入目标平移方向和目标平移距离就可以控制飞行组件平移,触控体点击“旋转”控件,输入目标轴、目标旋转方向、目标旋转角度就可以控制飞行组件平移。
在本申请的一些实施例子中,前述控制飞行组件的方法实施例中的全部或部分步骤可以应用于前述的终端中,由前述的终端执行,例如,前述步骤S102-S104可以由终端执行。在本申请的一些实施例子中,在飞行组件被推出终端后,控制飞行组件进行平移/旋转的相关步骤中的全部或者部分步骤可以 由触控笔或者可穿戴设备执行。
在本申请实施例中,基于触控体对飞行组件进行控制,响应于触控体在终端上的第一输入,在终端上显示控件,响应于触控体对控件的第二输入,控制飞行组件向出口的方向移动,在飞行组件从终端中移出后,根据触控体的动作控制飞行组件进行相应的飞行动作。通过本公开实施例的控制方式,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
本申请实施例的控制飞行组件的方法,可以结合终端和触控笔一起使用对飞行组件进行控制,提出了一整套包括从终端中推出飞行组件、在飞行组件脱离终端后平移飞行组件、旋转飞行组件的控制方案,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
需要说明的是,本申请实施例提供的控制飞行组件的方法,执行主体可以为控制飞行组件的装置。本申请实施例中以控制飞行组件的装置执行控制飞行组件的方法为例,说明本申请实施例提供的控制飞行组件的装置。
本申请实施例提供了控制飞行组件的装置,参见图7所示,该控制飞行组件的装置包括第一响应模块W1、第二响应模块W2以及飞行控制模块W3。
第一响应模块W1,用于响应于触控体在终端上的第一输入,在终端上显示控件。
第二响应模块W2,用于响应于触控体对控件的第二输入,控制飞行组件向出口的方向移动。
飞行控制模块W3,用于在飞行组件从终端中移出后,根据触控体的动作控制飞行组件的飞行动作。
在一个例子中,所述装置还包括悬停控制模块。悬停控制模块,用于在飞行组件从终端中移出后,在根据触控体的动作控制飞行组件的飞行动作之前,控制飞行组件飞行至预设高度悬停。
在一个例子中,触控体为用户,所述装置还包括触控体动作确定模块。触控体动作确定模块,用于在飞行组件从终端中移出后,在根据触控体的动 作控制飞行组件的飞行动作之前,获取穿戴在用户身上的可穿戴设备中的目标传感器的传感信号,根据传感信号确定触控体的动作。
在一个例子中,所述根据触控体的动作控制飞行组件的飞行动作,包括:检测触控体的平移方向和平移距离,根据触控体的平移方向和平移距离确定飞行组件的目标平移方向和目标平移距离。控制飞行组件按照目标平移方向和目标平移距离进行平移。
在一个例子中,所述根据触控体的动作控制飞行组件的飞行动作,包括:检测触控体的旋转方向和旋转角度,根据触控体的旋转方向和旋转角度确定飞行组件的目标旋转方向和目标旋转角度。控制飞行组件按照目标旋转方向和目标旋转角度以目标轴为旋转轴进行旋转。
在一个例子中,触控体为触控笔,所述根据触控体的动作控制飞行组件的飞行动作,包括:检测触控笔以笔身长度方向为旋转轴的旋转角度,根据旋转角度确定飞行组件的目标旋转角度。检测触控笔以笔身长度方向为旋转轴的旋转方向,根据旋转方向确定飞行组件的目标旋转方向。控制飞行组件照目标旋转方向和目标旋转角度以目标轴为旋转轴进行旋转。
在一个例子中,所述装置还包括第一目标轴确定模块。第一目标轴确定模块,用于在控制飞行组件按照目标旋转方向和目标旋转角度以目标轴为旋转轴进行旋转之前,获取飞行组件的姿态数据。根据飞行组件的姿态数据在终端上显示飞行组件的姿态图像,所述姿态图像包括飞行组件的机体坐标系,所述机体坐标系为三维直角坐标系。响应于触控体在姿态图像上的第三输入,选取飞行组件的一个坐标轴作为目标轴。
在一个例子中,触控体为触控笔,所述装置还包括第二目标轴确定模块。第二目标轴确定模块,用于在控制飞行组件按照目标旋转方向和目标旋转角度以目标轴为旋转轴进行旋转之前,获取触控笔的姿态数据和飞行组件的姿态数据。根据触控笔的姿态数据和飞行组件的姿态数据,确定飞行组件的第一坐标轴,飞行组件的第一坐标轴为飞行组件的机体坐标系中与触控笔的笔 身长度方向的锐角夹角小于45度的坐标轴,所述机体坐标系为三维直角坐标系。将飞行组件的第一坐标轴确定为目标轴。
在一个例子中,所述根据触控体的动作控制飞行组件的飞行动作,包括:根据触控体的动作生成飞行指令。获取触控体在生成飞行指令之前的预设时间内的平均运动速度。在平均运动速度小于等于预设速度阈值的情况下,将飞行指令发送给飞行组件,以使得飞行组件根据飞行指令进行飞行。
本申请实施例提供的控制飞行组件的装置,基于触控体对飞行组件进行控制,响应于触控体在终端上的第一输入,在终端上显示控件,响应于触控体对控件的第二输入,控制飞行组件向出口的方向移动,在飞行组件从终端中移出后,根据触控体的动作控制飞行组件进行相应的飞行动作。通过本公开实施例的控制方式,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
本申请实施例提供的控制飞行组件的装置,可以结合终端和触控笔一起使用对飞行组件进行控制,提出了一整套包括从终端中推出飞行组件、在飞行组件脱离终端后平移飞行组件、旋转飞行组件的控制方案,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
本申请实施例中的控制飞行组件的装置可以是电子设备,也可以是电子设备中的部件,例如集成电路或芯片。该电子设备可以是终端,也可以为除终端之外的其他设备。示例性的,电子设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、移动上网装置(Mobile Internet Device,MID)、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、机器人、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,还可以为服务器、网络附属存储器(Network Attached Storage,NAS)、个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例中的控制飞行组件的装置可以为具有操作系统的装置。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本申请实施例不作具体限定。
本申请实施例提供的控制飞行组件的装置能够实现图1至图6的方法实施例实现的各个过程,为避免重复,这里不再赘述。
可选地,如图8所示,本申请实施例还提供一种终端M00,包括处理器M01和存储器M02,存储器M02上存储有可在所述处理器M01上运行的程序或指令,该程序或指令被处理器M01执行时实现上述控制飞行组件的方法实施例的各个步骤,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,本申请实施例中的终端包括上述所述的移动电子设备和非移动电子设备。
图9为实现本申请实施例的一种电子设备1000的硬件结构示意图。
该电子设备1000包括但不限于:射频单元1001、网络模块1002、音频输出单元1003、输入单元1004、传感器1005、显示单元1006、用户输入单元1007、接口单元1008、存储器1009、以及处理器1010等部件。
本领域技术人员可以理解,电子设备1000还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器1010逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图9中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
处理器1010用于:响应于触控体在所述电子设备上的第一输入,在所述电子设备上显示控件;响应于所述触控体对所述控件的第二输入,控制所述飞行组件向所述出口的方向移动;在所述飞行组件从所述电子设备中移出后,根据所述触控体的动作控制所述飞行组件的飞行动作。
在一个例子中,处理器1010还用于:在所述飞行组件从所述电子设备中移出后,在根据所述触控体的动作控制所述飞行组件的飞行动作之前,控制 所述飞行组件飞行至预设高度悬停。
在一个例子中,所述触控体为用户,处理器1010还用于:在所述飞行组件从所述电子设备中移出后,在根据所述触控体的动作控制所述飞行组件的飞行动作之前,获取穿戴在所述用户身上的可穿戴设备中的目标传感器的传感信号,根据所述传感信号确定所述触控体的动作。
在一个例子中,所述根据触控体的动作控制所述飞行组件的飞行动作,包括:检测所述触控体的平移方向和平移距离,根据所述触控体的平移方向和平移距离确定所述飞行组件的目标平移方向和目标平移距离;控制所述飞行组件按照所述目标平移方向和所述目标平移距离进行平移。
在一个例子中,所述根据触控体的动作控制所述飞行组件的飞行动作,包括:检测所述触控体的旋转方向和旋转角度,根据所述触控体的旋转方向和旋转角度确定所述飞行组件的目标旋转方向和目标旋转角度;控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转。
在一个例子中,所述触控体为触控笔,所述根据触控体的动作控制所述飞行组件的飞行动作,包括:检测所述触控笔以笔身长度方向为旋转轴的旋转角度,根据所述旋转角度确定所述飞行组件的目标旋转角度;检测所述触控笔以笔身长度方向为旋转轴的旋转方向,根据所述旋转方向确定所述飞行组件的目标旋转方向;控制所述飞行组件照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转。
在一个例子中,处理器1010还用于:在所述控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转之前,获取所述飞行组件的姿态数据;根据所述飞行组件的姿态数据在所述电子设备上显示所述飞行组件的姿态图像,所述姿态图像包括所述飞行组件的机体坐标系,所述机体坐标系为三维直角坐标系;响应于所述触控体在所述姿态图像上的第三输入,选取所述飞行组件的一个坐标轴作为目标轴。
在一个例子中,所述触控体为触控笔,处理器1010还用于:在所述控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转之前,获取所述触控笔的姿态数据和所述飞行组件的姿态数据;根据所述触控笔的姿态数据和所述飞行组件的姿态数据,确定所述飞行组件的第一坐标轴,所述飞行组件的第一坐标轴为所述飞行组件的机体坐标系中与所述触控笔的笔身长度方向的锐角夹角小于45度的坐标轴,所述机体坐标系为三维直角坐标系;将所述飞行组件的第一坐标轴确定为目标轴。
在一个例子中,所述根据触控体的动作控制所述飞行组件的飞行动作,包括:根据所述触控体的动作生成飞行指令;获取所述触控体在生成所述飞行指令之前的预设时间内的平均运动速度;在所述平均运动速度小于等于预设速度阈值的情况下,将所述飞行指令发送给所述飞行组件,以使得所述飞行组件根据所述飞行指令进行飞行。
本申请实施例提供的电子设备,基于触控体对飞行组件进行控制,响应于触控体在终端上的第一输入,在终端上显示控件,响应于触控体对控件的第二输入,控制飞行组件向出口的方向移动,在飞行组件从终端中移出后,根据触控体的动作控制飞行组件进行相应的飞行动作。通过本公开实施例的控制方式,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
本申请实施例提供的电子设备,可以结合终端和触控笔一起使用对飞行组件进行控制,提出了一整套包括从终端中推出飞行组件、在飞行组件脱离终端后平移飞行组件、旋转飞行组件的控制方案,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
应理解的是,本申请实施例中,输入单元1004可以包括图形处理器(Graphics Processing Unit,GPU)10041和麦克风10042,图形处理器10041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元1006可包括显示面板10061, 可以采用液晶显示器、有机发光二极管等形式来配置显示面板10061。用户输入单元1007包括触控面板10071以及其他输入设备10072中的至少一种。触控面板10071,也称为触摸屏。触控面板10071可包括触摸检测装置和触摸控制器两个部分。其他输入设备10072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
存储器1009可用于存储软件程序以及各种数据。存储器1009可主要包括存储程序或指令的第一存储区和存储数据的第二存储区,其中,第一存储区可存储操作系统、至少一个功能所需的应用程序或指令(比如声音播放功能、图像播放功能等)等。此外,存储器1009可以包括易失性存储器或非易失性存储器,或者,存储器1009可以包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,DDRSDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(Synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DRRAM)。本申请实施例中的存储器1009包括但不限于这些和任意其它适合类型的存储器。
处理器1010可包括一个或多个处理单元;可选的,处理器1010集成应用处理器和调制解调处理器,其中,应用处理器主要处理涉及操作系统、用户界面和应用程序等的操作,调制解调处理器主要处理无线通信信号,如基带处理器。可以理解的是,上述调制解调处理器也可以不集成到处理器1010 中。
通过本公开实施例的电子设备,能够对飞行组件进行灵活简便的控制,并且非常直观,用户容易理解和操作。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述控制飞行组件的方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器ROM、随机存取存储器RAM、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现前述控制飞行组件的任一方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为系统级芯片、系统芯片、芯片系统或片上系统芯片等。
本申请实施例提供一种计算机程序产品,该程序产品被存储在存储介质中,该程序产品被至少一个处理器执行以实现如上述控制飞行组件的方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例 如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以计算机软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (22)

  1. 一种控制飞行组件的方法,所述飞行组件可移动地设置在终端的内部,所述终端上设有供所述飞行组件移出所述终端的出口,所述方法包括:
    响应于触控体在所述终端上的第一输入,在所述终端上显示控件;
    响应于所述触控体对所述控件的第二输入,控制所述飞行组件向所述出口的方向移动;
    在所述飞行组件从所述终端中移出后,根据所述触控体的动作控制所述飞行组件的飞行动作。
  2. 根据权利要求1所述的方法,其中,在所述飞行组件从所述终端中移出后,在所述根据所述触控体的动作控制所述飞行组件的飞行动作之前,所述方法还包括:
    控制所述飞行组件飞行至预设高度悬停。
  3. 根据权利要求1所述的方法,其中,所述触控体为用户,在所述飞行组件从所述终端中移出后,在所述根据所述触控体的动作控制所述飞行组件的飞行动作之前,所述方法还包括:
    获取穿戴在所述用户身上的可穿戴设备中的目标传感器的传感信号;
    根据所述传感信号确定所述触控体的动作。
  4. 根据权利要求1所述的方法,其中,所述根据所述触控体的动作控制所述飞行组件的飞行动作,包括:
    检测所述触控体的平移方向和平移距离,根据所述触控体的平移方向和平移距离确定所述飞行组件的目标平移方向和目标平移距离;
    控制所述飞行组件按照所述目标平移方向和所述目标平移距离进行平移。
  5. 根据权利要求1所述的方法,其中,所述根据所述触控体的动作控制所述飞行组件的飞行动作,包括:
    检测所述触控体的旋转方向和旋转角度,根据所述触控体的旋转方向和旋转角度确定所述飞行组件的目标旋转方向和目标旋转角度;
    控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转。
  6. 根据权利要求1所述的方法,其中,所述触控体为触控笔,所述根据所述触控体的动作控制所述飞行组件的飞行动作,包括:
    检测所述触控笔以笔身长度方向为旋转轴的旋转角度,根据所述旋转角度确定所述飞行组件的目标旋转角度;
    检测所述触控笔以笔身长度方向为旋转轴的旋转方向,根据所述旋转方向确定所述飞行组件的目标旋转方向;
    控制所述飞行组件照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转。
  7. 根据权利要求5或6所述的方法,其中,在所述控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转之前,所述方法还包括:
    获取所述飞行组件的姿态数据;
    根据所述飞行组件的姿态数据在所述终端上显示所述飞行组件的姿态图像,所述姿态图像包括所述飞行组件的机体坐标系,所述机体坐标系为三维直角坐标系;
    响应于所述触控体在所述姿态图像上的第三输入,选取所述飞行组件的一个坐标轴作为目标轴。
  8. 根据权利要求5或6所述的方法,其中,所述触控体为触控笔,在所述控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转之前,所述方法还包括:
    获取所述触控笔的姿态数据和所述飞行组件的姿态数据;
    根据所述触控笔的姿态数据和所述飞行组件的姿态数据,确定所述飞行组件的第一坐标轴,所述飞行组件的第一坐标轴为所述飞行组件的机体坐标系中与所述触控笔的笔身长度方向的锐角夹角小于45度的坐标轴,所述机体 坐标系为三维直角坐标系;
    将所述飞行组件的第一坐标轴确定为目标轴。
  9. 根据权利要求4-8任一项所述的方法,其中,所述根据所述触控体的动作控制所述飞行组件的飞行动作,包括:
    根据所述触控体的动作生成飞行指令;
    获取所述触控体在生成所述飞行指令之前的预设时间内的平均运动速度;
    在所述平均运动速度小于等于预设速度阈值的情况下,将所述飞行指令发送给所述飞行组件,以使得所述飞行组件根据所述飞行指令进行飞行。
  10. 一种控制飞行组件的装置,所述飞行组件可移动地设置在终端的内部,所述终端上设有供所述飞行组件移出所述终端的出口,所述装置包括:
    第一响应模块,用于响应于触控体在所述终端上的第一输入,在所述终端上显示控件;
    第二响应模块,用于响应于所述触控体对所述控件的第二输入,控制所述飞行组件向所述出口的方向移动;
    飞行控制模块,用于在所述飞行组件从所述终端中移出后,根据所述触控体的动作控制所述飞行组件的飞行动作。
  11. 根据权利要求10所述的装置,其中,所述飞行控制模块还用于:
    在所述根据所述触控体的动作控制所述飞行组件的飞行动作之前,控制所述飞行组件飞行至预设高度悬停。
  12. 根据权利要求10所述的装置,其中,所述触控体为用户,所述装置还包括触控体动作确定模块;
    所述触控体动作确定模块,用于在所述飞行组件从所述终端中移出后,在所述根据所述触控体的动作控制所述飞行组件的飞行动作之前,获取穿戴在所述用户身上的可穿戴设备中的目标传感器的传感信号,根据所述传感信号确定所述触控体的动作。
  13. 根据权利要求10所述的装置,其中,所述根据所述触控体的动作控 制所述飞行组件的飞行动作,包括:
    检测所述触控体的平移方向和平移距离,根据所述触控体的平移方向和平移距离确定所述飞行组件的目标平移方向和目标平移距离;
    控制所述飞行组件按照所述目标平移方向和所述目标平移距离进行平移。
  14. 根据权利要求10所述的装置,其中,所述根据所述触控体的动作控制所述飞行组件的飞行动作,包括:
    检测所述触控体的旋转方向和旋转角度,根据所述触控体的旋转方向和旋转角度确定所述飞行组件的目标旋转方向和目标旋转角度;
    控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转。
  15. 根据权利要求10所述的装置,其中,所述触控体为触控笔,所述根据所述触控体的动作控制所述飞行组件的飞行动作,包括:
    检测所述触控笔以笔身长度方向为旋转轴的旋转角度,根据所述旋转角度确定所述飞行组件的目标旋转角度;
    检测所述触控笔以笔身长度方向为旋转轴的旋转方向,根据所述旋转方向确定所述飞行组件的目标旋转方向;
    控制所述飞行组件照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转。
  16. 根据权利要求14或15所述的装置,其中,所述装置还包括目标轴选取模块;
    所述目标轴选取模块,用于在所述控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转之前,获取所述飞行组件的姿态数据,根据所述飞行组件的姿态数据在所述终端上显示所述飞行组件的姿态图像,所述姿态图像包括所述飞行组件的机体坐标系,所述机体坐标系为三维直角坐标系,响应于所述触控体在所述姿态图像上的第三输入,选取所述飞行组件的一个坐标轴作为目标轴。
  17. 根据权利要求14或15所述的装置,其中,所述触控体为触控笔,所述装置还包括目标轴确定模块;
    所述目标轴确定模块,用于在所述控制所述飞行组件按照所述目标旋转方向和所述目标旋转角度以目标轴为旋转轴进行旋转之前,获取所述触控笔的姿态数据和所述飞行组件的姿态数据,根据所述触控笔的姿态数据和所述飞行组件的姿态数据,确定所述飞行组件的第一坐标轴,所述飞行组件的第一坐标轴为所述飞行组件的机体坐标系中与所述触控笔的笔身长度方向的锐角夹角小于45度的坐标轴,所述机体坐标系为三维直角坐标系,将所述飞行组件的第一坐标轴确定为目标轴。
  18. 根据权利要求13-17任一项所述的装置,其中,所述根据所述触控体的动作控制所述飞行组件的飞行动作,包括:
    根据所述触控体的动作生成飞行指令;
    获取所述触控体在生成所述飞行指令之前的预设时间内的平均运动速度;
    在所述平均运动速度小于等于预设速度阈值的情况下,将所述飞行指令发送给所述飞行组件,以使得所述飞行组件根据所述飞行指令进行飞行。
  19. 一种终端,包括处理器和存储器,所述存储器存储可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1-9任一项所述的控制飞行组件的方法的步骤。
  20. 一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1-9任一项所述的控制飞行组件的方法的步骤。
  21. 一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如权利要求1-9任一项所述的方法。
  22. 一种计算机程序产品,所述程序产品被存储在非瞬态存储介质中,所述程序产品被至少一个处理器执行以实现如权利要求1-9任一项所述的方法。
PCT/CN2023/073405 2022-01-26 2023-01-20 控制飞行组件的方法和装置、终端和可读存储介质 WO2023143460A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210093579.5 2022-01-26
CN202210093579.5A CN114510075A (zh) 2022-01-26 2022-01-26 控制飞行组件的方法和装置、终端和可读存储介质

Publications (1)

Publication Number Publication Date
WO2023143460A1 true WO2023143460A1 (zh) 2023-08-03

Family

ID=81548975

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/073405 WO2023143460A1 (zh) 2022-01-26 2023-01-20 控制飞行组件的方法和装置、终端和可读存储介质

Country Status (2)

Country Link
CN (1) CN114510075A (zh)
WO (1) WO2023143460A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114510075A (zh) * 2022-01-26 2022-05-17 维沃移动通信有限公司 控制飞行组件的方法和装置、终端和可读存储介质

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101795142A (zh) * 2009-12-31 2010-08-04 上海杰远环保科技有限公司 一种具有飞行器组件的系统
CN105955292A (zh) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 一种控制飞行器飞行的方法、移动终端、飞行器及系统
CN106231173A (zh) * 2015-06-02 2016-12-14 Lg电子株式会社 移动终端及其控制方法
CN106303182A (zh) * 2016-09-18 2017-01-04 珠海格力电器股份有限公司 一种飞行摄像头装置、方法及终端
CN107087042A (zh) * 2017-04-21 2017-08-22 天津卉茗共创科技有限公司 摄像机构、终端以及摄像系统
WO2017208199A1 (en) * 2016-06-03 2017-12-07 ZHOU, Tiger Amphibious vtol super drone camera in mobile case (phone case) with multiple aerial and aquatic flight modes for capturing panoramic virtual reality views, selfie and interactwe video
CN108227726A (zh) * 2018-01-11 2018-06-29 深圳电航空技术有限公司 无人机飞行控制方法、装置、终端及存储介质
CN208675340U (zh) * 2018-05-25 2019-03-29 南昌华勤电子科技有限公司 移动设备
CN110709797A (zh) * 2018-06-29 2020-01-17 深圳市大疆创新科技有限公司 可移动平台的操控方法、装置及可移动平台
CN112134976A (zh) * 2019-06-25 2020-12-25 北京小米移动软件有限公司 终端设备
CN112954092A (zh) * 2021-02-03 2021-06-11 维沃移动通信有限公司 电子设备
CN114510075A (zh) * 2022-01-26 2022-05-17 维沃移动通信有限公司 控制飞行组件的方法和装置、终端和可读存储介质

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101795142A (zh) * 2009-12-31 2010-08-04 上海杰远环保科技有限公司 一种具有飞行器组件的系统
CN106231173A (zh) * 2015-06-02 2016-12-14 Lg电子株式会社 移动终端及其控制方法
CN105955292A (zh) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 一种控制飞行器飞行的方法、移动终端、飞行器及系统
WO2017208199A1 (en) * 2016-06-03 2017-12-07 ZHOU, Tiger Amphibious vtol super drone camera in mobile case (phone case) with multiple aerial and aquatic flight modes for capturing panoramic virtual reality views, selfie and interactwe video
CN106303182A (zh) * 2016-09-18 2017-01-04 珠海格力电器股份有限公司 一种飞行摄像头装置、方法及终端
CN107087042A (zh) * 2017-04-21 2017-08-22 天津卉茗共创科技有限公司 摄像机构、终端以及摄像系统
CN108227726A (zh) * 2018-01-11 2018-06-29 深圳电航空技术有限公司 无人机飞行控制方法、装置、终端及存储介质
CN208675340U (zh) * 2018-05-25 2019-03-29 南昌华勤电子科技有限公司 移动设备
CN110709797A (zh) * 2018-06-29 2020-01-17 深圳市大疆创新科技有限公司 可移动平台的操控方法、装置及可移动平台
CN112134976A (zh) * 2019-06-25 2020-12-25 北京小米移动软件有限公司 终端设备
CN112954092A (zh) * 2021-02-03 2021-06-11 维沃移动通信有限公司 电子设备
CN114510075A (zh) * 2022-01-26 2022-05-17 维沃移动通信有限公司 控制飞行组件的方法和装置、终端和可读存储介质

Also Published As

Publication number Publication date
CN114510075A (zh) 2022-05-17

Similar Documents

Publication Publication Date Title
US10021319B2 (en) Electronic device and method for controlling image display
CN111061574B (zh) 一种对象分享方法及电子设备
CN103513894B (zh) 显示设备、远程控制设备及其控制方法
EP3920523A1 (en) Photographing method and terminal device
CN110737374B (zh) 操作方法及电子设备
CN105335001A (zh) 具有弯曲显示器的电子设备以及用于控制其的方法
CN111596817B (zh) 图标移动方法及电子设备
EP3211515B1 (en) Display device and method for controlling display device
CN110989881B (zh) 一种图标整理方法及电子设备
WO2014163333A1 (ko) 사용자 인터페이스 표시 방법 및 장치
US11599322B1 (en) Systems with overlapped displays
CN110908554B (zh) 长截图的方法及终端设备
CN113546419B (zh) 游戏地图显示方法、装置、终端及存储介质
WO2023143460A1 (zh) 控制飞行组件的方法和装置、终端和可读存储介质
US20140340336A1 (en) Portable terminal and method for controlling touch screen and system thereof
US10409478B2 (en) Method, apparatus, and recording medium for scrapping content
CN110989896A (zh) 一种控制方法及电子设备
CN110971970A (zh) 视频处理方法及电子设备
CN108881742B (zh) 一种视频生成方法及终端设备
WO2021197260A1 (zh) 便签创建方法及电子设备
CN111610909B (zh) 一种截图方法、装置及电子设备
CN109857292B (zh) 一种对象显示方法及终端设备
CN111338521A (zh) 一种图标显示控制方法及电子设备
CN110888581A (zh) 元素传递方法、装置、设备及存储介质
CN111142726B (zh) 图像显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23746338

Country of ref document: EP

Kind code of ref document: A1