CN113759948A - Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle - Google Patents

Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle Download PDF

Info

Publication number
CN113759948A
CN113759948A CN202111061018.9A CN202111061018A CN113759948A CN 113759948 A CN113759948 A CN 113759948A CN 202111061018 A CN202111061018 A CN 202111061018A CN 113759948 A CN113759948 A CN 113759948A
Authority
CN
China
Prior art keywords
smart glasses
glasses
data
intelligent glasses
posture data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202111061018.9A
Other languages
Chinese (zh)
Inventor
魏亮辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202111061018.9A priority Critical patent/CN113759948A/en
Publication of CN113759948A publication Critical patent/CN113759948A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Abstract

A method of a smart eyewear (10) controlling a pan/tilt head (20), comprising: judging whether the intelligent glasses (10) shake or roll back or not according to the posture data of the intelligent glasses (10) (S10); processing the pose data of the smart glasses (10) to determine target pose data (S20) when there is jitter or rollback in the smart glasses (10); and transmitting the target attitude data to the pan/tilt head (20) to control the pan/tilt head (20) (S30). In addition, a control method of the cloud deck (20), intelligent glasses (10), the cloud deck (20) and an unmanned aerial vehicle (30) are further provided.

Description

Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle
Technical Field
The invention relates to the field of unmanned aerial vehicles, in particular to a method for controlling a cloud deck of intelligent glasses, a method for controlling the cloud deck, the intelligent glasses, the cloud deck and the unmanned aerial vehicle.
Background
When the user wears the intelligent glasses and remotely controls the cloud platform provided with the camera through the inertia measuring unit of the intelligent glasses, the intelligent glasses can move together with the head of the user. On one hand, the head of the user is difficult to completely stop, so that the cradle head correspondingly shakes under the remote control of the inertial measurement unit, and the video shot by the camera also shakes; on the other hand, when the head of the user moves from one orientation to another and suddenly stops, there is actually a slight rollback of the head, which causes the pan-tilt to roll back accordingly under the remote control of the inertial measurement unit, resulting in a slight rollback of the video captured by the camera.
Disclosure of Invention
The embodiment of the invention provides a method for controlling a cloud deck of intelligent glasses, a method for controlling the cloud deck, the intelligent glasses, the cloud deck and an unmanned aerial vehicle.
The method for controlling the cloud deck of the intelligent glasses comprises the following steps:
judging whether the intelligent glasses shake or roll back or not according to the posture data of the intelligent glasses;
processing the pose data of the smart glasses to determine target pose data when the smart glasses are jittered or rolled back; and
and sending the target attitude data to the holder to control the holder.
The control method of the cloud platform of the embodiment of the invention comprises the following steps:
judging whether the intelligent glasses have jitter or rollback according to the received attitude data sent by the intelligent glasses and the attitude data of the holder;
processing the pose data of the smart glasses to determine target pose data when the smart glasses are jittered or rolled back; and
and controlling the motion of the holder according to the target attitude data.
The intelligent glasses provided by the embodiment of the invention are used for controlling the holder, and comprise a processor, wherein the processor is used for:
judging whether the intelligent glasses shake or roll back or not according to the posture data of the intelligent glasses;
processing the pose data of the smart glasses to determine target pose data when the smart glasses are jittered or rolled back; and
and sending the target attitude data to the holder to control the holder.
The cloud platform of the embodiment of the invention comprises a processor, wherein the processor is used for:
judging whether the intelligent glasses have jitter or rollback according to the received attitude data sent by the intelligent glasses and the attitude data of the holder;
processing the pose data of the smart glasses to determine target pose data when the smart glasses are jittered or rolled back; and
and controlling the motion of the holder according to the target attitude data.
The unmanned aerial vehicle of the embodiment of the invention comprises:
a body; and
the cloud platform of the embodiment of the invention is arranged on the machine body.
According to the method for controlling the cloud deck of the intelligent glasses, the method for controlling the cloud deck, the intelligent glasses, the cloud deck and the unmanned aerial vehicle, whether the intelligent glasses shake or roll back or not is judged according to the attitude data of the intelligent glasses, and the attitude data of the intelligent glasses is processed to remove the shake and roll back when the intelligent glasses shake or roll back, so that the target attitude data of the cloud deck is determined to be controlled, and the cloud deck controlled by the intelligent glasses remotely is prevented from shaking or roll back.
Additional aspects and advantages of embodiments of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a method for controlling a pan/tilt head of smart glasses according to an embodiment of the present invention;
fig. 2 is a schematic view of an application scenario of the smart glasses according to the embodiment of the present invention;
fig. 3 is a scene schematic diagram of an intelligent glasses control pan/tilt head according to an embodiment of the present invention;
fig. 4 is a flowchart illustrating a method of controlling a pan/tilt head of smart glasses according to an embodiment of the present invention;
fig. 5 is a flowchart illustrating a method of controlling a pan/tilt head of smart glasses according to an embodiment of the present invention;
fig. 6 is a flowchart illustrating a method of controlling a pan/tilt head of smart glasses according to an embodiment of the present invention;
fig. 7 is a schematic flow chart of a control method of a pan/tilt head according to an embodiment of the present invention;
fig. 8 is a schematic flow chart of a control method of a pan/tilt head according to an embodiment of the present invention;
fig. 9 is a schematic flow chart of a control method of a pan/tilt head according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the embodiments of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the embodiments of the present invention and simplifying the description, but do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the embodiments of the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
In the description of the embodiments of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as being fixedly connected, detachably connected, or integrally connected; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. Specific meanings of the above terms in the embodiments of the present invention can be understood by those of ordinary skill in the art according to specific situations.
In embodiments of the invention, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise the first and second features being in direct contact, or the first and second features being in contact, not directly, but via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The following disclosure provides many different embodiments or examples for implementing different configurations of embodiments of the invention. In order to simplify the disclosure of embodiments of the invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, embodiments of the invention may repeat reference numerals and/or reference letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements discussed. In addition, embodiments of the present invention provide examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Referring to fig. 1, a method for controlling a pan/tilt of smart glasses according to an embodiment of the present invention includes:
s10: judging whether the intelligent glasses shake or roll back or not according to the posture data of the intelligent glasses;
s20: processing the posture data of the intelligent glasses to determine target posture data when the intelligent glasses shake or roll back; and
s30: and sending the target attitude data to the holder to control the holder.
Referring to fig. 2 and 3 together, the smart glasses 10 according to the embodiment of the present invention are used for controlling the pan/tilt head 20. The smart eyewear 10 includes a processor 12. The method of controlling the pan/tilt of the smart glasses according to the embodiment of the present invention may be applied to the smart glasses 10 according to the embodiment of the present invention. For example, the processor 12 may be used to execute the methods in S10, S20, and S30.
That is, the processor 12 may be configured to: judging whether the intelligent glasses 10 shake or roll back or not according to the posture data of the intelligent glasses 10; processing the pose data of the smart glasses 10 to determine target pose data when there is jitter or rollback of the smart glasses 10; and transmits the target attitude data to the pan/tilt head 20 to control the pan/tilt head 20.
It can be understood that when the user wears the smart glasses and remotely controls the cloud platform carrying the camera through the smart glasses, the smart glasses can move together with the head of the user. On one hand, the head of the user is difficult to completely stop, so that the holder correspondingly shakes under the remote control of the intelligent glasses, and the video shot by the camera also shakes; on the other hand, when the head of the user moves from one orientation to another and stops suddenly, the head actually rolls back slightly, which causes the pan-tilt head to roll back correspondingly under the remote control of the smart glasses, and causes the video shot by the camera to roll back slightly. According to the method for controlling the pan/tilt of the smart glasses and the smart glasses 10 of the embodiment of the present invention, after the original posture data of the smart glasses 10 is collected, the original posture data is not immediately sent to the pan/tilt 20, but whether the smart glasses 10 shake or roll back exists or not is determined according to the posture data of the smart glasses 10 (including the posture data of the smart glasses 10 at the current time and the posture data before the current time), and when the smart glasses 10 shake or roll back, the shake and roll back are removed from the original posture data, the obtained new posture data is used as the target posture data, and then the target posture data is sent to the pan/tilt 20 to control the movement of the pan/tilt 20. In this way, the pan/tilt head 20 remotely controlled by the smart glasses 10 can be prevented from shaking or rolling back.
Referring to fig. 4, in some embodiments, the step of determining whether the smart glasses have jitter or rollback according to the pose data of the smart glasses (i.e., S10) includes:
s11: acquiring current attitude data Y (N +1) of the intelligent glasses at the (N +1) th moment; and
s12: and judging whether the intelligent glasses have shake or rollback or not according to the current posture data Y (N +1), the obtained posture data Y (1), Y (2), … … and Y (N) of the intelligent glasses at the previous N moments and the posture data S (N) of the intelligent glasses sent to the holder at the Nth moment.
In certain embodiments, the processor 12 may be configured to perform the methods of S11 and S12.
That is, the processor 12 is further operable to: acquiring current posture data Y (N +1) of the intelligent glasses 10 at the (N +1) th moment; and judging whether the intelligent glasses 10 shake or roll back or not according to the current posture data Y (N +1), the posture data Y (1), Y (2), … … and Y (N) of the intelligent glasses 10 at the previous N moments, and the posture data S (N) of the intelligent glasses 10 sent to the holder 20 at the Nth moment.
Specifically, the processor 12 may acquire the current pose data Y (n +1) of the smart glasses 10 at a predetermined frequency, for example, the predetermined frequency may be 50 HZ. In an embodiment of the present invention, N.gtoreq.1, and N is an integer. When the smart glasses 10 control the pan/tilt head 20 through the target attitude data, the smart glasses 10 may first adopt an initial attitude data Y (1) as a reference attitude, and send the initial attitude data Y (1) to the pan/tilt head 20 to control the initial attitude of the pan/tilt head 20. In the subsequent process, the smart glasses 10 obtain their own current posture data Y (N +1) at a predetermined frequency, and determine whether there is a shake or a rollback in the smart glasses 10 according to the current posture data Y (N +1), the obtained posture data Y (1), Y (2), … …, Y (N) of the smart glasses 10 at the previous time N, and the posture data s (N) of the smart glasses 10 sent to the pan-tilt 20 at the nth time.
In the following description, the current time is taken as the fourth time, that is, N is 3. The smart glasses 10 acquire the current posture data Y (4) at the fourth time, and then determine whether the smart glasses 10 shake or roll back according to the current posture data Y (4), the acquired posture data Y (1) at the first time, the acquired posture data Y (2) at the second time, the acquired posture data Y (3) at the third time, and the posture data S (3) sent to the pan/tilt head 20 at the third time. It should be noted that the acquired pose data of the smart glasses 10 is raw pose data, i.e., unprocessed pose data, and the pose data sent to the pan/tilt head 20 may be processed pose data (e.g., pose data after removing jitter and rollback or data after smoothing), i.e., s (n) and y (n) may be the same or different.
Referring to fig. 5, in some embodiments, the smart glasses include an inertial measurement unit. The step of acquiring the current posture data Y (N +1) of the smart glasses at the (N +1) th time (i.e., S11) includes:
s111: acquiring attitude data of an inertial measurement unit at a predetermined frequency; and
s112: and converting the attitude data of the inertial measurement unit into the attitude data of the intelligent glasses.
Referring to fig. 2 and 3, in some embodiments, the smart eyewear 10 includes an inertial measurement unit 14. The processor 12 may be used to execute the methods in S111 and S112.
That is, the processor 12 is further operable to: acquiring attitude data of the inertial measurement unit 14 at a predetermined frequency; and converting the attitude data of the inertial measurement unit 14 into the attitude data of the smart glasses 10.
Specifically, a 6-axis or 9-axis inertial measurement unit 14 may be disposed on the smart glasses 10. It is understood that there is a predetermined correspondence between the attitude data of the inertial measurement unit 14 and the attitude data of the smart glasses 10. The smart glasses 10 acquire the attitude data of the inertial measurement unit 14 at a predetermined frequency (for example, 50HZ), and then convert the attitude data of the inertial measurement unit 14 into its own attitude data according to the corresponding relationship.
In some embodiments, the attitude data includes at least one of yaw angle, roll angle, pitch angle, yaw rate, roll rate, pitch rate.
For example, the attitude data may include yaw angle; or include roll angle; or yaw and pitch rates; or yaw angle, pitch angle, roll angle speed; or include yaw, roll, pitch, yaw rate, roll rate, and pitch rate. The pitch angle, the yaw angle and the roll angle respectively correspond to the rotating angles around the X axis, the Y axis and the Z axis in the three-dimensional space rectangular coordinate system. The ranges of the yaw angle, the roll angle and the pitch angle are all (-180 degrees and 180 degrees).
Specifically, the smart glasses 10 control the motion of the pan/tilt head 20 through the target attitude data transmitted to the pan/tilt head 20 so that the pan/tilt head 20 follows the attitude of itself. The holder 20 includes a driving motor, and the holder 20 is configured to control the driving motor to drive the holder 20 to move according to the target posture data sent by the smart glasses 10, so that the posture of the holder 20 follows the posture of the smart glasses 10. The pan/tilt head 20 may be a two-axis pan/tilt head 20, a three-axis pan/tilt head 20, and the like, which is not limited herein. The holder 20 is schematically illustrated as a three-axis holder, and when the holder 20 is a three-axis holder, the driving motor includes a first motor, a second motor and a third motor. The first motor is used for driving the pitching-shaft support or the shooting device 24 to rotate around the pitching shaft, the second motor is used for driving the rolling-shaft support or the shooting device 24 to rotate around the rolling shaft, and the third motor is used for driving the yawing-shaft support or the shooting device 24 to rotate around the yawing shaft.
Referring to fig. 6, in some embodiments, the step of determining whether the smart glasses have jitter or rollback according to the current pose data Y (N +1), the obtained pose data Y (1), Y (2), … …, Y (N) of the smart glasses at the previous N times, and the pose data S (N) of the smart glasses sent to the pan-tilt at the nth time (i.e., S12) includes:
s121: calculating whether the difference value between the current attitude data Y (N +1) and the attitude data S (N) of the intelligent glasses sent to the holder at the Nth moment is larger than a preset threshold value or not; and
s122: when the difference value is larger than a preset threshold value, determining that the intelligent glasses do not shake or roll back;
the method for controlling the holder of the intelligent glasses further comprises the following steps:
s40: and when the intelligent glasses do not shake or roll back, determining target attitude data according to the current attitude data Y (N +1) and the attitude data S (N) of the intelligent glasses sent to the holder at the Nth moment.
In some embodiments, the processor 12 may be configured to perform the methods of S121, S122, and S40.
That is, the processor 12 is further operable to: calculating whether the difference value between the current posture data Y (N +1) and the posture data S (N) of the intelligent glasses 10 sent to the holder 20 at the Nth moment is larger than a preset threshold value; and determining that there is no jitter or rollback of the smart glasses 10 when the difference is greater than the predetermined threshold. The processor 12 may also be configured to: when the smart glasses 10 do not shake or roll back, the target posture data is determined according to the current posture data Y (N +1) and the posture data s (N) of the smart glasses 10 sent to the pan-tilt 20 at the nth time.
Specifically, the following description will be given taking the current time as the fourth time as an example. The smart glasses 10 calculate whether a difference between the current posture data Y (4) and the posture data S (3) sent to the pan/tilt 20 at the third time is greater than a set threshold, and when the difference is greater than the predetermined threshold, it indicates that the smart glasses 10 do not shake or roll back, the smart glasses 10 determine target posture data according to the current posture data Y (4) and the posture data S (3) sent to the pan/tilt 20 at the third time. In some embodiments, the smart glasses 10 are configured to smooth the current pose data Y (4) according to the pose data S (3) sent to the pan/tilt head 20 at the third time to determine the target pose data. Thus, the motion process of the pan/tilt head 20 can be smooth. When the pan/tilt head 20 is equipped with the shooting device 24, the video effect shot by the shooting device 24 is softer and clearer.
Referring to fig. 6, in some embodiments, the step of determining whether the smart glasses have jitter or rollback according to the current pose data Y (N +1), the obtained pose data Y (1), Y (2), … …, Y (N) of the smart glasses at the previous N times, and the pose data S (N) of the smart glasses sent to the pan-tilt at the nth time (i.e., S12) includes:
s121: calculating whether the difference value between the current attitude data Y (N +1) and the attitude data S (N) of the intelligent glasses sent to the holder at the Nth moment is larger than a preset threshold value or not;
s123: when the difference is smaller than or equal to a preset threshold value, judging whether the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments according to the acquired posture data Y (1), Y (2), … …, Y (N) of the intelligent glasses at the previous N moments and the current posture data Y (N + 1);
s122: when the moving direction of the intelligent glasses at the (N +1) th moment is the same as that of the intelligent glasses at the previous N moments, determining that the intelligent glasses do not shake or roll back; and
s124: when the moving direction of the intelligent glasses at the (N +1) th moment is different from the moving direction of the intelligent glasses at the previous N moments, determining that the intelligent glasses shake or roll back;
the step of processing the pose data of the smart glasses to determine the target pose data when there is a shake or a rollback of the smart glasses (i.e., S20) includes:
s21: determining the attitude data S (N) of the intelligent glasses which are sent to the holder at the Nth moment as target attitude data;
the method for controlling the holder of the intelligent glasses further comprises the following steps:
s40: and when the intelligent glasses do not shake or roll back, determining target attitude data according to the current attitude data Y (N +1) and the attitude data S (N) of the intelligent glasses sent to the holder at the Nth moment.
In certain embodiments, the processor 12 may be configured to perform the methods of S121, S122, S123, S124, S21, and S40.
That is, the processor 12 is further operable to: calculating whether the difference value between the current posture data Y (N +1) and the posture data S (N) of the intelligent glasses 10 sent to the holder 20 at the Nth moment is larger than a preset threshold value; when the difference is less than or equal to the predetermined threshold, judging whether the moving direction of the smart glasses 10 at the (N +1) th time is the same as the moving direction of the smart glasses 10 at the previous N times according to the acquired posture data Y (1), Y (2), … …, Y (N) of the smart glasses 10 at the previous N times and the current posture data Y (N + 1); when the moving direction of the smart glasses 10 at the (N +1) th moment is the same as the moving direction of the smart glasses 10 at the previous N moments, determining that the smart glasses 10 do not shake or roll back; when the moving direction of the smart glasses 10 at the (N +1) th moment is different from the moving direction of the smart glasses 10 at the previous N moments, determining that the smart glasses 10 shake or roll back; and when the intelligent glasses 10 shake or roll back, determining the posture data S (N) of the intelligent glasses 10 sent to the holder 20 at the Nth moment as target posture data. The processor 12 may also be configured to: when the smart glasses 10 do not shake or roll back, the target posture data is determined according to the current posture data Y (N +1) and the posture data s (N) of the smart glasses 10 sent to the pan-tilt 20 at the nth time.
Specifically, the following description will be given taking the current time as the fourth time as an example. The smart glasses 10 calculate whether a difference between the current posture data Y (4) and the posture data S (3) sent to the pan/tilt 20 at the third time is greater than a set threshold, and when the difference is less than or equal to a predetermined threshold, determine whether the movement direction of the smart glasses 10 at the fourth time is the same as the movement direction of the smart glasses 10 at the first time according to the acquired posture data Y (1) at the first time, the acquired posture data Y (2) at the second time, the acquired posture data Y (3) at the third time, and the current posture data Y (4). It should be noted that the movement direction of the smart glasses 10 at the fourth time is the same as the movement direction of the smart glasses 10 at the first three times, which means that: the moving direction of the smart glasses 10 at the fourth time with respect to the third time is the same as the moving direction of the smart glasses 10 at the third time with respect to the second time and the moving direction of the smart glasses 10 at the second time with respect to the first time. The movement direction of the smart glasses 10 at the fourth time is different from the movement direction of the smart glasses 10 at the first three times, which means that: the moving direction of the smart glasses 10 at the fourth time with respect to the third time is different from at least one of the moving direction of the smart glasses 10 at the third time with respect to the second time and the moving direction of the smart glasses 10 at the second time with respect to the first time. For example, the moving direction of the smart glasses 10 at the fourth time with respect to the third time is different from the moving direction of the smart glasses 10 at the third time with respect to the second time, and the moving direction of the smart glasses 10 at the third time with respect to the second time is the same as the moving direction of the smart glasses 10 at the second time with respect to the first time, and it is also determined that the moving direction of the smart glasses 10 at the fourth time is different from the moving direction of the smart glasses 10 at the first time. When the moving direction of the smart glasses 10 at the fourth time is the same as the moving direction of the smart glasses 10 at the first three times, it indicates that: there is no jitter or rollback of the smart glasses 10. When the moving direction of the smart glasses 10 at the fourth moment is different from the moving direction of the smart glasses 10 at the first three moments, it indicates that: the smart glasses 10 are at least one of jittered or rolled back. When there is a shake or a rollback in the smart glasses 10, the smart glasses 10 take the attitude data S (3) transmitted to the pan/tilt head 20 at the third time as target attitude data, and retransmit the attitude data S (3) transmitted to the pan/tilt head 20 at the third time to the pan/tilt head 20 at the fourth time. When the smart glasses 10 do not shake or roll back, the smart glasses 10 determine target attitude data according to the current attitude data Y (4) and the attitude data S (3) sent to the pan/tilt head 20 at the third time.
In some embodiments, a step (i.e., S123) of determining whether the moving direction of the smart glasses at the (N +1) th time is the same as the moving direction of the smart glasses at the N previous time based on the acquired pose data Y (1), Y (2), … …, Y (N) of the smart glasses at the N previous time and the current pose data Y (N +1), and determining that the moving direction of the smart glasses at the (N +1) th time is the same as the moving direction of the smart glasses at the N previous time by determining that a difference between the pose data of the smart glasses at any two adjacent times in the (N +1) previous times is greater than or equal to zero; or determining that the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments by judging that the difference value between the posture data of the intelligent glasses at any two adjacent moments in the previous (N +1) moments is less than or equal to zero.
In some embodiments, the processor 12 is further configured to: determining that the movement direction of the smart glasses 10 at the (N +1) th moment is the same as the movement direction of the smart glasses 10 at the previous N moments by judging that the difference value between the posture data of the smart glasses 10 at any two adjacent moments in the previous (N +1) moments is greater than or equal to zero; or determining that the moving direction of the smart glasses 10 at the (N +1) th time is the same as the moving direction of the smart glasses 10 at the N previous time by judging that the difference value between the posture data of the smart glasses 10 at any two adjacent times in the (N +1) previous time is less than or equal to zero.
Specifically, the following description will be given taking the current time as the fourth time as an example. When the values of Y (4) -Y (3), Y (3) -Y (2), and Y (2) -Y (1) are all greater than or equal to zero, it indicates that the moving direction of the smart glasses 10 at the fourth time with respect to the third time is the same as the moving direction of the smart glasses 10 at the third time with respect to the second time and the moving direction of the smart glasses 10 at the second time with respect to the first time. Similarly, when the values of Y (4) -Y (3), Y (3) -Y (2), and Y (2) -Y (1) are all less than or equal to zero, it indicates that the moving direction of the smart glasses 10 at the fourth time relative to the third time is the same as the moving direction of the smart glasses 10 at the third time relative to the second time and the moving direction of the smart glasses 10 at the second time relative to the first time.
The following description will take attitude data as an example of a yaw angle. In a state where the head of the user is "still" (actually, it is difficult to completely still), the posture data of the smart glasses acquired at a plurality of consecutive times are: 0.1, 0.2, 0.3, 0.4, 0.2, in degrees. It can be seen that there is a slight jitter on the head. If these attitude data are directly sent to the pan-tilt, the pan-tilt will also have corresponding shake, and the video shot by the shooting equipment on the pan-tilt will also have shake. When the head of the user suddenly stops moving from one direction to another direction with the smart glasses, for example, suddenly stops moving from a position of 10 degrees to a position of 20 degrees, the posture data of the smart glasses acquired at a plurality of consecutive times are: 10.0, 10.5, 10.8, 11.2, … …, 19.8, 20.2, 20.0, 19.6, 19.5, 19.6 in degrees. It can be seen that there is a slight rollback of the head. If these attitude data are directly sent to the cloud platform, the cloud platform can also appear corresponding rollback, and the video that the shooting equipment on the cloud platform was shot can also appear rolling back.
The effectiveness of the method for controlling the pan/tilt head of the smart glasses according to the embodiment of the present invention will be described in terms of shaking and rollback, respectively. In a state that the head of the user is "still", before the method for controlling the pan/tilt head of the smart glasses according to the embodiment of the present invention is used, target posture data transmitted to the pan/tilt head 20 is shown in table 1; after the method for controlling the pan/tilt head of the smart glasses according to the embodiment of the present invention is used, the target attitude data sent to the pan/tilt head 20 is shown in table 2. When the head of the user suddenly stops moving from one direction to another direction while wearing the smart glasses 10, before using the method for controlling the pan/tilt head of the smart glasses according to the embodiment of the present invention, the target posture data transmitted to the pan/tilt head 20 is shown in table 3; after the method for controlling the pan/tilt head of the smart glasses according to the embodiment of the present invention is used, the target attitude data transmitted to the pan/tilt head 20 is shown in table 4. According to the method for controlling the cloud deck of the intelligent glasses, provided by the embodiment of the invention, the acquired attitude data of the intelligent glasses 10 is subjected to jitter removal and rollback treatment, so that the cloud deck 20 does not shake or roll back, and the effect of the video shot by the shooting equipment 24 is further improved.
TABLE 1
Time (hour: minute: second. millisecond) Transmitted yaw angle (unit: degree)
19:11:20.348 15.3
19:11:20.368 15.5
19:11:20.387 15.6
19:11:20.408 15.4
19:11:20.428 15.2
19:11:20.448 15.3
TABLE 2
Time (hour: minute: second. millisecond) Transmitted yaw angle (unit: degree)
19:35:11.004 15.5
19:35:11.024 15.5
19:35:11.044 15.5
19:35:11.064 15.5
19:35:11.084 15.5
19:35:11.104 15.5
TABLE 3
Time (hour: minute: second. millisecond) Transmitted yaw angle (unit: degree)
20:05:38.475 10.0
20:05:38.495 10.5
20:05:38.515 10.8
20:05:38.535 11.2
... ...
20:05:38.955 19.8
20:05:38.975 20.2
20:05:38.995 20.0
20:05:39.015 19.6
20:05:39.035 19.5
TABLE 4
Figure BDA0003256560220000101
Figure BDA0003256560220000111
Referring to fig. 7, a method for controlling a pan/tilt head according to an embodiment of the present invention includes:
s50: judging whether the intelligent glasses have jitter or rollback according to the received attitude data sent by the intelligent glasses and the attitude data of the holder;
s60: processing the posture data of the intelligent glasses to determine target posture data when the intelligent glasses shake or roll back; and
s70: and controlling the motion of the holder according to the target attitude data.
Referring to fig. 2 and 3, a pan/tilt head 20 according to an embodiment of the present invention includes a processor 22. The control method of the pan/tilt head according to the embodiment of the present invention is applicable to the pan/tilt head 20 according to the embodiment of the present invention. For example, the processor 22 may be used to execute the methods in S50, S60, and S70.
That is, the processor 22 may be configured to: judging whether the intelligent glasses 10 shake or roll back or not according to the received attitude data sent by the intelligent glasses 10 and the attitude data of the holder 20; processing the pose data of the smart glasses 10 to determine target pose data when there is jitter or rollback of the smart glasses 10; and controlling the motion of the pan/tilt head 20 according to the target attitude data.
After receiving the original attitude data of the smart glasses 10, the cradle head 20 and the control method of the cradle head according to the embodiments of the present invention do not immediately control the movement of the cradle head 20 according to the original attitude data, but determine whether the smart glasses 10 shake or rollback exists according to the attitude data sent by the smart glasses 10 and the attitude data of the cradle head 20, and perform processing of removing shake and rollback on the original attitude data when the smart glasses 10 shake or rollback exists, take the obtained new attitude data as target attitude data, and then control the movement of the cradle head 20 according to the target attitude data. In this way, when the smart glasses 10 shake or roll back, the pan/tilt head 20 remotely controlled by the smart glasses 10 is prevented from shaking or rolling back.
In some embodiments, when the control method of the pan/tilt head controls the pan/tilt head 20 to move according to the target attitude data, the amount of movement required for moving the pan/tilt head from the current attitude to the target attitude may be determined by calculating a difference between the target attitude data and the current attitude data of the pan/tilt head according to the difference, so as to control the pan/tilt head 20 to perform corresponding movement.
Referring to fig. 8, in some embodiments, the step of determining whether the smart glasses shake or roll back according to the received pose data sent by the smart glasses and the pose data of the pan/tilt (i.e., S60) includes:
s51: receiving current attitude data X (N +1) of the intelligent glasses at the (N +1) th moment; and
s52: and judging whether the intelligent glasses have shake or rollback or not according to the current posture data X (N +1), the received posture data X (1), X (2), … …, X (N) of the intelligent glasses at the previous N moments and the posture data P (N) of the pan-tilt at the Nth moment.
In certain embodiments, the processor 22 may be configured to perform the methods of S51 and S52.
That is, the processor 22 is further operable to: receiving current posture data X (N +1) of the smart glasses 10 at the (N +1) th time; and judging whether the intelligent glasses 10 shake or roll back or not according to the current posture data X (N +1), the received posture data X (1), X (2), … … and X (N) of the intelligent glasses 10 at the previous N moments and the posture data P (N) of the pan-tilt 20 at the Nth moment.
Specifically, the processor 22 may receive the current pose data X (n +1) of the smart glasses 10 at a predetermined frequency, for example, the predetermined frequency may be 50 HZ. In an embodiment of the present invention, N.gtoreq.1, and N is an integer. When the smart glasses 10 control the cradle head 20 through the target attitude data, the smart glasses 10 may first adopt one initial attitude data X (1) as a reference attitude, and send the initial attitude data X (1) to the cradle head 20 to control the initial attitude of the cradle head 20. In the subsequent process, the pan/tilt head 20 receives the current posture data X (N +1) of the smart glasses 10 at a predetermined frequency, and determines whether there is shaking or rollback of the smart glasses 10 according to the current posture data X (N +1), the received posture data X (1), X (2), … …, X (N) of the smart glasses 10 at the previous N times, and the posture data p (N) of the pan/tilt head 20 at the nth time.
In the following description, the current time is taken as the fourth time, that is, N is 3. The cradle head 20 receives the current posture data X (4) of the smart glasses 10 at the fourth time, and then determines whether the smart glasses 10 have a shake or a rollback according to the current posture data X (4), the received posture data X (1) at the first time, the received posture data X (2) at the second time, the received posture data X (3) at the third time, and the posture data P (3) of the cradle head 20 at the third time. It should be noted that the received pose data of the smart glasses 10 is raw pose data, i.e., unprocessed pose data, and the pose data of the pan/tilt head 20 may be processed pose data (e.g., pose data after removing jitter and rollback or data after smoothing), i.e., p (n) and x (n) may be the same or different.
In some embodiments, the attitude data includes at least one of yaw angle, roll angle, pitch angle, yaw rate, roll rate, pitch rate.
For example, the attitude data may include yaw angle; or include roll angle; or yaw and pitch rates; or yaw angle, pitch angle, roll angle speed; or include yaw, roll, pitch, yaw rate, roll rate, and pitch rate. The pitch angle, the yaw angle and the roll angle respectively correspond to the rotating angles around the X axis, the Y axis and the Z axis in the three-dimensional space rectangular coordinate system. The ranges of the yaw angle, the roll angle and the pitch angle are all (-180 degrees and 180 degrees).
Specifically, the control method of the pan/tilt head controls the pan/tilt head 20 to move to follow the movement of the smart glasses 10 through the target attitude data. The pan/tilt head 20 may be a two-axis pan/tilt head 20, a three-axis pan/tilt head 20, etc., and will not be described in detail herein.
Referring to fig. 3, in some embodiments, a camera device 24 is disposed on the cradle head 20. The photographing device 24 is used to record video or photograph images, etc.
Referring to fig. 9, in some embodiments, the step of determining whether the smart glasses have a shake or a rollback according to the current pose data X (N +1), the received pose data X (1), X (2), … …, X (N) of the smart glasses at the previous N times, and the pose data p (N) of the pan/tilt at the nth time (i.e., S62) includes:
s521: calculating whether the difference value between the current attitude data X (N +1) and the attitude data P (N) of the pan-tilt at the Nth moment is larger than a preset threshold value or not; and
s522: when the difference value is larger than a preset threshold value, determining that the intelligent glasses do not shake or roll back;
the control method further comprises the following steps:
s80: and when the intelligent glasses do not shake or roll back, determining target attitude data according to the current attitude data X (N +1) and the attitude data P (N) of the pan-tilt at the Nth moment.
In some embodiments, the processor 22 may be configured to perform the methods of S521, S522, and S80.
That is, the processor 22 is further operable to: calculating whether the difference between the current attitude data X (N +1) and the attitude data p (N) of the pan/tilt head 20 at the nth time is greater than a predetermined threshold; and determining that there is no jitter or rollback of the smart glasses 10 when the difference is greater than the predetermined threshold. The processor 22 may also be configured to: when the intelligent glasses 10 do not shake or roll back, the target attitude data is determined according to the current attitude data X (N +1) and the attitude data p (N) of the pan/tilt head 20 at the nth time.
Specifically, the following description will be given taking the current time as the fourth time as an example. The pan/tilt head 20 calculates whether a difference between the current attitude data X (4) and the attitude data P (3) of the pan/tilt head 20 at the third time is greater than a set threshold, and if the difference is greater than the predetermined threshold, it indicates that there is no shake or rollback of the smart glasses 10, and the pan/tilt head 20 determines the target attitude data according to the current attitude data X (4) and the attitude data P (3) of the pan/tilt head 20 at the third time. In some embodiments, the pan/tilt head 20 is configured to smooth the current attitude data X (4) according to the attitude data P (3) of the pan/tilt head 20 at the third time to determine the target attitude data. Thus, the motion process of the pan/tilt head 20 can be smooth. When the pan/tilt head 20 is equipped with the shooting device 24, the video effect shot by the shooting device 24 is softer and clearer.
Referring to fig. 9, in some embodiments, the step of determining whether the smart glasses have a shake or a rollback according to the current pose data X (N +1), the received pose data X (1), X (2), … …, X (N) of the smart glasses at the previous N times, and the pose data p (N) of the pan/tilt at the nth time (i.e., S62) includes:
s521: calculating whether the difference value between the current attitude data X (N +1) and the attitude data P (N) of the pan-tilt at the Nth moment is larger than a preset threshold value or not;
s523: when the difference is smaller than or equal to a preset threshold value, judging whether the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments according to the received posture data X (1), X (2), … …, X (N) of the intelligent glasses at the previous N moments and the current posture data X (N + 1);
s522: when the moving direction of the intelligent glasses at the (N +1) th moment is the same as that of the intelligent glasses at the previous N moments, determining that the intelligent glasses do not shake or roll back; and
s524: when the moving direction of the intelligent glasses at the (N +1) th moment is different from the moving direction of the intelligent glasses at the previous N moments, determining that the intelligent glasses shake or roll back;
the step of processing the pose data of the smart glasses to determine the target pose data when there is a shake or a rollback of the smart glasses (i.e., S60) includes:
s61: determining the attitude data P (N) of the holder at the Nth moment as target attitude data;
the control method further comprises the following steps:
s80: and when the intelligent glasses do not shake or roll back, determining target attitude data according to the current attitude data X (N +1) and the attitude data P (N) of the pan-tilt at the Nth moment.
In certain embodiments, the processor 22 may be configured to perform the methods in S521, S522, S523, S524, S61 and S80.
That is, the processor 22 is further operable to: calculating whether the difference between the current attitude data X (N +1) and the attitude data p (N) of the pan/tilt head 20 at the nth time is greater than a predetermined threshold; when the difference is less than or equal to the predetermined threshold, judging whether the moving direction of the smart glasses 10 at the (N +1) th time is the same as the moving direction of the smart glasses 10 at the N previous time according to the received posture data X (1), X (2), … …, X (N) of the smart glasses 10 at the N previous time and the current posture data X (N + 1); when the moving direction of the smart glasses 10 at the (N +1) th moment is the same as the moving direction of the smart glasses 10 at the previous N moments, determining that the smart glasses 10 do not shake or roll back; when the moving direction of the smart glasses 10 at the (N +1) th moment is different from the moving direction of the smart glasses 10 at the previous N moments, determining that the smart glasses 10 shake or roll back; and when the intelligent glasses 10 shake or roll back, determining the attitude data P (N) of the pan-tilt 20 at the Nth moment as target attitude data. The processor 22 may also be configured to: when the intelligent glasses 10 do not shake or roll back, the target attitude data is determined according to the current attitude data X (N +1) and the attitude data p (N) of the pan/tilt head 20 at the nth time.
Specifically, the following description will be given taking the current time as the fourth time as an example. The pan/tilt head 20 calculates whether a difference between the current posture data X (4) and the posture data P (3) of the pan/tilt head 20 at the third time is greater than a set threshold, and when the difference is less than or equal to a predetermined threshold, determines whether the moving direction of the smart glasses 10 at the fourth time is the same as the moving direction of the smart glasses 10 at the first time according to the received posture data X (1) at the first time, the received posture data X (2) at the second time, the received posture data X (3) at the third time, and the current posture data X (4). It should be noted that the movement direction of the smart glasses 10 at the fourth time is the same as the movement direction of the smart glasses 10 at the first three times, which means that: the moving direction of the smart glasses 10 at the fourth time with respect to the third time is the same as the moving direction of the smart glasses 10 at the third time with respect to the second time and the moving direction of the smart glasses 10 at the second time with respect to the first time. The movement direction of the smart glasses 10 at the fourth time is different from the movement direction of the smart glasses 10 at the first three times, which means that: the moving direction of the smart glasses 10 at the fourth time with respect to the third time is different from at least one of the moving direction of the smart glasses 10 at the third time with respect to the second time and the moving direction of the smart glasses 10 at the second time with respect to the first time. For example, the moving direction of the smart glasses 10 at the fourth time with respect to the third time is different from the moving direction of the smart glasses 10 at the third time with respect to the second time, and the moving direction of the smart glasses 10 at the third time with respect to the second time is the same as the moving direction of the smart glasses 10 at the second time with respect to the first time, and it is also determined that the moving direction of the smart glasses 10 at the fourth time is different from the moving direction of the smart glasses 10 at the first time. When the moving direction of the smart glasses 10 at the fourth time is the same as the moving direction of the smart glasses 10 at the first three times, it indicates that: there is no jitter or rollback of the smart glasses 10. When the moving direction of the smart glasses 10 at the fourth moment is different from the moving direction of the smart glasses 10 at the first three moments, it indicates that: the smart glasses 10 are at least one of jittered or rolled back. When the smart glasses 10 shake or roll back, the pan/tilt head 20 takes the posture data P (3) of the third time as the target posture data at the fourth time. When there is no shake or rollback of the smart glasses 10, the pan/tilt head 20 determines target attitude data from the current attitude data X (4) and the attitude data P (3) at the third time.
In some embodiments, when the difference is less than or equal to the predetermined threshold, determining whether the moving direction of the smart glasses at the (N +1) th time is the same as the moving direction of the smart glasses at the previous N times according to the received posture data X (1), X (2), … …, X (N) of the smart glasses at the previous N times and the current posture data X (N +1) (i.e., S624), determining that the moving direction of the smart glasses at the (N +1) th time is the same as the moving direction of the smart glasses at the previous N times by determining that the difference between the posture data of the smart glasses at any two adjacent times in the previous (N +1) times is greater than or equal to zero; or determining that the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments by judging that the difference value between the posture data of the intelligent glasses at any two adjacent moments in the previous (N +1) moments is less than or equal to zero.
In certain embodiments, the processor 22 is further configured to: determining that the movement direction of the smart glasses 10 at the (N +1) th moment is the same as the movement direction of the smart glasses 10 at the previous N moments by judging that the difference value between the posture data of the smart glasses 10 at any two adjacent moments in the previous (N +1) moments is greater than or equal to zero; or determining that the moving direction of the smart glasses 10 at the (N +1) th time is the same as the moving direction of the smart glasses 10 at the N previous time by judging that the difference value between the posture data of the smart glasses 10 at any two adjacent times in the (N +1) previous time is less than or equal to zero.
Specifically, the following description will be given taking the current time as the fourth time as an example. When the values of X (4) -X (3), X (3) -X (2), and X (2) -X (1) are all greater than or equal to zero, it indicates that the moving direction of the smart glasses 10 at the fourth time with respect to the third time is the same as the moving direction of the smart glasses 10 at the third time with respect to the second time and the moving direction of the smart glasses 10 at the second time with respect to the first time. Similarly, when the values of X (4) -X (3), X (3) -X (2), and X (2) -X (1) are all less than or equal to zero, it indicates that the moving direction of the smart glasses 10 at the fourth time relative to the third time is the same as the moving direction of the smart glasses 10 at the third time relative to the second time and the moving direction of the smart glasses 10 at the second time relative to the first time.
The following description will take attitude data as an example of a yaw angle. In a state where the head of the user is "still" (actually, it is difficult to completely still), the posture data of the smart glasses received at a plurality of consecutive times are: 0.1, 0.2, 0.3, 0.4, 0.2, in degrees. It can be seen that there is a slight jitter on the head. If the attitude of the cradle head is directly controlled according to the attitude data, the cradle head can shake correspondingly, and videos shot by shooting equipment on the cradle head can shake. When the head of the user suddenly stops moving from one direction to another direction with the smart glasses, for example, suddenly stops moving from a position of 10 degrees to a position of 20 degrees, the gesture data of the smart glasses received at a plurality of consecutive time points are respectively: 10.0, 10.5, 10.8, 11.2, … …, 19.8, 20.2, 20.0, 19.6, 19.5, 19.6 in degrees. It can be seen that there is a slight rollback of the head. If the attitude of the cradle head is directly controlled according to the attitude data, the cradle head can roll back correspondingly, and videos shot by shooting equipment on the cradle head can also roll back.
The effectiveness of the control method of the pan/tilt head according to the embodiment of the present invention will be described below in terms of two aspects, namely, shaking and rollback. In a state where the head of the user is "still", before the control method of the pan/tilt head according to the embodiment of the present invention is used, the determined target attitude data of the pan/tilt head 20 is shown in table 5; table 6 shows the determined target attitude data of the pan/tilt head 20 after the pan/tilt head control method according to the embodiment of the present invention is used. When the head of the user suddenly stops moving from one orientation to another orientation while wearing the smart glasses 10, the determined target posture data of the pan/tilt head 20 is shown in table 7 before using the pan/tilt head control method according to the embodiment of the present invention; the target attitude data of the pan/tilt head 20 determined after the pan/tilt head control method according to the embodiment of the present invention is shown in table 8. According to the control method of the pan/tilt head of the embodiment of the invention, the target attitude data for controlling the attitude of the pan/tilt head 20 is determined by removing jitter and rollback from the received attitude data of the smart glasses 10, so that the pan/tilt head 20 does not jitter or rollback, and the effect of the video shot by the shooting device 24 is improved.
TABLE 5
Time (hour: minute: second. millisecond) Yaw angle (unit: degree)
19:11:20.348 15.3
19:11:20.368 15.5
19:11:20.387 15.6
19:11:20.408 15.4
19:11:20.428 15.2
19:11:20.448 15.3
TABLE 6
Figure BDA0003256560220000161
Figure BDA0003256560220000171
TABLE 7
Time (hour: minute: second. millisecond) Yaw angle (unit: degree)
20:05:38.475 10.0
20:05:38.495 10.5
20:05:38.515 10.8
20:05:38.535 11.2
... ...
20:05:38.955 19.8
20:05:38.975 20.2
20:05:38.995 20.0
20:05:39.015 19.6
20:05:39.035 19.5
TABLE 8
Time (hour: minute: second. millisecond) Yaw angle (unit: degree)
20:18:58.117 12.3
20:18:58.137 12.6
20:18:58.157 13.0
20:18:58.177 13.5
... ...
20:18:58.617 21.8
20:18:58.637 22.0
20:18:58.657 22.0
20:18:58.677 22.0
20:18:58.697 22.0
Referring to fig. 3, an unmanned aerial vehicle 30 according to an embodiment of the present invention includes a body 32 and the cradle head 20 according to any of the above embodiments. The pan/tilt head 20 is disposed on the body 32.
After receiving the original attitude data of the smart glasses 10, the pan/tilt head 20 of the unmanned aerial vehicle 30 according to the embodiment of the present invention does not immediately control the movement of the pan/tilt head 20 according to the original attitude data, but determines whether the smart glasses 10 shake or rollback exists according to the attitude data sent by the smart glasses 10 and the attitude data of the pan/tilt head 20, and performs a process of removing shake and rollback on the original attitude data when the smart glasses 10 shake or rollback exists, and uses the obtained new attitude data as the target attitude data, and then controls the movement of the pan/tilt head 20 according to the target attitude data. In this way, when the smart glasses 10 shake or roll back, the pan/tilt head 20 remotely controlled by the smart glasses 10 is prevented from shaking or rolling back.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this specification, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program
For use by or in connection with an instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires (control method), a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of embodiments of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and not to be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (15)

1. A method for controlling a holder of smart glasses is characterized by comprising the following steps:
judging whether the intelligent glasses shake or roll back or not according to the posture data of the intelligent glasses;
when the intelligent glasses shake or roll back, determining attitude data S (N) of the intelligent glasses sent to the holder at the Nth moment as target attitude data; and
and sending the target attitude data to the holder to control the holder.
2. A method for controlling a holder of smart glasses is characterized by comprising the following steps:
judging whether the intelligent glasses shake or roll back or not according to the posture data of the intelligent glasses;
when the intelligent glasses do not shake or roll back, determining target posture data according to the current posture data Y (N +1) of the intelligent glasses at the (N +1) th moment and the posture data S (N) of the intelligent glasses sent to the holder at the N th moment; and
and sending the target attitude data to the holder to control the holder.
3. The method for controlling a pan/tilt head of smart glasses according to claim 1 or 2, wherein the step of determining whether the smart glasses have jitter or rollback according to the posture data of the smart glasses comprises:
acquiring current attitude data Y (N +1) of the intelligent glasses at the (N +1) th moment; and
and judging whether the intelligent glasses shake or roll back exists according to the current posture data Y (N +1), the obtained posture data Y (1), Y (2), … … and Y (N) of the intelligent glasses at the previous N moments and the posture data S (N) of the intelligent glasses sent to the holder at the Nth moment.
4. The method for controlling a pan/tilt head of smart glasses according to claim 3, wherein the step of determining whether there is shaking or rollback of the smart glasses according to the current posture data Y (N +1), the obtained posture data Y (1), Y (2), … …, Y (N) of the smart glasses at the previous N times, and the posture data S (N) of the smart glasses transmitted to the pan/tilt head at the Nth time comprises:
calculating whether the difference value between the current attitude data Y (N +1) and the attitude data S (N) of the intelligent glasses sent to the holder at the Nth moment is larger than a preset threshold value or not; and
when the difference is greater than the predetermined threshold, determining that the smart glasses do not shake or roll back.
5. The method for controlling a pan/tilt head of smart glasses according to claim 3, wherein the step of determining whether there is shaking or rollback of the smart glasses according to the current posture data Y (N +1), the obtained posture data Y (1), Y (2), … …, Y (N) of the smart glasses at the previous N times, and the posture data S (N) of the smart glasses transmitted to the pan/tilt head at the Nth time comprises:
calculating whether the difference value between the current attitude data Y (N +1) and the attitude data S (N) of the intelligent glasses sent to the holder at the Nth moment is larger than a preset threshold value or not;
when the difference is smaller than or equal to the preset threshold, judging whether the movement direction of the smart glasses at the (N +1) th moment is the same as the movement direction of the smart glasses at the previous N moments according to the acquired posture data Y (1), Y (2), … …, Y (N) of the smart glasses at the previous N moments and the current posture data Y (N + 1);
determining that there is no shaking or rollback of the smart glasses when the moving direction of the smart glasses at the (N +1) th moment is the same as the moving direction of the smart glasses at the previous N moments; and
determining that the smart glasses have jitter or rollback when the moving direction of the smart glasses at the (N +1) th moment is different from the moving direction of the smart glasses at the previous N moments.
6. The method of controlling a pan/tilt head of smart glasses according to claim 5, wherein the step of determining whether the moving direction of the smart glasses at the (N +1) th time is the same as the moving direction of the smart glasses at the previous N times according to the acquired pose data Y (1), Y (2), … …, Y (N) of the smart glasses at the previous N times and the current pose data Y (N +1) comprises:
determining that the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments by judging that the difference value between the attitude data of the intelligent glasses at any two adjacent moments in the previous (N +1) moments is greater than or equal to zero; or
Determining that the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments by judging that the difference value between the posture data of the intelligent glasses at any two adjacent moments in the previous (N +1) moments is less than or equal to zero.
7. A control method of a pan-tilt head, the control method comprising:
judging whether the intelligent glasses have jitter or rollback according to the received attitude data sent by the intelligent glasses and the attitude data of the holder;
when the intelligent glasses shake or roll back, determining the attitude data P (N) of the holder at the Nth moment as target attitude data; and
and controlling the motion of the holder according to the target attitude data.
8. A control method of a pan-tilt head, characterized in that the control method further comprises:
judging whether the intelligent glasses have jitter or rollback according to the received attitude data sent by the intelligent glasses and the attitude data of the holder;
when the intelligent glasses do not shake or roll back, determining target posture data according to the current posture data X (N +1) of the intelligent glasses at the (N +1) th moment and the posture data P (N) of the holder at the N th moment; and
and controlling the motion of the holder according to the target attitude data.
9. The control method according to claim 7 or 8, wherein the step of determining whether the smart glasses have jitter or rollback according to the received gesture data sent by the smart glasses and the gesture data of the pan/tilt head comprises:
receiving current posture data X (N +1) of the intelligent glasses at the (N +1) th moment; and
and judging whether the intelligent glasses have shake or rollback or not according to the current posture data X (N +1), the received posture data X (1), X (2), … …, X (N) of the intelligent glasses at the previous N moments and the posture data P (N) of the tripod head at the Nth moment.
10. The method according to claim 9, wherein the step of determining whether there is a shake or a rollback of the smart glasses according to the current posture data X (N +1), the received posture data X (1), X (2), … …, X (N) of the smart glasses at the previous N times, and the posture data p (N) of the pan/tilt at the nth time comprises:
calculating whether the difference value between the current attitude data X (N +1) and the attitude data P (N) of the pan/tilt head at the Nth moment is larger than a preset threshold value; and
when the difference is greater than the predetermined threshold, determining that the smart glasses do not shake or roll back.
11. The method according to claim 9, wherein the step of determining whether there is a shake or a rollback of the smart glasses according to the current posture data X (N +1), the received posture data X (1), X (2), … …, X (N) of the smart glasses at the previous N times, and the posture data p (N) of the pan/tilt at the nth time comprises:
calculating whether the difference value between the current attitude data X (N +1) and the attitude data P (N) of the pan/tilt head at the Nth moment is larger than a preset threshold value;
when the difference is less than or equal to the predetermined threshold, judging whether the moving direction of the smart glasses at the (N +1) th time is the same as the moving direction of the smart glasses at the previous N times according to the received posture data X (1), X (2), … …, X (N) of the smart glasses at the previous N times and the current posture data X (N + 1);
determining that there is no shaking or rollback of the smart glasses when the moving direction of the smart glasses at the (N +1) th moment is the same as the moving direction of the smart glasses at the previous N moments; and
determining that the smart glasses have jitter or rollback when the moving direction of the smart glasses at the (N +1) th moment is different from the moving direction of the smart glasses at the previous N moments.
12. The method according to claim 11, wherein the step of determining whether the moving direction of the smart glasses at the (N +1) th time is the same as the moving direction of the smart glasses at the N previous time based on the received posture data X (1), X (2), … …, X (N) of the smart glasses at the N previous times and the current posture data X (N +1) when the difference is less than or equal to the predetermined threshold value comprises:
determining that the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments by judging that the difference value between the attitude data of the intelligent glasses at any two adjacent moments in the previous (N +1) moments is greater than or equal to zero; or
Determining that the movement direction of the intelligent glasses at the (N +1) th moment is the same as the movement direction of the intelligent glasses at the previous N moments by judging that the difference value between the posture data of the intelligent glasses at any two adjacent moments in the previous (N +1) moments is less than or equal to zero.
13. The utility model provides an intelligent glasses, intelligent glasses are used for controlling the cloud platform, a serial communication port, include:
one or more processors for performing the method of controlling a head of smart glasses according to any one of claims 1 to 6.
14. A head, comprising:
one or more processors configured to perform the control method of any one of claims 7 to 12.
15. A drone, characterized in that it comprises:
a body; and
a head according to claim 14, said head being disposed on said body.
CN202111061018.9A 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle Withdrawn CN113759948A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111061018.9A CN113759948A (en) 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202111061018.9A CN113759948A (en) 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle
CN201780035851.1A CN109313455B (en) 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle
PCT/CN2017/111367 WO2019095210A1 (en) 2017-11-16 2017-11-16 Smart glasses, method for controlling gimbal by means of same, gimbal, control method and unmanned aerial vehicle

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201780035851.1A Division CN109313455B (en) 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN113759948A true CN113759948A (en) 2021-12-07

Family

ID=65225742

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201780035851.1A Expired - Fee Related CN109313455B (en) 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle
CN202111061018.9A Withdrawn CN113759948A (en) 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201780035851.1A Expired - Fee Related CN109313455B (en) 2017-11-16 2017-11-16 Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle

Country Status (2)

Country Link
CN (2) CN109313455B (en)
WO (1) WO2019095210A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021016985A1 (en) * 2019-08-01 2021-02-04 深圳市大疆创新科技有限公司 Gimbal control method, controller, gimbal, unmanned mobile platform, and storage medium
CN113168193A (en) * 2020-06-08 2021-07-23 深圳市大疆创新科技有限公司 Holder control method, handheld holder and computer readable storage medium
CN113170051A (en) * 2020-06-08 2021-07-23 深圳市大疆创新科技有限公司 Holder control method, handheld holder and computer readable storage medium
CN113260942A (en) * 2020-09-22 2021-08-13 深圳市大疆创新科技有限公司 Handheld holder control method, handheld holder, system and readable storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426282A (en) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 Remote control method and terminal
CN104834249A (en) * 2015-03-16 2015-08-12 张时勉 Wearable remote controller
KR20170004407A (en) * 2015-07-02 2017-01-11 김성훈 system and method for automated reconnaissance
WO2017008776A1 (en) * 2015-07-14 2017-01-19 Gebäudereinigung Lissowski GmbH Cleaning device and method for cleaning a surface
WO2017020150A1 (en) * 2015-07-31 2017-02-09 深圳市大疆创新科技有限公司 Image processing method, device and camera
CN105323487A (en) * 2015-11-20 2016-02-10 广州亿航智能技术有限公司 Camera apparatus pointing direction control device
CN105487552B (en) * 2016-01-07 2019-02-19 深圳一电航空技术有限公司 The method and device of unmanned plane track up
WO2017185316A1 (en) * 2016-04-29 2017-11-02 深圳市大疆创新科技有限公司 First-person-view flight control method and system for unmanned aerial vehicle, and smart glasses
CN110505458A (en) * 2016-09-18 2019-11-26 深圳市大疆创新科技有限公司 The method and apparatus of image is provided in wearable device and loose impediment
CN106444810A (en) * 2016-10-31 2017-02-22 浙江大学 Unmanned plane mechanical arm aerial operation system with help of virtual reality, and control method for unmanned plane mechanical arm aerial operation system
CN106648068A (en) * 2016-11-11 2017-05-10 哈尔滨工业大学深圳研究生院 Method for recognizing three-dimensional dynamic gesture by two hands
CN106959110B (en) * 2017-04-06 2020-08-11 亿航智能设备(广州)有限公司 Cloud deck attitude detection method and device

Also Published As

Publication number Publication date
CN109313455A (en) 2019-02-05
WO2019095210A1 (en) 2019-05-23
CN109313455B (en) 2021-09-28

Similar Documents

Publication Publication Date Title
CN109313455B (en) Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle
US11480291B2 (en) Camera system using stabilizing gimbal
US11188101B2 (en) Method for controlling aircraft, device, and aircraft
US10394107B2 (en) Gimbal control method, gimbal control apparatus, and gimbal
CN111213002B (en) Cloud deck control method, equipment, cloud deck, system and storage medium
CN108521814B (en) Control method and controller of cloud deck and cloud deck
US10306211B2 (en) Remote control of pivotable stereoscopic camera
CN107431749B (en) Focus following device control method, device and system
US11272105B2 (en) Image stabilization control method, photographing device and mobile platform
CN110771143B (en) Control method of handheld cloud deck, handheld cloud deck and handheld equipment
WO2021098453A1 (en) Target tracking method and unmanned aerial vehicle
WO2018191963A1 (en) Remote control, camera mount, and camera mount control method, device, and system
JP2017072986A (en) Autonomous flying device, control method and program of autonomous flying device
CN105739544B (en) Course following method and device of holder
CN109076101B (en) Holder control method, device and computer readable storage medium
US20160368602A1 (en) Camera drone systems and methods for maintaining captured real-time images vertical
WO2020019212A1 (en) Video playback speed control method and system, control terminal, and mobile platform
CN110831860A (en) Control method of holder, aircraft and computer-readable storage medium
CN106060357B (en) Imaging device, unmanned aerial vehicle and robot
WO2019134155A1 (en) Image data processing method, device, platform, and storage medium
US20210256732A1 (en) Image processing method and unmanned aerial vehicle
CN113853740A (en) Motor, motor control method, computer-readable storage medium, and mechanical apparatus
CN117730543A (en) Image transmission method, device and equipment for movable platform
WO2022061592A1 (en) Method and apparatus for controlling and detecting motor, pan-tilt and movable platform
CN113795806A (en) Movable platform system and control method and device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20211207

WW01 Invention patent application withdrawn after publication