TWI596043B - Unmanned aerial device and orientation method thereof - Google Patents

Unmanned aerial device and orientation method thereof Download PDF

Info

Publication number
TWI596043B
TWI596043B TW105102138A TW105102138A TWI596043B TW I596043 B TWI596043 B TW I596043B TW 105102138 A TW105102138 A TW 105102138A TW 105102138 A TW105102138 A TW 105102138A TW I596043 B TWI596043 B TW I596043B
Authority
TW
Taiwan
Prior art keywords
unmanned aerial
aerial vehicle
sensor
time point
determining
Prior art date
Application number
TW105102138A
Other languages
Chinese (zh)
Other versions
TW201726495A (en
Inventor
何岡峯
謝光勳
蘇上欽
Original Assignee
和碩聯合科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 和碩聯合科技股份有限公司 filed Critical 和碩聯合科技股份有限公司
Priority to TW105102138A priority Critical patent/TWI596043B/en
Publication of TW201726495A publication Critical patent/TW201726495A/en
Application granted granted Critical
Publication of TWI596043B publication Critical patent/TWI596043B/en

Links

Description

無人航空器具及其校正定位方法 Unmanned aerial vehicle and its correction positioning method

本發明係關於一種無人航空器具及其校正定位方法,具體而言,本發明係關於一種可於拋出後進行飛行的無人航空器具及其校正定位方法。 The present invention relates to an unmanned aerial vehicle and a method for correcting and positioning the same, and more particularly to an unmanned aerial vehicle capable of flying after being thrown and a method for correcting and positioning the same.

近年來無人航具的發展日新月異,不論是四旋翼飛行器或直升機式飛行器,在製造技術上及成本控制上都日漸成熟。隨著無人航具的發展,其應用的層面亦日益廣泛。例如攝影及控測,都是無人航具常被應用的領域。 In recent years, the development of unmanned aerial vehicles has been changing with each passing day. Whether it is a four-rotor aircraft or a helicopter-type aircraft, it is becoming more mature in terms of manufacturing technology and cost control. With the development of unmanned aerial vehicles, the level of application is becoming more and more extensive. For example, photography and control are all areas where unmanned aerial vehicles are often used.

以攝影為例,隨著無人航具的價格下降,其在攝影上的應用已不被限於空拍航照或拍攝電影之用。一般大眾也能輕易的擁有無人航具來進行照片或影片拍攝。以目前常見的一個用途,就是以無人航具來對使用者進行自拍,亦即將無人航具拋至空中,再控制無人航具上的攝影模組來對使用者進行拍照。然而以這類拋擲式的無人航具而言,為確保拋入空中後無人航具上的攝影模組仍對準使用者,通常需在拋出前先調整使攝影模組對準使用者;再小心控制拋出的力道,避免因不當的力量控制而造成攝影模組的鏡頭轉向而無法對準使用者,使用上並不方便。即便在使用上非常注意,在拋出的同時無人航具仍可能因風向等外來因素而造成攝影模 組的鏡頭偏轉,而無法得到滿意的效果。 Taking photography as an example, as the price of unmanned aerial vehicles declines, its application in photography is not limited to aerial photography or filming. The general public can also easily own unmanned aerial vehicles for photo or film shooting. One of the common uses is to take a self-portrait with an unmanned aerial vehicle, and to throw the unmanned aerial vehicle into the air, and then control the camera module on the unmanned aerial vehicle to take pictures of the user. However, in the case of such a throwing unmanned aerial vehicle, in order to ensure that the photographic module on the unmanned aerial vehicle is still aimed at the user after being thrown into the air, it is usually necessary to adjust the photographic module to the user before throwing; Carefully control the thrown force to avoid the lens of the camera module being turned due to improper force control and unable to align with the user. It is not convenient to use. Even if you pay great attention to the use, the unmanned aerial vehicle may still cause the photography mode due to external factors such as wind direction. The lens of the group is deflected and cannot be satisfactorily obtained.

本發明之一目的在於提供一種無人航空器具及其定位校正方法,可減少拋出時偏轉造成的影響。 An object of the present invention is to provide an unmanned aerial vehicle and a positioning correction method thereof, which can reduce the influence of deflection at the time of throwing.

無人航空器具包含飛行動力模組、第一感測器、第二感測器、第三感測器及處理器。第一感測器、第二感測器及第三感測器分別與處理器訊號連接。第一感測器係可依無人航空器具之移動產生加速度變化記錄,並傳送給處理器。第二感測器可依無人航空器具之移動產生旋轉角度變化記錄,並傳送給處理器。第三感測器可提供絕對方位的定位,並傳送給處理器。無人航空器具上定有一參考面向;當無人航空器具進行轉動時,參考面向即隨著轉向。 The unmanned aerial vehicle includes a flight power module, a first sensor, a second sensor, a third sensor, and a processor. The first sensor, the second sensor, and the third sensor are respectively connected to the processor signal. The first sensor can generate an acceleration change record based on the movement of the unmanned aerial vehicle and transmit it to the processor. The second sensor can record a rotation angle change according to the movement of the unmanned aerial vehicle and transmit it to the processor. The third sensor provides absolute orientation and is transmitted to the processor. The unmanned aerial vehicle has a reference surface; when the unmanned aerial vehicle rotates, the reference surface turns.

當無人航空器具在拋出位置被拋出後,處理器判斷被拋出之第一時點以及在空中停懸後之第二時點;並依據加速度變化記錄及旋轉角度變化記錄,判斷第一時點及第二時點間的相對位置變化關係。接著依據絕對方位及相對位置變化關係,決定校正旋轉量。最後以校正旋轉量控制飛行動力模組,使參考面向朝向拋出位置接近。 When the unmanned aerial vehicle is thrown at the throwing position, the processor determines the first time point to be thrown and the second time point after the suspension in the air; and judges the first time according to the acceleration change record and the rotation angle change record. The relative position change relationship between the point and the second time point. Then, the corrected rotation amount is determined based on the relationship between the absolute orientation and the relative position. Finally, the flight power module is controlled by correcting the rotation amount so that the reference surface approaches the throwing position.

校正定位方法包含:判斷無人航空器具被拋出之第一時點;判斷無人航空器具在空中停懸後之第二時點;依據第一感應器提供之加速度變化記錄及第二感應器提供之旋轉角度變化記錄,判斷第一時點及第二時點間無人航空器具的相對位置變化關係;依據第三感應器提供之絕對方位及相對位置變化關係,判斷校正旋轉量;以及依據校正旋轉量控制無人航空器具旋轉,使參考面向朝向拋出位置接近。 The calibration positioning method comprises: determining a first time point when the unmanned aerial vehicle is thrown; determining a second time point after the unmanned aerial vehicle is suspended in the air; recording the acceleration change provided by the first sensor and the rotation provided by the second sensor The angle change record determines the relative position change relationship of the unmanned aerial vehicle between the first time point and the second time point; determines the corrected rotation amount according to the absolute position and relative position change relationship provided by the third sensor; and controls the unmanned according to the corrected rotation amount The aircraft is rotated so that the reference face approaches the throwing position.

藉由前述無人航空器具及其定位校正方法,無人航空器具在被拋出後即可較準確地旋轉到使用者所期待的面向,或調整其在空中的位置,以減少因使用者拋出時施力方式而造成的影響。 By the aforementioned unmanned aerial vehicle and its positioning correction method, the unmanned aerial vehicle can be rotated more accurately to the desired surface of the user after being thrown, or adjust its position in the air to reduce the time when the user throws The effect of the way of exerting force.

100‧‧‧無人航空器具 100‧‧‧Unmanned aerial vehicles

101‧‧‧本體 101‧‧‧ Ontology

110‧‧‧第一感應器 110‧‧‧First sensor

120‧‧‧第二感應器 120‧‧‧Second sensor

130‧‧‧第三感應器 130‧‧‧ third sensor

150‧‧‧處理器 150‧‧‧ processor

170‧‧‧飛行動力模組 170‧‧‧ Flight Power Module

190‧‧‧儲存裝置 190‧‧‧ storage device

160‧‧‧觸發單元 160‧‧‧Trigger unit

200‧‧‧參考面向 200‧‧‧Reference-oriented

300‧‧‧影像擷取單元 300‧‧‧Image capture unit

500‧‧‧使用者 500‧‧‧Users

710‧‧‧特徵物件 710‧‧‧Characteristics

720‧‧‧期待物件 720‧‧‧ Looking forward to objects

圖1為無人航空器具之實施例示意圖;圖2為校正定位方法之實施例流程圖;圖3A至圖3C為使用者操作無人航空器具之實施例示意圖;圖4為無人航空器具在空中穩定停懸時之實施例俯視示意圖;圖5為無人航空器具之另一實施例示意圖;圖6為校正定位方法之另一實施例流程圖;圖7為無人航空器具操作狀態之實施例示意圖;圖8A為校正定位方法之另一實施例流程圖;圖8B為影像比對辨識之實施例示意圖;圖8C為無人航空器具操作狀態之實施例示意圖;圖9為無人航空器具之另一實施例示意圖。 1 is a schematic view of an embodiment of an unmanned aerial vehicle; FIG. 2 is a flow chart of an embodiment of a method for correcting positioning; FIG. 3A to FIG. 3C are schematic diagrams of an embodiment of a user operating an unmanned aerial vehicle; FIG. FIG. 5 is a schematic view of another embodiment of an unmanned aerial vehicle; FIG. 6 is a flow chart of another embodiment of a method for correcting positioning; FIG. 7 is a schematic view of an embodiment of an operating state of an unmanned aerial vehicle; FIG. 8B is a schematic diagram of an embodiment of image comparison identification; FIG. 8C is a schematic diagram of an embodiment of an unmanned aerial vehicle operating state; FIG. 9 is a schematic diagram of another embodiment of an unmanned aerial vehicle.

本發明提供一種無人航空器具及其校正定位方法。在較佳實施例中,無人航空器具包含四旋翼飛行器,但不以此為限,亦可包含直升機或其他型式的無人航空器具。 The invention provides an unmanned aerial vehicle and a method for correcting and positioning the same. In a preferred embodiment, the unmanned aerial vehicle includes a quadrotor, but is not limited thereto, and may include a helicopter or other type of unmanned aerial vehicle.

圖1為無人航空器具之實施例示意圖。如圖1所示,無人航空器具100包含本體101及飛行動力模組170。本體101內設置有第一感測器 110、第二感測器120、第三感測器130及處理器150。第一感測器110、第二感測器120及第三感測器130分別與處理器150訊號連接。其中第一感測器110係可依無人航空器具100之移動產生加速度變化記錄,並傳送給處理器150。在較佳實施例中,第一感測器110係為重力感測器(G sensor),可量測三軸上的加速度。第二感測器120可依無人航空器具100之移動產生旋轉角度變化記錄,並傳送給處理器150。在較佳實施例中,第二感測器120為陀螺儀(Gyroscope),可量測角速度,以判斷在某一軸上的旋轉變化量。但在不同實施例中,第二感測器120亦可為兩個或多個其他角速度感測器合併而成,以提供角速度的數值。第三感測器130可提供絕對方位的定位,並傳送給處理器150。在較佳實施例中,第三感測器130係為電子羅盤(E-compass)。如圖1所示,較佳地,第一感應器110或第二感應器120比第三感應器130設置較靠近無人航空器具100的重心位置,以提高數值量測的準確性。 Figure 1 is a schematic illustration of an embodiment of an unmanned aerial vehicle. As shown in FIG. 1 , the unmanned aerial vehicle 100 includes a body 101 and a flight power module 170 . a first sensor is disposed in the body 101 110. The second sensor 120, the third sensor 130, and the processor 150. The first sensor 110, the second sensor 120, and the third sensor 130 are respectively connected to the processor 150. The first sensor 110 can generate an acceleration change record according to the movement of the unmanned aerial vehicle 100 and transmit it to the processor 150. In the preferred embodiment, the first sensor 110 is a G sensor that measures the acceleration on the three axes. The second sensor 120 can generate a rotation angle change record according to the movement of the unmanned aerial vehicle 100 and transmit it to the processor 150. In the preferred embodiment, the second sensor 120 is a gyroscope that measures the angular velocity to determine the amount of change in rotation on a certain axis. However, in various embodiments, the second sensor 120 can also be combined with two or more other angular velocity sensors to provide values of angular velocity. The third sensor 130 can provide absolute orientation and communicate to the processor 150. In the preferred embodiment, the third sensor 130 is an electronic compass (E-compass). As shown in FIG. 1 , preferably, the first inductor 110 or the second inductor 120 is disposed closer to the center of gravity of the unmanned aerial vehicle 100 than the third inductor 130 to improve the accuracy of the numerical measurement.

如圖1所示,無人航空器具100上定有一參考面向200,亦即本體101上一特定表面位置的朝向。當本體101進行轉動時,參考面向200即隨著特定表面位置的轉動而轉向。在此較佳實施例中,本體101上在此特定表面位置設置有影像擷取單元300,影像擷取單元300中鏡頭的朝向即等同於參考面向200。然而在不同實施例中,在此特定表面位置亦可設置其他的電子單元,均可以本發明提供的校正定位方法來進行旋轉定位。在較佳實施例中,第三感應器130所提供的絕對方位即以參考面向200作為量測的依據;換言之,第三感應器130所提供的是參考面向200在測量當下的絕對方位。 As shown in FIG. 1, the unmanned aerial vehicle 100 has a reference facing surface 200, that is, the orientation of a particular surface location on the body 101. When the body 101 is rotated, the reference facing 200 is turned as the position of the particular surface is rotated. In the preferred embodiment, the image capturing unit 300 is disposed on the body 101 at the specific surface position. The orientation of the lens in the image capturing unit 300 is equivalent to the reference surface 200. However, in different embodiments, other electronic units may be disposed at the specific surface position, and the rotational positioning may be performed by the correction positioning method provided by the present invention. In the preferred embodiment, the absolute orientation provided by the third inductor 130 is based on the reference orientation 200 as a basis for measurement; in other words, the third inductor 130 provides the absolute orientation of the reference orientation 200 at the moment of measurement.

飛行動力模組170係連接於本體101,且受處理器150的控制 來調整動力及飛行方向。以四旋翼飛行器為例,飛行動力模組170共會有四組螺旋槳來提供飛行動力及控制移動方向。 The flying power module 170 is connected to the body 101 and controlled by the processor 150. To adjust the power and direction of flight. Taking a quadrotor as an example, the flight power module 170 has a total of four sets of propellers to provide flight power and control the direction of movement.

圖2所示為校正定位方法之實施例流程圖。步驟2010包含將無人航空器具從拋出位置拋至空中進行飛行。步驟2030則包含判斷無人航空器具被拋出之時點作為第一時點。如圖3A所示,使用者500準備在拋出位置將航空器具100拋出。在較佳實施例中,處理器150係根據第一感應器110提供的加速度變化記錄及第二感應器120提供的旋轉角度變化記錄來判斷使用者脫手拋出的第一時點。舉例而言,使用者500可能在拋出前會先前後往復移動無人航空器具100,以瞄準拋出的方向並且衡量拋出的力量。當加速度變化記錄及旋轉角度變化記錄被處理器150綜合分析後顯示這種前後往復的移動路徑出現時,則在無人航空器具100的軌跡脫離此種路徑的瞬間被判斷為第一時點。換言之,由於處理器150可由加速度變化記錄及旋轉角度變化記錄組合出無人航空器具100的移動軌跡,因此可以移動軌跡的特徵來判斷出第一時點。然而在另一個例子中,處理器150也可以僅根據第一感應器110提供的加速度變化記錄或第二感應器120提供的旋轉角度變化記錄來判斷使用者脫手拋出的第一時點,例如在某一時間點所測得的加速度值達到最大,或在某一時間點之後的某一方向加速度值突然變化,則判斷此點為第一時點。 2 is a flow chart showing an embodiment of a method of correcting positioning. Step 2010 involves dropping the unmanned aerial vehicle from the throw position into the air for flight. Step 2030 includes determining when the unmanned aerial vehicle is thrown as the first time point. As shown in Figure 3A, the user 500 is ready to throw the aircraft 100 out at the throw position. In the preferred embodiment, the processor 150 determines the first time point thrown by the user according to the acceleration change record provided by the first sensor 110 and the rotation angle change record provided by the second sensor 120. For example, the user 500 may reciprocate the unmanned aerial vehicle 100 before and after the throw to aim the direction of the throw and measure the thrown force. When the acceleration change record and the rotation angle change record are comprehensively analyzed by the processor 150 to display such a reciprocating movement path, the first time point is determined when the trajectory of the unmanned aerial vehicle 100 is out of the path. In other words, since the processor 150 can combine the movement change trajectory of the unmanned aerial vehicle 100 from the acceleration change record and the rotation angle change record, the feature of the trajectory can be moved to determine the first time point. In another example, the processor 150 may also determine the first time point thrown by the user according to the acceleration change record provided by the first sensor 110 or the rotation angle change record provided by the second sensor 120, for example. When the measured acceleration value reaches a maximum at a certain time point, or the acceleration value in a certain direction suddenly changes after a certain time point, it is judged that the point is the first time point.

在較佳實施例中,加速度變化記錄包含在一段連續時間中各不同時間點上的加速度值;換言之,加速度變化記錄的形式可以一連串不同時間點上的加速度值的方式來呈現。因為各時間點角速度值的變化與旋轉角度相關,因此旋轉角度變化記錄較佳亦可以類似上述加速度變化記錄 的方式來呈現,例如可包含在一段連續時間中各不同時間點上的角速度值,但不以此為限。 In a preferred embodiment, the acceleration change record includes acceleration values at various points in time over a continuous period of time; in other words, the form of the acceleration change record can be presented in a series of acceleration values at different points in time. Since the change of the angular velocity value at each time point is related to the rotation angle, the rotation angle change record is preferably similar to the above acceleration change record. The manner of presenting, for example, may include angular velocity values at different time points in a continuous period of time, but is not limited thereto.

圖3B所表示的是當使用者500在第一時點脫手拋出無人航空器具100後,處理器150由加速度變化記錄、旋轉角度變化記錄或兩者的綜合感應到已被拋出,則處理器150立即控制飛行動力模組170逐漸提供飛行動力,使得無人航空器具100在沿拋出軌跡移動時,為自主飛行作準備。如圖3C所示,當無人航空器具100到達滯空位置時,則處理器150控制飛行動力模組170提供飛行動力,而將無人航空器具100穩定在此空中的滯空位置。當無人航空器具100在空中穩定停懸後,則進行步驟2050,判斷無人航空器具100在空中穩定停懸的第二時點。「停懸」在此是指無人航空器具100在一定高度上保持空間位置基本不變的飛行狀態,亦可稱為「懸滯」或「滯空」。在較佳實施例中,處理器150係根據第一感應器110提供的加速度變化記錄及第二感應器120提供的旋轉角度變化記錄的綜合來判斷此第二時點,例如根據加速度變化記錄及旋轉角度變化記錄組合出無人航空器具100的行徑軌跡,以從其中的特徵來判斷出在空中穩定停懸的第二時點。然而在不同實施例中,亦可單以加速度變化記錄或旋轉角度變化記錄來判斷出第二時點。 FIG. 3B shows that when the user 500 throws out the unmanned aerial vehicle 100 at the first time, the processor 150 is processed by the acceleration change record, the rotation angle change record, or the combination of the two, and then processed. The controller 150 immediately controls the flight power module 170 to gradually provide flight power so that the unmanned aerial vehicle 100 prepares for autonomous flight as it moves along the throwing trajectory. As shown in FIG. 3C, when the unmanned aerial vehicle 100 reaches the dead position, the processor 150 controls the flight power module 170 to provide flight power while stabilizing the unmanned aerial vehicle 100 in the airborne position in the air. When the unmanned aerial vehicle 100 is stably suspended in the air, step 2050 is performed to determine the second time point when the unmanned aerial vehicle 100 is stably suspended in the air. "Stop" refers to a flight state in which the unmanned aerial vehicle 100 maintains a substantially constant spatial position at a certain height, and may also be referred to as "stagnation" or "stagnation". In a preferred embodiment, the processor 150 determines the second time point based on the combination of the acceleration change record provided by the first sensor 110 and the rotation angle change record provided by the second sensor 120, for example, recording and rotating according to the acceleration change. The angular change record combines the path trajectory of the unmanned aerial vehicle 100 to determine the second time point in which the suspension is stably suspended in the air. However, in different embodiments, the second time point may also be determined by the acceleration change record or the rotation angle change record.

步驟2060包含依據加速度變化記錄及角度變化記錄,判斷第一時點及第二時點間無人航空器具100的相對位置變化關係。在較佳實施例中,相對位置變化關係可以為滯空位置相對於拋出位置的方向向量。當無人航空器具100穩定的位於滯空位置時,如圖3C所示,滯空位置相對於使用者所在拋出位置間的方向向量為△d,亦即兩個位置間的相對位置變化關 係。△d可分解為水平方向上的△dxy及垂直方向上的△dz,而△dxy及△dz均可以加速度變化記錄及旋轉角度變化記錄組合後得出,進而求得△d。 Step 2060 includes determining a relative position change relationship of the unmanned aerial vehicle 100 between the first time point and the second time point according to the acceleration change record and the angle change record. In a preferred embodiment, the relative position change relationship may be a direction vector of the dead position relative to the throw position. When the unmanned aerial vehicle 100 is stably located in the vacant position, as shown in FIG. 3C, the direction vector between the vacant position and the throwing position of the user is Δd, that is, the relative positional change relationship between the two positions. Δd can be decomposed into Δd xy in the horizontal direction and Δd z in the vertical direction, and Δd xy and Δd z can be obtained by combining the acceleration change record and the rotation angle change record, and then obtaining Δd.

步驟2070包含依據第三感應器130提供的絕對方位及前述的相對位置變化關係△d來決定校正旋轉量。如圖4的俯視圖所示,當無人航空器具100位於滯空位置時,第三感應器130較佳可以參考面向200為參考點,並量測出參考點在每一時點的絕對方位。然而在不同實施例中,第三感應器130亦可以本體101上的其他參考位置為參考點以量測出絕對方位。 根據所測得的絕對方位,即可以得到參考面向200目前的方向向量V。在此實施例中,依據下述公式,處理器150即可得出θ作為校正旋轉量。向量內積公式如下:V.△dxy=|V||△dxy|cos θ Step 2070 includes determining the corrected rotation amount according to the absolute orientation provided by the third inductor 130 and the aforementioned relative position change relationship Δd. As shown in the top view of FIG. 4, when the unmanned aerial vehicle 100 is in the dead position, the third sensor 130 preferably refers to the reference surface 200 as a reference point, and measures the absolute position of the reference point at each time point. However, in different embodiments, the third sensor 130 may also use other reference positions on the body 101 as reference points to measure the absolute position. Based on the measured absolute orientation, the current direction vector V of the reference face 200 can be obtained. In this embodiment, the processor 150 can derive θ as the corrected rotation amount according to the following formula. The vector inner product formula is as follows: V . △dxy=| V ||Δdxy|cos θ

步驟2090包含以校正旋轉量控制無人航空器具100旋轉,使參考面向200朝向拋出位置。在圖4所示的實施例中,處理器150係依照校正旋轉量θ控制飛行動力模組170,使得參考面向200旋轉至朝向位在拋出位置的使用者500。此時以影像擷取單元拍攝照片,則可將使用者包含在照片的範圍內。 Step 2090 includes controlling the unmanned aerial vehicle 100 to rotate with a corrected amount of rotation such that the reference facing 200 faces the throwing position. In the embodiment illustrated in FIG. 4, the processor 150 controls the flight power module 170 in accordance with the corrected amount of rotation θ such that the reference face 200 is rotated to the user 500 facing the position in the throw position. At this time, taking a picture with the image capture unit can include the user in the range of the photo.

在圖5所示的實施例中,無人航空器具100更包含有儲存裝置190連接於處理器150。儲存裝置190可為內建或外接式的記憶體、硬碟或類似裝置,供儲存數位資料之用。儲存裝置190內儲存有期待相對位置變化關係。「期待相對位置變化關係」較佳可為無人航空器具100被設定與使用者 500在水平方向上或垂直方向上的相對位置差,例如無人航空器具100被設定應飛至使用者500上空3公尺,水平方向上距離使用者5公尺,以取得較佳的照片拍攝距離。因此在圖6的實施例中,校正定位方法可進一步包含步驟6010,比對前述的相對位置變化關係及期待相對位置變化關係,以產生校正位移量。接著在步驟6030中,處理器150即可根據校正位移量來控制飛行動力模組170,使無人航空器具100移動至原本設定的高度及距離,如圖7中的虛線位置所示。此外,為了配合不同的拍攝模式,例如廣角模式或特寫模式,儲存裝置190中可存有數組不同的期待相對位置變化關係供選用,以配合不同拍攝模式上對於飛行高度及距離的不同要求。 In the embodiment shown in FIG. 5, the unmanned aerial vehicle 100 further includes a storage device 190 coupled to the processor 150. The storage device 190 can be a built-in or external memory, hard disk or the like for storing digital data. The storage device 190 stores an expected relative position change relationship. The "expected relative position change relationship" is preferably set for the unmanned aerial vehicle 100 to be set and used by the user. The relative position difference of 500 in the horizontal direction or the vertical direction, for example, the unmanned aerial vehicle 100 is set to fly 3 meters above the user 500, and 5 meters away from the user in the horizontal direction to obtain a better photo shooting distance. . Therefore, in the embodiment of FIG. 6, the correction positioning method may further include a step 6010 of comparing the aforementioned relative position change relationship and the expected relative position change relationship to generate a corrected displacement amount. Next, in step 6030, the processor 150 can control the flight power module 170 according to the corrected displacement amount to move the unmanned aerial vehicle 100 to the originally set height and distance, as shown by the broken line position in FIG. In addition, in order to cooperate with different shooting modes, such as a wide-angle mode or a close-up mode, the storage device 190 may have different arrays of expected relative position change relationships to be selected to match different requirements for flight height and distance in different shooting modes.

圖8A為校正定位方法的另一實施例。在此實施例中可進一步包含步驟8010:影像擷取單元朝向拋出位置擷取影像。如圖8B所示,所擷取的影像可能包含使用者500或其他在拋出位置方向上的物件。在步驟8030中,處理器150對影像進行分析,以辨識出影像中的特徵物件710,例如人臉。處理器150並會進一步分析取得特徵物件710的基本資訊。在較佳實施例中,基本資訊包含物件尺寸、所佔畫素或其他可能分析出的資訊。接著在步驟8050中,處理器150比對物件基本資訊及儲存於儲存裝置190內的期待物件基本資訊,以產生校正位移量。期待物件基本資訊較佳可為被設定在拍攝照片中物件應該有的尺寸、所佔畫素或其他基本資訊。如圖8B所示,係以特徵物件710的尺寸作為特徵物件710基本資訊,而期待物件720基本資訊則為在照片中特徵物件710應該有的尺寸。經由比較兩者的落差,即可得出校正位移量,例如應該朝使用者500移動一定距離,方能在拍攝的照片中得出符合預期的特徵物件710基本資訊。接著在步驟8070中,處理器 150即根據校正位移量來控制飛行動力模組,使無人航空器具移至適當的操作位置,如圖8C所示。 Fig. 8A is another embodiment of a correction positioning method. In this embodiment, the method further includes the step 8010: the image capturing unit captures the image toward the throwing position. As shown in FIG. 8B, the captured image may include the user 500 or other objects in the direction of the throw position. In step 8030, the processor 150 analyzes the image to identify feature objects 710 in the image, such as a human face. The processor 150 further analyzes the basic information of the feature object 710. In a preferred embodiment, the basic information includes object size, occupied pixels, or other information that may be analyzed. Next, in step 8050, the processor 150 compares the object basic information with the expected object basic information stored in the storage device 190 to generate a corrected displacement amount. It is expected that the basic information of the object can be set to the size, the pixel, or other basic information that the object should have in the photograph. As shown in FIG. 8B, the size of the feature object 710 is taken as the basic information of the feature object 710, and the basic information of the object 720 is expected to be the size that the feature object 710 should have in the photo. By comparing the difference between the two, the corrected displacement amount can be obtained, for example, a certain distance should be moved toward the user 500, so as to obtain the basic information of the characteristic object 710 that meets the expected result in the photograph taken. Next in step 8070, the processor 150 controls the flight power module according to the corrected displacement amount to move the unmanned aerial vehicle to an appropriate operating position, as shown in FIG. 8C.

圖9所示為無人航空器具的另一實施例。在此實施例中,無人航空器具100更包含有觸發單元160,且較佳設置於本體101上。觸發單元160可因應使用者的拋出動作而被觸發,以產生觸發訊號。處理器150即可根據觸發訊號來判斷被拋出的第一時點。以較佳實施例而言,觸發單元160可以為按鈕。當使用者手持無人航空器具100時,即可以手壓住觸發單元160;當拋出無人航空器具100時,則觸發單元160可因使用者的手鬆開而生觸發訊號。然而在不同實施例中,觸發單元160亦可為紅外線遮斷裝置、壓電材料或其他類的電子裝置。 Figure 9 shows another embodiment of an unmanned aerial vehicle. In this embodiment, the unmanned aerial vehicle 100 further includes a trigger unit 160 and is preferably disposed on the body 101. The trigger unit 160 can be triggered according to the user's throwing action to generate a trigger signal. The processor 150 can determine the first time point to be thrown according to the trigger signal. In the preferred embodiment, trigger unit 160 can be a button. When the user holds the unmanned aerial vehicle 100, the trigger unit 160 can be pressed by hand; when the unmanned aerial vehicle 100 is thrown, the trigger unit 160 can generate a trigger signal due to the release of the user's hand. However, in various embodiments, the trigger unit 160 can also be an infrared occlusion device, a piezoelectric material, or other types of electronic devices.

藉由前述無人航空器具及其定位校正方法,無人航空器具在被拋出後即可較準確地將參考面向旋轉到使用者所期待的方向,減少因使用者拋出時施力方式而造成的影響。此外,配合前述的各實施例,亦可進一步使無人航空器具在被拋出後得以自行調整位置,以符合使用者的需要。 With the aforementioned unmanned aerial vehicle and its positioning correction method, the unmanned aerial vehicle can accurately rotate the reference surface to the direction expected by the user after being thrown, thereby reducing the force applied by the user when throwing. influences. In addition, with the foregoing embodiments, the unmanned aerial vehicle can be further adjusted to position itself after being thrown to meet the needs of the user.

本發明已由上述實施例加以描述,然而上述實施例僅為例示目的而非用於限制。熟此技藝者當知在不悖離本發明精神下,於此特別說明的實施例可有例示實施例的其他修改。因此,本發明範疇亦涵蓋此類修改且僅由所附申請專利範圍限制。 The present invention has been described by the above embodiments, but the above embodiments are for illustrative purposes only and are not intended to be limiting. It will be apparent to those skilled in the art that the embodiments specifically described herein may have other modifications of the embodiments. Accordingly, the scope of the invention is intended to cover such modifications and are only limited by the scope of the appended claims.

100‧‧‧無人航空器具 100‧‧‧Unmanned aerial vehicles

170‧‧‧飛行動力模組 170‧‧‧ Flight Power Module

200‧‧‧參考面向 200‧‧‧Reference-oriented

500‧‧‧使用者 500‧‧‧Users

Claims (13)

一種無人航空器具校正定位方法,其中該無人航空器具上設置有一第一感應器、一第二感應器及一第三感應器,並具有一參考面向,該無人航空器具自一拋出位置被拋至空中進行飛行,該方法包含下列步驟:判斷該無人航空器具被拋出之一第一時點;判斷該無人航空器具在空中停懸後之一第二時點;依據該第一感應器所提供之加速度變化記錄及該第二感應器所提供之旋轉角度變化記錄,判斷該第一時點及該第二時點間該無人航空器具的一相對位置變化關係;依據該第三感應器所提供之一絕對方位及該相對位置變化關係,判斷一校正旋轉量;依據該校正旋轉量控制該無人航空器具旋轉,使該參考面向朝向該拋出位置;以該無人航空器具於該參考面向上設置之一影像擷取單元朝向該拋出位置擷取一影像;分析該影像以辨識出該影像中之一特徵物件並取得該物件之一物件基本資訊;比對該物件基本資訊及儲存於該無人航空器具內一儲存裝置中之一期待物件基本資訊,以產生一校正位移量;以及根據該校正位移量控制該無人航空器具移動。 An unmanned aerial vehicle correction positioning method, wherein the unmanned aerial vehicle is provided with a first inductor, a second inductor and a third inductor, and has a reference surface, the unmanned aerial vehicle is thrown from a throwing position Flying to the air, the method comprising the steps of: determining a first time point at which the unmanned aerial vehicle is thrown; determining a second time point after the unmanned aerial vehicle is suspended in the air; providing according to the first sensor And determining a relative position change relationship of the unmanned aerial vehicle between the first time point and the second time point according to the acceleration change record and the rotation angle change record provided by the second sensor; according to the third sensor Determining a corrected rotation amount according to an absolute orientation and the relative position change relationship; controlling the unmanned aerial vehicle rotation according to the corrected rotation amount to face the reference position toward the throwing position; and setting the unmanned aerial vehicle to the reference surface An image capturing unit captures an image toward the throwing position; analyzing the image to identify a feature in the image and taking Basic information of one of the objects; basic information about the object and one of the storage devices stored in the unmanned aerial vehicle, expecting the basic information of the object to generate a corrected displacement amount; and controlling the unmanned aerial based on the corrected displacement amount The appliance moves. 如申請專利範圍第1項所述之校正定位方法,其中該判斷該相對位置變化 關係之步驟包含該第三感應器提供該參考面向之該絕對方位。 The method for correcting positioning according to claim 1, wherein the determining the relative position change The step of the relationship includes the third sensor providing the absolute orientation of the reference face. 如申請專利範圍第1項所述之校正定位方法,其中該判斷該第一時點之步驟包含依據該第一感應器提供之加速度變化記錄、該第二感應器提供之旋轉角度變化記錄或兩者的綜合來判斷該第一時點。 The method for correcting positioning according to claim 1, wherein the step of determining the first time point comprises recording an acceleration change according to the first sensor, a rotation angle change record provided by the second sensor, or two The combination of the person to judge the first point in time. 如申請專利範圍第1項所述之校正定位方法,其中該判斷該第一時點之步驟包含:於該無人航空器具上設置一觸發單元;於拋出該無人航空器具時同時觸發該觸發單元,以判斷出該第一時點。 The method for correcting positioning according to claim 1, wherein the step of determining the first time point comprises: setting a trigger unit on the unmanned aerial vehicle; and simultaneously triggering the trigger unit when the unmanned aerial vehicle is thrown To determine the first point in time. 如申請專利範圍第1項所述之校正定位方法,其中該判斷該第二時點之步驟包含依據該第一感應器提供之加速度變化記錄、該第二感應器提供之旋轉角度變化記錄或兩者的綜合來判斷該第二時點。 The method for correcting positioning according to claim 1, wherein the step of determining the second time point comprises recording an acceleration change according to the first sensor, a rotation angle change record provided by the second sensor, or both The synthesis is to judge the second time. 如申請專利範圍第1項所述之校正定位方法,進一步包含:比對該相對位置變化關係及儲存於該無人航空器具內一儲存裝置中之一期待相對位置變化關係,以產生一校正位移量;以及根據該校正位移量控制該無人航空器具移動。 The method for correcting positioning according to claim 1, further comprising: comparing a relative position change relationship and an expected relative position change relationship stored in a storage device in the unmanned aerial vehicle to generate a corrected displacement amount. And controlling the movement of the unmanned aerial vehicle based on the corrected displacement amount. 一種無人航空器具,自一拋出位置被拋至空中進行飛行,該無人航空器具設定有一參考面向,並包含:一飛行動力模組,提供飛行動力;一第一感應器,提供一加速度變化記錄;一第二感應器,提供一旋轉角度變化記錄;一第三感應器,提供一絕對方位;一影像擷取單元,朝向該拋出位置擷取一影像; 一儲存裝置,儲存一期待物件基本資訊;以及一處理器,判斷被拋出之一第一時點以及在空中停懸後之一第二時點;並依據該加速度變化記錄及該旋轉角度變化記錄,判斷該第一時點及該第二時點間的一相對位置變化關係;並依據該絕對方位及該相對位置變化關係,判斷一校正旋轉量;以及判斷該校正旋轉量控制該飛行動力模組,使該參考面向朝向該拋出位置;該處理器分析該影像以辨識出該影像中之一特徵物件並取得該物件之一物件基本資訊;並比對該物件基本資訊及該期待物件基本資訊,以產生一校正位移量;以及根據該校正位移量控制該飛行動力模組進行移動。 An unmanned aerial vehicle is thrown into the air for flight from a throwing position, the unmanned aerial vehicle is provided with a reference surface and includes: a flight power module providing flight power; and a first sensor providing an acceleration change record a second sensor providing a rotation angle change record; a third sensor providing an absolute orientation; an image capture unit for capturing an image toward the throwing position; a storage device storing a basic information of the expected object; and a processor determining a first time point to be thrown and a second time point after being suspended in the air; and recording according to the acceleration change and the rotation angle change record Determining a relative position change relationship between the first time point and the second time point; determining a corrected rotation amount according to the absolute position and the relative position change relationship; and determining the corrected rotation amount to control the flight power module Orienting the reference surface toward the throwing position; the processor analyzes the image to identify one of the feature objects in the image and obtain basic information about the object; and compares the basic information of the object with the basic information of the expected object And generating a corrected displacement amount; and controlling the flight power module to move according to the corrected displacement amount. 如申請專利範圍第7項所述之無人航空器具,其中該第三感應器提供該參考面向之該絕對方位。 The unmanned aerial vehicle of claim 7, wherein the third sensor provides the absolute orientation of the reference face. 如申請專利範圍第7項所述之無人航空器具,其中該處理器係依據該加速度變化記錄、該旋轉角度變化記錄或兩者的綜合來判斷該第一時點。 The unmanned aerial vehicle of claim 7, wherein the processor determines the first time point based on the acceleration change record, the rotation angle change record, or a combination of the two. 如申請專利範圍第8項所述之無人航空器具,其中該處理器依據該加速度變化記錄、該旋轉角度變化記錄或兩者的綜合來判斷該第二時點。 The unmanned aerial vehicle of claim 8, wherein the processor determines the second time point based on the acceleration change record, the rotation angle change record, or a combination of the two. 如申請專利範圍第7項所述之無人航空器具,進一步包含一觸發單元,該觸發單元因應拋出動作而產生一觸發訊號,該處理器依據該觸發訊號以判斷出該第一時點。 The unmanned aerial vehicle of claim 7, further comprising a trigger unit, wherein the trigger unit generates a trigger signal according to the triggering action, and the processor determines the first time point according to the trigger signal. 如申請專利範圍第7項所述之無人航空器具,進一步包含一重心位置;其中該第一感應器或該第二感應器較該第三感應器設置靠近該重心位置。 The unmanned aerial vehicle of claim 7, further comprising a center of gravity position; wherein the first sensor or the second sensor is disposed closer to the center of gravity than the third sensor. 如申請專利範圍第7項所述之無人航空器具,進一步包含一儲存裝置,該儲存裝置儲存有一期待相對位置變化關係;其中,該處理器比對該相對 位置變化關係及該期待相對位置變化關係,以產生一校正位移量;以及根據該校正位移量控制該飛行動力模組進行移動。 The unmanned aerial vehicle of claim 7, further comprising a storage device, wherein the storage device stores an expected relative position change relationship; wherein the processor compares the relative a position change relationship and the expected relative position change relationship to generate a corrected displacement amount; and controlling the flight power module to move according to the corrected displacement amount.
TW105102138A 2016-01-22 2016-01-22 Unmanned aerial device and orientation method thereof TWI596043B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW105102138A TWI596043B (en) 2016-01-22 2016-01-22 Unmanned aerial device and orientation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW105102138A TWI596043B (en) 2016-01-22 2016-01-22 Unmanned aerial device and orientation method thereof

Publications (2)

Publication Number Publication Date
TW201726495A TW201726495A (en) 2017-08-01
TWI596043B true TWI596043B (en) 2017-08-21

Family

ID=60186571

Family Applications (1)

Application Number Title Priority Date Filing Date
TW105102138A TWI596043B (en) 2016-01-22 2016-01-22 Unmanned aerial device and orientation method thereof

Country Status (1)

Country Link
TW (1) TWI596043B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI682877B (en) * 2018-06-04 2020-01-21 宏碁股份有限公司 Movable device control method, server device and movable device control system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110013642B (en) * 2019-04-04 2021-11-26 南京敏思软件有限公司 Method and system for determining motion state of intelligent stone lock and corresponding device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI394687B (en) * 2010-05-20 2013-05-01 Nat Defence University Hand-launched unmanned aerial system
CN105159321A (en) * 2015-08-18 2015-12-16 北京奇虎科技有限公司 Unmanned aerial vehicle-based photographing method and unmanned aerial vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI394687B (en) * 2010-05-20 2013-05-01 Nat Defence University Hand-launched unmanned aerial system
CN105159321A (en) * 2015-08-18 2015-12-16 北京奇虎科技有限公司 Unmanned aerial vehicle-based photographing method and unmanned aerial vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI682877B (en) * 2018-06-04 2020-01-21 宏碁股份有限公司 Movable device control method, server device and movable device control system

Also Published As

Publication number Publication date
TW201726495A (en) 2017-08-01

Similar Documents

Publication Publication Date Title
US11649052B2 (en) System and method for providing autonomous photography and videography
US10218885B2 (en) Throwable cameras and network for operating the same
JP6122591B2 (en) Photogrammetry camera and aerial photography equipment
US9237317B2 (en) Throwable camera and network for operating the same
WO2018072657A1 (en) Image processing method, image processing device, multi-camera photographing device, and aerial vehicle
CN109071034A (en) Switch method, controller and the image stability augmentation equipment of holder operating mode
WO2019227441A1 (en) Video control method and device of movable platform
WO2018098704A1 (en) Control method, apparatus, and system, unmanned aerial vehicle, and mobile platform
CN113794840B (en) Video processing method, video processing equipment, unmanned aerial vehicle and video processing system
WO2021031159A1 (en) Match photographing method, electronic device, unmanned aerial vehicle and storage medium
WO2020172800A1 (en) Patrol control method for movable platform, and movable platform
WO2019227333A1 (en) Group photograph photographing method and apparatus
WO2019195991A1 (en) Trajectory determination and time-lapse photography methods, device, and machine readable storage medium
WO2020227998A1 (en) Image stability augmentation control method, photography device and movable platform
WO2019120082A1 (en) Control device, system, control method, and program
JP5866913B2 (en) Imaging device
WO2019195990A1 (en) Image collection method and device, and machine readable storage medium
TWI596043B (en) Unmanned aerial device and orientation method thereof
WO2019127341A1 (en) Pan-tilt control method and device, and computer-readable storage medium
WO2019206076A1 (en) Control device, camera, moving body, control method and program
WO2019104684A1 (en) Unmanned aerial vehicle control method, device and system
CN105807783A (en) Flight camera
JP2018078433A (en) Mobile imaging device and control method thereof, and imaging apparatus and control method thereof, drone, program, recording medium
WO2020042159A1 (en) Rotation control method and apparatus for gimbal, control device, and mobile platform
WO2021052217A1 (en) Control device for performing image processing and frame body control