TW202124990A - State estimation and sensor fusion methods for autonomous vehicles - Google Patents
State estimation and sensor fusion methods for autonomous vehicles Download PDFInfo
- Publication number
- TW202124990A TW202124990A TW108146328A TW108146328A TW202124990A TW 202124990 A TW202124990 A TW 202124990A TW 108146328 A TW108146328 A TW 108146328A TW 108146328 A TW108146328 A TW 108146328A TW 202124990 A TW202124990 A TW 202124990A
- Authority
- TW
- Taiwan
- Prior art keywords
- state
- mobile vehicle
- transportation
- item
- patent application
- Prior art date
Links
- 238000007500 overflow downdraw method Methods 0.000 title abstract 2
- 238000000034 method Methods 0.000 claims abstract description 31
- 238000013507 mapping Methods 0.000 claims abstract description 16
- 238000012546 transfer Methods 0.000 claims abstract description 13
- 230000007704 transition Effects 0.000 claims description 42
- 230000004927 fusion Effects 0.000 claims description 13
- 238000013481 data capture Methods 0.000 claims description 11
- 230000007613 environmental effect Effects 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- APTZNLHMIGJTEW-UHFFFAOYSA-N pyraflufen-ethyl Chemical compound C1=C(Cl)C(OCC(=O)OCC)=CC(C=2C(=C(OC(F)F)N(C)N=2)Cl)=C1F APTZNLHMIGJTEW-UHFFFAOYSA-N 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0259—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Acoustics & Sound (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本發明是有關於一種裝置狀態的估測方法,且特別是有關於一種移動載具及其狀態估測與感測融合切換方法。The present invention relates to a method for estimating the state of a device, and more particularly to a mobile vehicle and a method for fusion switching of state estimation and sensing.
自主移動載具(Automated Guided Vehicle,AGV)是一種移動式機器人,其可通過地板導線、機器視覺或雷射導航等技術在廠區、倉庫內搬運貨物。由於AGV能夠自動裝卸和運輸貨物,讓載卸貨更省力,且能夠彈性調配載卸地點及運輸路徑,以提升交貨效率、解決車道占用等問題。Automated Guided Vehicle (AGV) is a mobile robot that can move goods in factories and warehouses through technologies such as floor wires, machine vision, or laser navigation. Because AGV can automatically load, unload and transport goods, it saves effort on loading and unloading, and can flexibly allocate loading and unloading locations and transportation routes to improve delivery efficiency and solve problems such as lane occupation.
AGV仰賴定位、物件識別等技術來實施貨物搬運,近年來多種定位技術群雄並起,如藍芽(Bluetooth)、無線保真(WiFi)、超寬頻(Ultra-Wideband,UWB)、可見光定位系統(Visible Light Positioning System)、無線射頻辨識(Radio Frequency Identification,RFID)等,依佈建成本、精準度及技術特性,這些定位技術各有其適合應用的場域。由於定位技術的多元性,使得室內外無縫定位之設計,難以單純地採用雙系統間的切換來達成。AGV relies on technologies such as positioning and object recognition to carry out cargo handling. In recent years, a variety of positioning technologies have come together, such as Bluetooth, WiFi, Ultra-Wideband (UWB), and visible light positioning system ( Visible Light Positioning System), Radio Frequency Identification (RFID), etc., depending on deployment cost, accuracy, and technical characteristics, each of these positioning technologies has its own field of application. Due to the diversity of positioning technology, the design of seamless indoor and outdoor positioning is difficult to achieve by simply switching between dual systems.
本發明之目的係提供一種移動載具及其狀態估測與感測融合切換方法,可實現多元定位系統之間的無縫切換。The purpose of the present invention is to provide a mobile vehicle and its state estimation and sensing fusion switching method, which can realize seamless switching between multiple positioning systems.
本發明提供一種移動載具的狀態估測與感測融合切換方法,此移動載具包括至少一個感測器、至少一個致動器及處理器,用以移載及運送物件。此方法包括下列步驟:接收搬運物件的任務指令及執行此任務指令所需的資料;將此任務指令依映射位置區分為多個工作階段,並將各個工作階段映射至運輸狀態及執行狀態其中之一,以建立語義層次;利用感測器估計移動載具的目前位置;以及將此目前位置映射至語義層次中的工作階段其中之一,以估測移動載具的目前狀態。The invention provides a method for switching between state estimation and sensing fusion of a mobile carrier. The mobile carrier includes at least one sensor, at least one actuator and a processor for transferring and transporting objects. This method includes the following steps: receiving a task instruction for moving objects and the data required to execute the task instruction; dividing the task instruction into multiple work stages according to the mapping position, and mapping each work stage to one of the transportation state and the
本發明提供一種移動載具,其包括資料擷取裝置、至少一個感測器、至少一個致動器、儲存裝置及處理器。其中,感測器是用以估計移動載具的目前位置。致動器是用以移載及運送物件。儲存裝置是用以儲存由資料擷取裝置擷取的資料及多個電腦指令或程式。處理器耦接資料擷取裝置、感測器、致動器及儲存裝置,且經配置以執行電腦指令或程式以:利用資料擷取裝置接收搬運物件的任務指令及執行此任務指令所需的資料;將此任務指令依映射位置區分為多個工作階段,並將各個工作階段映射至運輸狀態及執行狀態其中之一,以建立語義層次;以及將感測器所估計的目前位置映射至語義層次中的所述工作階段其中之一,以估測移動載具的目前狀態。The invention provides a mobile vehicle, which includes a data capture device, at least one sensor, at least one actuator, a storage device, and a processor. Among them, the sensor is used to estimate the current position of the mobile vehicle. The actuator is used to transfer and transport objects. The storage device is used to store the data captured by the data capture device and multiple computer commands or programs. The processor is coupled to the data capture device, the sensor, the actuator, and the storage device, and is configured to execute computer commands or programs to: use the data capture device to receive task instructions for moving objects and execute the task instructions Data; divide the task command into multiple work phases based on the mapping position, and map each work phase to one of the transportation state and the execution state to establish a semantic level; and map the current position estimated by the sensor to the semantics One of the working stages in the hierarchy to estimate the current state of the mobile vehicle.
本發明的移動載具及其狀態估測與感測融合切換方法藉由將任務指令區分為多個工作階段並映射至不同狀態以建立語義層次,當移動載具在執行移載和運送物件的任務時,可藉由將估計位置映射至當前狀態並判斷出是否發生狀態轉移,而當發生狀態轉移時,也能夠快速地切換至適合當下狀態的感測組合,以接續執行任務指令。藉此,可有效率地執行移動載具的狀態估測及其感測融合切換,實現定位系統之間的無縫切換。The mobile vehicle and its state estimation and sensing fusion switching method of the present invention establish a semantic level by dividing task commands into multiple work phases and mapping them to different states. When the mobile vehicle is performing transfer and transportation of objects During the task, it is possible to map the estimated position to the current state and determine whether a state transition occurs. When a state transition occurs, it can also quickly switch to a sensing combination suitable for the current state to continue executing task commands. Thereby, the state estimation of the mobile vehicle and its sensing fusion switching can be performed efficiently, and seamless switching between positioning systems can be realized.
為讓本發明的上述特徵和優點能更明顯易懂,下文特舉實施例,並配合所附圖式作詳細說明如下。In order to make the above-mentioned features and advantages of the present invention more comprehensible, the following specific embodiments are described in detail in conjunction with the accompanying drawings.
本發明實施例係針對自主移動載具(Automated Guided Vehicle,AGV)設計一個共通架構,其中將所接收的任務指令依其映射位置區分為多個工作階段以建立語義層次(semantic hierarchy),然後將各個工作階段依據其順序及連結關係隨著語義層次映射至狀態層,以建立狀態轉換模型。在實時運作中,自主移動載具即可藉由估計自身當前位置,並將該位置映射至語義層次,以估測目前狀態。此外,自主移動載具可比較目前狀態與先前狀態之間的差異,以判斷是否發生狀態轉移,並在發生狀態轉移時重新排定感測器的優先順序,以有效率地切換到適合目前狀態的控制執行緒以繼續執行搬運工作。The embodiment of the present invention designs a common architecture for autonomous mobile vehicles (Automated Guided Vehicle, AGV), in which the received task instructions are divided into multiple work stages according to their mapping positions to establish a semantic hierarchy, and then Each work stage is mapped to the state layer along with the semantic level according to its sequence and connection relationship to establish a state transition model. In real-time operation, autonomous mobile vehicles can estimate their current position and map the position to the semantic level to estimate the current state. In addition, the autonomous mobile vehicle can compare the difference between the current state and the previous state to determine whether a state transition occurs, and re-prioritize the sensors when a state transition occurs, so as to efficiently switch to the current state Control thread to continue the handling work.
圖1是依照本發明一實施例所繪示之移動載具的方塊圖。請參照圖1,本實施例的移動載具10例如是用以移載及運送物件的自主移動載具、搬運機器人等電子裝置。移動載具10包括資料擷取裝置12、至少一個感測器14、至少一個致動器16、儲存裝置18及處理器20,其功能分述如下。FIG. 1 is a block diagram of a mobile vehicle according to an embodiment of the present invention. Please refer to FIG. 1, the mobile carrier 10 of this embodiment is, for example, an electronic device such as an autonomous mobile carrier, a handling robot, etc., used to transfer and transport objects. The mobile vehicle 10 includes a
資料擷取裝置12例如是通用序列匯流排(USB)介面、火線(Firewire)介面、雷電(Thunderbolt)介面、讀卡機等介面裝置,其可用以連接隨身碟、行動硬碟、記憶卡等外部裝置以擷取資料。在另一實施例中,資料擷取裝置12例如是鍵盤、滑鼠、觸控板、觸碰螢幕等輸入工具,用以偵測使用者的輸入操作以擷取輸入資料。在又一實施例中,資料擷取裝置12例如是支援乙太網路(Ethernet)等有線網路連結的網路卡或是支援電機和電子工程師協會(Institute of Electrical and Electronics Engineers,IEEE)802.11n/b/g等無線通訊標準的無線網路卡,其可透過有線或無線方式與外部裝置進行網路連線並擷取資料。The
感測器14例如是無線通訊子系統、全球定位系統(global position system,GPS)、低功耗藍牙(Bluetooth Low Energy,BLE)、慣性測量單元(inertial measurement unit,IMU)、旋轉編碼器(rotary encoder)、相機、光感測器(photodetector)、雷射或其組合,而可感測移動載具10周遭的電磁波、影像、聲波等環境資訊以及移動載具10自身的慣性、位移等,並將所偵測資訊提供處理器20用以估計移動載具10的目前位置及/或狀態。在一實施例中,感測器14可搭配雷射測繪(laser mapper)、測距(odometry)等系統,而可增加移動載具10位置的精準估測。The
致動器16例如是牙叉(fork)、手臂(arm)、滾輪(roller)、馬達(motor)或其組合,其可組成叉臂式搬運系統,而可根據處理器20下達的控制指令或訊號,對物件進行裝載、卸載及運送等操作動作。The
儲存裝置18可以是任何型態的固定式或可移動式隨機存取記憶體(random access memory,RAM)、唯讀記憶體(read-only memory,ROM)、快閃記憶體(flash memory)或類似元件或上述元件的組合。在本實施例中,儲存裝置18用以儲存由資料擷取裝置12擷取的資料與可供處理器20存取並執行的電腦指令或程式。其中,資料擷取裝置12擷取的資料包含任務指令及用以執行任務指令所需的圖資、識別資訊等資料,而處理器20可利用圖資進行位置估計,並利用識別資訊對移載物品、裝載或卸載地點、裝載或卸載對象進行識別操作。所述裝載對象與卸載對象的識別方法包括生物特徵、物件特徵、環境特徵或識別碼,在此不設限。The
處理器20例如是中央處理單元(Central Processing Unit,CPU)或圖形處理單元(Graphics Processing Unit,GPU),或是其他可程式化之一般用途或特殊用途的微處理器(Microprocessor)、數位訊號處理器(Digital Signal Processor,DSP)、可程式化控制器、特殊應用積體電路(Application Specific Integrated Circuits,ASIC)、可程式化邏輯裝置(Programmable Logic Device,PLD)或其他類似裝置或這些裝置的組合。處理器20連接資料擷取裝置12、感測器14、致動器16及儲存裝置18,其例如從儲存裝置18載入電腦指令或程式,並據以執行本發明的移動載具的狀態估測與感測融合切換方法。以下即舉實施例說明此方法的詳細步驟。The
圖2是依照本案一實施例所繪示之移動載具的狀態估測與感測融合切換方法的流程圖。請同時參照圖1及圖2,本實施例的方法適用於圖1的移動載具10,以下即搭配移動載具10中的各項元件說明本發明之狀態估測與感測融合切換方法的詳細步驟。FIG. 2 is a flowchart of a method for fusion switching between state estimation and sensing of a mobile vehicle according to an embodiment of the present case. Please refer to FIGS. 1 and 2 at the same time. The method of this embodiment is applicable to the mobile carrier 10 of FIG. detailed steps.
在步驟S202中,由處理器20利用資料擷取裝置12接收搬運物件的任務指令及執行任務指令所需的資料。其中,所述的任務指令例如是由廠區的管理者下達,用以指示移動載具10對廠區內的物件進行移載及運送等操作。在一實施例中,處理器20例如會將經常讀取或即將使用的資料,例如附近區域的圖資及搬運物品、裝載或卸載地點、裝載或卸載對象的識別資訊儲存在儲存裝置18中,以提供處理器20存取使用。In step S202, the
在步驟S204中,由處理器20將任務指令依映射位置區分為多個工作階段,並將各工作階段映射至運輸狀態及執行狀態其中之一,以建立語義層次(semantic hierarchy)。其中,所述的任務指令由裝載、卸載及運送中的至少一個工作組成,而處理器20例如會將這些工作分別對應於至少一個控制執行緒,並依據控制執行緒區分工作階段。其中,所述裝載及卸載例如是依據裝載地點、卸載地點、移載物件以及裝載對象與卸載對象的識別來區分工作階段,而所述運送工作則例如是依據運送所經過的至少一個場所各自的地理資訊系統來區分工作階段。In step S204, the
在一實施例中,處理器20將移動載具10的狀態歸為兩類:運輸狀態或執行狀態。在運輸狀態中,處理器20例如會利用路徑規劃模組(path planner)設置路徑,路徑規劃模組規劃路徑係依據如Ghosh and Mount所提出之方法建構可見性圖,並基於可見性圖的邊,利用最短路徑演算法如Dijkstra’s algorithm運算出最佳路徑,並產生低階指令來控制移動載具10的馬達調整方向和速度,以追蹤所規劃的路徑。在運送途中,處理器20會利用感測器14持續感測周遭環境並確認移動載具10是否循跡移動,而當偵測到障礙物時,處理器20即依據測距資料控制馬達降速或停止,同時利用雷射測繪系統測繪出障礙物形狀並輸出至路徑規劃模組以規劃避障路徑。另一方面,在執行狀態中,處理器20例如會啟動相機以執行裝載/卸載對象的識別,並控制移載機械執行物品裝卸。In one embodiment, the
詳言之,本實施例的移動載具的狀態估測與感測融合切換方法在實作狀態分析時,建立語義層次以賦予認知系統的能力。其中,語義層次可基於任務指令動態建立,其中包括映射位置、工作階段及狀態等三種層次。In detail, the state estimation and sensing fusion switching method of the mobile vehicle of this embodiment establishes a semantic level to give the cognitive system the ability when implementing state analysis. Among them, the semantic level can be dynamically established based on task instructions, including three levels of mapping position, work stage, and status.
舉例來說,圖3是依照本案一實施例所繪示之語義層次的示意圖。請參照圖3,語義層次30包括映射位置層32、工作階段層34及狀態層36。其中,映射位置層32包括執行任務指令所涉及的區域或位置,例如座標1~3、圖磗(map tile)1~3及(移載地點/對象)影像1~3。工作階段層34包括多個工作階段,例如包括裝載P1、運送P2~P3及卸載P4。映射位置層32中的各個位置可映射至裝載P1、運送P2~P3及卸載P4其中之一,例如座標3及圖磗3可映射至裝載P1,座標2、影像2及影像3可映射至卸載P4,以此類推。狀態層36則包括執行狀態及運輸狀態,其中裝載P1及卸載P4可映射至執行狀態,而運送P2~P3則可映射至運輸狀態。各種執行狀態及運輸狀態可對應一回饋控制迴圈的執行緒,此執行緒例如耦合特定的感測器14和致動器16,以控制其執行特定操作。For example, FIG. 3 is a schematic diagram illustrating the semantic level according to an embodiment of the present case. Please refer to FIG. 3, the
在一實施例中,處理器20在建立語義層次後,例如會進一步根據工作階段之間的順序及連結關係,將各個工作階段隨著語義層次映射至運輸狀態及執行狀態其中之一,以形成狀態轉移模型(state transition model)。In one embodiment, after the
舉例來說,圖4是依照本案一實施例所繪示之狀態轉移模型的示意圖。請參照圖4,狀態轉移模型40例如定義語義層次中運輸狀態及執行狀態下各個工作階段之間的轉換。意即,狀態轉移模型40是將工作階段之間的轉換映射到狀態之間的轉換。以圖4為例,狀態轉移模型40記錄映射至運輸狀態的工作階段1~n之間的轉換、映射至執行狀態的工作階段1~m之間的轉換,以及工作階段1~n與工作階段1~m之間的轉換。左下方的表格記錄映射至運輸狀態的工作階段1~n所耦合的感測器和致動器,右下方的表格記錄映射至執行狀態的工作階段1~m所耦合的感測器和致動器。例如,映射至運輸狀態的工作階段1耦合全球定位系統及基地台,映射至運輸狀態的工作階段2耦合光感測器、慣性測量單元及旋轉編碼器,以此類推。For example, FIG. 4 is a schematic diagram of a state transition model according to an embodiment of the present application. Referring to FIG. 4, the
在語義層次及狀態轉移模型建立之後,在實時運作中,移動載具10即可藉由估計自身當前位置,並將該位置映射至語義層次,以估測目前狀態。After the semantic level and the state transition model are established, in real-time operation, the mobile vehicle 10 can estimate its current position and map the position to the semantic level to estimate the current state.
詳言之,在步驟S206中,由處理器20利用感測器14估計移動載具10的目前位置。其中,處理器20例如可使用全球定位系統或基地台定位系統估計室外位置,或使用光感測器、雷射等定位裝置估計室內位置,在此不設限。In detail, in step S206, the
最後,在步驟S208中,由處理器20將目前位置映射至語義層次中的工作階段其中之一,以估測移動載具10的目前狀態。以圖3為例,當處理器20估計移動載具10的目前位置而獲得座標3時,即可經由語義層次30,將座標3映射至工作階段中的裝載P1,然後再將裝載P1映射至執行狀態。據此,處理器20可依據其估測的目前狀態,耦合對應的感測器和致動器來執行初級行為或技能。Finally, in step S208, the
處理器20在估測出移動載具10的目前狀態之後,例如會將此目前狀態與前一時間點所估測的先前狀態進行比較,以判斷是否發生狀態轉移。其中,當判斷發生狀態轉移時,處理器20會依據先前建立的狀態轉移模型,循序切換該狀態轉移下的多個感測組合,以選擇可用的感測組合接續執行任務指令。所述感測組合包括至少一個感測器及/或致動器。而藉由在狀態轉移時重新排定感測訊號源的組合,可以有效率地切換到適合目前狀態之控制執行緒,以接續執行工作。After the
舉例來說,圖5A至圖5D是依照本案一實施例所繪示之感測融合切換方法的範例。本實施例的自主移動載具V例如是具備移載機械的自動取送貨車,用以從倉庫送貨至戶外客人。For example, FIGS. 5A to 5D are examples of a sensor fusion switching method according to an embodiment of the present invention. The autonomous mobile vehicle V of this embodiment is, for example, an automatic pick-up and delivery vehicle equipped with a transfer machine for delivering goods from a warehouse to outdoor guests.
請參照圖5A,自主移動載具V接收搬運物件O的任務指令及執行此任務指令所需的資料,包括物件O在貨架S上的位置及物件O的識別碼I(如圖所示的QR碼),接著即進行狀態分析,判斷自身位於倉庫內的貨架S旁,此時即進入執行狀態以進行取貨。其中,自主移動載具V利用相機C拍攝位在貨架S上的物件O的識別碼I以對物件O進行識別,而當確認物件O是任務指令所指示搬運的貨物時,即利用移載機械A對物件O進行取貨。Please refer to Figure 5A, the autonomous mobile vehicle V receives the task instruction to transport the object O and the data required to execute the task instruction, including the position of the object O on the shelf S and the identification code I of the object O (QR as shown in the figure) Code), and then proceed to the state analysis, determine that it is located next to the shelf S in the warehouse, and enter the execution state to pick up the goods at this time. Among them, the autonomous mobile vehicle V uses the camera C to photograph the identification code I of the object O located on the shelf S to identify the object O, and when it is confirmed that the object O is the cargo indicated by the task instruction, it uses the transfer machine A picks up the item O.
請參照圖5B,取貨完畢後,自主移動載具V即由執行狀態切換至運輸狀態,而啟動路徑規劃模組進行送貨路徑的規劃。其中,由於從執行狀態切換至運輸狀態的過程中會觸發狀態轉移,自主移動載具V會循序切換感測組合,直到所切換到的感測組合與現場的定位系統相匹配為止。Referring to FIG. 5B, after the goods are picked up, the autonomous mobile vehicle V is switched from the execution state to the transportation state, and the route planning module is activated to plan the delivery route. Among them, since the state transition is triggered during the process of switching from the execution state to the transportation state, the autonomous mobile vehicle V will sequentially switch the sensing combination until the sensing combination switched to matches the on-site positioning system.
舉例來說,下表1繪示本次狀態轉移下的感測組合,自主移動載具V會在這些感測組合中循序切換,以選擇可用的感測組合接續執行任務指令。其中,自主移動載具V在使用感測組合1後發現無法與現場的定位系統匹配,隨即切換至感測組合2,並發現感測組合2可與現場的定位系統匹配,因此可直接選用感測組合2接續執行任務指令。For example, Table 1 below shows the sensing combinations in this state transition. The autonomous mobile vehicle V will switch among these sensing combinations in order to select the available sensing combinations to continue executing task commands. Among them, the autonomous mobile vehicle V found that it could not be matched with the on-site positioning system after using the sensing set 1, and then switched to the sensing set 2, and found that the sensing set 2 can be matched with the on-site positioning system, so the sensing set can be directly selected. Test set 2 continues to execute task instructions.
請參照圖5C,當自主移動載具V根據所規劃路徑移動,而準備從倉庫內移動到室外時,由於當前估計的位置所映射的狀態與前一時間點所估測的狀態相異(即,工作階段由倉庫改變為室外),此時將再度觸發狀態轉移而重新排定感測組合。Referring to Figure 5C, when the autonomous mobile vehicle V moves according to the planned path and is about to move from the warehouse to the outdoors, the state mapped by the current estimated position is different from the state estimated at the previous point in time (ie , The work phase is changed from warehouse to outdoor), at this time, the state transition will be triggered again and the sensing combination will be rescheduled.
舉例來說,下表2繪示本次狀態轉移下的感測組合。自主移動載具V在使用感測組合1後即發現其可與現場定位系統匹配,因此可直接選用感測組合1接續執行任務指令。其中,由於主移動載具V是依據本次狀態轉移(即,工作階段由倉庫改變為室外)下最有可能匹配的感測組合的順序進行切換,因此可有效率地且無縫地切換定位系統。For example, Table 2 below shows the sensing combination in this state transition. The autonomous mobile vehicle V finds that it can be matched with the on-site positioning system after using the sensing set 1, so the sensing set 1 can be directly selected to continue to execute the task instructions. Among them, because the main mobile vehicle V is switched according to the sequence of the most likely matching sensing combination under this state transition (that is, the work phase is changed from the warehouse to the outdoor), it can efficiently and seamlessly switch the positioning system.
請參照圖5D,自主移動載具V在抵達卸載地點後,即可藉由估計目前位置並將所估計的目前位置映射至語義層次,而估測出目前狀態為執行狀態。而由運輸狀態切換成執行狀態會觸發狀態轉移,此時自主移動載具V即會切換感測組合以執行卸載時所需執行的識別操作。Referring to FIG. 5D, after the autonomous mobile vehicle V arrives at the unloading site, it can estimate the current state as the execution state by estimating the current position and mapping the estimated current position to the semantic level. Switching from the transportation state to the execution state will trigger the state transition. At this time, the autonomous mobile vehicle V will switch the sensing combination to perform the recognition operation that needs to be performed when unloading.
舉例來說,下表3繪示本次狀態轉移下的感測組合。自主移動載具V切換至感測組合1時即啟動相機,由於相機支持卸載時所需進行的卸載對象T的識別操作(例如人臉辨識),因此自主移動載具V可直接選用感測組合1接續執行任務指令。當確認卸載對象T的身分匹配時,自主移動載具V即啟動移載機械A將物件O交付給卸載對象T。For example, Table 3 below shows the sensing combination in this state transition. When the autonomous mobile vehicle V switches to the
綜上所述,本發明的移動載具及其狀態估測與感測融合切換方法藉由將任務指令區分為多個工作階段並映射至不同狀態以建立語義層次,當移動載具在執行移載和運送物件的任務時,可藉由將估計位置映射至當前狀態並判斷出是否發生狀態轉移,而當發生狀態轉移時,也能夠快速地切換至適合當下狀態的感測組合,以接續執行任務指令。藉此,可有效率地執行移動載具的狀態估測及其感測融合切換,實現定位系統之間的無縫切換。In summary, the mobile vehicle and its state estimation and sensing fusion switching method of the present invention establishes a semantic level by dividing task commands into multiple work phases and mapping them to different states. When loading and transporting objects, you can map the estimated position to the current state and determine whether a state transition occurs. When a state transition occurs, you can quickly switch to a sensing combination suitable for the current state to continue execution Task instructions. Thereby, the state estimation of the mobile vehicle and its sensing fusion switching can be performed efficiently, and seamless switching between positioning systems can be realized.
雖然本發明已以實施例揭露如上,然其並非用以限定本發明,任何所屬技術領域中具有通常知識者,在不脫離本發明的精神和範圍內,當可作些許的更動與潤飾,故本發明的保護範圍當視後附的申請專利範圍所界定者為準。Although the present invention has been disclosed in the above embodiments, it is not intended to limit the present invention. Anyone with ordinary knowledge in the relevant technical field can make some changes and modifications without departing from the spirit and scope of the present invention. The protection scope of the present invention shall be subject to those defined by the attached patent application scope.
10:移動載具 12:資料擷取裝置 14:感測器 16:致動器 18:儲存裝置 20:處理器 30:語義層次 32:映射位置層 34:工作階段層 36:狀態層 40:狀態轉移模型 A:移載機械 C:相機 I:識別碼 O:物件 P1~P4:工作階段 S:貨架 T:卸載對象 V:自主移動載具 W:倉庫 S202~S208:步驟10: mobile vehicle 12: Data acquisition device 14: Sensor 16: Actuator 18: storage device 20: processor 30: Semantic level 32: Map location layer 34: Work stage layer 36: State layer 40: State transition model A: Transfer machinery C: Camera I: identification code O: Object P1~P4: working stage S: Shelf T: Unload the object V: Autonomous mobile vehicle W: Warehouse S202~S208: steps
圖1是依照本發明一實施例所繪示之移動載具的方塊圖。 圖2是依照本案一實施例所繪示之移動載具的狀態估測與感測融合切換方法的流程圖。 圖3是依照本案一實施例所繪示之語義層次的示意圖。 圖4是依照本案一實施例所繪示之狀態轉移模型的示意圖。 圖5A至圖5D是依照本案一實施例所繪示之感測融合切換方法的範例。FIG. 1 is a block diagram of a mobile vehicle according to an embodiment of the present invention. FIG. 2 is a flowchart of a method for fusion switching between state estimation and sensing of a mobile vehicle according to an embodiment of the present case. Fig. 3 is a schematic diagram of the semantic level drawn according to an embodiment of the present case. Fig. 4 is a schematic diagram of a state transition model drawn according to an embodiment of the present case. 5A to 5D are examples of a sensor fusion switching method according to an embodiment of the present application.
S202~S208:步驟S202~S208: steps
Claims (16)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW108146328A TWI715358B (en) | 2019-12-18 | 2019-12-18 | State estimation and sensor fusion methods for autonomous vehicles |
CN202010086218.9A CN113075923B (en) | 2019-12-18 | 2020-02-11 | Mobile carrier and state estimation and sensing fusion switching method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW108146328A TWI715358B (en) | 2019-12-18 | 2019-12-18 | State estimation and sensor fusion methods for autonomous vehicles |
Publications (2)
Publication Number | Publication Date |
---|---|
TWI715358B TWI715358B (en) | 2021-01-01 |
TW202124990A true TW202124990A (en) | 2021-07-01 |
Family
ID=75237391
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW108146328A TWI715358B (en) | 2019-12-18 | 2019-12-18 | State estimation and sensor fusion methods for autonomous vehicles |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113075923B (en) |
TW (1) | TWI715358B (en) |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002023297A1 (en) * | 2000-09-11 | 2002-03-21 | Kunikatsu Takase | Mobile body movement control system |
TW201321292A (en) * | 2011-11-16 | 2013-06-01 | Ind Tech Res Inst | Transportation method, storage device, container, support plate, and trailer thereof |
EP3074832A4 (en) * | 2013-11-27 | 2017-08-30 | The Trustees Of The University Of Pennsylvania | Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (mav) |
KR101644270B1 (en) * | 2015-05-15 | 2016-08-01 | 한경대학교 산학협력단 | Unmanned freight transportation system using automatic positioning and moving route correcting |
CN111792034B (en) * | 2015-05-23 | 2022-06-24 | 深圳市大疆创新科技有限公司 | Method and system for estimating state information of movable object using sensor fusion |
KR101822103B1 (en) * | 2015-10-26 | 2018-01-25 | 주식회사 가치소프트 | System for sorting product using sorting apparatus and method thereof |
KR101793932B1 (en) * | 2016-06-13 | 2017-11-07 | 주식회사 가치소프트 | System for arranging product |
US10295365B2 (en) * | 2016-07-29 | 2019-05-21 | Carnegie Mellon University | State estimation for aerial vehicles using multi-sensor fusion |
US10866102B2 (en) * | 2016-12-23 | 2020-12-15 | X Development Llc | Localization of robotic vehicles |
US10038979B1 (en) * | 2017-01-31 | 2018-07-31 | Qualcomm Incorporated | System and method for ranging-assisted positioning of vehicles in vehicle-to-vehicle communications |
CN110223212B (en) * | 2019-06-20 | 2021-05-18 | 上海智蕙林医疗科技有限公司 | Dispatching control method and system for transport robot |
-
2019
- 2019-12-18 TW TW108146328A patent/TWI715358B/en active
-
2020
- 2020-02-11 CN CN202010086218.9A patent/CN113075923B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN113075923A (en) | 2021-07-06 |
TWI715358B (en) | 2021-01-01 |
CN113075923B (en) | 2024-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5047709B2 (en) | Moving device, system, moving method, and moving program | |
JP6731423B2 (en) | Apparatus and method for navigation control | |
JP7161040B2 (en) | Zone engine for providing contextual enhanced map layers | |
JP6802137B2 (en) | Transport vehicle system, transport vehicle control system and transport vehicle control method | |
JP5982729B2 (en) | Transport management device, transport system, and transport management program | |
US10507578B1 (en) | Optimization of observer robot locations | |
JP2019529277A (en) | Collaborative inventory monitoring | |
CN103635779A (en) | Method and apparatus for facilitating map data processing for industrial vehicle navigation | |
JP2005320074A (en) | Device and program for retrieving/collecting articles | |
US11797906B2 (en) | State estimation and sensor fusion switching methods for autonomous vehicles | |
CN113654558A (en) | Navigation method and device, server, equipment, system and storage medium | |
US11468770B2 (en) | Travel control apparatus, travel control method, and computer program | |
KR101955628B1 (en) | System and method for managing position of material | |
KR102580082B1 (en) | Proximity robot object detection and avoidance | |
TWI715358B (en) | State estimation and sensor fusion methods for autonomous vehicles | |
JP2021039450A (en) | System and method for design assist, and program | |
US9501755B1 (en) | Continuous navigation for unmanned drive units | |
US20220083062A1 (en) | Robot navigation management between zones in an environment | |
US20220291696A1 (en) | Transport system, control apparatus, transport method, and program | |
US20220317704A1 (en) | Transport system, control apparatus, transport method, and program | |
WO2020032157A1 (en) | Article position estimation system and article position estimation method | |
CN114833823A (en) | Intelligent logistics robot based on SLAM navigation, control method and application | |
US11802948B2 (en) | Industrial vehicle distance and range measurement device calibration | |
CA3184958A1 (en) | Industrial vehicle distance and range measurement device calibration |