WO2020261973A1 - 運転制御システム - Google Patents
運転制御システム Download PDFInfo
- Publication number
- WO2020261973A1 WO2020261973A1 PCT/JP2020/022659 JP2020022659W WO2020261973A1 WO 2020261973 A1 WO2020261973 A1 WO 2020261973A1 JP 2020022659 W JP2020022659 W JP 2020022659W WO 2020261973 A1 WO2020261973 A1 WO 2020261973A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control system
- transport vehicle
- unit
- transported
- posture
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/12—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to parameters of the vehicle itself, e.g. tyre models
- B60W40/13—Load or weight
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60P—VEHICLES ADAPTED FOR LOAD TRANSPORTATION OR TO TRANSPORT, TO CARRY, OR TO COMPRISE SPECIAL LOADS OR OBJECTS
- B60P9/00—Other vehicles predominantly for carrying loads, e.g. load carrying vehicles convertible for an intended purpose
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/50—Systems of measurement based on relative movement of target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60P—VEHICLES ADAPTED FOR LOAD TRANSPORTATION OR TO TRANSPORT, TO CARRY, OR TO COMPRISE SPECIAL LOADS OR OBJECTS
- B60P7/00—Securing or covering of load on vehicles
- B60P7/06—Securing of load
- B60P7/08—Securing to the vehicle floor or sides
- B60P7/0823—Straps; Tighteners
- B60P7/0861—Measuring or identifying the tension in the securing element
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2200/00—Type of vehicle
- B60Y2200/10—Road Vehicles
- B60Y2200/14—Trucks; Load vehicles, Busses
Definitions
- the present disclosure relates to an operation control system.
- the present application claims priority based on Japanese Patent Application No. 2019-118572 filed in Japan on June 26, 2019, the contents of which are incorporated herein by reference.
- Patent Document 1 the configuration of the transport vehicle shown in Patent Document 1 is complicated because it is necessary to newly provide a load collapse detection sensor in addition to the obstacle sensor. The same can be said for Patent Document 2.
- the load collapse determination unit obtains a set of a plurality of continuous linear measurement points in the outer shape of the object to be transported. It is used as a reference for the posture of the object to be transported.
- the load collapse determination unit determines the load collapse of the object to be transported with a simple configuration based on the measurement data of the measurement unit that detects an obstacle.
- the operation control system S (operation control system) according to the embodiment of the present disclosure is a system for transporting a transportation object (for example, a transportation object C) to a target place or supporting the transportation.
- the operation control system S of the present disclosure is a system that autonomously travels the transport vehicle T that is transporting the transport object C.
- the transport vehicle T has a structure including a trailer head H and a loading platform N connected to the trailer head H.
- the transport vehicle T may be unmanned or manned, but in the following description, it is assumed that the transport vehicle T is an unmanned transport vehicle.
- the object to be transported C is loaded on the loading platform N.
- the operation control system S has a distance sensor 1 (measurement means or measurement unit), a shape recognition unit 2, a posture storage unit 3, and a load collapse determination unit 4 (load collapse determination means). ) And the display unit 5.
- the shape recognition unit 2, the posture storage unit 3, and the load collapse determination unit 4 are configured as one function of the arithmetic unit P.
- This arithmetic unit P is a computer composed of a CPU (Central Processing Unit), memories such as RAM (Random Access Memory) and ROM (Read Only Memory), and an input / output device for exchanging signals with an external device. ..
- the arithmetic unit P there is an ECU (Electric Control Unit) mounted on the transport vehicle T and controlling an internal combustion engine, an electric motor, a transmission, or a combination thereof. Therefore, as an algorithm executed by a computer or an ECU as an example thereof, the functions of the shape recognition unit 2, the posture storage unit 3, and the load collapse determination unit 4 may be realized.
- Each of the shape recognition unit 2, the posture storage unit 3, and the load collapse determination unit 4 is a computer composed of a CPU, memories such as RAM and ROM, an input / output device that exchanges signals with an external device, and the like. You may.
- the functions of the shape recognition unit 2, the posture storage unit 3, and the load collapse determination unit 4 are realized as algorithms executed by the respective computers provided in the shape recognition unit 2, the attitude storage unit 3, and the load collapse determination unit 4. May be done.
- the distance sensor 1 is provided at the upper part of the trailer head H of the transport vehicle T (vehicle), for example, as shown in FIGS. 1 and 2.
- a distance sensor 1 can detect the distance to the object by irradiating the object with laser light in a pulse shape and measuring the time until the reflected light from the object reaches the distance sensor 1. It is a device.
- the distance sensor 1 attached to the front side in the traveling direction can scan a region including the front and the rear of the transport vehicle T, exists around the transport vehicle T including the front of the transport vehicle T, and operates the transport vehicle T. It is also possible to detect the obstacle R by measuring the distance from the obstacle R that can be an obstacle.
- Obstacle R is, for example, temporarily installed on utility poles, tunnel surface walls, railroad crossings and parking lot barriers, medians, other parked vehicles, other running vehicles, and road surfaces under construction. It is assumed that guards will be used.
- the distance sensor 1 is, for example, a two-dimensional or three-dimensional LRF (Laser Range Finder) or a two-dimensional or three-dimensional LIDAR (Light Detection and Langing, Laser Imaging Detection and Ringing). Further, the distance sensor 1 may apply the data of the sensor used for SLAM (Simultaneous Localization and Mapping).
- the objects for which the distance sensor 1 measures the distance may be collectively referred to as surrounding objects. That is, the surrounding objects include an obstacle R, a transportation object C, a loading platform N, a ground G (road surface), and the like.
- the shape recognition unit 2 identifies the outer shape from the data point cloud based on the measurement data acquired from the distance sensor 1.
- the measurement data is, for example, data indicating the distance to the object measured by the distance sensor 1.
- the measurement point which is a point on the surface of the object whose distance to the distance sensor 1 is measured by irradiating the surface of the object with a pulsed laser beam, becomes a point on the object. Multiple are displayed in. These plurality of points are called a data point cloud.
- the shape recognition unit 2 groups (aggregates) consecutive measurement points having similar heights based on the acquired data point group, and identifies them as the outer shape of the object.
- the shape recognition unit 2 can detect not only the shape of the object to be transported C but also the outer shape of an object that can be an obstacle R in the traveling direction. Therefore, the shape recognition unit 2 also serves as an obstacle detection sensor.
- the distance sensor 1 and the arithmetic unit P are electrically and / or electronically connected so that signals can be exchanged with each other.
- the posture storage unit 3 stores a group of measurement points identified by the shape recognition unit 2 by time (by frame). Further, the posture storage unit 3 also stores the outer shape of the object in the traveling direction in the same manner.
- the load collapse determination unit 4 determines whether or not the shape of the group of measurement points stored in the posture storage unit 3 is linear. Then, the load collapse determination unit 4 uses a group of linear measurement points as a determination standard for the posture of the object to be transported C to determine the load collapse. That is, the load collapse determination unit 4 uses a group of linear measurement points, and the posture of the group in the current frame is close to the height position in the previous frame (previous or past frame). When the difference in posture is equal to or greater than the threshold value, it is determined that the load has collapsed.
- the display unit 5 is, for example, a monitor provided in a central control facility or the like that manages the running of an unmanned transportation vehicle, and displays a warning or the like when it is determined that the load has collapsed.
- a warning display may be displayed on the windshield of the driver's seat, an in-vehicle monitor, or the like.
- the central control equipment, etc. refers to equipment that allows an observer to constantly monitor the operating status of an unmanned transportation vehicle, such as the presence or absence of a collapse of cargo. If necessary, instructions such as stopping may be transmitted from the central control facility or the like to the unmanned transportation vehicle. Further, in the central control facility or the like, a plurality of unmanned transportation vehicles may be monitored at the same time.
- the distance sensor 1 the distance to the point cloud of the measurement point of the object to be transported C is measured (step S1). Then, the shape recognition unit 2 compares the heights of the acquired point clouds in the vertical direction, and measures that the distances are short and the heights of the point clouds are close (that is, the difference between the distance and the height is equal to or less than a predetermined value). Group the points (step S2).
- step S4 when the determination is YES, that is, when the group is linear, the load collapse determination unit 4 stores the attitude of the measurement point group (G1) in the current frame and the attitude storage unit 3. In the previous frame, it is determined whether or not the difference between the posture of the measurement point group (G1) in the current frame and the posture of the measurement point group (G2) whose positions are close to each other is equal to or greater than the threshold value (step S5). ). In step S5, if the determination is YES, that is, if the difference in posture is equal to or greater than the threshold value, it is determined that the load collapse has occurred, and a warning is displayed on the display unit 5 (step S6). Further, when the determination is NO in step S5, that is, when the difference in posture is less than the threshold value, it is determined that the load has not collapsed.
- connection angle between the trailer head H and the object to be transported C may change when the transport vehicle T turns.
- the value of the connection angle between the trailer head H and the object to be transported C is acquired from a sensor (for example, an angle sensor) provided on the transport vehicle T for measuring the connection angle, and the distance sensor 1 acquires the value.
- the distance sensor 1 which also serves as an obstacle detection sensor
- determine the load collapse by the load collapse determination unit 4 it is possible to determine the load collapse of the object to be transported C with a simple configuration without providing a load collapse detection sensor separately from the obstacle detection sensor. Since it is possible to determine the collapse of the load of the object to be transported C with a simple configuration, the processing capacity of the arithmetic unit P mounted on the operation control system S can be suppressed low, and the processing speed of the arithmetic unit P can be increased. It becomes possible to do. Therefore, it is possible to contribute to the realization of autonomous traveling of the transportation vehicle T carrying the transportation object C, which is an improvement in computer-related technology.
- a group of a plurality of continuous linear measurement points in the outer shape of the object to be transported C is used as a reference for the posture of the object to be transported C.
- the distance sensor 1 since the distance sensor 1 is attached to the upper part of the transport vehicle T, the blind spot can be reduced in both the front in the traveling direction and the rear in the traveling direction. Therefore, it is possible to measure the obstacle R in front of the traveling direction and the transported object C behind in the traveling direction.
- the present disclosure is not limited to the above embodiment, and for example, the following modifications can be considered.
- a load collapse occurs, a warning is displayed on the display unit 5, but the present disclosure is not limited to this.
- a process of stopping the traveling of the transport vehicle T from the central control facility or the like may be performed. This process may be performed artificially by an observer in the central control facility or the like, or may be automatically performed by the central control facility or the like.
- the distance is measured using LiDAR as the measuring unit, but the present disclosure is not limited to this.
- an omnidirectional camera, a wide-angle radar, or the like may be used as the measuring unit.
- a plurality of distance sensors 1 may be provided.
- the distance sensors 1 may be provided at two positions at the left and right ends of the trailer head H'in the direction orthogonal to the traveling direction of the transport vehicle T'(horizontal direction).
- by reducing the blind spot of the distance sensor 1 not only the measurement range of the distance sensor 1 is widened, but also the posture of the object to be transported C is detected by comparing the measurement points of the two distance sensors 1. Therefore, it is possible to detect the posture of the object to be transported C more widely and accurately.
- the driving control system S is a system that supports autonomous driving of an unmanned transportation vehicle, but the present disclosure is not limited to this.
- it may be a system that provides driving support information to the driver of a manned transportation vehicle.
- the load collapse warning information is displayed on a monitor that can be visually recognized by the driver of the transport vehicle T or on the windshield.
- the warning information of the collapse of the load is displayed on the display unit, but the present disclosure is not limited to this.
- the load collapse warning information may be notified as voice to the manager of the unmanned transportation vehicle or the driver of the manned transportation vehicle.
- the load collapse determination unit determines the load collapse of the object to be transported with a simple configuration based on the measurement data of the measurement unit that detects an obstacle.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Mechanical Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本願は、2019年6月26日に、日本国に出願された特願2019-118572号に基づき優先権を主張し、その内容をここに援用する。
距離センサ1において、運搬対象物Cの計測点の点群までの距離が計測される(ステップS1)。
そして、形状認識部2において、取得した点群の鉛直方向の高さを比較し、距離が近く、かつ点群の高さが近い(すなわち、距離及び高さの差が所定値以下)の計測点をグループ化する(ステップS2)。
(1)上記実施形態においては、荷崩れが発生した場合には、表示部5に警告表示を行うとしたが、本開示はこれに限定されない。例えば、荷崩れが発生した場合には、中央管制設備等から運搬車両Tの走行を停止する処理を行ってもよい。この処理は、中央管制設備等にいる監視者が人為的に行っても良いし、中央管制設備等が自動的に行っても良い。
2 形状認識部
3 姿勢記憶部
4 判定部
5 表示部
C 運搬対象物
S 運転制御システム
Claims (5)
- 運搬車両における障害物を検知する計測部と、
前記計測部の計測データに基づいて前記運搬車両に積載された運搬対象物の姿勢を取得し、時間に対する前記姿勢の変化に基づいて前記運搬対象物の荷崩れを判定する荷崩れ判定部と
を備える運転制御システム。 - 前記計測部は、前記運搬車両に搭載され、レーザ光を走査して前記運搬車両と周囲物体との距離を計測する請求項1記載の運転制御システム。
- 前記荷崩れ判定部は、前記運搬対象物の外形形状における連続した直線状の複数の計測点の集合を、前記運搬対象物の姿勢の基準とする請求項2記載の運転制御システム。
- 前記計測部は、前記運搬車両の上部に設置されている請求項1~3のいずれか一項に記載の運転制御システム。
- 前記計測部は、前記運搬車両の左右端にそれぞれ設置されている請求項1~3のいずれか一項に記載の運転制御システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217040521A KR102587696B1 (ko) | 2019-06-26 | 2020-06-09 | 운전 제어 시스템 |
JP2021528106A JP7180777B2 (ja) | 2019-06-26 | 2020-06-09 | 運転制御システム |
CN202080045695.9A CN114072314B (zh) | 2019-06-26 | 2020-06-09 | 驾驶控制系统 |
US17/619,748 US20220355806A1 (en) | 2019-06-26 | 2020-06-09 | Drive control system |
EP20833690.9A EP3992031A4 (en) | 2019-06-26 | 2020-06-09 | DRIVE CONTROL SYSTEM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019118572 | 2019-06-26 | ||
JP2019-118572 | 2019-06-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020261973A1 true WO2020261973A1 (ja) | 2020-12-30 |
Family
ID=74060964
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/022659 WO2020261973A1 (ja) | 2019-06-26 | 2020-06-09 | 運転制御システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220355806A1 (ja) |
EP (1) | EP3992031A4 (ja) |
JP (1) | JP7180777B2 (ja) |
KR (1) | KR102587696B1 (ja) |
CN (1) | CN114072314B (ja) |
WO (1) | WO2020261973A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240083353A1 (en) * | 2022-09-14 | 2024-03-14 | Venti Technologies | Container Misalignment Detection System and Methods for Autonomous Vehicles |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012158445A (ja) | 2011-02-01 | 2012-08-23 | Murata Machinery Ltd | 搬送車 |
JP2013230903A (ja) * | 2012-04-27 | 2013-11-14 | Hitachi Ltd | フォークリフト |
JP2014186693A (ja) | 2013-03-25 | 2014-10-02 | Murata Mach Ltd | 自律移動式無人搬送車 |
JP2017019596A (ja) * | 2015-07-08 | 2017-01-26 | 株式会社豊田自動織機 | 無人フォークリフトにおける荷取り時の走行制御方法及び荷取り時の走行制御装置 |
JP2019118572A (ja) | 2017-12-28 | 2019-07-22 | ユニ・チャーム株式会社 | 吸収性物品 |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3299125B2 (ja) * | 1996-09-30 | 2002-07-08 | 日野自動車株式会社 | 荷崩れ警報装置 |
JP4330050B2 (ja) * | 1999-08-27 | 2009-09-09 | 東急車輛製造株式会社 | パレット搬送車 |
JP2005018472A (ja) * | 2003-06-26 | 2005-01-20 | Nec Mobiling Ltd | 運送車両の荷崩れ監視システムおよびその方法 |
JP5165503B2 (ja) * | 2008-08-29 | 2013-03-21 | 全国農業協同組合連合会 | 積荷監視方法及びそのための積荷監視システム |
JP2013035399A (ja) * | 2011-08-08 | 2013-02-21 | Toshiba Alpine Automotive Technology Corp | 荷室監視装置 |
CN103273880A (zh) * | 2013-05-23 | 2013-09-04 | 无锡伊佩克科技有限公司 | 一种货车货物防掉落系统 |
US9902397B2 (en) * | 2014-07-30 | 2018-02-27 | Komatsu Ltd. | Transporter vehicle and transporter vehicle control method |
JP2016081159A (ja) * | 2014-10-14 | 2016-05-16 | トヨタ自動車株式会社 | 移動体 |
KR102326062B1 (ko) * | 2014-11-12 | 2021-11-12 | 현대모비스 주식회사 | 자율주행차량의 장애물 회피 시스템 및 방법 |
US9944213B2 (en) * | 2015-05-13 | 2018-04-17 | Stratom, Inc. | Robotic cargo system |
JP6741402B2 (ja) * | 2015-06-18 | 2020-08-19 | 日野自動車株式会社 | 情報提供装置、車両、および情報提供方法 |
US9958872B2 (en) * | 2016-06-06 | 2018-05-01 | International Business Machines Corporation | Cargo-based vehicle control |
US10346797B2 (en) * | 2016-09-26 | 2019-07-09 | Cybernet Systems, Inc. | Path and load localization and operations supporting automated warehousing using robotic forklifts or other material handling vehicles |
US10720070B2 (en) * | 2018-01-03 | 2020-07-21 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold of a robotic vehicle based on presence of detected payload(s) |
US11041956B2 (en) * | 2018-05-24 | 2021-06-22 | Toyota Motor Engineering & Manufacturing North America, Inc. | Lidar module and portable lidar scanner unit |
US20210354924A1 (en) * | 2018-10-29 | 2021-11-18 | Fq Ip Ab | Navigator for Intralogistics |
US11770677B1 (en) * | 2019-01-08 | 2023-09-26 | Spirited Eagle Enterprises, LLC | Enhanced safety systems and method for transportation vehicles |
WO2020203253A1 (ja) * | 2019-04-03 | 2020-10-08 | 株式会社Ihi | 重量推定システム |
US11429113B2 (en) * | 2019-08-08 | 2022-08-30 | Lg Electronics Inc. | Serving system using robot and operation method thereof |
US11062582B1 (en) * | 2020-02-07 | 2021-07-13 | Ford Global Technologies, Llc | Pick-up cargo bed capacitive sensor systems and methods |
US20220035684A1 (en) * | 2020-08-03 | 2022-02-03 | Nvidia Corporation | Dynamic load balancing of operations for real-time deep learning analytics |
US20220379792A1 (en) * | 2021-05-25 | 2022-12-01 | Stratom, Inc. | Cargo transport system |
WO2023064520A1 (en) * | 2021-10-14 | 2023-04-20 | Tusimple, Inc. | Systems and methods for operating an autonomous vehicle |
WO2024102846A1 (en) * | 2022-11-08 | 2024-05-16 | Seegrid Corporation | System and method for definition of a zone of dynamic behavior with a continuum of possible actions and locations within the same |
-
2020
- 2020-06-09 KR KR1020217040521A patent/KR102587696B1/ko active IP Right Grant
- 2020-06-09 WO PCT/JP2020/022659 patent/WO2020261973A1/ja unknown
- 2020-06-09 JP JP2021528106A patent/JP7180777B2/ja active Active
- 2020-06-09 EP EP20833690.9A patent/EP3992031A4/en active Pending
- 2020-06-09 CN CN202080045695.9A patent/CN114072314B/zh active Active
- 2020-06-09 US US17/619,748 patent/US20220355806A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012158445A (ja) | 2011-02-01 | 2012-08-23 | Murata Machinery Ltd | 搬送車 |
JP2013230903A (ja) * | 2012-04-27 | 2013-11-14 | Hitachi Ltd | フォークリフト |
JP2014186693A (ja) | 2013-03-25 | 2014-10-02 | Murata Mach Ltd | 自律移動式無人搬送車 |
JP2017019596A (ja) * | 2015-07-08 | 2017-01-26 | 株式会社豊田自動織機 | 無人フォークリフトにおける荷取り時の走行制御方法及び荷取り時の走行制御装置 |
JP2019118572A (ja) | 2017-12-28 | 2019-07-22 | ユニ・チャーム株式会社 | 吸収性物品 |
Also Published As
Publication number | Publication date |
---|---|
CN114072314B (zh) | 2024-06-11 |
EP3992031A4 (en) | 2023-07-19 |
KR20220006622A (ko) | 2022-01-17 |
EP3992031A1 (en) | 2022-05-04 |
US20220355806A1 (en) | 2022-11-10 |
CN114072314A (zh) | 2022-02-18 |
JPWO2020261973A1 (ja) | 2020-12-30 |
KR102587696B1 (ko) | 2023-10-10 |
JP7180777B2 (ja) | 2022-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210325897A1 (en) | Sensor array for an autonomously operated utility vehicle and method for surround-view image acquisition | |
CN113329927B (zh) | 基于激光雷达的挂车追踪 | |
US10260889B2 (en) | Position estimation device and position estimation method | |
JP7154362B2 (ja) | 作業車 | |
US20170320437A1 (en) | Measurement of a Dimension on a Surface | |
WO2020203253A1 (ja) | 重量推定システム | |
US11280897B2 (en) | Radar field of view extensions | |
US9616886B2 (en) | Size adjustment of forward objects for autonomous vehicles | |
US11709260B2 (en) | Data driven resolution function derivation | |
US20220035029A1 (en) | Method for evaluating an effect of an object in the surroundings of a transport device on a driving maneuver of the transport device | |
JP2009075638A (ja) | 車種判別装置 | |
US11999384B2 (en) | Travel route generation device and control device | |
WO2020261973A1 (ja) | 運転制御システム | |
JP2020067702A (ja) | 姿勢検出装置及び運搬システム | |
KR20220064112A (ko) | 다중 정보 기반의 증강 현실 영상을 이용한 선박블록 운송 장치 및 그것을 이용한 가시화 방법 | |
CN218949346U (zh) | 一种无人驾驶牵引车 | |
US20230182730A1 (en) | Apparatus for calculating driving path area of vehicle and method thereof | |
US11993274B2 (en) | Monitoring of on-board vehicle image capturing device functionality compliance | |
US20220163675A1 (en) | Methods of Using Background Images from a Light Detection and Ranging (LIDAR) Device | |
US20240094384A1 (en) | Object detection using reflective surfaces | |
JP6886237B2 (ja) | 移動体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20833690 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021528106 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20217040521 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020833690 Country of ref document: EP Effective date: 20220126 |