WO2023119566A1 - Système de transport - Google Patents

Système de transport Download PDF

Info

Publication number
WO2023119566A1
WO2023119566A1 PCT/JP2021/047923 JP2021047923W WO2023119566A1 WO 2023119566 A1 WO2023119566 A1 WO 2023119566A1 JP 2021047923 W JP2021047923 W JP 2021047923W WO 2023119566 A1 WO2023119566 A1 WO 2023119566A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
position information
camera device
transport vehicle
image
Prior art date
Application number
PCT/JP2021/047923
Other languages
English (en)
Japanese (ja)
Inventor
アンドレイ ピディン
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to PCT/JP2021/047923 priority Critical patent/WO2023119566A1/fr
Publication of WO2023119566A1 publication Critical patent/WO2023119566A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to a transport system.
  • Patent Document 1 discloses a work system that detects the position of a transport vehicle based on GPS signals.
  • Positioning signals such as GPS signals are used for highly accurate positioning, but when the construction site is indoors or surrounded by buildings, there may be cases where the positioning signals cannot be stably received.
  • the present invention provides a transport technology that can move a transport vehicle to a predetermined position in a work area based on image information captured by a camera without relying on positioning signals such as GPS signals. for the purpose of providing
  • a transport system includes a camera device provided in a work area, a transport vehicle capable of autonomous movement in the work area, and a server capable of communicating with the camera device and the transport vehicle.
  • a transport system The camera device Acquiring relative position information of the camera device with respect to the first marker and the second marker based on an image of the first marker and the second marker set with information indicating a three-dimensional direction in the work area.
  • the server is processing means for generating relative position information obtained by coordinate-converting the relative position information acquired from the camera device into position information in the work area;
  • the transport vehicle is an image acquisition means for acquiring an image of the surroundings; computing means for computing the position information of the second marker based on the image of the first marker acquired by the image acquisition means and the relative position information acquired from the server; setting the position information as a target value; and a control means for controlling movement of the carrier.
  • a transport technology capable of moving a transport vehicle to a predetermined position in a work area based on image information captured by a camera without using a positioning signal such as a GPS signal. can be done.
  • FIG. 4 is a diagram showing an example of an image in front of the transport vehicle captured by a camera of the transport vehicle;
  • FIG. 10 is a diagram illustrating an image captured by a camera of a transport vehicle, ST61 is a diagram showing a running state of the transport vehicle, and ST62 is a diagram showing an example of an image of a second marker;
  • FIG. 4 is a diagram for explaining the flow of processing in the camera device;
  • FIG. 1 is a diagram showing a configuration example of the transport system STM according to the embodiment.
  • the transport system STM includes at least a camera device 10 provided in a work area and a transport vehicle 20 ( (also referred to as an “autonomous guided vehicle”), a camera device 10, a guided vehicle 20, and a server 30 (information processing device) capable of communicating via a network NW.
  • a terminal 40 of an operator who manages the transport system STM can communicate with the server 30 via the network NW.
  • the server 30 transmits a confirmation request to the terminal 40 of the operator when transmitting information to the carrier 20 .
  • the operator approves via terminal 40 .
  • the operator can monitor the status of the transport vehicle 20 in the transport system STM through the terminal 40 .
  • FIG. 2 is a diagram schematically showing the camera device 10 and the carrier 20 in the work area WA, and the camera device 10 is arranged at a predetermined position in the work area WA.
  • the camera device 10 is arranged at a position P (x, y, z) within the work area WA.
  • At least one camera device 10 may be arranged in the work area WA, and a plurality of camera devices 10 may be arranged.
  • identification information is assigned to each of the plurality of camera devices 10, and the identification information is added to information (relative position information) transmitted from each camera device 10 to the server 30. do it.
  • the server 30 can identify each camera device arranged within the work area WA based on the identification information.
  • the work area WA in which the carrier 20 moves may be either outdoors or indoors, but in this embodiment, an example of movement of the carrier 20 indoors will be described. In the example shown in FIG. 2, an example in which the transport vehicle 20 moves indoors will be described.
  • the camera device 10 has an imaging unit 11, an image processing unit 12, and a communication unit 13 as functional configurations.
  • the imaging unit 11 of the camera device 10 is capable of capturing still images or moving images.
  • a digital camera with a sensor is used.
  • the image processing unit 12 acquires the relative distance (relative position information) between the object and the camera device 10 based on the image of the object captured by the imaging unit 11 .
  • the image processing unit 12 acquires relative position information between the object and the camera device 10 by image processing the image captured by the imaging unit 11 .
  • the object to be imaged by the imaging unit 11 is a two-dimensional marker (ArUco marker) in which a predetermined pattern is formed in a rectangular area.
  • ArUco marker a two-dimensional marker
  • FIG. 1 A first marker MK1 provided on one plane and a second marker MK2 provided on a second plane of the work area WA.
  • the first marker MK1 is provided on a plane WA10 (for example, an indoor plane: first plane) that intersects the road surface WA20 on which the carrier 20 moves
  • the second marker MK2 is , the road surface WA20 (for example, indoor road surface: second plane).
  • Acquisition of relative position information using an ArUco marker is a known technique, and detailed description thereof will be omitted.
  • FIG. 3 is a diagram illustrating an image of the first marker MK1 and the second marker MK2 captured by the imaging unit 11 of the camera device 10.
  • the first marker MK1 and the second marker MK2 (ArUco marker) each have coordinate information indicating a three-dimensional direction.
  • the image processing unit 12 performs various image processing such as smoothing processing and sharpening processing on the captured image, and recognizes the first marker MK1 and the second marker MK2 from the image.
  • the image processing unit 12 uses the first marker MK1 and the second marker MK2 recognized based on the image processing to determine the distance (relative position information d1) between the camera device 10 and the first marker MK1 and the camera device 10 and the distance (relative position information d2) between and the second marker MK2.
  • the communication unit 13 can bi-directionally communicate with the server 30 via the network NW.
  • the communication unit 13 transmits the relative position information d1 and d2 acquired by the image processing unit 12 to the server 30 .
  • the communication unit 13 transmits identification information specifying the camera device 10 to the server 30 together with the relative position information d1 and d2.
  • the server 30 can manage the identification information of the camera device 10 and the relative position information (d1, d2) in association with each other.
  • the server 30 includes a processing unit 31, a communication unit 32, and a storage unit 33, which are connected by a bus (not shown).
  • the processing unit 31 is a processor represented by a CPU, and implements various functions related to the server 30 by executing programs stored in the storage unit 33 . That is, information processing by software stored in the storage unit 33 can be specifically realized by the processing unit 31 which is an example of hardware, and can be executed as each functional unit included in the processing unit 31 .
  • the storage unit 33 is, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a HDD (Hard Disk Drive), or an SSD (Solid State Drive). data is stored.
  • the communication unit 32 is a communication interface with an external device.
  • the processing unit 31 acquires the relative position information (relative position information d1, relative position information d2) transmitted from the camera device 10 via the communication unit 32. Then, based on the obtained relative position information, the processing unit 31 generates relative position information D1 and D2 in the system coordinate system indicating the overall position in the work area WA, and converts the generated relative position information D1 and D2 to It is transmitted to the transport vehicle 20 .
  • the relative position information transmitted from the camera device 10 is information acquired in the coordinate system of the camera device 10 (camera coordinate system).
  • the processing unit 31 identifies the position P of the camera device 10 within the work area WA based on the identification information transmitted from the camera device 10 . Then, the processing unit 31 converts the relative position information d1 and d2 of the camera coordinate system to the system coordinate system (x, y , z), and transmits the generated relative position information D1 and D2 to the transport vehicle 20 .
  • FIG. 4 is a block diagram of the transport vehicle 20 according to one embodiment of the present invention.
  • ST41 is a plan view of the transport vehicle 20
  • ST42 is a side view of the transport vehicle 20.
  • FIG. In the drawing, Fr, Rr, L, and R indicate front, rear, left, and right when the transport vehicle 20 travels forward. Also, Up and Dn indicate the upper side and the lower side of the transport vehicle 20 .
  • the transport vehicle 20 is a vehicle that can move autonomously in the work area WA, is a vehicle that does not have a driver's seat or a driving mechanism for a passenger, and is unmanned during travel.
  • the transport vehicle 20 of the present embodiment is a four-wheeled vehicle having two front wheels 40f and two rear wheels 40r, and loads a load on a carrier 400 to transport the load.
  • the transport vehicle 20 is an electric vehicle that uses a battery 41 as a main power source.
  • the battery 41 is, for example, a secondary battery such as a lithium ion battery.
  • the transport vehicle 20 includes an electric travel mechanism 42 .
  • the electric travel mechanism 42 includes a travel mechanism 43 , a steering mechanism 45 and a braking mechanism 46 .
  • the traveling mechanism 43 is a mechanism that advances or reverses the transport vehicle 20 by using the traveling motor 44 as a drive source, and in the case of this embodiment, the rear wheels 40r are used as driving wheels.
  • Braking mechanisms 46 such as disc brakes are provided for the front wheels 40f and the rear wheels 40r, respectively.
  • the steering mechanism 45 is a mechanism that uses a steering motor 47 as a drive source to give a steering angle to the front wheels 40f.
  • the electric drive mechanism 42 of this embodiment may include a two-wheel steering mechanism that steers only the front wheels 40f, or may include a four-wheel steering mechanism that steers the front wheels 40f and the rear wheels 40r.
  • the transport vehicle 20 includes a detection unit 480 that detects surrounding conditions.
  • the detection unit 480 is a group of external sensors that monitor the surroundings of the carrier 20 .
  • the external sensor is, for example, a millimeter wave radar, and detects obstacles around the transport vehicle 20 using radio waves.
  • the external sensor is, for example, Light Detection and Ranging (LIDAR), and detects obstacles around the transport vehicle 20 by light.
  • a control unit (ECU) 49 can measure the distance to an obstacle by analyzing information detected by the detection unit 480 .
  • External sensors can be provided at the front, rear, left and right sides of the transport vehicle 20, respectively, so that the four directions of the transport vehicle 20 can be monitored.
  • the transport vehicle 20 includes a positioning sensor 410 and a communication device 420.
  • the positioning sensor 410 receives positioning signals from artificial satellites forming a GNSS (Global Navigation Satellite System).
  • GNSS Global Navigation Satellite System
  • An example of GNSS is GPS (Global Positioning System).
  • the positioning sensor 410 receives a positioning signal (GNSS signal, eg, GPS signal) to detect the current position of the transport vehicle 20 .
  • the communication device 420 performs communication (wireless communication) with the server 30 and acquires information.
  • the transport vehicle 20 includes a control unit (ECU) 49.
  • the control unit 49 includes a processor represented by a CPU, a storage device such as a semiconductor memory and a hard disk, and an interface with an external device.
  • the storage device stores programs executed by the processor, data (map information) used by the processor for processing, and the like.
  • a plurality of sets of processors, storage devices, and interfaces may be provided for each function of the transport vehicle 20 and configured to communicate with each other.
  • a control unit (ECU) 49 controls the electric drive mechanism 42, performs information processing on the detection results of the positioning sensor 410, and the communication results of the communication device 420.
  • the control unit 49 searches for a route from the current location to the destination based on the map information.
  • the communication device 420 can access the database of the server 30 on the network NW and acquire map information.
  • the transport vehicle 20 includes an image acquisition unit (camera) 48 that acquires images of the surroundings.
  • the image acquisition unit (camera) 48 is provided, for example, in front of the transport vehicle 20, and acquires an image in front when the transport vehicle 20 travels.
  • the control unit 49 can control automatic operation of the transport vehicle 20 based on image information acquired by the image acquisition unit (camera) 48, information (relative position information D1, D2) acquired by the communication device 420, and map information. can.
  • FIG. 5 is a diagram showing an example of an image in front of the transport vehicle 20 captured by the image acquisition unit (camera) 48 of the transport vehicle 20.
  • FIG. A control unit (ECU) 49 can perform image processing similar to that of the image processing section 12 of the camera device 10 .
  • Objects captured by the image acquisition unit (camera) 48 are the first marker MK1 and the second marker MK2 (ArUco markers), and the control unit (ECU) 49 captures the image captured by the image acquisition unit (camera) 48.
  • relative position information between the object and the camera device 10 is acquired by the image processing of .
  • the control unit (ECU) 49 acquires the coordinate information of the first marker MK1 and acquires the relative position information with respect to the first marker MK1. Is possible.
  • the coordinate information of the second marker MK2 and the relative position information with respect to the second marker MK2 cannot be acquired.
  • the work area WA where GPS signals cannot be stably detected, assuming that the position of the second marker MK2 is the transport target position of the transport vehicle 20, only the information from the image acquisition unit (camera) 48 can determine the relative position of the second marker MK2. There may be cases where location information cannot be acquired.
  • control unit (ECU) 49 uses relative position information acquired by the camera device 10 (relative position information generated by the server 30) and a first Based on the relative position information with respect to the marker MK1, the relative position information of the second marker MK2 (the target position of transportation) is obtained, and the travel of the transport vehicle 20 is controlled.
  • FIG. 7 is a diagram explaining the flow of processing in the camera device 10
  • FIG. 8 is a diagram explaining the flow of processing in the server 30
  • FIG. 9 is a diagram explaining the flow of processing in the transport vehicle 20.
  • step S710 the image capturing unit 11 captures an image of the object (first marker MK1, second marker MK2) to acquire an image.
  • step S720 the image processing unit 12 performs image processing on the image captured by the imaging unit 11.
  • step S730 the image processing unit 12 calculates the distance (relative position information) between the object and the camera device 10. to get
  • the first marker MK1 and the second marker MK2 (ArUco marker) are set with coordinate information indicating three-dimensional directions.
  • the distance (relative position information d1) between the camera device 10 and the first marker MK1 and the distance (relative position information d2) between the camera device 10 and the second marker MK2 are calculated. get.
  • step S740 the communication unit 13 transmits the relative position information d1 and d2 acquired by the image processing unit 12 to the server 30.
  • the communication unit 13 may transmit identification information for specifying the camera device 10 to the server 30 together with the relative position information d1 and d2.
  • the server 30 specifies the position of the camera device 10 in advance, it is not always necessary to transmit the identification information.
  • step S ⁇ b>810 the processing unit 31 acquires the relative position information (the relative position information d ⁇ b>1 and the relative position information d ⁇ b>2 ) transmitted from the camera device 10 and the identification information of the camera device 10 via the communication unit 32 .
  • step S820 the processing unit 31 identifies the position P of the camera device 10 within the work area WA based on the identification information transmitted from the camera device 10. Then, the processing unit 31 converts the relative position information d1 and d2 of the camera coordinate system to the system coordinate system (x, y , z) to generate relative position information D1 and D2.
  • step S830 the processing unit 31 transmits the relative position information D1 and D2 generated in step S820 to the transport vehicle 20 via the communication unit 32.
  • step S ⁇ b>900 the guided vehicle 20 travels using the GPS information detected by the positioning sensor 410 .
  • step S910 the control unit 49 determines whether the first marker MK1 has been recognized from the image captured by the image acquisition section (camera) 48. If the first marker MK1 is not recognized (S910-No), the process returns to step S900 and repeats the same process. On the other hand, when the first marker MK1 is recognized in the determination of step S910 (S910-Yes), the control unit 49 advances the process to step S920.
  • step S920 the control unit 49 acquires the coordinate information of the first marker MK1 and acquires relative position information with respect to the first marker MK1 based on the current position of the transport vehicle 20.
  • step S930 the control unit 49 acquires the relative position information D1 and D2 generated by the server 30 via the communication device 420.
  • step S940 the control unit 49 calculates the position of the second marker MK2 in the work area WA (position information ) as the target position for the new transfer.
  • the control unit 49 sets the position (position information) of the second marker MK2 as a target value and controls the movement of the transport vehicle 20.
  • FIG. 6 is a diagram illustrating an image captured by the image acquisition unit (camera) 48 of the carrier 20, and ST61 is a side view showing the running state of the carrier 20.
  • FIG. ST62 is a diagram showing an example image of the second marker MK2.
  • the second marker MK2 is provided on the road surface WA20, and when the second marker MK2 enters the imaging range 600 by the image acquisition section (camera) 48, an image as shown in ST62 can be acquired.
  • the control unit (ECU) 49 obtains the coordinate information of the second marker MK2, and the relative position information with respect to the second marker MK2. can be obtained.
  • step S950 the control unit 49 determines whether the second marker MK2 has been recognized from the image captured by the image acquisition section (camera) 48. If the second marker MK2 is not recognized (S950-No), the process returns to step S920 and repeats the same process. On the other hand, if the second marker MK2 is recognized in step S950 (S950-Yes), the control unit 49 advances the process to step S960.
  • step S960 the control unit 49 acquires the coordinate information of the second marker MK2 and sets the acquired coordinate information of the second marker MK2 as the final target position (target value).
  • the control unit 49 updates the target value based on the coordinate information acquired from the second marker MK2, and based on the target value, It controls the movement of the carrier 20 .
  • step S970 the control unit 49 sets the route to the second marker MK2 using the current position of the carrier 20 and the coordinate information of the second marker MK2, and controls movement (navigation) of the carrier 20.
  • the target point of the transport vehicle 20 is designated based on the map information created in advance from the 3D point cloud, but it is not realistic to create map information in a place such as a construction site where the layout changes frequently.
  • the transport system STM of the present embodiment it is possible to move the transport vehicle to a predetermined position in the work area based on the information of the image captured by the camera without using a positioning signal such as a GPS signal. . Further, according to the transport system STM of the present embodiment, the transport vehicle 20 is positioned at a predetermined position (second marker MK2) in the work area WA with a simpler system configuration without frequently creating map information. be able to.
  • the transport system of the above embodiment includes a camera device (10) provided in a work area, a transport vehicle (20) capable of autonomous movement in the work area, and a server capable of communicating with the camera device and the transport vehicle.
  • a transport system (STM) comprising: The camera device (10) Acquiring relative position information of the camera device with respect to the first marker and the second marker based on an image of the first marker and the second marker set with information indicating a three-dimensional direction in the work area.
  • An image processing means (12) is provided,
  • the server (30) Processing means (31) for generating relative position information obtained by coordinate-converting the relative position information acquired from the camera device into position information in the work area
  • the transport vehicle (20) is an image acquisition means (48) for acquiring an image of the surroundings; computing means (49) for computing the position information of the second marker based on the image of the first marker acquired by the image acquisition means and the relative position information acquired from the server;
  • a control means (49) for setting the position information as a target value and controlling movement of the carrier.
  • the control means (49) updates the target value based on the coordinate information acquired from the second marker, to control the movement of the carrier based on.
  • the first marker is provided on a plane (WA10) that intersects the road surface (WA20) on which the transport vehicle moves, and the second marker is provided on the road surface.
  • the first marker and the second marker are ArUco markers.
  • the transport vehicle further comprises detection means (480) for receiving a positioning signal and detecting the current position of the transport vehicle,
  • the control means (49) controls the movement of the transport vehicle based on at least one of the positioning signal detected by the detection means and the position information of the second marker calculated by the calculation means. to control.
  • a transport technology capable of moving a transport vehicle to a predetermined position in a work area based on image information captured by a camera without relying on positioning signals such as GPS signals.
  • the transport vehicle 20 is positioned at a predetermined position (second marker MK2) in the work area WA with a simpler system configuration without frequently creating map information. be able to.
  • the present invention supplies a program that realizes the functions of the above-described embodiments to a system or a camera device, carrier vehicle, or server that constitutes the system via a network or a storage medium. It is also possible for one or more processors to read the program and perform the processing of the notification device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Un système de transport comprend un dispositif de caméra disposé dans une zone de travail, un véhicule de transport pouvant se déplacer de manière autonome dans la zone de travail, et un serveur apte à communiquer avec le dispositif de caméra et le véhicule de transport. Le dispositif de caméra comprend une unité de traitement d'image qui acquiert des informations sur les positions du dispositif de caméra par rapport à un premier marqueur et à un second marqueur, dans lesquelles des informations indiquant une direction tridimensionnelle dans la zone de travail sont définies, en fonction d'images obtenues par la capture d'images du premier marqueur et du second marqueur. Le serveur comprend une unité de traitement qui génère des informations de position relative par la réalisation d'une transformation de coordonnées des informations de position relative acquises en provenance du dispositif de caméra en informations de position de la zone de travail. Le véhicule de transport comprend : une unité de calcul qui, en fonction de l'image du premier marqueur, acquise par une unité d'acquisition d'image qui acquiert une image environnante, et des informations de position relative acquises en provenance du serveur, calcule des informations de position de second marqueur ; et une unité de commande qui définit lesdites informations de position en tant que valeur cible pour commander le déplacement du véhicule de transport.
PCT/JP2021/047923 2021-12-23 2021-12-23 Système de transport WO2023119566A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/047923 WO2023119566A1 (fr) 2021-12-23 2021-12-23 Système de transport

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/047923 WO2023119566A1 (fr) 2021-12-23 2021-12-23 Système de transport

Publications (1)

Publication Number Publication Date
WO2023119566A1 true WO2023119566A1 (fr) 2023-06-29

Family

ID=86901793

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/047923 WO2023119566A1 (fr) 2021-12-23 2021-12-23 Système de transport

Country Status (1)

Country Link
WO (1) WO2023119566A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019240208A1 (fr) * 2018-06-13 2019-12-19 Groove X株式会社 Robot, procédé de commande de robot et programme
JP2020149463A (ja) * 2019-03-14 2020-09-17 株式会社東芝 移動体行動登録装置、移動体行動登録システム、及び移動体行動決定装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019240208A1 (fr) * 2018-06-13 2019-12-19 Groove X株式会社 Robot, procédé de commande de robot et programme
JP2020149463A (ja) * 2019-03-14 2020-09-17 株式会社東芝 移動体行動登録装置、移動体行動登録システム、及び移動体行動決定装置

Similar Documents

Publication Publication Date Title
RU2720138C2 (ru) Способ автоматического подведения к погрузочно-разгрузочной площадке для применения в грузовых автомобилях большой грузоподъемности
CN109278672B (zh) 用于辅助各种操纵的无线汽车拖车的方法、设备和系统
Chong et al. Autonomy for mobility on demand
JP2021520000A (ja) トレーラの検出および自律的なヒッチング
US20200110410A1 (en) Device and method for processing map data used for self-position estimation, mobile body, and control system for mobile body
JP2020107324A (ja) 車両隊列構成車両間で分配されたデータの収集と処理
CN110998472A (zh) 移动体以及计算机程序
CN109624973B (zh) 车辆控制装置、车辆控制方法及存储介质
JP6876863B2 (ja) 車両制御システム、車両制御方法、及びプログラム
US20230288935A1 (en) Methods and systems for performing inter-trajectory re-linearization about an evolving reference path for an autonomous vehicle
US20180267547A1 (en) Distributed computing among vehicles
US11807224B2 (en) Automated valet parking system
US20220198810A1 (en) Information processing device and information processing method
US20200292332A1 (en) Map information distribution system and vehicle
US20210284139A1 (en) Parking information management server, parking assist device, and parking assist system
CN113459852A (zh) 一种路径规划方法、装置以及移动工具
WO2023119566A1 (fr) Système de transport
CN110427034B (zh) 一种基于车路协同的目标追踪系统及方法
Wang et al. Intelligent distribution framework and algorithms for connected logistics vehicles
CN115237113B (zh) 机器人导航的方法、机器人、机器人系统及存储介质
CN114954511A (zh) 车辆控制装置、车辆控制方法以及存储介质
CN113734179A (zh) 一种行驶轨迹应用方法、装置、设备、存储介质及车辆
CN114562997A (zh) 车辆定位系统以及包括该车辆定位系统的封闭区域导航系统
JP6934760B2 (ja) 走行装置、走行制御システム、走行制御方法及びプログラム
WO2023132080A1 (fr) Véhicule de chantier, dispositif de commande, procédé de commande et système

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21969001

Country of ref document: EP

Kind code of ref document: A1