WO2023243444A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2023243444A1
WO2023243444A1 PCT/JP2023/020662 JP2023020662W WO2023243444A1 WO 2023243444 A1 WO2023243444 A1 WO 2023243444A1 JP 2023020662 W JP2023020662 W JP 2023020662W WO 2023243444 A1 WO2023243444 A1 WO 2023243444A1
Authority
WO
WIPO (PCT)
Prior art keywords
control unit
information
information processing
processing device
situation
Prior art date
Application number
PCT/JP2023/020662
Other languages
French (fr)
Japanese (ja)
Inventor
新 平野
誠 本城
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2023243444A1 publication Critical patent/WO2023243444A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/40Transportation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/20Information sensed or collected by the things relating to the thing itself
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/30Control

Definitions

  • the present invention relates to an information processing device.
  • the information processing device is an acquisition unit that acquires situation information regarding the situation of a mobile object requesting an operation command from the mobile object; a control unit capable of assigning a remote control task of the mobile body to a remote control unit capable of remotely controlling the mobile body; When a plurality of the mobile bodies are waiting for assignment of the remote control task, the control unit determines the order of assignment based on status information for each of the plurality of mobile bodies.
  • FIG. 1 is a diagram illustrating a configuration example of a remote support system including an information processing device according to an embodiment.
  • FIG. 2 is a block diagram showing a schematic configuration of the mobile body in FIG. 1.
  • FIG. 2 is a block diagram showing a schematic configuration of the remote control unit in FIG. 1.
  • FIG. 2 is a block diagram showing a schematic configuration of the information processing device in FIG. 1.
  • FIG. It is a table of the first evaluation value and the second evaluation value for each priority factor. It is an evaluation map for calculating a priority from a first total evaluation value and a second total evaluation value.
  • 5 is a flowchart for explaining order determination processing executed by the control unit in FIG. 4.
  • FIG. 4 is a flowchart for explaining order determination processing executed by the control unit in FIG. 4.
  • a remote support system 11 including an information processing device 10 may be configured to include a remote operation unit 12 and the information processing device 10.
  • the remote support system 11 may be able to communicate with the mobile 14 via the network 13 .
  • Remote support system 11 may communicate with mobile object 14 via wireless communication.
  • the mobile object 14 is capable of autonomous operation.
  • the mobile object 14 can perform remote operation based on operation commands obtained from the remote support system 11.
  • the mobile body 14 may be capable of being manually driven by a passenger of the mobile body 14 .
  • the moving object 14 may include, for example, a vehicle, a ship, an aircraft, and the like.
  • Vehicles may include, for example, automobiles, industrial vehicles, railroad vehicles, residential vehicles, fixed-wing aircraft that travel on runways, and the like.
  • Motor vehicles may include, for example, cars, trucks, buses, motorcycles, trolleybuses, and the like.
  • Industrial vehicles may include, for example, agricultural and construction industrial vehicles.
  • Industrial vehicles may include, for example, forklifts, golf carts, and the like.
  • Agricultural industrial vehicles may include, for example, tractors, tillers, transplanters, binders, combines, lawn mowers, and the like.
  • Industrial vehicles for construction may include, for example, bulldozers, scrapers, shovels, crane trucks, dump trucks, road rollers, and the like.
  • the moving body 14 may include a sensor section 15, a communication section 16, an operation control system 17, and a control section 18.
  • the sensor unit 15 may include, for example, an indoor camera that images the interior of the moving body 14, a microphone, and a vital sensor as sensors that detect the occupant status.
  • the indoor camera may generate images as occupant information indicating the occupant status.
  • the microphone may generate the occupant's voice as occupant information indicating the occupant status.
  • the vital sensor may generate the state of sickness of the passenger as passenger information indicating the passenger condition.
  • the sensor unit 15 may include a position detection sensor that detects the position of the mobile body 14 itself.
  • the operation control system 17 may operate the moving body 14 by controlling a drive unit such as an engine or a motor, a direction adjustment unit such as a steering wheel, and a brake of the moving body 14.
  • the driving control system 17 may determine whether autonomous driving is possible based on information indicating the surrounding situation detected by the sensor unit 15.
  • the driving control system 17 may determine whether autonomous driving is possible, for example, by comparing an evaluation value such as safety calculated for autonomous driving based only on information indicating the surrounding situation with a threshold value.
  • the control unit 18 includes one or more processors and memory.
  • the processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor specialized for specific processing.
  • the dedicated processor may include an application specific integrated circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the control unit 18 may be either an SoC (System-on-a-Chip) or an SiP (System In-a-Package) in which one or more processors cooperate.
  • the control unit 18 may control the communication unit 16 to transmit surrounding information to the information processing device 10 from after the remote control task is assigned in response to a request for an operation command until autonomous driving is resumed.
  • the control unit 18 may transfer the operation command to the operation control system 17 to perform remote operation.
  • the remote control unit 12 may be configured to include an output section 19 and an input section 20, as shown in FIG.
  • the remote control unit 12 allows the operator of the remote control unit 12 to recognize the surrounding situation of the moving body 14 through the output unit 19, and the input unit 20 detects the operator's operation input as input information, thereby controlling the moving body 14. 14 can be controlled remotely
  • the output unit 19 may output surrounding information of the moving body 14 that is the operation target of the assigned remote control task, which will be described later.
  • the output unit 19 is, for example, a display, and may display an image or distance distribution of objects as surrounding information provided by assignment of a remote control task.
  • the output unit 19 may display a map showing the order of assignment and the position of the mobile object 14 waiting to be assigned a remote control task, which will be described later.
  • the input unit 20 may detect an operation input for moving the moving body 14 that is the operation target of the assigned remote control task, which will be described later.
  • the input unit 20 may be, for example, a combination of a steering wheel, an accelerator pedal, and a brake pedal, or a combination of a directional lever and a button.
  • the input unit 20 may generate input information corresponding to the detected operation input.
  • the input unit 20 may directly or indirectly provide an operation command based on the input information to the mobile object 14 via the information processing device 10 .
  • the information processing device 10 is configured to include a communication section (acquisition section) 21 and a control section 22.
  • the information processing device 10 may further include a storage unit 23.
  • the communication unit 21 may be controlled by the control unit 22 to communicate with the mobile body 14 via the network 13.
  • the communication unit 21 may obtain a request for an operation command from the mobile object 14.
  • the communication unit 21 acquires situation information from the mobile object 14 requesting an operation command.
  • the communication unit 21 may acquire surrounding information for the moving object 14 requesting the operation command from observation devices around the moving object 14 and other moving objects 14 around the moving object 14 .
  • the surrounding observation device is, for example, a roadside machine.
  • the observation device may be a device capable of transmitting surrounding information detected by a built-in outdoor camera or distance measuring sensor via communication.
  • the communication unit 21 may acquire surrounding information from the mobile object 14 to which the remote control task is assigned to the remote control unit 12.
  • the communication unit 21 may provide the surrounding information acquired from the mobile object 14 to the remote control unit 12 to which the remote control task of the mobile object 14 is assigned.
  • the communication unit 21 may provide the mobile object 14 to which the remote control task is assigned to the remote control unit 12 with the operation command acquired from the remote control unit 12 .
  • the storage unit 23 includes any storage device such as a RAM (Random Access Memory) and a ROM (Read Only Memory).
  • the storage unit 23 may store various programs that cause the control unit 22 to function and various information that the control unit 22 uses.
  • the control unit 22 may store in the storage unit 23 the situation information acquired from the mobile object 14 requesting the operation command.
  • the control unit 22 may add information indicating the acquisition time of the situation information to the situation information.
  • the control unit 22 may store a plurality of pieces of situation information acquired from the same mobile object 14 at different times in the storage unit 23 in association with each other.
  • the control unit 22 may add information indicating that the operating command has been given to the situation information acquired from the moving body 14.
  • the control section 22 may delete the situation information acquired from the moving object 14 from the storage section 23.
  • the control unit 22 can assign the remote control unit 12 a remote control task for the mobile object 14 that requests a control command.
  • the remote control task is a task in which the remote control unit 12 remotely controls the moving body 14 that requests an operation command.
  • the control unit 22 determines the order of assignment based on status information for each of the plurality of moving bodies 14. The assignment of remote control tasks will be explained below.
  • the control unit 22 may calculate the priority of each of the plurality of moving bodies 14 waiting to be assigned a remote control task based on the situation information.
  • the control unit 22 may calculate the priority at the same time for each of the plurality of moving objects 14.
  • the same points in time may not only be exactly the same points in time, but also points in time that are included in each other within a predetermined time interval that is relatively short.
  • the control unit 22 may identify at least one priority factor for each mobile object 14 based on the situation information in order to calculate the priority.
  • the priority factor is an event that is assumed in advance as a factor for determining the priority of remote control around and inside the moving object 14 that requests the operation command, that is, the moving object 14 that has stopped autonomous operation and is stopped. good.
  • the priority factor may be predetermined for the situation information.
  • the control unit 22 may identify the priority factor using any method.
  • the control unit 22 may identify the priority factor using, for example, a previously learned determination model.
  • Priority factors include, for example, in the surrounding situation, traffic jams, vehicles lying outside, lane departures, running onto separation strips, obstructions in exiting, obstructions in warehousing, contact with railways, railways stopped, taxi stands stopped, stops stopped, public vehicles stopped in front, and hospitals.
  • Front stop etc. Lying outside the vehicle is a state in which a person located in the direction of movement of the moving object is lying down on the road.
  • This moving object means the moving object 14 that requested the operation command.
  • Lane deviation is a state in which another vehicle located in the moving direction of the moving object deviates from the lane.
  • the separation strip running over is a state in which another vehicle located in the traveling direction of the moving object is riding on the median strip.
  • the forward stop of a public utility vehicle is a state in which a moving object is stopped in front of a vehicle of high public utility, such as an ambulance, a fire engine, a mail truck, or a courier.
  • the forward stop of the public utility vehicle is a state in which the public utility vehicle exists behind the stopped moving body in the traveling direction.
  • Stopping in front of a hospital is a state in which the mobile body is stopped near the entrance of a hospital.
  • Priority factors include, for example, in the passenger situation, lying in the vehicle, injury, loss of consciousness, bleeding, intoxication, crying, screaming, disappearance, etc. Lying down in the vehicle is a state in which the occupant lies sideways inside the moving body 14.
  • Traum is a condition in which the occupant is injured.
  • Each priority factor may have at least one of a first evaluation value and a second evaluation value.
  • the first evaluation value is a fixed value that does not depend on the passage of time.
  • the second evaluation value is a value that changes depending on the elapsed time, and is, for example, a coefficient multiplied by a power of the elapsed time.
  • the first evaluation value and the second evaluation value may be set to increase as the urgency of the remote control task increases.
  • the second evaluation value may be corrected according to a plurality of pieces of situation information acquired from the same moving object 14 at different times. For example, the second evaluation value determined for a traffic jam is modified based on the number of moving objects 14 included in the traffic jam in each of a plurality of situation information obtained at different times and the time interval between the different times. It's fine. In other words, the second evaluation value determined for the traffic jam may be modified based on the rate of increase of other mobile units 14 into the traffic jam, including the mobile unit 14 transmitting the situation information.
  • the control unit 22 may create a target map showing the determined assignment order and the positions of the plurality of moving bodies 14 waiting to be assigned a remote control task.
  • the control unit 22 may control the communication unit 21 to transmit the created target map to the remote control unit 12 in operation.
  • step S103 the control unit 22 calculates a first total evaluation value and a second total evaluation value based on all priority factors identified in step S102. After the calculation, the process proceeds to step S104.
  • step S104 the control unit 22 calculates the priority of the mobile object 14 selected in step S100 based on the first total evaluation value and the second total evaluation value calculated in step S102. After the calculation, the process proceeds to step S105.
  • the information processing device 10 of this embodiment calculates the priorities of the plurality of moving objects 14 at the same time, and determines the allocation order based on the priorities.
  • the urgency of performing remote control may change depending on the elapsed time. Therefore, the priorities calculated at different times among the plurality of moving bodies 14 may deviate from the true urgency.
  • the information processing device 10 having the above-described configuration reduces the deviation from the true emergency, thereby improving the smoothness of traffic and the safety of people around and inside the mobile object 14 in remote support. safety can be further improved.
  • the information processing device an acquisition unit that acquires situation information regarding the situation of a mobile object requesting an operation command from the mobile object; a control unit capable of assigning a remote control task of the mobile body to a remote control unit capable of remotely controlling the mobile body;
  • the control unit includes: When a plurality of the mobile bodies are waiting for assignment of the remote control task, the assignment order is determined based on status information for each of the plurality of mobile bodies.
  • the situation information has at least one of a fixed first evaluation value and a second evaluation value that changes according to elapsed time, and priority factors for calculating the priority are determined in advance.
  • the control unit identifies the priority factor based on situation information obtained from the mobile object requesting the operation command, and determines the priority factor based on the first evaluation value and the second evaluation value of the priority factor. Calculate degree.
  • the situation information includes surrounding information indicating the surrounding situation of the mobile object.
  • the acquisition unit acquires the surrounding information from an observation device around the moving object that requests the operation command and from other moving objects around the moving object.
  • the situation information includes occupant information indicating the occupant situation of the mobile object.
  • embodiments according to the present disclosure are not limited to any of the specific configurations of the embodiments described above. Embodiments of the present disclosure extend to any novel features or combinations thereof described in this disclosure, or to any novel methods or process steps or combinations thereof described. be able to.
  • descriptions such as “first” and “second” are identifiers for distinguishing the configurations.
  • the numbers in the configurations can be exchanged.
  • a first mobile object can exchange identifiers "first” and “second” with a second mobile object.
  • the exchange of identifiers takes place simultaneously.
  • the configurations are distinguished.
  • Identifiers may be removed.
  • Configurations with removed identifiers are distinguished by codes.
  • the description of identifiers such as “first” and “second” in this disclosure should not be used to interpret the order of the configuration or to determine the existence of lower-numbered identifiers.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Operations Research (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

This information processing device includes an acquisition unit and a control unit. The acquisition unit acquires status information from a moving body that is requesting an operation instruction. The control unit can assign a remote control task of the moving body to a remote control unit. When a plurality of moving bodies are waiting to be assigned to a remote control task, the control unit determines the order of assignment on the basis of status information of each of the plurality of moving bodies.

Description

情報処理装置information processing equipment 関連出願の相互参照Cross-reference of related applications
 本出願は、2022年6月13日に日本国に特許出願された特願2022-95171の優先権を主張するものであり、この先の出願の開示全体をここに参照のために取り込む。 This application claims priority to Japanese Patent Application No. 2022-95171 filed in Japan on June 13, 2022, and the entire disclosure of this earlier application is incorporated herein by reference.
 本発明は、情報処理装置に関するものである。 The present invention relates to an information processing device.
 車両等の移動体の自動運転技術の開発が進められている。自動運転の運用において、安全性の観点から遠隔地からのオペレータによる遠隔支援が望まれる。オペレータには複数の移動体の遠隔支援が求められることがある。そこで、オペレータの遠隔支援に対する負担を軽減する情報処理装置が提案されている(特許文献1参照)。 The development of autonomous driving technology for moving objects such as vehicles is progressing. In the operation of autonomous driving, remote support by an operator from a remote location is desired from a safety perspective. Operators may be required to remotely support multiple mobile units. Therefore, an information processing device has been proposed that reduces the burden on the operator for remote support (see Patent Document 1).
特開2020-21453号公報JP2020-21453A
 第1の観点による情報処理装置は、
 操作命令を要求する移動体から、該移動体の状況に関する状況情報を取得する取得部と、
 前記移動体を、前記移動体を遠隔操作可能な遠隔操作ユニットに、前記移動体の遠隔操作タスクを割当て可能な制御部と、を備え、
 前記制御部は、複数の前記移動体が前記遠隔操作タスクの割当てを待機している場合、該複数の移動体毎の状況情報に基づいて、割当て順番を決める。
The information processing device according to the first viewpoint is
an acquisition unit that acquires situation information regarding the situation of a mobile object requesting an operation command from the mobile object;
a control unit capable of assigning a remote control task of the mobile body to a remote control unit capable of remotely controlling the mobile body;
When a plurality of the mobile bodies are waiting for assignment of the remote control task, the control unit determines the order of assignment based on status information for each of the plurality of mobile bodies.
図1は、一実施形態に係る情報処理装置を備える遠隔支援システムの構成例を示す図である。FIG. 1 is a diagram illustrating a configuration example of a remote support system including an information processing device according to an embodiment. 図1の移動体の概略構成を示すブロック図である。FIG. 2 is a block diagram showing a schematic configuration of the mobile body in FIG. 1. FIG. 図1の遠隔操作ユニットの概略構成を示すブロック図である。FIG. 2 is a block diagram showing a schematic configuration of the remote control unit in FIG. 1. FIG. 図1の情報処理装置の概略構成を示すブロック図である。2 is a block diagram showing a schematic configuration of the information processing device in FIG. 1. FIG. 優先要因毎の第1の評価値及び第2の評価値の表である。It is a table of the first evaluation value and the second evaluation value for each priority factor. 第1の総評価値及び第2の総評価値から優先度を算出するための評価マップである。It is an evaluation map for calculating a priority from a first total evaluation value and a second total evaluation value. 図4の制御部が実行する順番決定処理を説明するためのフローチャートである。5 is a flowchart for explaining order determination processing executed by the control unit in FIG. 4. FIG.
 以下、本開示を適用した情報処理装置の実施形態について、図面を参照して説明する。 Hereinafter, embodiments of an information processing device to which the present disclosure is applied will be described with reference to the drawings.
 図1に示すように、一実施形態に係る情報処理装置10を備える遠隔支援システム11は、遠隔操作ユニット12及び情報処理装置10を含んで構成されてよい。遠隔支援システム11は、ネットワーク13を介して移動体14と通信可能であってよい。遠隔支援システム11は、無線通信により移動体14と通信してよい。 As shown in FIG. 1, a remote support system 11 including an information processing device 10 according to an embodiment may be configured to include a remote operation unit 12 and the information processing device 10. The remote support system 11 may be able to communicate with the mobile 14 via the network 13 . Remote support system 11 may communicate with mobile object 14 via wireless communication.
 移動体14は、自律運転を実行可能である。移動体14は、遠隔支援システム11から取得する操作命令に基づいて遠隔運転を実行可能である。更に、移動体14は、移動体14の乗員によって、手動運転を実行可能であってよい。 The mobile object 14 is capable of autonomous operation. The mobile object 14 can perform remote operation based on operation commands obtained from the remote support system 11. Furthermore, the mobile body 14 may be capable of being manually driven by a passenger of the mobile body 14 .
 移動体14は、例えば車両、船舶、及び航空機等を含んでよい。車両は、例えば自動車、産業車両、鉄道車両、生活車両、および滑走路を走行する固定翼機等を含んでよい。自動車は、例えば乗用車、トラック、バス、二輪車、およびトロリーバス等を含んでよい。産業車両は、例えば農業および建設向けの産業車両等を含んでよい。産業車両は、例えばフォークリフトおよびゴルフカート等を含んでよい。農業向けの産業車両は、例えばトラクター、耕耘機、移植機、バインダー、コンバイン、および芝刈り機等を含んでよい。建設向けの産業車両は、例えばブルドーザー、スクレーバー、ショベルカー、クレーン車、ダンプカー、およびロードローラ等を含んでよい。車両は、人力で走行するものを含んでよい。車両の分類は、上述した例に限られない。例えば、自動車は、道路を走行可能な産業車両を含んでよい。複数の分類に同じ車両が含まれてよい。船舶は、例えばマリンジェット、ボート、およびタンカー等を含んでよい。航空機は、例えば固定翼機および回転翼機等を含んでよい。 The moving object 14 may include, for example, a vehicle, a ship, an aircraft, and the like. Vehicles may include, for example, automobiles, industrial vehicles, railroad vehicles, residential vehicles, fixed-wing aircraft that travel on runways, and the like. Motor vehicles may include, for example, cars, trucks, buses, motorcycles, trolleybuses, and the like. Industrial vehicles may include, for example, agricultural and construction industrial vehicles. Industrial vehicles may include, for example, forklifts, golf carts, and the like. Agricultural industrial vehicles may include, for example, tractors, tillers, transplanters, binders, combines, lawn mowers, and the like. Industrial vehicles for construction may include, for example, bulldozers, scrapers, shovels, crane trucks, dump trucks, road rollers, and the like. Vehicles may include those that are powered by human power. The classification of vehicles is not limited to the example described above. For example, a motor vehicle may include a road capable industrial vehicle. The same vehicle may be included in multiple classifications. Vessels may include, for example, watercraft, boats, tankers, and the like. Aircraft may include, for example, fixed wing aircraft, rotary wing aircraft, and the like.
 図2に示すように、移動体14は、センサ部15、通信部16、運転制御システム17、及び制御部18を含んで構成されてよい。 As shown in FIG. 2, the moving body 14 may include a sensor section 15, a communication section 16, an operation control system 17, and a control section 18.
 センサ部15は、周期的に移動体14の状況を検出してよい。検出される移動体14の状況は、当該移動体14の周囲状況を含んでよい。センサ部15は、周囲状況を検出するセンサとして、例えば、移動体14の周囲を撮像する屋外カメラ及び移動体14の周囲の物体を測距する測距センサを含んでよい。屋外カメラは、周囲状況を示す周囲情報として画像を生成してよい。測距センサは、周囲情報として、移動体14の周囲に存在する物体との距離を検出してよい。測距センサは、周囲情報として、測距センサからの方向別の物体の距離値を含む距離分布を生成してよい。検出される移動体14の状況は、当該移動体14の乗員状況を含んでよい。センサ部15は、乗員状況を検出するセンサとして、例えば、移動体14の屋内を撮像する屋内カメラ、マイク、及びバイタルセンサを含んでよい。屋内カメラは、乗員状況を示す乗員情報として、画像を生成してよい。マイクは、乗員状況を示す乗員情報として、乗員の音声を生成してよい。バイタルセンサは、乗員状況を示す乗員情報として、乗員の酔いの状態を生成してよい。センサ部15は、自身の移動体14の位置を検出する位置検出センサを含んでよい。 The sensor unit 15 may periodically detect the situation of the moving body 14. The detected situation of the moving body 14 may include the surrounding situation of the moving body 14. The sensor unit 15 may include, for example, an outdoor camera that captures an image of the surroundings of the moving body 14 and a distance measuring sensor that measures distances of objects around the moving body 14 as sensors that detect the surrounding situation. The outdoor camera may generate images as surrounding information indicating surrounding conditions. The distance sensor may detect distances to objects existing around the moving body 14 as surrounding information. The distance measurement sensor may generate a distance distribution including distance values of objects in each direction from the distance measurement sensor as surrounding information. The detected status of the moving body 14 may include the occupant status of the moving body 14. The sensor unit 15 may include, for example, an indoor camera that images the interior of the moving body 14, a microphone, and a vital sensor as sensors that detect the occupant status. The indoor camera may generate images as occupant information indicating the occupant status. The microphone may generate the occupant's voice as occupant information indicating the occupant status. The vital sensor may generate the state of sickness of the passenger as passenger information indicating the passenger condition. The sensor unit 15 may include a position detection sensor that detects the position of the mobile body 14 itself.
 通信部16は、制御部18によって制御されて、ネットワーク13を介して情報処理装置10と無線通信を行ってよい。通信部16は、他の移動体14、路側機等の他の機器、又は情報処理装置10の他の情報処理装置等を更に介して情報処理装置10と無線通信を行ってよい。通信部16は、後述するように、操作命令の要求を情報処理装置10に送信してよい。通信部16は、後述するように、センサ部15が検出する移動体14の状況に関する状況情報を情報処理装置10に送信してよい。通信部16は、後述するように、センサ部15が検出する周囲状況を示す周囲情報を情報処理装置10に送信してよい。通信部16は、操作命令を情報処理装置10から受信してよい。通信部16は、自身の移動体14の位置を示す情報を情報処理装置10に送信してよい。 The communication unit 16 may be controlled by the control unit 18 to perform wireless communication with the information processing device 10 via the network 13. The communication unit 16 may perform wireless communication with the information processing device 10 via another moving body 14, another device such as a roadside machine, or another information processing device of the information processing device 10. The communication unit 16 may transmit a request for an operation command to the information processing device 10, as described later. The communication unit 16 may transmit situation information regarding the situation of the moving object 14 detected by the sensor unit 15 to the information processing device 10, as described later. The communication unit 16 may transmit surrounding information indicating the surrounding situation detected by the sensor unit 15 to the information processing device 10, as described later. The communication unit 16 may receive an operation command from the information processing device 10. The communication unit 16 may transmit information indicating the position of its own mobile body 14 to the information processing device 10.
 運転制御システム17は、移動体14のエンジン又はモータ等の駆動部、操舵輪等の方向調整部、及びブレーキを制御して、移動体14を運転してよい。運転制御システム17は、センサ部15が検出する周囲状況を示す情報に基づいて自律運転の可否を判別してよい。運転制御システム17は、例えば、周囲状況を示す情報だけに基づく自律運転に対して算出される安全性等の評価値を閾値と比較することにより、自律運転の可否を判別してよい。 The operation control system 17 may operate the moving body 14 by controlling a drive unit such as an engine or a motor, a direction adjustment unit such as a steering wheel, and a brake of the moving body 14. The driving control system 17 may determine whether autonomous driving is possible based on information indicating the surrounding situation detected by the sensor unit 15. The driving control system 17 may determine whether autonomous driving is possible, for example, by comparing an evaluation value such as safety calculated for autonomous driving based only on information indicating the surrounding situation with a threshold value.
 運転制御システム17は、自律運転可能であると判断する場合、周囲状況に基づいて自律運転を行ってよい。運転制御システム17は、移動体14の周囲状況に起因して自律運転不可能であると判断する場合、自律運転不可であることを制御部18に通知してよい。運転制御システム17は、後述する操作命令に基づいて遠隔運転を行ってよい。運転制御システム17は、移動体14のアクセルペダル、ブレーキペダル、及びステアリングホイール等への乗員の操作に基づいて、手動運転を行ってよい。 If the driving control system 17 determines that autonomous driving is possible, it may perform autonomous driving based on the surrounding situation. If the driving control system 17 determines that autonomous driving is not possible due to the surrounding situation of the mobile object 14, it may notify the control unit 18 that autonomous driving is not possible. The operation control system 17 may perform remote operation based on operation commands to be described later. The driving control system 17 may perform manual driving based on the occupant's operations on the accelerator pedal, brake pedal, steering wheel, etc. of the moving body 14.
 制御部18は、1以上のプロセッサおよびメモリを含む。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、および特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC;Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD;Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable Gate Array)を含んでよい。制御部18は、1つ又は複数のプロセッサが協働するSoC(System-on-a-Chip)、およびSiP(System In a Package)のいずれかであってもよい。 The control unit 18 includes one or more processors and memory. The processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor specialized for specific processing. The dedicated processor may include an application specific integrated circuit (ASIC). The processor may include a programmable logic device (PLD). The PLD may include an FPGA (Field-Programmable Gate Array). The control unit 18 may be either an SoC (System-on-a-Chip) or an SiP (System In-a-Package) in which one or more processors cooperate.
 制御部18は、遠隔操作に基づく移動体14の動作を制御してよい。制御部18は、例えば、運転制御システム17から自律運転不可であることの通知を受ける場合、操作命令の要求を情報処理装置10に送信するように通信部16を制御してよい。制御部18は、操作命令の要求後から当該操作命令の要求に対して遠隔操作タスクが遠隔操作ユニット12に割当てられるまで、周囲情報及び乗員情報の少なくとも一方を含む状況情報を生成してよい。制御部18は、状況情報に、自律運転不可であることの通知を運転制御システム17から取得した時点を示す、停止時点を情報として付加してよい。制御部18は、生成した状況情報を情報処理装置10に送信するように通信部16を制御してよい。制御部18は、操作命令の要求に対する遠隔操作タスクの割当て後から自律運転が再開されるまで、周囲情報を情報処理装置10に送信するように通信部16を制御してよい。制御部18は、情報処理装置10から操作命令を受信する場合、当該操作命令を運転制御システム17に転送して、遠隔運転を実行させてよい。 The control unit 18 may control the operation of the moving body 14 based on remote control. For example, when receiving a notification from the driving control system 17 that autonomous driving is not possible, the control unit 18 may control the communication unit 16 to transmit a request for an operation command to the information processing device 10. The control unit 18 may generate situation information including at least one of surrounding information and occupant information from after a request for an operation command until a remote control task is assigned to the remote control unit 12 in response to the request for the operation command. The control unit 18 may add, as information, a stop time point indicating the time point at which the notification that autonomous driving is not possible is acquired from the driving control system 17 to the situation information. The control unit 18 may control the communication unit 16 to transmit the generated situation information to the information processing device 10. The control unit 18 may control the communication unit 16 to transmit surrounding information to the information processing device 10 from after the remote control task is assigned in response to a request for an operation command until autonomous driving is resumed. When receiving an operation command from the information processing device 10, the control unit 18 may transfer the operation command to the operation control system 17 to perform remote operation.
 遠隔操作ユニット12は、図3に示すように、出力部19及び入力部20を含んで構成されてよい。遠隔操作ユニット12は、遠隔操作ユニット12の操作者に、移動体14の周囲状況を出力部19により認識させ、当該操作者の操作入力を入力情報として入力部20が検出することにより、移動体14を遠隔操作可能である The remote control unit 12 may be configured to include an output section 19 and an input section 20, as shown in FIG. The remote control unit 12 allows the operator of the remote control unit 12 to recognize the surrounding situation of the moving body 14 through the output unit 19, and the input unit 20 detects the operator's operation input as input information, thereby controlling the moving body 14. 14 can be controlled remotely
 出力部19は、後述する、割当てられる遠隔操作タスクの操作対象である移動体14の周囲情報を出力してよい。出力部19は、例えば、ディスプレイであり、遠隔操作タスクの割当てにより付与される周囲情報としての画像又は物体の距離分布を表示してよい。出力部19は、後述する割当ての順番及び遠隔操作タスクの割当てを待機中の移動体14の位置を示す地図を表示してよい。 The output unit 19 may output surrounding information of the moving body 14 that is the operation target of the assigned remote control task, which will be described later. The output unit 19 is, for example, a display, and may display an image or distance distribution of objects as surrounding information provided by assignment of a remote control task. The output unit 19 may display a map showing the order of assignment and the position of the mobile object 14 waiting to be assigned a remote control task, which will be described later.
 入力部20は、後述する、割当てられる遠隔操作タスクの操作対象である移動体14を移動させるための操作入力を検出してよい。入力部20は、例えば、ステアリングホイール、アクセルペダル、及びブレーキペダルの組合せ、又は方向レバーとボタンとの組合せであってよい。入力部20は、検出した操作入力に相当する入力情報を生成してよい。入力部20は、入力情報に基づく操作命令を、情報処理装置10を介して間接的に、又は直接的に移動体14に付与してよい。 The input unit 20 may detect an operation input for moving the moving body 14 that is the operation target of the assigned remote control task, which will be described later. The input unit 20 may be, for example, a combination of a steering wheel, an accelerator pedal, and a brake pedal, or a combination of a directional lever and a button. The input unit 20 may generate input information corresponding to the detected operation input. The input unit 20 may directly or indirectly provide an operation command based on the input information to the mobile object 14 via the information processing device 10 .
 情報処理装置10は、図4に示すように、通信部(取得部)21及び制御部22を含んで構成される。情報処理装置10は、更に記憶部23を含んで構成されてよい。 As shown in FIG. 4, the information processing device 10 is configured to include a communication section (acquisition section) 21 and a control section 22. The information processing device 10 may further include a storage unit 23.
 通信部21は、制御部22によって制御されて移動体14とネットワーク13を介して通信してよい。通信部21は、移動体14から操作命令の要求を取得してよい。通信部21は、操作命令を要求する移動体14から、状況情報を取得する。通信部21は、操作命令を要求する移動体14にとっての周囲情報を、当該移動体14の周囲の観察装置、及び当該移動体14の周囲の他の移動体14から取得してよい。周囲の観察装置は、例えば、路側機である。観察装置は、内蔵する屋外カメラ又は測距センサが検出する周囲情報を通信により送信可能な装置であってよい。通信部21は、遠隔操作タスクが遠隔操作ユニット12に割当てられている移動体14から、周囲情報を取得してよい。通信部21は、移動体14から取得する周囲情報を、当該移動体14の遠隔操作タスクが割当てられている遠隔操作ユニット12に付与してよい。通信部21は、遠隔操作タスクが遠隔操作ユニット12に割当てられている移動体14に、当該遠隔操作ユニット12から取得する操作命令を付与してよい。 The communication unit 21 may be controlled by the control unit 22 to communicate with the mobile body 14 via the network 13. The communication unit 21 may obtain a request for an operation command from the mobile object 14. The communication unit 21 acquires situation information from the mobile object 14 requesting an operation command. The communication unit 21 may acquire surrounding information for the moving object 14 requesting the operation command from observation devices around the moving object 14 and other moving objects 14 around the moving object 14 . The surrounding observation device is, for example, a roadside machine. The observation device may be a device capable of transmitting surrounding information detected by a built-in outdoor camera or distance measuring sensor via communication. The communication unit 21 may acquire surrounding information from the mobile object 14 to which the remote control task is assigned to the remote control unit 12. The communication unit 21 may provide the surrounding information acquired from the mobile object 14 to the remote control unit 12 to which the remote control task of the mobile object 14 is assigned. The communication unit 21 may provide the mobile object 14 to which the remote control task is assigned to the remote control unit 12 with the operation command acquired from the remote control unit 12 .
 記憶部23は、例えば、RAM(Random Access Memory)及びROM(Read Only Memory)など、任意の記憶デバイスを含む。記憶部23は、制御部22を機能させる多様なプログラム及び制御部22が用いる多様な情報を記憶してよい。 The storage unit 23 includes any storage device such as a RAM (Random Access Memory) and a ROM (Read Only Memory). The storage unit 23 may store various programs that cause the control unit 22 to function and various information that the control unit 22 uses.
 制御部22は、1以上のプロセッサおよびメモリを含む。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、および特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、ASICを含んでよい。プロセッサは、PLDを含んでよい。PLDは、FPGAを含んでよい。制御部22は、1つ又は複数のプロセッサが協働するSoC及びSiPのいずれかであってもよい。 The control unit 22 includes one or more processors and memory. The processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor specialized for specific processing. A dedicated processor may include an ASIC. The processor may include a PLD. The PLD may include an FPGA. The control unit 22 may be either an SoC or an SiP in which one or more processors cooperate.
 制御部22は、操作命令を要求する移動体14から取得する状況情報を記憶部23に格納してよい。制御部22は、状況情報の取得時刻を示す情報を当該状況情報に付与してよい。制御部22は、同じ移動体14から異なる時点で取得する複数の状況情報を互いに関連付けて記憶部23に格納してよい。制御部22は、移動体14に操作命令を付与する場合、当該移動体14から取得した状況情報に操作命令を付与済みであることを示す情報を付加してよい。又は、制御部22は、移動体14に操作命令を付与する場合、当該移動体14から取得した状況情報を記憶部23から削除してよい。 The control unit 22 may store in the storage unit 23 the situation information acquired from the mobile object 14 requesting the operation command. The control unit 22 may add information indicating the acquisition time of the situation information to the situation information. The control unit 22 may store a plurality of pieces of situation information acquired from the same mobile object 14 at different times in the storage unit 23 in association with each other. When giving the operating command to the moving body 14, the control unit 22 may add information indicating that the operating command has been given to the situation information acquired from the moving body 14. Alternatively, when giving an operation command to the moving object 14, the control section 22 may delete the situation information acquired from the moving object 14 from the storage section 23.
 制御部22は、遠隔操作ユニット12に、操作命令を要求する移動体14の遠隔操作タスクを割当可能である。遠隔操作タスクは、遠隔操作ユニット12において、操作命令を要求する移動体14の遠隔操作を行うタスクである。制御部22は、複数の移動体14が遠隔操作タスクの割当てを待機している場合、当該複数の移動体14毎の状況情報に基づいて、割当て順番を決める。遠隔操作タスクの割当てについて以下に説明する。 The control unit 22 can assign the remote control unit 12 a remote control task for the mobile object 14 that requests a control command. The remote control task is a task in which the remote control unit 12 remotely controls the moving body 14 that requests an operation command. When a plurality of moving bodies 14 are waiting for assignment of a remote control task, the control unit 22 determines the order of assignment based on status information for each of the plurality of moving bodies 14. The assignment of remote control tasks will be explained below.
 制御部22は、状況情報に基づいて、遠隔操作タスクの割当てを待機している複数の移動体14毎の優先度を算出してよい。制御部22は、複数の移動体14別に、同一の時点における優先度を算出してよい。同一の時点は、厳密に同一な時点だけでなく、比較的短時間である所定の時間間隔に互いに含まれる時点であってよい。 The control unit 22 may calculate the priority of each of the plurality of moving bodies 14 waiting to be assigned a remote control task based on the situation information. The control unit 22 may calculate the priority at the same time for each of the plurality of moving objects 14. The same points in time may not only be exactly the same points in time, but also points in time that are included in each other within a predetermined time interval that is relatively short.
 制御部22は、優先度の算出のために、状況情報に基づいて少なくとも1つの優先要因を、移動体14毎に認定してよい。優先要因は、操作命令を要求する移動体14、すなわち自律運転を停止して停車中の移動体14の周囲及び内部において、遠隔操作の優先性を判断する要因として予め想定される事象であってよい。優先要因は、状況情報に対して予め定められていてよい。 The control unit 22 may identify at least one priority factor for each mobile object 14 based on the situation information in order to calculate the priority. The priority factor is an event that is assumed in advance as a factor for determining the priority of remote control around and inside the moving object 14 that requests the operation command, that is, the moving object 14 that has stopped autonomous operation and is stopped. good. The priority factor may be predetermined for the situation information.
 制御部22は、任意の方法で優先要因を認定してよい。制御部22は、例えば、予め学習された判定モデルを用いて、優先要因を認定してよい。 The control unit 22 may identify the priority factor using any method. The control unit 22 may identify the priority factor using, for example, a previously learned determination model.
 優先要因は、例えば、周囲状況においては、渋滞、車外横臥、車線逸脱、分離帯乗上げ、出庫妨害、入庫妨害、鉄道接触、鉄道停止、タクシー乗場停止、停留所停止、公益車両の前方停止、病院前停止等である。車外横臥は、移動体の進行方向に位置する人が路上で倒れている状態である。この移動体は、操作命令を要求した移動体14を意味する。車線逸脱は、移動体の進行方向に位置する他車両が車線から逸脱している状態である。分離帯乗上げは、移動体の進行方向に位置する他車両が中央分離帯に乗上げている状態である。出庫妨害は、移動体の停車位置が駐車中の他車両の出庫位置上であり、出庫を妨害している状態である。入庫妨害は、移動体の停車位置が駐車スペースへの入庫位置上であり、入庫を妨害している状態である。鉄道接触は、移動体の進行方向に位置する他車両が鉄道と接触している状態である。鉄道停止は、鉄道が移動体の進行方向上に停止している状態である。タクシー乗場停止は、タクシー乗場において移動体が停止している状態である。停留所停止は、バス又は路面電車の停留所において移動体が停止している状態である。公益車両の前方停止は、救急車、消防車、郵便車、宅配者等の公益性の高い車両の前方で移動体が停止している状態である。言換えると、公益車両の前方停止は、停止している移動体の進行方向の後方に、公益車両が存在する状態である。病院前停止は、病院の出入り口付近において移動体が停止している状態である。優先要因は、例えば、乗員状況においては、車内横臥、怪我、意識喪失、出血、酔い、泣込み、叫号、消失等である。車内横臥は、乗員が移動体14内で横倒れの状態である。怪我は、乗員が怪我している状態である。意識喪失は、乗員の外観上で意識を喪失している状態である。出血は、乗員が出血している状態である。酔いは、乗員が酔っ払っている状態である。泣込みは、乗員が泣いている状態である。叫号は、乗員が叫号している状態である。消失は、乗員が移動体14内から移動体14外に投げ出された状態である。 Priority factors include, for example, in the surrounding situation, traffic jams, vehicles lying outside, lane departures, running onto separation strips, obstructions in exiting, obstructions in warehousing, contact with railways, railways stopped, taxi stands stopped, stops stopped, public vehicles stopped in front, and hospitals. Front stop etc. Lying outside the vehicle is a state in which a person located in the direction of movement of the moving object is lying down on the road. This moving object means the moving object 14 that requested the operation command. Lane deviation is a state in which another vehicle located in the moving direction of the moving object deviates from the lane. The separation strip running over is a state in which another vehicle located in the traveling direction of the moving object is riding on the median strip. Obstruction of leaving the garage is a state in which the parking position of the mobile object is above the leaving position of another parked vehicle, and the vehicle is obstructing the vehicle from leaving the garage. Parking obstruction is a state in which the parking position of the moving body is above the parking space entry position, and the parking space is obstructed. Railway contact is a state in which another vehicle located in the traveling direction of the moving body is in contact with the railway. The railway stop is a state in which the railway is stopped in the direction of movement of the moving object. Stopped at a taxi rank is a state in which a mobile object is stopped at a taxi rank. Stopped at a bus stop is a state in which the moving object is stopped at a bus or streetcar stop. The forward stop of a public utility vehicle is a state in which a moving object is stopped in front of a vehicle of high public utility, such as an ambulance, a fire engine, a mail truck, or a courier. In other words, the forward stop of the public utility vehicle is a state in which the public utility vehicle exists behind the stopped moving body in the traveling direction. Stopping in front of a hospital is a state in which the mobile body is stopped near the entrance of a hospital. Priority factors include, for example, in the passenger situation, lying in the vehicle, injury, loss of consciousness, bleeding, intoxication, crying, screaming, disappearance, etc. Lying down in the vehicle is a state in which the occupant lies sideways inside the moving body 14. Injury is a condition in which the occupant is injured. Loss of consciousness is a state in which the occupant is apparently unconscious. Bleeding is a condition in which the occupant is bleeding. Intoxication is a condition in which a passenger is intoxicated. Crying is a state in which a passenger is crying. A shout is a state in which the crew members are shouting. Disappearance is a state in which the occupant is thrown from inside the moving body 14 to outside the moving body 14.
 各優先要因は、第1の評価値及び第2の評価値の少なくとも一方を有してよい。第1の評価値は、時間経過によらない固定された値である。第2の評価値は、経過時間に応じて変化する値であって、例えば、経過時間のべき乗に乗じる係数である。第1の評価値及び第2の評価値は、遠隔操作タスクの緊急性が高い程大きくなるように定められていてよい。第2の評価値は、同じ移動体14から異なる時点で取得する複数の状況情報に応じて補正されてよい。例えば、渋滞に対して定められている第2の評価値は、異なる時点で取得する複数の状況情報それぞれにおける渋滞に含まれる移動体14の数と、異なる時点の時間間隔とに基づいて修正されてよい。言換えると、渋滞に対して定められている第2の評価値は、状況情報を送信する移動体14を含む渋滞への他の移動体14の増加速度に基づいて修正されてよい。 Each priority factor may have at least one of a first evaluation value and a second evaluation value. The first evaluation value is a fixed value that does not depend on the passage of time. The second evaluation value is a value that changes depending on the elapsed time, and is, for example, a coefficient multiplied by a power of the elapsed time. The first evaluation value and the second evaluation value may be set to increase as the urgency of the remote control task increases. The second evaluation value may be corrected according to a plurality of pieces of situation information acquired from the same moving object 14 at different times. For example, the second evaluation value determined for a traffic jam is modified based on the number of moving objects 14 included in the traffic jam in each of a plurality of situation information obtained at different times and the time interval between the different times. It's fine. In other words, the second evaluation value determined for the traffic jam may be modified based on the rate of increase of other mobile units 14 into the traffic jam, including the mobile unit 14 transmitting the situation information.
 図5に示すように、各優先要因別に、第1の評価値及び第2の評価値が定められていてよい。各優先要因は、細分化されていてもよい。例えば、怪我、意識喪失、出血等の優先要因は、その度合いに応じて細分化され、度合いが大きくなるほど、第1の評価値及び第2の評価値が大きくなるように定められていてよい。 As shown in FIG. 5, a first evaluation value and a second evaluation value may be determined for each priority factor. Each priority factor may be subdivided. For example, priority factors such as injury, loss of consciousness, and bleeding may be subdivided according to their degree, and the first evaluation value and the second evaluation value may be determined to increase as the degree increases.
 制御部22は、状況情報から認定した優先要因の第1の評価値及び第2の評価値に基づいて移動体14別の優先度を算出してよい。以下に制御部22による、優先度の算出の具体例について説明する。 The control unit 22 may calculate the priority for each mobile object 14 based on the first evaluation value and the second evaluation value of the priority factor identified from the situation information. A specific example of priority calculation by the control unit 22 will be described below.
 制御部22は、優先度の算出のために、周囲情報の優先要因に基づく第1の総評価値と、乗員情報の優先要因に基づく第2の総評価値とを算出してよい。第1の総評価値は、周囲情報から認定された全優先要因の第1の評価値の合計値である。第2の総評価値は、周囲情報から認定された全優先要因の第2の評価値を、停止時点から任意に定められる時点までの時間経過に応じて評価した合計値であってよい。例えば、任意の優先要因の第2の評価値が単位時間に対する増加量である傾きである場合、時間経過×当該傾きが任意に定められる時点における第2の評価値として評価される。任意に定められる時点は、前述の優先度を算出する同一の時点である。任意に定められる時点は、例えば、各移動体14に対する優先要因の認定時点であってよく、遠隔操作タスクの割当てを待機している全移動体14に対する優先要因の認定後の時点、言換えると認定後の将来の時点であってよい。 In order to calculate the priority, the control unit 22 may calculate a first total evaluation value based on the priority factor of the surrounding information and a second total evaluation value based on the priority factor of the occupant information. The first total evaluation value is the sum of the first evaluation values of all priority factors certified from surrounding information. The second total evaluation value may be a total value obtained by evaluating the second evaluation values of all priority factors recognized from the surrounding information according to the passage of time from the time of stopping to an arbitrarily determined time. For example, if the second evaluation value of a given priority factor is a slope that is an amount of increase with respect to unit time, the second evaluation value at an arbitrarily determined point in time is evaluated as time elapsed times the slope. The arbitrarily determined time point is the same time point at which the aforementioned priority is calculated. The arbitrarily determined time point may be, for example, the time point at which the priority factor is recognized for each mobile object 14, or the time point after the priority factor is recognized for all the mobile objects 14 waiting to be assigned a remote control task, in other words. It may be at a future point in time after certification.
 制御部22は、第1の総評価値及び第2の総評価値に基づいて優先度を算出してよい。例えば、優先度の算出に、図6に示すように、第1の総評価値を横軸、第2の総評価値を縦軸とする評価マップが用いられてよい。評価マップでは、第1の総評価値及び第2の総評価値がともにゼロである位置が原点に定められている。評価マップでは、原点を通る対角線に沿って、領域が分割されている。分割された各領域には、優先度が定められている。優先度は、原点から最も離れた領域が最大であり、原点に近づくにつれて低くなるように定められている。例えば、原点から最も離れた領域から原点近傍までn個に分割された領域に、優先度p1~pnまでが定められている。なお、最大値であるp1から最小値であるpnまで順番に下がるように、優先度は定められている。 The control unit 22 may calculate the priority based on the first total evaluation value and the second total evaluation value. For example, as shown in FIG. 6, an evaluation map having the first total evaluation value on the horizontal axis and the second total evaluation value on the vertical axis may be used to calculate the priority. In the evaluation map, a position where both the first total evaluation value and the second total evaluation value are zero is defined as the origin. In the evaluation map, areas are divided along diagonal lines passing through the origin. A priority level is determined for each divided area. The priority is determined such that the area farthest from the origin has the highest priority, and the priority decreases as the area approaches the origin. For example, priorities p1 to pn are determined for n areas divided from the area farthest from the origin to the vicinity of the origin. Note that the priorities are determined in order from p1, which is the maximum value, to pn, which is the minimum value.
 制御部22は、移動体14別に算出した第1の総評価値及び第2の総評価値の値により定まる位置を含む領域を判別する。制御部22は、判別した領域の優先度を、当該移動体14の優先度として算出してよい。 The control unit 22 determines the area including the position determined by the first total evaluation value and the second total evaluation value calculated for each moving object 14. The control unit 22 may calculate the priority of the determined area as the priority of the moving body 14.
 制御部22は、算出した優先度に基づいて、遠隔操作タスクの割当てを待機している移動体14の割当て順番を決定する。具体的には、制御部22は、割当て順番を、算出した優先度が高い順番に決定してよい。制御部22は、複数の移動体14の優先度が同じである場合、状況情報を最初に取得してからの経過時間がより長い移動体14の割当て順を早めてよい。 Based on the calculated priority, the control unit 22 determines the assignment order of the mobile bodies 14 that are waiting to be assigned a remote control task. Specifically, the control unit 22 may determine the allocation order in order of the highest calculated priority. When the priority levels of the plurality of mobile bodies 14 are the same, the control unit 22 may advance the allocation order of the mobile body 14 for which the time elapsed since the situation information was first obtained is longer.
 制御部22は、決定した割当て順番で、遠隔操作ユニット12に遠隔操作タスクを割当てよい。制御部22は、周期的に割当て順番を決定してよく、任意の遠隔操作ユニット12が遠隔操作タスクを実行可能になる度に決定してよい。制御部22は、周期的に割当て順番を決定する構成においては、次に割当て順番を決定するまでの間は、最新に決定された割当て順番に応じて任意の遠隔操作ユニット12に遠隔操作タスクを割当ててよい。 The control unit 22 may allocate the remote control tasks to the remote control units 12 in the determined order of allocation. The control unit 22 may determine the allocation order periodically, and may determine the assignment order each time a given remote control unit 12 becomes capable of executing a remote control task. In a configuration in which the allocation order is determined periodically, the control unit 22 assigns a remote control task to any remote control unit 12 according to the most recently determined allocation order until the next allocation order is determined. May be assigned.
 制御部22は、決定した割当て順番及び遠隔操作タスクの割当てを待機中の複数の移動体14の位置を示す対象地図を作成してよい。制御部22は、作成した対象地図を、稼働中の遠隔操作ユニット12に送信するように通信部21を制御してよい。 The control unit 22 may create a target map showing the determined assignment order and the positions of the plurality of moving bodies 14 waiting to be assigned a remote control task. The control unit 22 may control the communication unit 21 to transmit the created target map to the remote control unit 12 in operation.
 上述の優先要因の中の特定の優先要因には、制御部22に実行させる動作が予め定められていてよい。例えば、車外横臥に対しては、消防署への連絡が定められている。又、分離帯乗上げに対しては、警察等への連絡が定められている。又、意識喪失に対しては、消防署への連絡が定められている。又、消失に対しては、警察等への連絡が定められている。制御部22は、認定された優先要因に対して実行すべき動作が定められている場合、当該優先要因を認定させた状況情報を、優先要因及び当該情報状況を発した移動体14の位置情報とともに、特定の連絡先に付与してよい。特定の連絡先は、当該優先要因に対して連絡先として定められている、消防署、警察等である。例えば、車外横臥を含む状況情報を送信した移動体14が存在した場合、当該移動体の位置情報と状況情報を含む連絡情報を生成して消防署へ送信してよい。 An operation to be executed by the control unit 22 may be predetermined for a specific priority factor among the above-mentioned priority factors. For example, if a vehicle is lying down outside the vehicle, it is stipulated that the fire department be contacted. In addition, it is stipulated that the police, etc., be contacted in the event of a vehicle running over the separator strip. Additionally, in the event of loss of consciousness, it is stipulated that the fire department be contacted. In addition, it is stipulated that the police should be contacted in the event of disappearance. When an action to be performed is determined for the recognized priority factor, the control unit 22 converts the situation information that caused the priority factor into the position information of the mobile object 14 that has issued the priority factor and the information situation. It may also be given to a specific contact. The specific contact information is the fire department, police, etc. that are determined as the contact information for the priority factor. For example, if there is a mobile object 14 that has transmitted situation information including lying outside the vehicle, contact information including the position information and situation information of the mobile object may be generated and transmitted to the fire department.
 次に、本実施形態において情報処理装置10の制御部22が実行する、順番決定処理について、図7のフローチャートを用いて説明する。順番決定処理は、例えば、周期的に開始する。 Next, the order determination process executed by the control unit 22 of the information processing device 10 in this embodiment will be described using the flowchart of FIG. 7. The order determination process, for example, starts periodically.
 ステップS100において、制御部22は、遠隔操作タスクの割当てを待機中且つ優先度が算出されていない移動体14を選択する。選択後、プロセスはステップS101に進む。 In step S100, the control unit 22 selects a mobile object 14 that is waiting to be assigned a remote control task and whose priority has not been calculated. After selection, the process proceeds to step S101.
 ステップS101では、制御部22は、ステップS100において選択した移動体14から取得した状況情報を記憶部23から読出す。読出し後、プロセスはステップS102に進む。 In step S101, the control unit 22 reads the situation information acquired from the mobile object 14 selected in step S100 from the storage unit 23. After reading, the process proceeds to step S102.
 ステップS102では、制御部22は、ステップS101において読出した状況情報に基づいて優先要因を認定する。認定後、プロセスはステップS103に進む。 In step S102, the control unit 22 identifies priority factors based on the situation information read out in step S101. After certification, the process proceeds to step S103.
 ステップS103では、制御部22は、ステップS102において認定した全優先要因に基づいて第1総評価値及び第2総評価値を算出する。算出後、プロセスはステップS104に進む。 In step S103, the control unit 22 calculates a first total evaluation value and a second total evaluation value based on all priority factors identified in step S102. After the calculation, the process proceeds to step S104.
 ステップS104では、制御部22は、ステップS102において算出した第1総評価値及び第2総評価値に基づいて、ステップS100において選択した移動体14の優先度を算出する。算出後、プロセスはステップS105に進む。 In step S104, the control unit 22 calculates the priority of the mobile object 14 selected in step S100 based on the first total evaluation value and the second total evaluation value calculated in step S102. After the calculation, the process proceeds to step S105.
 ステップS105では、制御部22は、遠隔操作タスクの割当てを待機中の全移動体14の優先度を算出済みであるか否かを判別する。全移動体14の優先度を算出していない場合、プロセスはステップS100に戻る。全移動体14の優先度を算出している場合、プロセスはステップS106に進む。 In step S105, the control unit 22 determines whether the priorities of all the moving bodies 14 waiting to be assigned a remote control task have been calculated. If the priorities of all mobile objects 14 have not been calculated, the process returns to step S100. If the priorities of all mobile objects 14 have been calculated, the process proceeds to step S106.
 ステップS106では、制御部22は、優先度を算出した全移動体14の優先度の高い順番に応じて、遠隔操作タスクの割当て順番を決定する。決定後、順番決定処理は終了する。 In step S106, the control unit 22 determines the order in which the remote control tasks are assigned in accordance with the order in which the priorities of all the moving objects 14 are calculated. After the determination, the order determination process ends.
 以上のような構成の本実施形態の情報処理装置10は、複数の移動体14が遠隔操作タスクの割当てを待機している場合、複数の移動体14毎の状況情報に基づいて、割当て順番を決める。複数の自動運転可能な移動体14の自律運転が不可能となる場合、周囲の交通状況、停止原因、及び移動体14内部の乗員の状況によって、遠隔操作を行う緊急性には違いがあることが一般的である。このような事象に対して、上記構成を有する情報処理装置10は、遠隔操作を行う緊急性を状況情報から判別して遠隔操作タスクの割当て順番を決定し得るので、遠隔支援における、交通の円滑性、並びに移動体14の周囲及び内部の人の安全性を向上させ得る。 The information processing apparatus 10 of this embodiment configured as described above determines the order of assignment based on the status information of each of the plurality of moving objects 14 when a plurality of moving objects 14 are waiting for assignment of a remote control task. decide. When autonomous operation of multiple autonomously capable mobile bodies 14 becomes impossible, the urgency of performing remote control will vary depending on the surrounding traffic conditions, the cause of the stoppage, and the situation of the occupants inside the mobile bodies 14. is common. In response to such an event, the information processing device 10 having the above configuration can determine the urgency of performing remote control from the situation information and determine the order in which remote control tasks are assigned. This can improve safety and the safety of people around and inside the moving body 14.
 また、本実施形態の情報処理装置10では、状況情報に対して、固定された第1の評価値、及び経過時間に応じて変化する第2の評価値の少なくとも一方を有し且つ優先度を算出するための優先要因が予め定められており、制御部22は、操作命令を要求する移動体14から取得する状況情報に基づいて優先要因を認定し、当該優先要因の第1の評価値及び第2の評価値に基づいて優先度を算出する。遠隔操作タスクの割当てを待機中の移動体14における遠隔操作の緊急性は、自律運転停止後からの経過時間に応じて変化し得る。このような事象に対して、上述の構成を有する情報処理装置10は、緊急性の経時変化に対応して、遠隔操作タスクの割当て順番を決定し得る。したがって、情報処理装置10は、遠隔支援における、交通の円滑性、並びに移動体14の周囲及び内部の人の安全性を更に向上させ得る。 Further, in the information processing device 10 of the present embodiment, the situation information has at least one of a fixed first evaluation value and a second evaluation value that changes depending on elapsed time, and has a priority. Priority factors for calculation are determined in advance, and the control unit 22 identifies the priority factors based on the situation information acquired from the mobile object 14 requesting the operation command, and determines the first evaluation value and the priority factor of the priority factors. A priority is calculated based on the second evaluation value. The urgency of remote control in the mobile object 14 that is waiting to be assigned a remote control task may change depending on the time that has elapsed since the autonomous operation was stopped. In response to such an event, the information processing device 10 having the above-described configuration can determine the order of assignment of remote control tasks in response to changes in urgency over time. Therefore, the information processing device 10 can further improve the smoothness of traffic and the safety of people around and inside the moving body 14 in remote support.
 また、本実施形態の情報処理装置10は、複数の移動体14における同一の時点における優先度を算出し、当該優先度に基づいて割当て順番を決定する。前述のように、遠隔操作を行う緊急性は経過時間により変化し得る。それゆえ、複数の移動体14間で異なる時点で算出した優先度は、真の緊急性からずれが生じ得る。このような事象に対して、上述の構成を有する情報処理装置10は、真の緊急性からのずれを低減するので、遠隔支援における、交通の円滑性、並びに移動体14の周囲及び内部の人の安全性を一層向上させ得る。 Furthermore, the information processing device 10 of this embodiment calculates the priorities of the plurality of moving objects 14 at the same time, and determines the allocation order based on the priorities. As mentioned above, the urgency of performing remote control may change depending on the elapsed time. Therefore, the priorities calculated at different times among the plurality of moving bodies 14 may deviate from the true urgency. In response to such an event, the information processing device 10 having the above-described configuration reduces the deviation from the true emergency, thereby improving the smoothness of traffic and the safety of people around and inside the mobile object 14 in remote support. safety can be further improved.
 また、本実施形態の情報処理装置10は、状況情報に含まれる周囲情報を、操作命令を要求する移動体14の周囲の観察装置、及び当該移動体14の周囲の他の移動体14から取得する。このような構成により、情報処理装置10は、操作命令を要求する移動体14の周囲状況を、多方向から取得した周囲情報に基づいて把握し得る。したがって、情報処理装置10は、当該移動体14のみから周囲情報を取得する構成に比べてより正確に周囲状況を把握し得るので、遠隔支援における、交通の円滑性、並びに移動体14の周囲及び内部の人の安全性をより一層向上させ得る。 Further, the information processing device 10 of the present embodiment acquires surrounding information included in the situation information from an observation device around the moving object 14 that requests an operation command and from other moving objects 14 around the moving object 14. do. With such a configuration, the information processing device 10 can grasp the surrounding situation of the moving object 14 that requests an operation command based on surrounding information acquired from multiple directions. Therefore, the information processing device 10 can grasp the surrounding situation more accurately than a configuration that acquires surrounding information only from the moving object 14, so it is possible to improve the smoothness of traffic and the surroundings of the moving object 14 in remote support. The safety of people inside can be further improved.
 一実施形態において、(1)情報処理装置は、
 操作命令を要求する移動体から、該移動体の状況に関する状況情報を取得する取得部と、
 前記移動体を、前記移動体を遠隔操作可能な遠隔操作ユニットに、前記移動体の遠隔操作タスクを割当て可能な制御部と、を備え、
 前記制御部は、
 複数の前記移動体が前記遠隔操作タスクの割当てを待機している場合、該複数の移動体毎の状況情報に基づいて、割当て順番を決める。
In one embodiment, (1) the information processing device:
an acquisition unit that acquires situation information regarding the situation of a mobile object requesting an operation command from the mobile object;
a control unit capable of assigning a remote control task of the mobile body to a remote control unit capable of remotely controlling the mobile body;
The control unit includes:
When a plurality of the mobile bodies are waiting for assignment of the remote control task, the assignment order is determined based on status information for each of the plurality of mobile bodies.
 (2)上記(1)の情報処理装置では、
 前記制御部は、前記状況情報に基づいて前記移動体の優先度を算出し、該優先度に基づいて前記割当て順番を決定する。
(2) In the information processing device of (1) above,
The control unit calculates the priority of the mobile object based on the situation information, and determines the allocation order based on the priority.
 (3)上記(2)の情報処理装置では、
 前記状況情報に対して、固定された第1の評価値、及び経過時間に応じて変化する第2の評価値の少なくとも一方を有し且つ前記優先度を算出するための優先要因が予め定められており、
 前記制御部は、前記操作命令を要求する移動体から取得する状況情報に基づいて前記優先要因を認定し、該優先要因の前記第1の評価値及び前記第2の評価値に基づいて前記優先度を算出する。
(3) In the information processing device of (2) above,
The situation information has at least one of a fixed first evaluation value and a second evaluation value that changes according to elapsed time, and priority factors for calculating the priority are determined in advance. and
The control unit identifies the priority factor based on situation information obtained from the mobile object requesting the operation command, and determines the priority factor based on the first evaluation value and the second evaluation value of the priority factor. Calculate degree.
 (4)上記(2)又は(3)の情報処理装置では、
 前記制御部は、同一の時点における優先度を算出する。
(4) In the information processing device of (2) or (3) above,
The control unit calculates priorities at the same time.
 (5)上記(1)乃至(4)のいずれかの情報処理装置では、
 前記状況情報は、前記移動体の周囲状況を示す周囲情報を含む。
(5) In the information processing device according to any one of (1) to (4) above,
The situation information includes surrounding information indicating the surrounding situation of the mobile object.
 (6)上記(5)の情報処理装置では、
 前記取得部は、前記周囲情報を、前記操作命令を要求する移動体の周囲の観察装置、及び該移動体の周囲の他の移動体から取得する。
(6) In the information processing device of (5) above,
The acquisition unit acquires the surrounding information from an observation device around the moving object that requests the operation command and from other moving objects around the moving object.
 (7)上記(1)乃至(6)のいずれかの情報処理装置では、
 前記状況情報は、該移動体の乗員状況を示す乗員情報を含む。
(7) In the information processing device according to any one of (1) to (6) above,
The situation information includes occupant information indicating the occupant situation of the mobile object.
 以上、情報処理装置10の実施形態を説明してきたが、本開示の実施形態としては、装置を実施するための方法又はプログラムの他、プログラムが記録された記憶媒体(一例として、光ディスク、光磁気ディスク、CD-ROM、CD-R、CD-RW、磁気テープ、ハードディスク、又はメモリカード等)としての実施態様をとることも可能である。 The embodiment of the information processing device 10 has been described above, but the embodiment of the present disclosure includes a method or a program for implementing the device, as well as a storage medium (for example, an optical disk, a magneto-optical disk, etc.) on which the program is recorded. It is also possible to take an embodiment as a disk, CD-ROM, CD-R, CD-RW, magnetic tape, hard disk, memory card, etc.).
 また、プログラムの実装形態としては、コンパイラによってコンパイルされるオブジェクトコード、インタプリタにより実行されるプログラムコード等のアプリケーションプログラムに限定されることはなく、オペレーティングシステムに組み込まれるプログラムモジュール等の形態であってもよい。さらに、プログラムは、制御基板上のCPUにおいてのみ全ての処理が実施されるように構成されてもされなくてもよい。プログラムは、必要に応じて基板に付加された拡張ボード又は拡張ユニットに実装された別の処理ユニットによってその一部又は全部が実施されるように構成されてもよい。 Furthermore, the implementation form of a program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter, but may also be in the form of a program module incorporated into an operating system. good. Furthermore, the program may or may not be configured such that all processing is performed only in the CPU on the control board. The program may be configured such that part or all of the program is executed by an expansion board attached to the board or another processing unit mounted in an expansion unit, as necessary.
 本開示に係る実施形態について説明する図は模式的なものである。図面上の寸法比率等は、現実のものとは必ずしも一致していない。 The diagrams explaining the embodiments of the present disclosure are schematic. The dimensional ratios, etc. on the drawings do not necessarily match the reality.
 本開示に係る実施形態について、諸図面及び実施例に基づき説明してきたが、当業者であれば本開示に基づき種々の変形又は改変を行うことが可能であることに注意されたい。従って、これらの変形又は改変は本開示の範囲に含まれることに留意されたい。例えば、各構成部等に含まれる機能等は論理的に矛盾しないように再配置可能であり、複数の構成部等を1つに組み合わせたり、或いは分割したりすることが可能である。 Although the embodiments according to the present disclosure have been described based on the drawings and examples, it should be noted that those skilled in the art can make various modifications or modifications based on the present disclosure. Therefore, it should be noted that these variations or modifications are included within the scope of this disclosure. For example, functions included in each component can be rearranged so as not to be logically contradictory, and a plurality of components can be combined into one or divided.
 本開示に記載された構成要件の全て、及び/又は、開示された全ての方法、又は、処理の全てのステップについては、これらの特徴が相互に排他的である組合せを除き、任意の組合せで組み合わせることができる。また、本開示に記載された特徴の各々は、明示的に否定されない限り、同一の目的、同等の目的、または類似する目的のために働く代替の特徴に置換することができる。したがって、明示的に否定されない限り、開示された特徴の各々は、包括的な一連の同一、又は、均等となる特徴の一例にすぎない。 All of the features described in this disclosure and/or all of the steps of any method or process disclosed may be used in any combination, except in combinations where these features are mutually exclusive. Can be combined. Also, each feature described in this disclosure, unless explicitly contradicted, can be replaced by alternative features serving the same, equivalent, or similar purpose. Thus, unless expressly stated to the contrary, each feature disclosed is one example only of a generic series of identical or equivalent features.
 さらに、本開示に係る実施形態は、上述した実施形態のいずれの具体的構成にも制限されるものではない。本開示に係る実施形態は、本開示に記載された全ての新規な特徴、又は、それらの組合せ、あるいは記載された全ての新規な方法、又は、処理のステップ、又は、それらの組合せに拡張することができる。 Furthermore, the embodiments according to the present disclosure are not limited to any of the specific configurations of the embodiments described above. Embodiments of the present disclosure extend to any novel features or combinations thereof described in this disclosure, or to any novel methods or process steps or combinations thereof described. be able to.
 本開示において「第1」及び「第2」等の記載は、当該構成を区別するための識別子である。本開示における「第1」及び「第2」等の記載で区別された構成は、当該構成における番号を交換することができる。例えば、第1の移動体は、第2の移動体と識別子である「第1」と「第2」とを交換することができる。識別子の交換は同時に行われる。識別子の交換後も当該構成は区別される。識別子は削除してよい。識別子を削除した構成は、符号で区別される。本開示における「第1」及び「第2」等の識別子の記載のみに基づいて、当該構成の順序の解釈、小さい番号の識別子が存在することの根拠に利用してはならない。 In this disclosure, descriptions such as "first" and "second" are identifiers for distinguishing the configurations. For configurations that are distinguished by descriptions such as “first” and “second” in the present disclosure, the numbers in the configurations can be exchanged. For example, a first mobile object can exchange identifiers "first" and "second" with a second mobile object. The exchange of identifiers takes place simultaneously. Even after exchanging identifiers, the configurations are distinguished. Identifiers may be removed. Configurations with removed identifiers are distinguished by codes. The description of identifiers such as "first" and "second" in this disclosure should not be used to interpret the order of the configuration or to determine the existence of lower-numbered identifiers.
 10 情報処理装置
 11 遠隔支援システム
 12 遠隔操作ユニット
 13 ネットワーク
 14 移動体
 15 センサ部
 16 通信部
 17 運転制御システム
 18 制御部
 19 出力部
 20 入力部
 21 通信部(取得部)
 22 制御部
 23 記憶部
10 Information processing device 11 Remote support system 12 Remote operation unit 13 Network 14 Mobile object 15 Sensor unit 16 Communication unit 17 Operation control system 18 Control unit 19 Output unit 20 Input unit 21 Communication unit (acquisition unit)
22 Control unit 23 Storage unit

Claims (7)

  1.  操作命令を要求する移動体から、該移動体の状況に関する状況情報を取得する取得部と、
     前記移動体を、前記移動体を遠隔操作可能な遠隔操作ユニットに、前記移動体の遠隔操作タスクを割当て可能な制御部と、を備え、
     前記制御部は、複数の前記移動体が前記遠隔操作タスクの割当てを待機している場合、該複数の移動体毎の状況情報に基づいて、割当て順番を決める
     情報処理装置。
    an acquisition unit that acquires situation information regarding the situation of a mobile object requesting an operation command from the mobile object;
    a control unit capable of assigning a remote control task of the mobile body to a remote control unit capable of remotely controlling the mobile body;
    In the information processing apparatus, the control unit determines the order of assignment based on status information for each of the plurality of moving bodies, when the plurality of moving bodies are waiting for assignment of the remote control task.
  2.  請求項1に記載の情報処理装置において、
     前記制御部は、前記状況情報に基づいて前記移動体の優先度を算出し、該優先度に基づいて前記割当て順番を決定する
     情報処理装置。
    The information processing device according to claim 1,
    The control unit calculates the priority of the mobile object based on the situation information, and determines the allocation order based on the priority.
  3.  請求項2に記載の情報処理装置において、
     前記状況情報に対して、固定された第1の評価値、及び経過時間に応じて変化する第2の評価値の少なくとも一方を有し且つ前記優先度を算出するための優先要因が予め定められており、
     前記制御部は、前記操作命令を要求する移動体から取得する状況情報に基づいて前記優先要因を認定し、該優先要因の前記第1の評価値及び前記第2の評価値に基づいて前記優先度を算出する
     情報処理装置。
    The information processing device according to claim 2,
    The situation information has at least one of a fixed first evaluation value and a second evaluation value that changes according to elapsed time, and priority factors for calculating the priority are determined in advance. and
    The control unit identifies the priority factor based on situation information obtained from the mobile object requesting the operation command, and determines the priority factor based on the first evaluation value and the second evaluation value of the priority factor. An information processing device that calculates degrees.
  4.  請求項2又は3に記載の情報処理装置において、
     前記制御部は、同一の時点における優先度を算出する
     情報処理装置。
    The information processing device according to claim 2 or 3,
    The information processing device, wherein the control unit calculates priorities at the same time.
  5.  請求項1から4のいずれか1項に記載の情報処理装置において、
     前記状況情報は、前記移動体の周囲状況を示す周囲情報を含む
     情報処理装置。
    The information processing device according to any one of claims 1 to 4,
    The situation information includes surrounding information indicating the surrounding situation of the mobile object. Information processing apparatus.
  6.  請求項5に記載の情報処理装置において、
     前記取得部は、前記周囲情報を、前記操作命令を要求する移動体の周囲の観察装置、及び該移動体の周囲の他の移動体から取得する
     情報処理装置。
    The information processing device according to claim 5,
    The acquisition unit acquires the surrounding information from an observation device around the moving object that requests the operation command and from other moving objects around the moving object.
  7.  請求項1から6のいずれか1項に記載の情報処理装置において、
     前記状況情報は、該移動体の乗員状況を示す乗員情報を含む
     情報処理装置。
     
     
    The information processing device according to any one of claims 1 to 6,
    The situation information includes occupant information indicating the occupant situation of the mobile object. Information processing device.

PCT/JP2023/020662 2022-06-13 2023-06-02 Information processing device WO2023243444A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022095171A JP2023181821A (en) 2022-06-13 2022-06-13 Information processing device
JP2022-095171 2022-06-13

Publications (1)

Publication Number Publication Date
WO2023243444A1 true WO2023243444A1 (en) 2023-12-21

Family

ID=89191045

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/020662 WO2023243444A1 (en) 2022-06-13 2023-06-02 Information processing device

Country Status (2)

Country Link
JP (1) JP2023181821A (en)
WO (1) WO2023243444A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019160146A (en) * 2018-03-16 2019-09-19 株式会社デンソー Vehicle remote support system and method
JP2020005123A (en) * 2018-06-28 2020-01-09 株式会社Soken Vehicle remote control system, vehicle control device, vehicle, and method for notifying start timing of remote control
WO2020071222A1 (en) * 2018-10-05 2020-04-09 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Information processing method, and information processing system
JP2020102159A (en) * 2018-12-25 2020-07-02 トヨタ自動車株式会社 Vehicle control device and vehicle control method
WO2021059715A1 (en) * 2019-09-27 2021-04-01 株式会社デンソー Monitoring center, monitoring system, and method
US20220137615A1 (en) * 2020-11-04 2022-05-05 Uber Technologies, Inc. Systems and Methods for Dynamic Data Buffering for Autonomous Vehicle Remote Assistance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019160146A (en) * 2018-03-16 2019-09-19 株式会社デンソー Vehicle remote support system and method
JP2020005123A (en) * 2018-06-28 2020-01-09 株式会社Soken Vehicle remote control system, vehicle control device, vehicle, and method for notifying start timing of remote control
WO2020071222A1 (en) * 2018-10-05 2020-04-09 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Information processing method, and information processing system
JP2020102159A (en) * 2018-12-25 2020-07-02 トヨタ自動車株式会社 Vehicle control device and vehicle control method
WO2021059715A1 (en) * 2019-09-27 2021-04-01 株式会社デンソー Monitoring center, monitoring system, and method
US20220137615A1 (en) * 2020-11-04 2022-05-05 Uber Technologies, Inc. Systems and Methods for Dynamic Data Buffering for Autonomous Vehicle Remote Assistance

Also Published As

Publication number Publication date
JP2023181821A (en) 2023-12-25

Similar Documents

Publication Publication Date Title
US20230004157A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
US11112793B2 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
CN110077410B (en) Remote assistance for autonomous vehicles in predetermined situations
US20190346863A1 (en) Vehicle platooning systems and methods
JP6051162B2 (en) System and method for predicting the behavior of detected objects
US20190064805A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
US20190064800A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
CN113195326A (en) Detecting general road weather conditions
CN113195327B (en) Determining wheel slip on a self-driving vehicle
CN112927550B (en) Automatic parking system
US20190064803A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
US10589742B2 (en) Vehicle snow level response
JP2021047512A (en) Automatic parking system
CN107097780A (en) Enable and disable automatic Pilot
CN112977437A (en) Prevention, detection and management of autonomous truck tire burst
US11458962B2 (en) Automatic parking system
CN112918465B (en) Automatic parking system
US20190064802A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
CN111629944A (en) Vehicle control device, vehicle, and vehicle control method
WO2019046204A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
CN113525362A (en) Automatic driving danger target determination method and device
CN110603179A (en) System and method for automated shunting of autologous vehicles
WO2023243444A1 (en) Information processing device
JP7132740B2 (en) Object detection system
JP7075789B2 (en) Vehicle control devices, vehicle control methods, and programs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23823737

Country of ref document: EP

Kind code of ref document: A1