WO2023149370A1 - Système de commande, procédé de commande et support de stockage - Google Patents

Système de commande, procédé de commande et support de stockage Download PDF

Info

Publication number
WO2023149370A1
WO2023149370A1 PCT/JP2023/002623 JP2023002623W WO2023149370A1 WO 2023149370 A1 WO2023149370 A1 WO 2023149370A1 JP 2023002623 W JP2023002623 W JP 2023002623W WO 2023149370 A1 WO2023149370 A1 WO 2023149370A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
spatial information
reliability
unique identifier
route
Prior art date
Application number
PCT/JP2023/002623
Other languages
English (en)
Japanese (ja)
Inventor
浩一朗 猪
洋平 佐藤
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2023000580A external-priority patent/JP2023112666A/ja
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2023149370A1 publication Critical patent/WO2023149370A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/907Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/909Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/40Transportation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/20Information sensed or collected by the things relating to the thing itself
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/30Control

Definitions

  • the present invention relates to control systems, control methods, storage media, and the like.
  • digital architecture an overall picture that connects data and systems between members of different organizations and societies has been promoted in line with technological innovations such as autonomous driving mobility and spatial recognition systems around the world.
  • Patent Document 1 describes a configuration in which a single processor divides a spatio-temporal area in time and space according to spatio-temporal management data provided by a user to generate a plurality of spatio-temporal divided areas. Also, in consideration of the temporal and spatial proximity of the spatio-temporal segments, an identifier expressed by a one-dimensional integer value is assigned to uniquely identify each of the plurality of spatio-temporal segments.
  • a spatio-temporal data management system determines the arrangement of time-series data so that data in spatio-temporal divided areas with similar identifiers are arranged closely on the storage device.
  • Patent Document 1 it is only within the system that generated the data that the reliability of the data related to the generated area can be grasped. Therefore, it has been difficult for users using other systems to utilize the information on the spatially divided areas.
  • the present invention provides a system that makes it easier for more users to utilize three-dimensional spatial information.
  • a control system assigns a unique identifier to a three-dimensional space defined by latitude/longitude/height, and associates spatial information about the state and time of objects existing in the space with the unique identifier.
  • having a formatting means for formatting and storing the The formatting means formats and stores information about the reliability of the spatial information in association with the unique identifier.
  • FIG. 1 It is a figure which shows the whole structure example of the autonomous mobile body control system concerning Embodiment 1 of this invention.
  • (A) is a diagram showing an example of an input screen when a user inputs position information
  • (B) is a diagram showing an example of a selection screen for selecting an autonomous mobile body to be used.
  • (A) is a diagram showing an example of a screen for confirming the current position of an autonomous mobile body
  • (B) is a diagram showing an example of a map display screen when confirming the current position of an autonomous mobile body.
  • 2 is a functional block diagram showing an internal configuration example of 10 to 15 in FIG. 1;
  • FIG. 1 FIG.
  • FIG. 1 is a perspective view showing a mechanical configuration example of an autonomous mobile body 12 according to Embodiment 1.
  • FIG. 3 is a block diagram showing a specific hardware configuration example of a control unit 10-2, a control unit 11-2, a control unit 12-2, a control unit 13-2, a control unit 14-3, and a control unit 15-2;
  • FIG. 4 is a sequence diagram illustrating processing executed by the autonomous mobile body control system according to the first embodiment;
  • FIG. 9 is a sequence diagram continued from FIG. 8;
  • FIG. 10 is a sequence diagram continued from FIG. 9;
  • (A) is a diagram showing latitude/longitude information of the earth, and
  • (B) is a perspective view showing the predetermined space 100 of (A).
  • 4 is a diagram schematically showing spatial information in space 100.
  • FIG. (A) is a diagram showing route information using map information
  • (B) is a diagram showing route information using position point cloud data using map information
  • (C) is a map showing route information using unique identifiers. It is the displayed figure.
  • FIG. 11 is a sequence diagram illustrating an example of processing for storing detection time/detection method information;
  • FIG. 11 is a sequence diagram illustrating an example of processing for storing detection time/detection method information
  • FIG. 11 is a flowchart illustrating an example of processing for calculating reliability and determining whether or not movement is permitted;
  • FIG. 11 is a sequence diagram illustrating an example of processing for calculating reliability and correcting route information; It is a flowchart figure explaining the example of a process which stores reliability.
  • FIG. 11 is a flow chart illustrating an example of processing for acquiring reliability and determining whether or not movement is permitted;
  • FIG. 11 is a sequence diagram illustrating an example of processing for acquiring reliability and correcting route information;
  • the mobile body may be one in which the user can operate at least a part of the movement of the mobile body. That is, for example, various displays related to the moving route and the like may be displayed to the user, and the user may perform a part of the driving operation of the moving body with reference to the display.
  • FIG. 1 is a diagram showing an overall configuration example of an autonomous mobile body control system according to Embodiment 1 of the present invention.
  • the autonomous mobile body control system also abbreviated as control system
  • the user interface 11 means a user terminal device.
  • each device shown in FIG. 1 is connected via the Internet 16 by respective network connection units, which will be described later.
  • network connection units such as LAN (Local Area Network) may be used.
  • part of the system control device 10, the user interface 11, the route determining device 13, the conversion information holding device 14, etc. may be configured as the same device.
  • the system control device 10, the user interface 11, the autonomous mobile body 12, the route determination device 13, the conversion information holding device 14, and the sensor node 15 each include a CPU (ECU) as a computer, and ROM, RAM, HDD, etc. as storage media. It includes an information processing device consisting of Details of the function and internal configuration of each device will be described later.
  • screen images displayed on the user interface 11 when the user browses the current position of the autonomous mobile body 12 will be described with reference to FIGS. 3(A) and 3(B). Based on these explanations, an example will be used to explain how the application is operated in the autonomous mobile body control system.
  • map display will be described on a two-dimensional plane for the sake of convenience. You can also enter information. That is, according to this embodiment, a three-dimensional map can be generated.
  • Fig. 2(A) is a diagram showing an example of an input screen when a user inputs position information
  • Fig. 2(B) is a diagram showing an example of a selection screen for selecting an autonomous mobile body to be used.
  • the WEB page of the system control device 10 is displayed.
  • the input screen 40 has a list display button 48 for displaying a list of autonomous moving bodies (mobilities) to be used.
  • a list of mobilities is displayed as shown in FIG. A screen 47 is displayed.
  • the user first selects the autonomous mobile body (mobility) to be used on the list display screen 47 .
  • the autonomous mobile body mobility
  • the list display screen 47 for example, mobilities M1 to M3 are displayed in a selectable manner, but the number is not limited to this.
  • the screen automatically returns to the input screen 40 of FIG. 2(A). Also, the selected mobility name is displayed on the list display button 48 . After that, the user inputs the location to be set as the starting point in the input field 41 of "starting point".
  • the user inputs the location to be set as a transit point in the input field 42 of "transit point 1". It is possible to add a waypoint, and when the add waypoint button 44 is pressed once, an input field 46 for "waypoint 2" is additionally displayed, and the waypoint to be added can be input.
  • add waypoint button 44 Each time the add waypoint button 44 is pressed, additional input fields 46 are displayed, such as "waypoint 3" and "waypoint 4", and multiple additional waypoints can be entered. Also, the user inputs a place to be set as the arrival point in the input field 43 of "arrival point". Although not shown in the figure, when the input fields 41 to 43, 46, etc. are clicked, a keyboard or the like for inputting characters is temporarily displayed so that desired characters can be input.
  • the user can set the movement route of the autonomous mobile body 12 by pressing the decision button 45 .
  • "AAA” is set as the departure point
  • "BBB” is set as the transit point 1
  • "CCC” is set as the arrival point.
  • the text to be entered in the input field may be, for example, an address, or it may be possible to enter location information for indicating a specific location, such as latitude/longitude information, store name, and telephone number.
  • FIG. 3A is a diagram showing an example of a screen for confirming the current position of an autonomous mobile body
  • FIG. 3B is a diagram showing an example of a map display screen when confirming the current position of an autonomous mobile body.
  • Reference numeral 50 in FIG. 3(A) denotes a confirmation screen, which is displayed by operating an operation button (not shown) after setting the movement route of the autonomous mobile body 12 on the screen as shown in FIG. 2(A).
  • the current position of the autonomous mobile body 12 is displayed on the WEB page of the user interface 11, like the current position 56, for example. Therefore, the user can easily grasp the current position.
  • the user can update the screen display information to display the latest state. Further, the user can change the place of departure, the waypoint, and the place of arrival by pressing the change waypoint/arrival place button 54 . That is, it is possible to change by inputting the places to be reset in the input field 51 of "departure point", the input field 52 of "route point 1", and the input field 53 of "arrival point".
  • FIG. 3(B) shows an example of a map display screen 60 that switches from the confirmation screen 50 when the map display button 55 of FIG. 3(A) is pressed.
  • the current location of the autonomous mobile body 12 can be confirmed more easily by displaying the current location 62 on the map.
  • the return button 61 the display screen can be returned to the confirmation screen 50 of FIG. 3(A).
  • the user can easily set a movement route for moving the autonomous mobile body 12 from a predetermined location to a predetermined location.
  • a route setting application can also be applied to, for example, a taxi dispatch service, a drone home delivery service, and the like.
  • FIG. 4 is a functional block diagram showing an internal configuration example of 10 to 15 in FIG. Some of the functional blocks shown in FIG. 4 are realized by causing a computer (not shown) included in each device to execute a computer program stored in a memory (not shown) as a storage medium.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • each functional block shown in FIG. 4 may not be built in the same housing, and may be configured by separate devices connected to each other via signal paths.
  • the user interface 11 includes an operation section 11-1, a control section 11-2, a display section 11-3, an information storage section (memory/HDD) 11-4, and a network connection section 11-5.
  • the operation unit 11-1 is composed of a touch panel, key buttons, etc., and is used for data input.
  • the display unit 11-3 is, for example, a liquid crystal screen, and is used to display route information and other data.
  • the display screen of the user interface 11 shown in FIGS. 2 and 3 is displayed on the display unit 11-3.
  • the user can use the menu displayed on the display unit 11-3 to select a route, input information, confirm information, and the like. That is, the operation unit 11-1 and the display unit 11-3 provide an operation interface for the user to actually operate.
  • a touch panel may be used as both the operation section and the display section.
  • the control unit 11-2 incorporates a CPU as a computer, manages various applications in the user interface 11, manages modes such as information input and information confirmation, and controls communication processing. Also, it controls the processing in each part in the system controller.
  • the information storage unit (memory/HDD) 11-4 is a database for holding necessary information such as computer programs to be executed by the CPU.
  • a network connection unit 11-5 controls communication performed via the Internet, LAN, wireless LAN, or the like.
  • the user interface 11 may be, for example, a device such as a smart phone, or may be in the form of a tablet terminal.
  • the user interface 11 of the present embodiment displays the departure point, waypoint, and destination on the input screen 40 on the browser screen of the system control device 10, and allows the user to enter positions such as the departure point, waypoint, and arrival point. Information can be entered. Furthermore, by displaying the confirmation screen 50 and the map display screen 60 on the browser screen, the current position of the autonomous mobile body 12 can be displayed.
  • the route determination device 13 includes a map information management unit 13-1, a control unit 13-2, a position/route information management unit 13-3, an information storage unit (memory/HDD) 13-4, and a network connection unit 13. -5, and a unique identifier management unit 13-6.
  • the map information management unit 13-1 holds wide-area map information, searches for route information indicating a route on the map based on designated predetermined position information, and uses the route information of the search result as a position/ It is transmitted to the route information management section 13-3.
  • the map information is three-dimensional map information that includes information such as terrain and latitude/longitude/altitude, and also includes roadway, sidewalk, direction of travel, and traffic regulation information related to the Road Traffic Law.
  • control unit 13-2 incorporates a CPU as a computer, and controls processing in each unit within the route determination device 13.
  • FIG. 1
  • the position/route information management unit 13-3 manages the position information of the autonomous mobile body acquired via the network connection unit 13-5, transmits the position information to the map information management unit 13-1, and manages the map information. It manages the route information as the search result obtained from the unit 13-1.
  • the unique identifier management unit 13-6 manages the location information of the map managed by the map information management unit 13-1 and the unique identifier corresponding to the location information in association with each other.
  • control unit 13-2 converts the route information managed by the position/route information management unit 13-3 into a predetermined data format and transmits it to the external system.
  • the route determination device 13 is configured to search for a route in compliance with the Road Traffic Law or the like based on designated position information, and to output the route information in a predetermined data format. It is Also, the route determining device 13 is connected to the conversion information holding device 14 so that information can be exchanged in a predetermined data format.
  • the conversion information holding device 14 of FIG. 4 includes a position/path information management unit 14-1, a unique identifier management unit 14-2, a control unit 14-3, a format database 14-4, an information storage unit (memory/HDD) 14- 5. It has a network connection unit 14-6. Furthermore, a real-time clock (RTC hereinafter) 14-7 is also provided.
  • RTC real-time clock
  • the conversion information holding device 14 assigns a unique identifier to a three-dimensional space defined by latitude/longitude/height, and associates spatial information about the state and time of objects existing in the space with the unique identifier. It functions as a formatting means to format and save.
  • the position/route information management unit 14-1 manages predetermined position information acquired through the network connection unit 14-6, and transmits the position information to the control unit 14-3 according to a request from the control unit 14-3.
  • the control unit 14-3 incorporates a CPU as a computer, and controls processing in each unit within the conversion information holding device 14. FIG.
  • the control unit 14-3 Based on the position information acquired from the position/route information management unit 14-1 and the information of the format managed by the format database 14-4, the control unit 14-3 converts the position information into the format defined in the format. unique identifier. Then, it is transmitted to the unique identifier management section 14-2.
  • an identifier (hereinafter referred to as a unique identifier) is assigned to a space starting from a predetermined position, and the space is managed by the unique identifier.
  • a unique identifier is assigned to a space starting from a predetermined position, and the space is managed by the unique identifier.
  • the unique identifier management unit 14-2 manages the unique identifier converted by the control unit 14-3 and transmits it through the network connection unit 14-6.
  • the format database 14-4 manages the format information and transmits the format information to the control unit 14-3 in accordance with a request from the control unit 14-3.
  • the conversion information holding device 14 manages the information related to the space acquired by external devices, devices, and networks in association with unique identifiers. In addition, it provides information on the unique identifier and the space associated with it to external devices, devices, and networks. Also, the control unit 14-3 can acquire the current time from the RTC 14-7.
  • the conversion information holding device 14 acquires the unique identifier and the information in the space based on the predetermined position information, and can share the information with external devices, devices, and networks connected to itself. managed and provided to Further, the conversion information holding device 14 converts the location information specified by the system control device 10 into the unique identifier and provides the unique identifier to the system control device 10 .
  • the system control device 10 includes a unique identifier management section 10-1, a control section 10-2, a position/path information management section 10-3, an information storage section (memory/HDD) 10-4, and a network connection section 10-. 5, with RTC 10-6.
  • the position/route information management unit 10-3 holds simple map information that associates terrain information with latitude/longitude information, and stores predetermined position information and route information obtained through the network connection unit 10-5. to manage.
  • the position/route information management unit 10-3 can also divide the route information at predetermined intervals and generate position information such as the latitude/longitude of the divided locations.
  • the unique identifier management unit 10-1 manages information obtained by converting the position information and the route information into the unique identifier.
  • the control unit 10-2 incorporates a CPU as a computer, controls the communication function of the position information, the route information, and the unique identifier of the system control device 10, and controls the processing in each unit in the system control device 10. do.
  • control unit 10 - 2 provides the user interface 11 with the WEB page and transmits predetermined position information acquired from the WEB page to the route determination device 13 . Further, it acquires predetermined route information from the route determination device 13 and transmits each position information of the route information to the conversion information holding device 14 . Then, the route information converted into the unique identifier acquired from the conversion information holding device 14 is transmitted to the autonomous mobile body 12 . Also, the control unit 10-2 can acquire the current time from the RTC 10-6.
  • the system control device 10 is configured to acquire predetermined position information designated by the user, transmit and receive position information and route information, generate position information, and transmit and receive route information using unique identifiers.
  • the system control device 10 collects the route information necessary for the autonomous mobile body 12 to move autonomously, and assigns a unique identifier to the autonomous mobile body 12. Provides route information using In this embodiment, the system control device 10, the route determination device 13, and the conversion information holding device 14 function as servers, for example.
  • the autonomous moving body 12 includes a detection unit 12-1, a control unit 12-2, a direction control unit 12-3, an information storage unit (memory/HDD) 12-4, a network connection unit 12-5, and a drive unit 12. -6, with RTC 12-7.
  • the detection unit 12-1 has, for example, a plurality of imaging elements, and has a function of performing distance measurement based on phase differences between a plurality of imaging signals obtained from the plurality of imaging elements.
  • detection information such as obstacles such as surrounding terrain and building walls
  • the detection unit 12-1 also has a self-position detection function such as GPS (Global Positioning System) and a direction detection function such as a geomagnetic sensor. Furthermore, based on the acquired detection information, self-position estimation information, and direction detection information, the control unit 12-2 can generate a three-dimensional map of cyber space.
  • a self-position detection function such as GPS (Global Positioning System)
  • a direction detection function such as a geomagnetic sensor.
  • the control unit 12-2 can generate a three-dimensional map of cyber space.
  • a 3D map of cyberspace is one that can express spatial information equivalent to the position of features in the real world as digital data.
  • the autonomous mobile body 12 that exists in the real world and information on features around it are held as spatially equivalent information as digital data. Therefore, by using this digital data, efficient movement is possible.
  • FIG. 5A is a diagram showing the spatial positional relationship between the autonomous mobile body 12 in the real world and a pillar 99 that exists as feature information around it.
  • FIG. 5B shows the autonomous mobile body 12 and the pillar 99.
  • FIG. 5B is a diagram showing a state of mapping in an arbitrary XYZ coordinate system space with the position P0 as the origin.
  • the position of the autonomous mobile body 12 is determined from the latitude and longitude position information acquired by GPS or the like (not shown) mounted on the autonomous mobile body 12. identified as ⁇ 0. Also, the orientation of the autonomous mobile body 12 is specified by the difference between the orientation ⁇ Y acquired by an electronic compass (not shown) or the like and the moving direction 12Y of the autonomous mobile body 12 .
  • the position of the pillar 99 is specified as the position of the vertex 99-1 from position information measured in advance.
  • the distance measurement function of the autonomous mobile body 12 makes it possible to acquire the distance from ⁇ 0 of the autonomous mobile body 12 to the vertex 99-1.
  • FIG. 5A when the moving direction 12Y is the axis of the XYZ coordinate system and ⁇ 0 is the origin, the coordinates (Wx, Wy, Wz) of the vertex 99-1 are shown.
  • FIG. 5B shows a state in which the autonomous mobile body 12 and the pillar 99 are mapped in an arbitrary XYZ coordinate system space with P0 as the origin.
  • the autonomous mobile body 12 is expressed as P1 and the pillar 99 as P2 in this arbitrary XYZ coordinate system space. be able to.
  • the position P1 of ⁇ 0 in this space can be calculated from the latitude and longitude of ⁇ 0 and the latitude and longitude of P0.
  • the column 99 can be calculated as P2.
  • two of the autonomous mobile body 12 and the pillar 99 are represented by a three-dimensional map of cyber space, but of course, even if there are more, it is possible to treat them in the same way.
  • a three-dimensional map is a mapping of the self-position and objects in the real world in a three-dimensional space.
  • the autonomous mobile body 12 stores learning result data of object detection that has been machine-learned, for example, in an information storage unit (memory/HDD) 12-4. Objects can be detected.
  • the detection information can be obtained from an external system via the network connection unit 12-5 and reflected on the three-dimensional map.
  • control unit 12-2 incorporates a CPU (ECU) as a computer, controls the movement, direction change, and autonomous traveling functions of the autonomous mobile body 12, and controls the processing in each part in the autonomous mobile body 12.
  • ECU central processing unit
  • the direction control unit 12-3 changes the moving direction of the autonomous moving body 12 by changing the driving direction of the moving body by the driving unit 12-6.
  • the driving unit 12-6 is composed of a driving device such as a motor, and generates a propulsion force for the autonomous mobile body 12.
  • the autonomous mobile body 12 reflects the self-position, detection information, and object detection information in the three-dimensional map, generates a route keeping a certain distance from the surrounding terrain, buildings, obstacles, and objects, and autonomously travels. It can be carried out.
  • the route determination device 13 mainly generates routes in consideration of regulatory information related to the Road Traffic Law.
  • the autonomous mobile body 12 more accurately detects the positions of surrounding obstacles on the route determined by the route determination device 13, and generates a route based on its own size so as to move without touching them.
  • the control unit 12-2 can acquire the current time from the RTC 12-7.
  • the information storage unit (memory/HDD) 12-4 of the autonomous mobile body 12 can store the mobility type of the autonomous mobile body itself.
  • the mobility type is, for example, a legally identified type of moving object, such as a car, a bicycle, or a drone. Formatted route information, which will be described later, can be generated based on this mobility format.
  • FIG. 6 is a perspective view showing a mechanical configuration example of the autonomous mobile body 12 according to the first embodiment.
  • the autonomous mobile body 12 will be described as an example of a traveling body having wheels, but is not limited to this, and may be a flying body such as a drone.
  • the autonomous moving body 12 includes a detection unit 12-1, a control unit 12-2, a direction control unit 12-3, an information storage unit (memory/HDD) 12-4, a network connection unit 12-5, a drive unit 12-6 are mounted, and each part is electrically connected to each other. At least two drive units 12-6 and direction control units 12-3 are provided in the autonomous mobile body 12.
  • FIG. 6 the autonomous moving body 12 includes a detection unit 12-1, a control unit 12-2, a direction control unit 12-3, an information storage unit (memory/HDD) 12-4, a network connection unit 12-5, a drive unit 12-6 are mounted, and each part is electrically connected to each other. At least two drive units 12-6 and direction control units 12-3 are provided in the autonomous mobile body 12.
  • the direction control unit 12-3 changes the moving direction of the autonomous mobile body 12 by changing the direction of the driving unit 12-6 by rotating the shaft, and the driving unit 12-6 rotates the autonomous mobile body by rotating the shaft. Perform 12 forwards and backwards.
  • the configuration described with reference to FIG. 6 is an example, and the present invention is not limited to this.
  • an omniwheel or the like may be used to change the movement direction.
  • the autonomous mobile body 12 is, for example, a mobile body using SLAM (Simultaneous Localization and Mapping) technology. Further, based on the detection information detected by the detection unit 12-1 or the like and the detection information of the external system acquired via the Internet 16, it is configured so that it can autonomously move along a designated predetermined route.
  • SLAM Simultaneous Localization and Mapping
  • the autonomous mobile body 12 can perform trace movement by tracing finely specified points, and can also generate route information by itself in the space between them while passing through roughly set points and move. It is possible. As described above, the autonomous moving body 12 of this embodiment can autonomously move based on the route information using the unique identifier provided by the system control device 10 .
  • the sensor node 15 is an external system such as a video surveillance system such as a roadside camera unit. , a network connection unit 15-4, and an RTC 15-5.
  • the detection unit 15-1 is, for example, a camera or the like, acquires detection information of an area in which it can detect itself, and has an object detection function and a distance measurement function.
  • the control unit 15-2 incorporates a CPU as a computer, controls the detection of the sensor node 15, data storage, and data transmission functions, and controls processing in each unit within the sensor node 15. Further, the detection information acquired by the detection unit 15-1 is stored in the information storage unit (memory/HDD) 15-3 and transmitted to the conversion information holding device 14 through the network connection unit 15-4. Also, the control unit 15-2 can acquire the current time from the RTC 15-5.
  • the sensor node 15 is configured so that detection information such as image information detected by the detection unit 15-1, feature point information of a detected object, and position information can be stored in the information storage unit 15-3 and communicated. It is Further, the sensor node 15 provides the conversion information holding device 14 with the detection information of the area detectable by itself.
  • FIG. 7 is a block diagram showing a specific hardware configuration example of the control unit 10-2, the control unit 11-2, the control unit 12-2, the control unit 13-2, the control unit 14-3, and the control unit 15-2. It is a diagram. Note that the hardware configuration is not limited to that shown in FIG. Moreover, it is not necessary to have all the blocks shown in FIG.
  • 21 is a CPU as a computer that manages the calculation and control of the information processing device.
  • a RAM 22 functions as a main memory of the CPU 21, an area for execution programs, an execution area for the programs, and a data area.
  • a ROM 23 stores an operation processing procedure of the CPU 21 .
  • the ROM 23 includes a program ROM that records basic software (OS), which is a system program for controlling the information processing device, and a data ROM that records information necessary for operating the system. Note that an HDD 29, which will be described later, may be used instead of the ROM 23.
  • OS basic software
  • HDD 29 which will be described later, may be used instead of the ROM 23.
  • a network interface (NETIF) 24 controls data transfer between information processing devices via the Internet 16 and diagnoses the connection status.
  • a video RAM (VRAM) 25 develops an image to be displayed on the screen of the LCD 26 and controls the display.
  • 26 is a display device such as a display (hereinafter referred to as LCD).
  • KBC controller
  • Reference numeral 28 denotes an external input device (hereinafter abbreviated as KB) for receiving operations performed by the user, and for example, a pointing device such as a keyboard or mouse is used.
  • HDD 29 is a hard disk drive (hereinafter referred to as HDD), which is used for storing application programs and various data.
  • the application program in this embodiment is a software program or the like that executes various processing functions in this embodiment.
  • CDD external input/output device
  • a removable medium 31 as a removable data recording medium such as a CDROM drive, a DVD drive, a Blu-Ray (registered trademark) disk drive, and the like.
  • the CDD 30 is used, for example, when reading the above application program from removable media.
  • 31 is a removable medium such as a CDROM disk, DVD, Blu-Ray disk, etc., which is read by the CDD 30 .
  • the removable medium may be a magneto-optical recording medium (eg, MO), a semiconductor recording medium (eg, memory card), or the like. It is also possible to store the application programs and data stored in the HDD 29 in the removable medium 31 and use them.
  • Reference numeral 20 denotes a transmission bus (address bus, data bus, input/output bus, and control bus) for connecting the units described above.
  • FIG. 8 is a sequence diagram illustrating processing executed by the autonomous mobile body control system according to the first embodiment
  • FIG. 9 is a sequence diagram following FIG. 8
  • FIG. 10 is a sequence diagram following FIG. is.
  • 8 to 10 show the processing executed by each device from when the user inputs the location information to the user interface 11 until the current location information of the autonomous mobile body 12 is received. 8 to 10 are executed by the computers in the control units 10 to 15 executing the computer programs stored in the memory.
  • step S201 the user uses the user interface 11 to access the WEB page provided by the system control device 10.
  • step S202 the system control device 10 displays the position input screen as described with reference to FIG. 2 on the display screen of the WEB page.
  • step S203 as described with reference to FIG. 2, the user selects an autonomous mobile object (mobility) and inputs location information (hereinafter referred to as location information) indicating departure/via/arrival points.
  • mobility autonomous mobile object
  • location information hereinafter referred to as location information
  • the position information may be a word (hereinafter referred to as a position word) specifying a specific place such as a building name, a station name, or an address, or a point (hereinafter referred to as a point) indicating a specific position on the map displayed on the WEB page.
  • a position word a word specifying a specific place such as a building name, a station name, or an address
  • a point hereinafter referred to as a point
  • step S204 the system control device 10 saves the type information of the selected autonomous mobile body 12 and input information such as the input position information.
  • the position information is the position word
  • the position word is stored
  • the simple map stored in the position/route information management unit 10-3 is stored. Based on the information, find the latitude/longitude corresponding to the point and save the latitude/longitude.
  • step S205 the system control device 10 designates the type of route that can be traveled (hereinafter referred to as route type) from the mobility type (type) of the autonomous mobile body 12 designated by the user. Then, in step S206, it is transmitted to the route determination device 13 together with the position information.
  • route type the type of route that can be traveled
  • the mobility type is, as described above, a legally distinguished type of moving object, such as a car, bicycle, or drone.
  • the type of route is, for example, a general road, a highway, an exclusive road for automobiles, or the like, and a predetermined sidewalk, a side strip of an ordinary road, or a bicycle lane for a bicycle.
  • step S207 the route determination device 13 inputs the received position information to the owned map information as departure/via/arrival points. If the location information is the location word, search the map information by the location word and use the corresponding latitude/longitude information. When the position information is latitude/longitude information, it is used as it is input to the map information. Furthermore, a pre-search for the route may be performed.
  • step S208 the route determination device 13 searches for a route from the departure point to the arrival point via the intermediate points.
  • the route to be searched is searched according to the route type.
  • step S209 the route determination device 13 outputs, as a result of the search, a route from the departure point to the arrival point via the waypoints (hereinafter referred to as route information) in GPX format (GPS eXchange Format), and system control is performed.
  • route information a route from the departure point to the arrival point via the waypoints
  • GPX format GPS eXchange Format
  • GPX format files are mainly divided into three types: waypoints (point information without order), routes (point information with order with time information added), and tracks (collection of multiple point information: trajectories). is configured to
  • latitude/longitude is described as the attribute value of each point information
  • altitude, geoid height, GPS reception status/accuracy, etc. are described as child elements.
  • the minimum element required for a GPX file is latitude/longitude information for a single point, and any other information is optional.
  • the route information is the route, which is a set of point information consisting of latitude/longitude having an order relationship. Note that the route information may be in another format as long as it satisfies the above requirements.
  • FIG. 11(A) is a diagram showing latitude/longitude information of the earth
  • FIG. 11(B) is a perspective view showing the predetermined space 100 in FIG. 11(A).
  • the center of the predetermined space 100 is defined as the center 101.
  • FIG. 12 is a diagram schematically showing spatial information in the space 100. As shown in FIG.
  • the format divides the earth's space into three-dimensional spaces determined by ranges starting from latitude/longitude/height, and each space has a unique identifier. is added to make it manageable.
  • the space 100 is displayed as a predetermined three-dimensional space.
  • a space 100 is defined by a center 101 of 20 degrees north latitude, 140 degrees east longitude, and height (altitude, altitude) H, and the width in the latitudinal direction is defined as D, the width in the longitudinal direction as W, and the width in the height direction as T. is a partitioned space. In addition, it is one space obtained by dividing the space of the earth into spaces determined by ranges starting from the latitude/longitude/height.
  • each of the arranged divided spaces has its horizontal position defined by latitude/longitude, overlaps in the height direction, and the position in the height direction is defined by height.
  • the center 101 of the divided space is set as the starting point of the latitude/longitude/height in FIG. 11B, it is not limited to this. may be used as the starting point.
  • the shape may be a substantially rectangular parallelepiped, and when considering the case of laying on the surface of a sphere such as the earth, it is better to set the top surface of the rectangular parallelepiped slightly wider than the bottom surface, so that it can be arranged without gaps.
  • information on the types of objects that exist or can enter the range of the space 100 and time limits are associated with unique identifiers.
  • the formatted spatial information is stored in chronological order from the past to the future.
  • the conversion information holding device 14 associates with the unique identifier the spatial information regarding the types of objects that can exist or can enter a three-dimensional space defined by latitude/longitude/height and the time limit, and formats the format database 14-. Saved in 4.
  • the spatial information is updated at predetermined update intervals based on information supplied by information supply means such as an external system (for example, the sensor node 15) communicatively connected to the conversion information holding device 14. Then, the information is shared with other external systems communicably connected to the conversion information holding device 14 . For applications that do not require time-related information, it is possible to use spatial information that does not contain time-related information. Also, non-unique identifiers may be used instead of unique identifiers.
  • information about the type of an object that can exist or enter a three-dimensional space defined by latitude/longitude/height and the time limit (hereinafter referred to as spatial information) is associated with a unique identifier. formatted and stored in the database. Space-time can be managed by formatted spatial information.
  • the conversion information holding device 14 of the first embodiment executes a formatting step of formatting and saving information about update intervals of spatial information in association with unique identifiers.
  • the update interval information formatted in association with the unique identifier may be the update frequency, and the update interval information includes the update frequency.
  • step S210 the system control device 10 confirms the interval between each piece of point information in the received route information. Then, position point cloud data is created by matching the interval of the point information with the interval between the starting point positions of the divided spaces defined by the format.
  • the system control device 10 thins out the point information in the route information according to the interval of the starting point positions of the divided spaces. group data. Further, when the interval of the point information is larger than the interval between the starting point positions of the divided spaces, the system control device 10 interpolates the point information within a range that does not deviate from the route information to obtain position point group data.
  • step S211 in Fig. 9 the system control device 10 transmits the latitude/longitude information of each point information of the position point cloud data to the conversion information holding device 14 in the order of the route.
  • step S212 the conversion information holding device 14 searches the format database 14-4 for a unique identifier corresponding to the received latitude/longitude information, and transmits it to the system control device 10 in step S213.
  • step S214 the system control device 10 arranges the received unique identifiers in the same order as the original position point cloud data, and stores them as route information using the unique identifiers (hereinafter referred to as format route information).
  • the system control device 10 as a control means acquires the spatial information from the database of the conversion information holding device 14, and based on the acquired spatial information and the type information of the mobile object, the movement of the mobile object is determined. Generating routing information about routes.
  • FIG. 13(A) is an image diagram of route information displayed as map information
  • FIG. 13(B) is an image diagram of route information using position point cloud data displayed as map information
  • FIG. 13(C) is an image diagram using unique identifiers.
  • FIG. 10 is an image diagram showing route information as map information;
  • 120 is route information
  • 121 is a non-movable area through which the autonomous mobile body 12 cannot pass
  • 122 is a movable area where the autonomous mobile body 12 can move.
  • the route information 120 generated by the route determination device 13 based on the positional information of the departure point, waypoint, and arrival point specified by the user passes through the departure point, waypoint, and arrival point, and is displayed on the map. It is generated as a route passing over the movable area 122 on the information.
  • 123 is a plurality of pieces of position information on the route information.
  • the system control device 10 that has acquired the route information 120 generates the position information 123 arranged at predetermined intervals on the route information 120 .
  • the position information 123 can be represented by latitude/longitude/height, respectively, and this position information 123 is called position point cloud data in the first embodiment. Then, the system control device 10 transmits the position information 123 (latitude/longitude/height of each point) one by one to the conversion information holding device 14 and converts it into a unique identifier.
  • 124 is positional space information in which the positional information 123 is converted into unique identifiers one by one, and the spatial range defined by the unique identifiers is represented by a rectangular frame.
  • the location space information 124 is obtained by converting the location information into a unique identifier.
  • the route represented by the route information 120 is converted into continuous position space information 124 and represented.
  • Each piece of position space information 124 is associated with information regarding the types of objects that can exist or enter the range of the space and time limits. This continuous position space information 124 is called format route information in the first embodiment.
  • step S215 the system control device 10 downloads the spatial information associated with each unique identifier of the format path information from the conversion information holding device 14.
  • step S216 the system control device 10 converts the spatial information into a format that can be reflected in the three-dimensional map of the cyberspace of the autonomous mobile body 12, and identifies the positions of multiple objects (obstacles) in a predetermined space. Create the information shown (hereafter, cost map).
  • the cost map may be created for all route spaces in the format route information at first, or may be created in a form divided by fixed areas and updated sequentially.
  • step S217 the system control device 10 associates the format route information and the cost map with the unique identification number (unique identifier) assigned to the autonomous mobile body 12 and stores them.
  • the autonomous mobile body 12 monitors (hereinafter, polls) its own unique identification number via the network at predetermined time intervals, and downloads the linked cost map in step S218.
  • the autonomous mobile body 12 reflects the latitude/longitude information of each unique identifier of the format route information as route information on the three-dimensional map of the cyberspace created by itself.
  • step S220 the autonomous mobile body 12 reflects the cost map on the three-dimensional map of cyberspace as obstacle information on the route.
  • the cost map is created in a form divided at regular intervals, after moving the area in which the cost map was created, the cost map of the next area is downloaded and the cost map is updated.
  • step S221 the autonomous mobile body 12 moves along the route information while avoiding the objects (obstacles) input in the cost map. That is, movement control is performed based on the cost map.
  • step S222 the autonomous mobile body 12 moves while performing object detection, and moves while updating the cost map using the object detection information if there is a difference from the cost map. Also, in step S223, the autonomous mobile body 12 transmits difference information from the cost map to the system control device 10 together with the corresponding unique identifier.
  • the system control device 10 that has acquired the difference information between the unique identifier and the cost map transmits the spatial information to the conversion information holding device 14 in step S224 of FIG. Update the spatial information of the unique identifier.
  • the content of the spatial information updated here does not directly reflect the difference information from the cost map, but is abstracted by the system control device 10 and then sent to the conversion information holding device 14 . Details of the abstraction will be described later.
  • step S226 the autonomous mobile body 12 that is moving based on the format route information tells the system controller 10 that the space it is currently passing through each time it passes through the divided space linked to each unique identifier. Send the unique identifier associated with the .
  • the system control device 10 grasps the current position of the autonomous mobile body 12 on the format route information.
  • the system control device 10 can grasp where the autonomous mobile body 12 is currently located in the format route information. Note that the system control device 10 may stop holding the unique identifier of the space through which the autonomous mobile body 12 has passed, thereby reducing the holding data capacity of the format route information.
  • step S227 the system control device 10 creates the confirmation screen 50 and the map display screen 60 described with reference to FIGS. do.
  • the system control device 10 updates the confirmation screen 50 and the map display screen 60 each time the autonomous mobile body 12 transmits the unique identifier indicating the current position to the system control device 10 .
  • the sensor node 15 saves the detection information of the detection range, abstracts the detection information in step S229, and transmits it to the conversion information holding device 14 as the spatial information in step S230.
  • the abstraction is, for example, information such as whether or not an object exists, or whether or not the existence state of the object has changed, and is not detailed information about the object.
  • the conversion information holding device 14 stores the spatial information, which is the abstracted detection information, in association with the unique identifier of the position corresponding to the spatial information.
  • the spatial information is stored in one unique identifier in the format database.
  • the external system uses the spatial information in the conversion information holding device 14 to convert the information in the sensor node 15 via the conversion information holding device 14.
  • the detection information is acquired and utilized.
  • the conversion information holding device 14 also has a function of connecting the communication standards of the external system and the sensor node 15 .
  • the conversion information holding device 14 has a function of connecting data of multiple devices with a relatively small amount of data.
  • steps S215 and S216 of FIG. 9 when the system control device 10 needs detailed object information when creating a cost map, detailed information is sent from an external system storing detailed detection information of spatial information. should be downloaded and used.
  • the sensor node 15 updates the spatial information on the route of the format route information of the autonomous mobile body 12 .
  • the sensor node 15 acquires the detection information in step S232 of FIG. 10, generates abstracted spatial information in step S233, and transmits it to the conversion information holding device 14 in step S234.
  • the conversion information holding device 14 stores the spatial information in the format database 14-4 in step S235.
  • the system control device 10 checks changes in the spatial information in the managed format path information at predetermined time intervals, and if there is a change, downloads the spatial information in step S236. Then, in step S237, the cost map associated with the unique identification number assigned to the autonomous mobile body 12 is updated.
  • step S2308 the autonomous mobile body 12 recognizes the update of the cost map by polling and reflects it in the three-dimensional map of cyberspace created by itself.
  • the autonomous mobile body 12 can recognize in advance a change in the route that the self cannot recognize, and can respond to the change.
  • a unique identifier is transmitted in step S240.
  • the system control device 10 which has thus recognized the unique identifier, displays an arrival indication on the user interface 11 in step S241, and terminates the application.
  • Embodiment 1 it is possible to provide a digital architecture format and an autonomous mobile body control system using the same.
  • the format database 14-4 stores information (spatial information) about the types of objects that can exist or enter the space 100 and time limits. It is stored in chronological order from to future. The spatial information is updated based on information input from an external sensor or the like communicatively connected to the conversion information holding device 14, and is shared with other external systems that can be connected to the conversion information holding device 14. there is
  • the type information of objects in the space is information that can be obtained from map information, such as roadways, sidewalks, and bicycle lanes on roads.
  • map information such as roadways, sidewalks, and bicycle lanes on roads.
  • information such as the traveling direction of mobility on a roadway, traffic regulations, etc. can also be defined as type information.
  • type information it is also possible to define type information in the space itself.
  • the conversion information holding device 14 can be connected to a system control device that manages information on roads and a system control device that manages information on sections other than roads.
  • the system control device 10 can transmit position point cloud data collectively representing the position information 123 of FIG. Similarly, a system control device that manages information on roads and a system control device that manages information on sections other than roads can also transmit corresponding data to the conversion information holding device 14 .
  • the corresponding data is the position point cloud data information managed by the system control device that manages road information and the system control device that manages information on sections other than roads.
  • Each point of the position point cloud data is hereinafter referred to as a position point.
  • the space information update interval differs according to the type of object existing in the space. That is, when the type of object existing in the space is a moving object, the length of time is set to be shorter than when the type of object existing in the space is not a moving object. Also, when the type of the object existing in the space is a road, the type of the object existing in the space is made shorter than in the case of the partition.
  • the update interval of the space information about each object should be different according to the type of each object (eg moving body, road, section, etc.). Spatial information about the state and time of each of a plurality of objects existing in the space is associated with the unique identifier, formatted and stored. Therefore, the load for updating spatial information can be reduced.
  • the format database 14-4 stores information (spatial information) regarding the state and time of objects existing within the space 100 in chronological order from the past to the future. Further, the spatial information is updated based on information input by an external system or the like communicatively connected to the conversion information holding device 14, and information is shared with other external systems communicably connected to the conversion information holding device 14. be done.
  • These external systems are systems such as the route determination device 13 and the sensor node 15, for example, and the spatial information is updated by map information held by the route determination device 13 and information on objects detected by the sensor node 15. .
  • the spatial information stored in the format database 14-4 is information at the time it is input by an external system or the like, and the information itself is not necessarily real-time information.
  • the route determining device 13 as an example of an external system has a map information management section 13-1, and this map information management section 13-1 has map information on the earth.
  • This map information is appropriately updated in accordance with changes in the real world, but it is not updated in real time, and it may take several days to several months at most to be updated.
  • the information on the object detected by the sensor node 15 is also the information at the time when the object was detected, and there may be a time difference from the time when the system control device 10 attempts to control the movement of the autonomous mobile body 12.
  • the spatial information stored in the format database 14-4 does not necessarily match the state in the real world.
  • the autonomous mobile body control system controls the movement of the autonomous mobile body 12 using the spatial information stored in the format database 14-4, the spatial information stored in the format database 14-4 and the state in the space of the real world are matched. If they do not match, a control error will occur. That is, there is a possibility that correct movement control of the autonomous mobile body 12 is not performed in the real world.
  • reliability information information (hereinafter referred to as reliability information) is provided as to how much the spatial information stored in the format database 14-4 is expected to match the state of the real world.
  • the reliability of the spatial information is calculated from the update time of the spatial information and the detection method, and the movement control of the autonomous mobile body 12 is performed using the spatial information.
  • the route determination device 13 is configured to search for a route that complies with the Road Traffic Act based on specified position information and output it in a predetermined data format. Furthermore, map update time information is output and stored as the update time of the spatial information in the format database 14-4. This will be described below with reference to FIG.
  • the update time information of the map is information of the time when the map information was updated.
  • the route determination device 13 has a map information management section 13-1.
  • the map information of the map information management unit 13-1 is created based on, for example, a digital map provided by the Geospatial Information Authority of Japan, and is stored together with creation time information of the digital map.
  • the control unit 13-2 of the route determination device 13 confirms the update of the digital map information at a predetermined timing such as once a day, and if there is an update of the digital map information, it manages the map information together with the update time information. Update the map information in section 13-1.
  • map information management unit 13-1 may be updated only for the updated map range.
  • map information of the map information management unit 13-1 also stores the update time of the map at each latitude and longitude (hereinafter referred to as map update time information).
  • FIG. 14 is a sequence diagram illustrating an example of processing for storing detection time/detection method information. The operations of each step of the sequence of FIG. 14 are performed by the computers in the control units in 13 to 15 executing the computer program stored in the memory.
  • the route determination device 13 converts a predetermined unique identifier into latitude/longitude/altitude information.
  • the predetermined unique identifier is not limited to the range indicated by the format path information, but is a unique identifier within the range managed by the unique identifier management section 14-2 of the conversion information holding device 14. FIG.
  • step S402 the route determination device 13 extracts the map update time information in the position information based on the converted position information such as the latitude/longitude/altitude information from the map information it owns. After that, in step S403, the route determination device 13 transmits the extracted map update time information to the conversion information holding device 14 together with the position information.
  • step S404 the conversion information holding device 14 stores the transmitted map update time information corresponding to the unique identifier corresponding to the position information in the format database 14-4 as the update time of the spatial information.
  • location information such as latitude/longitude/altitude information and map update time information are passed between the route determining device 13 and the conversion information holding device 14.
  • map update time information is passed along with the unique identifier.
  • the route determination device 13 extracts the map update time information in the position information based on the position information indicated by the unique identifier within a predetermined range from the map information it owns.
  • the predetermined unique identifier is not limited to the range indicated by the format path information, but is a unique identifier within the range managed by the unique identifier management section 14-2 of the conversion information holding device 14. FIG.
  • step S408 the route determination device 13 transmits map update time information based on the position information indicated by the unique identifier to the conversion information holding device 14. Then, in step S409, the conversion information holding device 14 stores the transmitted map update time information corresponding to the unique identifier in the format database 14-4 as the update time of the spatial information.
  • the update time of the spatial information corresponding to the unique identifier from the map information can be stored in advance in the format database 14-4.
  • the sensor node 15 is an external system such as a video monitoring system such as a roadside camera unit, as described with reference to FIG. Location information about where you are is stored.
  • step S411 the sensor node 15 detects information such as image information, feature point information, position information, etc. of an object existing in a detectable area by the object detection function and the distance measurement function of the detection unit 15-1. .
  • step S412 the detection information and the time information at the time of object detection (hereinafter referred to as detection time information) acquired using the RTC 15-5 are stored in the information storage unit (memory/HDD) 15-3. Further, in association with this, the information of the detection method (camera, LiDAR, etc.) that detected the object is stored in the information storage unit (memory/HDD) 15-3 of the sensor node 15.
  • detection time information the time information acquired using the RTC 15-5
  • the information storage unit (memory/HDD) 15-3 the information storage unit (memory/HDD) 15-3 of the sensor node 15.
  • the sensor node 15 transmits the detection information, detection time, detection method, and position information of the linked object to the conversion information holding device 14 in step S413.
  • step S414 the conversion information holding device 14 determines the unique identifier corresponding to the transmitted position information, and stores the detection information, detection time, and detection method of the object corresponding to the unique identifier in the format database 14-4. do. In this manner, the sensor node 15 can store the update time and detection method of spatial information.
  • information such as the detection time and the position information of the object detected by the sensor node 15 is passed between the sensor node 15 and the conversion information holding device 14.
  • information such as the detection time and the unique identifier is passed. is also possible. The method is described below.
  • step S415 the sensor node 15 detects information such as image information, feature point information, position information, etc. of an object existing in a detectable area by the sensor node 15 using the object detection function and the distance measurement function of the detection unit 15-1. .
  • step S416 this detection information and time information at the time of object detection (hereinafter referred to as detection time information) acquired using the RTC 15-5 are stored in the information storage unit (memory/HDD) 15-3. Furthermore, in association with this, the information of the detection method (camera, LiDAR, etc.) used to detect the object is stored in the information storage unit (memory/HDD) 15-3 of the sensor node 15.
  • detection time information time information acquired using the RTC 15-5
  • the information storage unit (memory/HDD) 15-3 the information storage unit (memory/HDD) 15-3 of the sensor node 15.
  • step S417 the unique identifier of the position where the object exists is calculated based on the detected position information, and is stored in the information storage unit (memory/HDD) 15-3 linked to the detection information of the object. Further, in step S418, the sensor node 15 transmits the detection information, the detection time, the detection method, and the unique identifier of the linked object to the conversion information holding device 14.
  • FIG. 1 the information storage unit (memory/HDD) 15-3 linked to the detection information of the object.
  • step S419 the conversion information holding device 14 stores the detection information, detection time, and detection method of the object corresponding to the transmitted unique identifier in the format database 14-4. In this manner, the sensor node 15 can store the update time and detection method of spatial information.
  • the format database 14-4 stores the update time of the spatial information and the method of detecting the spatial information in association with the spatial information.
  • the format database 14-4 stores various types of spatial information stored in the format database 14-4, and the reliability of the spatial information may vary depending on the content of the spatial information.
  • Factors that affect the reliability of spatial information include the detection method of spatial information, the update time of spatial information or the elapsed time from the detection time, and the type of object stored as spatial information. That is, the reliability in this embodiment differs according to at least one of the type of object, the elapsed time from the spatial information update time, and the spatial information detection method. Furthermore, it differs depending on the elapsed time from the update time of the spatial information and the detection method of the spatial information.
  • the reasons for the difference in reliability depending on the spatial information detection method are considered as follows. For example, when an object detected by the sensor node 15 at night is detected using LiDAR as a sensor, and when detected using a camera, the reliability of the detection result is higher when detected by LiDAR. is considered high.
  • the camera is a passive detector, so detection sensitivity in the dark is generally low, and LiDAR is an active detector (self-luminous), so the detection sensitivity generally does not decrease even in the dark. is. That is, for example, when detecting a pedestrian at night, it is more likely that the pedestrian can be detected correctly by LiDAR than by a camera. In this way, the reliability of the detected object may differ depending on the detection method.
  • the reliability of spatial information may vary depending on the reliability of the measuring instrument itself as a spatial information detection method. For example, when detecting the shape of a building, when measuring using a surveying instrument based on the JIS standard as the sensor node 15, and when calculating based on image data using an image detector such as a surveillance camera or an in-vehicle camera. can be considered.
  • the reliability of the detected object may differ depending on whether the detection method is based on the measurement standard or not.
  • the degree of reliability may differ depending on the creator (creating organization) that created the spatial information and the creation procedure. For example, spatial information measured and detected by a public institution or the like may be highly reliable. Moreover, there are cases where the spatial information measured and detected in accordance with officially approved (approved) procedures is highly reliable.
  • the factors that cause the reliability of spatial information to differ depending on the elapsed time from the detection time of the spatial information are considered as follows. For example, it is assumed that the sensor node 15 detects that a triangular cone for lane control is erected on the roadway due to road construction or the like. At the time of detection by the sensor node 15, there is a high possibility that a triangular cone exists in the real world as well, but as time passes, the triangular cone does not necessarily exist in the real world as well.
  • spatial information such as the triangular cone described above
  • the reliability of spatial information may change depending on the type of object used as spatial information as well as the elapsed time.
  • the reliability of spatial information varies depending on the detection method of spatial information, the elapsed time from the update time of spatial information, the type of object stored as spatial information, and so on.
  • a method of calculating the reliability of spatial information will be described below.
  • the format database 14-4 stores the object information of the spatial information, the update time of the spatial information, and the detection method. do.
  • the spatial information reliability decrease rate P defines the degree to which the reliability of the space changes as time elapses from the time when the spatial information is updated.
  • P can be a constant.
  • Equation 2 is based on the idea that the reliability of spatial information is considered to be 100% at the time the spatial information is updated, and that the reliability of spatial information decreases in proportion to the passage of time.
  • the spatial information reliability reduction rate P can be considered as a function P(T) with time T as a parameter.
  • the spatial information reliability reduction rate P it is desirable to vary the spatial information reliability reduction rate P depending on the object.
  • road information when road information is stored as spatial information, it is unlikely that the spatial information will change over time.
  • information about an object that can easily move, such as a triangular cone is stored as spatial information, there is a high possibility that the spatial information will change over time.
  • formulas 2 and 2' are used to calculate the reliability of spatial information, but the formulas are not limited to these, and other formulas may be used as long as the reliability of spatial information can be calculated over time. You can use it.
  • the reliability of spatial information may vary depending on the method of detecting spatial information.
  • a detection method reliability M is defined as the reliability of this spatial information detection method.
  • the detection method reliability M can be a constant determined for each device according to, for example, the type of device of the detected sensor node 15, the manufacturer, the date of manufacture, etc., in addition to the detection method.
  • the unit of the detection method reliability M here be expressed in percent or the like, similar to the reliability R of spatial information.
  • the detection method reliability M be varied according to the detection method, the type of device, the manufacturer, and the date of manufacture, as described above.
  • the reliability of detection results is considered to be higher when an object is detected by a device with an older manufacturing date, when an object is detected with a device with a newer manufacturing date, and when an object is detected with a device with a newer manufacturing date. be done. This is because a device that was manufactured a long time ago may deteriorate in object detection accuracy due to aged deterioration or the like.
  • Equation 5 M-(T3 ⁇ P) (Formula 5)
  • the reliability in this embodiment differs depending on both the elapsed time from the update time of the spatial information and the detection method of the spatial information.
  • examples of the reliability calculation method such as formula (2), formula (4), and formula (5), are shown, but the method is not limited to this.
  • the initial condition of Equation (2) is not limited to 100, and the spatial information reliability reduction rate P does not need to be a constant value. Additional parameters and the like may be added.
  • format route information representing the movement route of the autonomous mobile body in the autonomous mobile body control system is created as described in steps S211 to S217 and the like in FIG.
  • FIG. 15 is a flowchart illustrating an example of processing for calculating the reliability and determining whether or not the movement is permitted. The operation of each step in the flow chart of FIG. 15 is performed by executing the computer program stored in the memory by the computer in the control unit in the system control device 10 .
  • step S421 the system control device 10 acquires from the format database 14-4 of the conversion information holding device 14 the update time and detection method of the space information of each unique identifier of the created format path information.
  • step S422 the system control device 10 calculates the reliability of the spatial information of each unique identifier based on the time obtained from the RTC 10-6, the update time of the spatial information of each unique identifier, and the detection method. Calculated using
  • step S423 the system control device 10 determines whether or not the autonomous mobile body 12 can move in each space indicated by each unique identifier based on the calculated reliability of the space information.
  • the threshold for determination here can be determined according to the type and moving speed of the autonomous mobile body 12 . An example of a method for setting the threshold will be described below.
  • the autonomous mobile body 12 moves based on the format route information, but there is a possibility that emergency control will be performed during the movement.
  • the mismatch between the spatial information and the state of the real world may be a factor that causes the need to perform the emergency control here. For example, when passing through an area where road construction is frequently performed, there is a high possibility that lane restrictions will change frequently and a discrepancy will occur between the real world and spatial information.
  • control performance such as turning performance in an emergency
  • the threshold for the determination is higher when the autonomous mobile body 12 moves at high speed than when it moves at low speed.
  • step S423 the reliability calculated in step S422 is compared with a threshold value, and the threshold value for determining whether or not the vehicle can continue to move is the speed of the moving body or the type of the moving body (turning performance, etc.). It is desirable to change according to
  • step S424 the system control device 10 regenerates the route information in the sequence described in steps S205 to S209 of FIG. get. Thereafter, in step S423, the system control device 10 determines whether or not the autonomous mobile body 12 can move in each space indicated by each unique identifier based on the format route information generated again.
  • step S423 the system control device 10 stores the format route information by linking it to the unique identification number assigned to the autonomous mobile body 12 in step S425.
  • the autonomous mobile body 12 monitors (hereinafter, polls) its own unique identification number via the network at predetermined time intervals. Download to mobile unit 12 .
  • the system control device 10 reflects the latitude/longitude information of each unique identifier of the format route information to the route information of the three-dimensional map of the autonomous mobile body 12 .
  • FIG. 16 is a sequence diagram illustrating an example of processing for calculating reliability and correcting route information. Each step of the sequence shown in FIG. 16 is performed by the computers in the control units 10, 12 to 15 executing the computer program stored in the memory.
  • step S431 the control unit 12-2 of the autonomous mobile body 12 uses a self-position detection function such as GPS and a direction detection function such as a geomagnetic sensor mounted on the detection unit 12-1 to determine the current self-position and progress. Calculate direction. Subsequently, in step S432, a unique identifier of the planned destination is calculated from the calculated self-position and traveling direction. Also, in step S433, the autonomous mobile body 12 transmits this unique identifier to the system control device 10 at predetermined intervals.
  • a self-position detection function such as GPS and a direction detection function such as a geomagnetic sensor mounted on the detection unit 12-1 to determine the current self-position and progress. Calculate direction.
  • step S432 a unique identifier of the planned destination is calculated from the calculated self-position and traveling direction. Also, in step S433, the autonomous mobile body 12 transmits this unique identifier to the system control device 10 at predetermined intervals.
  • step S434 the system control device 10 transmits the transmitted unique identifier to the conversion information holding device 14. Also, in step S435, the conversion information holding device 14 acquires the update time and detection method of the spatial information corresponding to the transmitted unique identifier from the format database 14-4. Then, in step S436, the system controller 10 is sent the update time and detection method of the spatial information of the object.
  • step S437 the system control device transmits the update time and detection method of each spatial information received to the autonomous mobile body 12.
  • step S438 the autonomous mobile body 12 calculates the reliability of the cost map at the planned movement position based on the time acquired from the RTC 12-7, the update time of the spatial information in the acquired unique identifier, and the detection method.
  • the calculation method may be performed using the above-described formula or the like.
  • step S439 the route information is corrected according to the calculation result of the reliability of this cost map.
  • the moving speed of the autonomous mobile body 12 is changed according to the calculation result of the reliability of the cost map.
  • the system control device 10 or the autonomous mobile body 12 calculates the reliability of spatial information.
  • the conversion information holding device 14 calculates the reliability information of the corresponding space at the timing when the space information update time and the detection method are stored in the format database 14-4, and stores it in the format database 14-4. good.
  • movement control of the autonomous mobile body 12 may be performed based on the reliability information stored in the format database 14-4. This is described in detail below.
  • FIG. 17 is a flowchart illustrating an example of processing for storing reliability. The operation of each step in the flow chart of FIG. 17 is performed by executing the computer program stored in the memory by the computer in the control section in the conversion information holding device 14 .
  • the method for storing the spatial information update time and detection method in the format database 14-4 may be as described in steps S401 to S407 and steps S411 to S414.
  • step S441 the conversion information holding device 14 calculates the reliability of the unique identifier spatial information based on the time acquired from the RTC 14-7, the update time of the unique identifier spatial information, and the detection method.
  • the calculation method may be performed using the above-described formula or the like.
  • step S442 the calculated spatial information reliability is formatted by the conversion information holding device 14 as formatting means in association with the unique identifier corresponding to the corresponding space in the format database 14-4. become. Then, it is saved (stored) in the format database 14-4.
  • the reliability of spatial information may be configured to be updated as needed over time. In that case, as described above, the reliability of the unique identifier spatial information is calculated based on the time obtained from the RTC 14-7, the updated time of the unique identifier spatial information, and the detection method.
  • step S443 the conversion information holding device 14 determines whether the spatial information in the format database 14-4 has been updated by information input from an external system or the like communicatively connected to the conversion information holding device 14 or not. Then, in the case of Yes, the process proceeds to step S444, and the conversion information holding device 14 calculates the reliability of the spatial information in the same manner as described above. Let Rc be the reliability of the spatial information calculated here.
  • step S445 the conversion information holding device 14 compares the reliability Rb of the already stored spatial information with the newly calculated reliability Rc of the spatial information, and determines whether Rc ⁇ Rb. Proceed to S446.
  • step S446 the conversion information holding device 14 associates (links) the newly calculated reliability Rc of the spatial information with the unique identifier corresponding to the corresponding space in the format database 14-4 together with the spatial information. It is stored in the format database 14-4.
  • the reliability to be compared here is the reliability of the spatial information of the same element, such as comparing map information or vehicle information. If Rc ⁇ Rb in step S445, the conversion information holding device 14 does not update the spatial information and accordingly does not update the reliability of the spatial information. By doing so, highly reliable spatial information can be stored in the format database 14-4.
  • FIG. 18 is a flowchart for explaining an example of processing for acquiring reliability and determining whether or not to move.
  • the operation of each step in the flow chart of FIG. 17 is performed by executing the computer program stored in the memory by the computer in the control section in the system control device 10 .
  • step S451 the system control device 10 acquires the reliability of the space information of each unique identifier of the created format path information from the format database 14-4 of the conversion information holding device 14. Then, in step S452, the system control device 10 determines whether or not the autonomous mobile body 12 can move in each space indicated by each unique identifier based on the reliability of the acquired space information.
  • the threshold for determination here can be determined according to the type and moving speed of the autonomous mobile body 12 .
  • step S453 the system control device 10 acquires route information again in the sequence described in steps S205 to S209 of FIG. After that, in step S452, the system control device 10 determines whether or not the autonomous mobile body 12 can move in each space indicated by each unique identifier, based on the format route information generated again.
  • step S452 When the system control device 10 determines in step S452 that all unique identifiers can be moved, the system control device 10 links the format route information to the unique identification number assigned to the autonomous mobile body 12 in step S454. store.
  • the autonomous mobile body 12 monitors (hereinafter, polling) the unique identification number of itself via the network at predetermined intervals, and in step S455, the system control device 10 sends the associated data to the autonomous mobile body 12. to download. Then, in step S456, the system control device 10 reflects the latitude/longitude information of each unique identifier of the format route information to the route information of the three-dimensional map of the cyberspace created by the autonomous mobile body 12.
  • FIG. 19 is a sequence diagram illustrating an example of processing for acquiring reliability and correcting route information.
  • each step of the sequence in FIG. 19 is performed by the computers in the control units in 10, 12 to 15 executing the computer program stored in the memory. Note that this method is partially similar to the content described with reference to FIG.
  • step S461 the control unit 12-2 of the autonomous mobile body 12 uses a self-position detection function such as GPS and a direction detection function such as a geomagnetic sensor mounted on the detection unit 12-1 to determine the current self-position and progress. Calculate direction.
  • a self-position detection function such as GPS
  • a direction detection function such as a geomagnetic sensor mounted on the detection unit 12-1 to determine the current self-position and progress. Calculate direction.
  • step S462 the autonomous mobile body 12 calculates the unique identifier of the planned destination. Furthermore, the autonomous mobile body 12 transmits this unique identifier to the system control device 10 at predetermined intervals in step S463.
  • the system control device 10 transmits the transmitted unique identifier to the conversion information holding device 14 in step S464.
  • the conversion information holding device 14 acquires the reliability of the spatial information corresponding to the transmitted unique identifier from the format database 14-4. send degrees.
  • step S467 the system control device transmits the reliability of each received spatial information to the autonomous mobile body 12.
  • highly reliable information can be used to stably control the movement of the autonomous mobile body 12 in accordance with the state of the space in the real world.
  • the mobile object of this embodiment is not limited to an autonomous mobile object such as an AGV (Automatic Guided Vehicle) or an AMR (Autonomous Mobile Robot).
  • AGV Automatic Guided Vehicle
  • AMR Automatic Mobile Robot
  • it can be any mobile device that moves, such as automobiles, trains, ships, airplanes, robots, and drones.
  • part of the control system of the present embodiment may or may not be mounted on those moving bodies.
  • this embodiment can be applied to remote control of a moving body.
  • the present invention may be realized by supplying a storage medium recording software program code (control program) for realizing the functions of the above-described embodiments to a system or device. It is also achieved by the computer (or CPU or MPU) of the system or apparatus reading and executing the computer-readable program code stored in the storage medium. In that case, the program code itself read from the storage medium implements the functions of the above-described embodiments, and the storage medium storing the program code constitutes the present invention.

Abstract

Un système de commande peut rendre des informations spatiales tridimensionnelles utilisables pour plus d'utilisateurs et comprend un moyen de formatage qui attribue un identifiant unique à un espace tridimensionnel défini par la latitude, la longitude et la hauteur, et formate et sauvegarde des informations spatiales relatives à un état d'un objet existant dans l'espace et à un temps en association avec l'identifiant unique. Le moyen de formatage formate et sauvegarde des informations relatives à la fiabilité des informations spatiales en association avec l'identifiant unique.
PCT/JP2023/002623 2022-02-01 2023-01-27 Système de commande, procédé de commande et support de stockage WO2023149370A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2022014166 2022-02-01
JP2022-014166 2022-02-01
JP2022101880 2022-06-24
JP2022-101880 2022-06-24
JP2023-000580 2023-01-05
JP2023000580A JP2023112666A (ja) 2022-02-01 2023-01-05 制御システム、制御方法、及びコンピュータプログラム

Publications (1)

Publication Number Publication Date
WO2023149370A1 true WO2023149370A1 (fr) 2023-08-10

Family

ID=87552395

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/002623 WO2023149370A1 (fr) 2022-02-01 2023-01-27 Système de commande, procédé de commande et support de stockage

Country Status (1)

Country Link
WO (1) WO2023149370A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003302898A (ja) * 2002-04-11 2003-10-24 Hitachi Ltd 地図解析装置及びその実現のためのプログラム
JP2018106504A (ja) * 2016-12-27 2018-07-05 株式会社豊田中央研究所 情報管理制御装置、情報管理制御プログラム
JP2019502214A (ja) * 2015-11-04 2019-01-24 ズークス インコーポレイテッド 物理的環境の変化に応答して自律車両をナビゲートするための適応マッピング
JP2020038360A (ja) * 2018-08-31 2020-03-12 株式会社デンソー 車両側装置、方法および記憶媒体
JP2020095336A (ja) * 2018-12-10 2020-06-18 株式会社Subaru 自動運転支援装置
WO2020188671A1 (fr) * 2019-03-18 2020-09-24 三菱電機株式会社 Dispositif de correction d'informations de carte, corps mobile, dispositif de création d'informations de carte, système de correction d'informations de carte, procédé de correction d'informations de carte et programme de correction d'informations de carte
JP2021026693A (ja) * 2019-08-08 2021-02-22 株式会社デンソー 表示制御装置、表示制御方法、および表示制御プログラム
JP2021103091A (ja) * 2019-12-24 2021-07-15 トヨタ自動車株式会社 ルート検索システム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003302898A (ja) * 2002-04-11 2003-10-24 Hitachi Ltd 地図解析装置及びその実現のためのプログラム
JP2019502214A (ja) * 2015-11-04 2019-01-24 ズークス インコーポレイテッド 物理的環境の変化に応答して自律車両をナビゲートするための適応マッピング
JP2018106504A (ja) * 2016-12-27 2018-07-05 株式会社豊田中央研究所 情報管理制御装置、情報管理制御プログラム
JP2020038360A (ja) * 2018-08-31 2020-03-12 株式会社デンソー 車両側装置、方法および記憶媒体
JP2020095336A (ja) * 2018-12-10 2020-06-18 株式会社Subaru 自動運転支援装置
WO2020188671A1 (fr) * 2019-03-18 2020-09-24 三菱電機株式会社 Dispositif de correction d'informations de carte, corps mobile, dispositif de création d'informations de carte, système de correction d'informations de carte, procédé de correction d'informations de carte et programme de correction d'informations de carte
JP2021026693A (ja) * 2019-08-08 2021-02-22 株式会社デンソー 表示制御装置、表示制御方法、および表示制御プログラム
JP2021103091A (ja) * 2019-12-24 2021-07-15 トヨタ自動車株式会社 ルート検索システム

Similar Documents

Publication Publication Date Title
CN110914777B (zh) 用于自主车辆的高清地图以及路线存储管理系统
EP3696511A1 (fr) Procédé, appareil et système de fourniture d'une plateforme de gestion de campagnes pour découvrir des données de cartes pour mettre à jour des données de cartes
EP3696510A2 (fr) Procédé, appareil et système de fourniture d'une plateforme de gestion de campagnes pour découvrir des données de cartes
EP3628085B1 (fr) Modélisation d'incertitude et d'observation cartographiques
EP3696509B1 (fr) Procédé, appareil et système de fourniture d'une plateforme de gestion de campagne pour valider des données de cartes
EP3848728B1 (fr) Procédé, appareil et système de sélection d'une station de base pour positionnement différentiel
EP3696767B1 (fr) Procédé, appareil et système permettant de fournir une interface pour la publication des demandes de données de capteur dans une plateforme de gestion de campagne
WO2023149370A1 (fr) Système de commande, procédé de commande et support de stockage
JP2023112666A (ja) 制御システム、制御方法、及びコンピュータプログラム
WO2023149353A1 (fr) Système de commande, procédé de commande et support d'enregistrement
WO2023149264A1 (fr) Système de commande, procédé de commande et support de stockage
WO2023149358A1 (fr) Système de commande, procédé de commande et support de stockage
WO2023149376A1 (fr) Système de commande, procédé de commande et support de stockage
WO2023149288A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et support de stockage
WO2023149373A1 (fr) Système de commande, procédé de commande et support de stockage
WO2023149349A1 (fr) Système de commande, procédé de commande et support de stockage
WO2023149292A1 (fr) Système de traitement d'informations, procédé de commande et support de stockage
WO2023149281A1 (fr) Système de commande, procédé de commande et support de stockage
JP2023112669A (ja) 制御システム、制御方法、及びコンピュータプログラム
WO2023149346A1 (fr) Dispositif de traitement d'informations, système de commande, procédé de commande, et support de stockage
JP2023112658A (ja) 制御システム、制御方法、及びコンピュータプログラム
WO2023149308A1 (fr) Système de commande, procédé de commande et support d'enregistrement
JP2023112670A (ja) 制御システム、制御方法、及びコンピュータプログラム
JP2023112665A (ja) 制御システム、制御方法、及びコンピュータプログラム
JP2023112672A (ja) 情報処理装置、情報処理方法、及びコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23749686

Country of ref document: EP

Kind code of ref document: A1