WO2017057053A1 - Information processing device, information processing method - Google Patents

Information processing device, information processing method Download PDF

Info

Publication number
WO2017057053A1
WO2017057053A1 PCT/JP2016/077426 JP2016077426W WO2017057053A1 WO 2017057053 A1 WO2017057053 A1 WO 2017057053A1 JP 2016077426 W JP2016077426 W JP 2016077426W WO 2017057053 A1 WO2017057053 A1 WO 2017057053A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
vehicle
route
projection
information
Prior art date
Application number
PCT/JP2016/077426
Other languages
French (fr)
Japanese (ja)
Inventor
貝野 彰彦
辰吾 鶴見
江島 公志
嵩明 加藤
福地 正樹
卓 青木
琢人 元山
章 中村
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017057053A1 publication Critical patent/WO2017057053A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas

Definitions

  • This technology relates to an information processing apparatus and an information processing method. Specifically, the present invention relates to an information processing apparatus and an information processing method that provide support when guiding a moving object to a destination.
  • a parking support device that supports a parking operation by calculating a parking route of a vehicle has been proposed.
  • the parking support system described in Patent Document 1 below sets an index that serves as a mark in an environment that is to be parked, and performs parking support by guiding the vehicle based on the distance and direction from the index.
  • Patent Document 2 the surrounding state of the host vehicle is detected to detect the marker set in the map information, and the current position of the host vehicle is determined based on the relative positional relationship between the detected marker and the host vehicle. It has been proposed to estimate.
  • JP 2008-087676 A Japanese Patent Laying-Open No. 2015-083430
  • Patent Document 2 a fixed marker is set in advance, and processing is performed by detecting the marker.
  • a marker installed in advance is hidden by a parked vehicle and cannot be detected.
  • Patent Document 2 there is a possibility that a shift due to environmental conditions may occur, and it becomes impossible to appropriately perform parking support because the position of the vehicle cannot be accurately detected.
  • the present technology has been made in view of such a situation, and is capable of accurately detecting the position of a vehicle and appropriately guiding to a predetermined place.
  • a first information processing apparatus includes a projection control unit that controls a plurality of projection apparatuses that project a predetermined pattern, and a path creation unit that creates a path when the moving body moves to a destination.
  • the projection control unit controls the projection of the projection device located on the path created by the path creation unit.
  • the projection control unit sets a pattern projected by the projection apparatus installed at a predetermined position as a marker pattern, and sets a pattern projected by the projection apparatus installed at a position other than the predetermined position as a random pattern. Can be.
  • the predetermined position may be at least one of a corner or the vicinity of the destination.
  • the route creation unit creates a route to the destination to which the moving body can move using at least one information of the width, length, height, and minimum turning radius of the moving body. be able to.
  • the destination may be a parking space
  • the route creation unit may create the route up to a position where the moving body starts an approach for parking in the parking space.
  • a holding unit that holds a global map indicating the position of the feature point in the predetermined area in the three-dimensional space, and a local map that indicates the position of the feature point in the image captured from the moving body in the three-dimensional space.
  • a global map update unit that updates the global map based on the local map may be further provided.
  • the route creation unit can create a route with reference to the global map held in the holding unit.
  • the global map indicating the path created by the path creating unit, the marker information of the marker pattern, and the position of the feature point in a predetermined area can be supplied to the moving body.
  • a first information processing method includes a step of controlling a plurality of projection devices that project a predetermined pattern, and creating a path when the moving body moves to a destination, and controls the projection Is performed by controlling the projection of the projection device located on the created path.
  • a second information processing apparatus includes an imaging unit that captures a predetermined pattern projected by a projection device, and an estimation unit that estimates a self-position using the pattern captured by the imaging unit And a path supplied from another device, and a control signal generation unit that generates a control signal for controlling each unit to move on the path based on the estimated self-position.
  • a local map generation unit that generates a local map indicating the position in the three-dimensional space of the feature points in the image captured by the imaging unit.
  • a holding unit that holds a global map indicating a position in a three-dimensional space of feature points in a predetermined area supplied from the other device; and whether or not a position corresponding to the local map has changed in the global map
  • a determination unit that determines whether the local map is supplied to the other device when the determination unit determines that there has been a change.
  • the route can be corrected with the local map.
  • the self-position estimated by the estimation unit can be corrected.
  • the control signal may be a signal that notifies the user of the route.
  • a second information processing method captures a predetermined pattern projected by a projection device, estimates a self-position using the captured pattern, and supplies a route supplied from another device And generating a control signal for controlling each part to move on the route based on the estimated self-position.
  • a plurality of projection apparatuses that project a predetermined pattern are controlled, and a route when the moving body moves to the destination is created. Further, the projection is controlled by controlling the projection of the projection device located on the created path.
  • the second information processing apparatus and the information processing apparatus capture an image of a predetermined pattern projected by the projection device, and use the pattern captured by the imaging unit to perform self-position estimation. Based on a route supplied from another device and the estimated self-position, a control signal for controlling each part to move on the route is generated.
  • Parking assistance 2. Configuration of parking support system Operation of parking support system 4. 4. Configuration of parking assist device Operation of parking assist device 6. 6. Configuration of parking assist vehicle side device 7. Operation of parking assist vehicle side device About recording media
  • the present technology can be applied to a system that guides a vehicle to an empty space and assists in parking in a parking lot.
  • a parking lot 10 as shown in FIG. 1, and as shown in FIG. 2, the present invention is applied to a system in which a vehicle 20 entering the parking lot 10 is guided to an empty space by a route indicated by an arrow. it can.
  • an entrance gate 11 is installed in the parking lot 10, and a parking space is indicated by a white line (black line in the figure).
  • a white line black line in the figure.
  • the description will be continued assuming that the entrance gate 11 is installed and the vehicle 20 is temporarily stopped at the entrance gate 11, but there is a parking lot where the entrance gate 11 is not installed and a place where the vehicle 20 is temporarily stopped.
  • the technology described below can be applied to a parking lot that is not provided.
  • a parking space that is considered appropriate is identified for the vehicle 20 in consideration of the environment at that time, and a route to the parking space is generated.
  • a route to the parking space is generated.
  • the vehicle 20 travels based on the generated route, moves to the parking space, and parks.
  • the description will be continued by taking the vehicle 20 as an example, but the present technology can be applied not to the vehicle 20 but to a moving body (moving object).
  • FIG. 3 is a diagram illustrating a configuration of an embodiment of the parking assistance system.
  • the parking assistance system shown in FIG. 3 includes a parking assistance device 51, an imaging device 52, a projection device 53, a network 61, a database 71, and a parking assistance vehicle side device 81.
  • the parking support device 51 is a device that manages the parking lot 10 shown in FIG. Although details will be described later, the parking assistance device 51 creates a route, provides the created route to the vehicle 20, and holds and updates a global map (details will be described later).
  • the imaging device 52 is installed near the entrance gate 11 (FIG. 1) and images the vehicle 20 that has entered the parking lot 10.
  • the parking assist device 51 acquires information about the vehicle such as the vehicle width, the vehicle length, the vehicle height, and the minimum turning radius of the vehicle 20 imaged by the imaging device 52 (hereinafter referred to as vehicle information as appropriate).
  • vehicle information the minimum turning radius of the vehicle 20 imaged by the imaging device 52
  • the database 71 connected to the network 61 can be referred to.
  • the network 61 is a network composed of a WAN (Wide Area Network), a LAN (Local Area Network), or the like, and the parking support device 51 and the database 71 exchange information (data) with each other via the network 61. It can be configured.
  • the database 71 is a database that manages information related to the vehicle 20 as described above.
  • the database 71 is connected to the network 61 is shown, but a configuration provided in the parking assistance device 51 may be adopted.
  • information on the vehicle 20 is acquired with reference to the database 71, but information on the vehicle 20 may be acquired by other methods.
  • information on the vehicle 20 may be acquired by other methods.
  • information such as the vehicle width and the vehicle height may be acquired by analyzing an image captured by the imaging device 52.
  • Projection device 53 is also connected to parking support device 51.
  • the projection device 53 projects a predetermined pattern onto an object in the parking lot 10 such as a floor or a parked vehicle based on an instruction from the parking support device 51.
  • a plurality of projection devices 53 are installed in the parking lot 10. Further, when the parking lot 10 is an indoor parking lot, the projection device 53 is installed on the ceiling and projects a predetermined pattern on the floor or the vehicle.
  • the vehicle 20 includes an imaging device (camera) that images the pattern projected by the projection device 53.
  • the parking assistance vehicle side device 81 analyzes the pattern imaged by the imaging device provided in the vehicle 20, and estimates the self position and creates a local map (details will be described later).
  • a plurality of projection devices 53 are installed in the parking lot 10 as described above.
  • an ID is assigned to each of the projection devices 53 installed in the parking lot 10, and each projection device 53 can be identified by the ID.
  • the description will be continued assuming that 21 projection devices 53 are installed in the parking lot 10 and are assigned IDs.
  • projectors 53 are assigned ID1 to ID21, respectively.
  • the projection device 53 assigned ID1 is described as the projection device 53 ID1
  • the projection device 53 assigned ID2 is described as the projection device 53 ID2.
  • the other projection devices 53 are also described in a similar manner to distinguish the individual projection devices 53 in the parking lot 10.
  • the ID1 projection device 53 is installed in the vicinity of the entrance gate 11, and the ID2 projection device 53 is installed on the right side of the ID1 projection device 53.
  • an IP address or the like can also be used as information for identifying each projection device 53.
  • a number is assigned to the parking space for one vehicle.
  • numbers 1 to 48 are assigned. That is, 48 parking spaces are provided in the parking lot 10.
  • the parking space 1 is a parking space located near the entrance gate 11 in the parking lot 10 of FIG. 4.
  • Other parking spaces are listed as well.
  • a parking space 37 is first set as a parking space, and a route like an arrow is set as a route from the entrance gate 11 to the parking space 37.
  • an ID1 projection device 53 On the path of the arrow, there are an ID1 projection device 53, an ID2 projection device 53, an ID3 projection device 53, an ID7 projection device 53, an ID11 projection device 53, and an ID12 projection device 53 in the vicinity of the parking space 37. Is installed.
  • the parking assistance device 51 gives instructions for projecting predetermined patterns to the projection devices 53 of ID1, ID2, ID3, ID7, ID11, and ID12.
  • the projection device 53 is installed on the ceiling 91 and projects a predetermined pattern onto the floor 92.
  • the projection device 53 of ID1 and the projection device 53 of ID2 are shown.
  • the projection device 53 may be installed so that the projection ranges of the adjacent ID1 projection device 53 and ID2 projection device 53 overlap, or the projection device 53 does not overlap. May be installed.
  • the projected pattern may be visible light, or may be other than visible light, such as infrared rays.
  • the pattern projected from the projection device 53 is a random pattern, but the pattern projected from the projection device 53 installed at a specific position is a marker pattern.
  • the projection device 53 that projects the marker pattern can be a projection device 53 installed at a corner.
  • the projected pattern is picked up by the image pickup device of the vehicle 20, and the parking assist vehicle-side device 81 detects feature points from the picked-up image and performs analysis such as matching, thereby Estimate location or create local map.
  • the parking assist vehicle-side device 81 can easily estimate the self position.
  • the pattern projected from the projection device 53 is a pattern having a predetermined shape such as a lattice shape, and the same lattice shape pattern is projected from the ID1 projection device 53 and the ID2 projection device 53, parking is performed. It is difficult to determine whether the pattern captured by the assisting vehicle side device 81 is a pattern projected by the ID1 projection device 53 or a pattern projected by the ID2 projection device 53. An error may occur in the creation.
  • the pattern from each projection device 53 can be made different, and the parking assist vehicle-side device 81 as described above can estimate its own position, etc. It is possible to suppress the occurrence of an error.
  • a random pattern and a marker pattern are provided on the pattern projected from the projection device 53.
  • the marker pattern is used for correction of self-position estimation.
  • the marker pattern is a pattern projected by the projection device 53 installed at a corner.
  • the projection devices 53 of ID1, ID3, ID5, ID9, ID11, ID13, ID17, ID19, and ID21 are installed at the corners, and these projection devices 53 display the marker pattern. Project. Projectors 53 other than these projectors 53 project a random pattern.
  • the parking assist vehicle-side device 81 When the parking assist vehicle-side device 81 images the marker pattern, the parking assist vehicle-side device 81 corrects the estimated self-position from the information obtained from the marker pattern. Further, the projection device 53 installed near (in the vicinity of) the parking space to be guided may project a marker pattern. For example, when the parking space 37 is a guided parking space, the ID12 projection device 53 may project a marker pattern instead of a random pattern. When configured in this way, the projection device 53 of ID12 is configured to project by switching between a random pattern and a marker pattern.
  • the projection device 53 can be configured to project only one of the random pattern and the marker pattern, or can be configured to switch and project the random pattern or the marker pattern as necessary. You can also.
  • the projection device 53 is installed on the ceiling 91 and is installed so as to project a predetermined pattern onto the floor 92.
  • the projected pattern is prevented from being blocked on the road surface on which the vehicle 20 travels without being affected by the shielding object. Is possible.
  • the parking assist vehicle-side device 81 can capture and analyze a pattern projected on an object such as a parked vehicle.
  • the parking assist vehicle-side device 81 can create a local map that also considers objects in the parking lot 10 and estimate its own position, and estimates the route to be taken in consideration of the objects in the parking lot 10. can do. For example, it is possible to estimate a route that travels avoiding an object that is not always an object in the parking lot 10, such as a vehicle that protrudes to a traveling area, not a parking space, or a luggage or a person.
  • a predetermined pattern is drawn on a floor or a pillar instead of projecting a pattern with the projection device 53, and the self-position is estimated by imaging the pattern, the predetermined pattern on the floor or the pillar is There is a possibility that an image such as a parked vehicle, object, or person is blocked and cannot be imaged. Also, if the pattern is drawn, it may disappear due to aging. For this reason, when the self-position is estimated using the drawn pattern, there is a high possibility that it cannot be estimated correctly.
  • the pattern by projecting the pattern with the projection device 53, it is possible to prevent the pattern from disappearing due to deterioration over time. Further, by projecting the pattern from the ceiling with the projection device 53, in other words, projecting from above, it is only possible to prevent the pattern from being blocked by an object such as a parked vehicle, object, or person. In addition, it is possible to detect objects such as vehicles, objects, and people, and it is possible to estimate the self-position and the route with higher accuracy.
  • GPS Global Positioning System
  • IMU Inertial Measurement Unit
  • the position can be accurately estimated even indoors, unlike the case of using GPS communication. It becomes. Moreover, since it becomes possible to perform the process as described later in the parking assistance vehicle side device 81, for example, the process of correcting the position estimated by the marker pattern, the position can be estimated with high accuracy. Become.
  • the parking assist vehicle-side device 81 captures the projected pattern by the image capturing device of the vehicle 20, detects feature points from the captured pattern, and performs matching analysis, etc.
  • Estimate self-position Suppose that a self-position is estimated by imaging a structure in the parking lot 10, not a pattern projected from the projection device 53, detecting feature points from the image of the structure, and performing analysis such as matching. It can also be done.
  • the parking lot 10 has a structure such as a floor or a pillar as a structure, it is a structure with poor texture, and it is difficult to estimate the self-position by acquiring and analyzing an image of such a structure. It is difficult to estimate accurately and maintain stability.
  • step S11 the parking assistance device 51 acquires the vehicle information of the vehicle 20 that has entered.
  • step S12 the parking assistance device 51 generates a route.
  • the route is also generated using the vehicle information of the vehicle 20 acquired in step S11. For example, in consideration of the vehicle width of the vehicle, a parking space that can be parked is searched, and a route that can be passed through the vehicle width of the vehicle to the searched parking space is searched.
  • step S13 control of projection of a predetermined pattern in the projection device 53 is started.
  • the projection device 53 on the generated path is instructed to start projection.
  • step S14 the generated route is provided to the parking assist vehicle-side device 81.
  • Information provided to the parking assist vehicle-side device 81 is information on a generated route (hereinafter referred to as route information), a global map, and the like.
  • step S ⁇ b> 21 the parking assistance vehicle side device 81 acquires route information provided from the parking assistance device 51. And the parking assistance vehicle side apparatus 81 controls each part of the vehicle 20, or starts estimation of a self-position so that it can drive
  • the parking assistance vehicle side device 81 creates a local map.
  • the local map is a map related to the environment around the parking assist vehicle-side device 81 (vehicle 20). For example, it is information relating to the environment in which the parking assist vehicle-side device 81 (vehicle 20) is traveling, such as what position and at what warehousing angle the parked vehicle is stopped.
  • the local map is a map that is used when the self-position is estimated or when the route is corrected.
  • step S24 the parking assistance vehicle side device 81 provides the created local map to the parking assistance device 51 as necessary.
  • step S15 the parking assistance device 51 receives the local map provided from the parking assistance vehicle side device 81. And in step S16, the parking assistance apparatus 51 updates the global map which self manages, as needed.
  • FIG. 7 is a diagram illustrating an internal configuration of the parking assist device 51.
  • the parking assist device 51 includes an entrance vehicle detection unit 101, a vehicle information acquisition unit 102, a projection control unit 103, a pattern holding unit 104, a route creation unit 105, a global map holding unit 106, a global map update unit 107, and a communication unit 108. It is configured to include.
  • the entrance vehicle detection unit 101 detects the vehicle 20 that has entered the parking lot 10.
  • the description is continued on the assumption that the image captured by the imaging device 52 is supplied to the entrance vehicle detection unit 101 and the vehicle 20 entering the parking lot 10 is detected based on the supplied image.
  • the vehicle information acquisition unit 102 acquires information on the vehicle 20 that has entered the parking lot 10, such as information on the vehicle width, vehicle height, minimum turning radius, and the like (hereinafter referred to as vehicle information).
  • Projection control unit 103 controls projection device 53.
  • the projection control unit 103 controls the start of projection by the projection device 53 based on a control signal from the entrance vehicle detection unit 101, and projects by the projection device 53 based on a control signal based on information such as a route from the route creation unit 105. To control.
  • the pattern holding unit 104 holds, for example, the ID of the projection device 53 and the pattern (image) to be projected in association with each other.
  • the projection control unit 103 reads the pattern associated with the ID of the projection device 53 located on the path created by the route creation unit 105 from the pattern holding unit 104 and supplies the pattern to the projection device 53 located on the path. To do.
  • the route creation unit 105 uses the vehicle information acquired by the vehicle information acquisition unit 102 to determine a parking space suitable for the vehicle 20 that has entered the vehicle to park and a route from the entrance gate 11 to the parking space. The determination is made with reference to the global map held in the global map holding unit 106.
  • the global map holding unit 106 holds a global map.
  • the global map update unit 107 updates the global map stored in the global map storage unit 106 with reference to the local map supplied from the parking assistance vehicle side device 81 via the communication unit 108 as necessary.
  • the communication unit 108 communicates with the parking assistance vehicle side device 81.
  • the communication unit 108 performs communication when the route information created by the route creation unit 105, the global map held by the global map holding unit 106, and the like are provided to the parking assistance vehicle side device 81.
  • the communication unit 108 performs communication when supplying a local map or the like supplied from the parking assistance vehicle side device 81 to the global map update unit 107 or the route creation unit 105.
  • the global map is a map showing the position in a three-dimensional space of a stationary object within a predetermined wide area.
  • the global map includes information indicating the position and feature amount of a feature point of a stationary object in a predetermined region on a three-dimensional spatial coordinate system.
  • the spatial coordinate system is represented by, for example, latitude, longitude, and height from the ground.
  • the global map has a base global map (referred to as the base global map), and the global map is updated as appropriate according to the environment.
  • the global map is a map of the parking lot 10
  • the base global map is a planar map as shown in FIG. That is, the basic global map is a map of the parking lot 10 when the vehicle is empty.
  • the base global map can be, for example, a map created using a design drawing at the time of designing the parking lot 10.
  • the base global map is updated by the global map update unit 107.
  • the update is performed when a change occurs in the environment (situation) in the parking lot 10 such as an update reflecting a newly parked vehicle or an update reflecting a vehicle leaving the parking lot 10.
  • the base global map is updated to a map as shown in FIG. 2, for example.
  • the map shown in FIG. 2 is a state in which a plurality of vehicles are parked in the parking lot 10.
  • the map shown in FIG. 2 is a map in which a vehicle is parked in the parking space 1, the parking space 3, the parking space 13, the parking space 25, the parking space 30, and the parking space 33, referring to FIG. is there.
  • the parking assist vehicle side device 81 creates a local map while traveling along the designated route in the parking lot 10.
  • the local map is a map indicating the position of a stationary object around the vehicle 20 in the three-dimensional space, and is generated by the parking assistance vehicle side device 81.
  • the local map includes information indicating the position and the feature amount on the three-dimensional spatial coordinate system of the feature point of the stationary object around each moving object, as in the global map.
  • the local map is a map that reflects the surrounding environment where the vehicle is traveling. For example, when traveling around the parking space 1 and the parking space 3, a map is created that the vehicle is parked in the parking space 1 and the parking space 3.
  • a map is created in which vehicles parked in the parking space 13 are parked outside the parking space.
  • the parking assist vehicle side device 81 passes the left side of the vehicle parked in the parking space 13
  • the parking assist vehicle side device 81 does not protrude and collide with the parked vehicle.
  • a route that passes through the parking space 6 is created, and each part in the vehicle 20 is controlled so that traveling based on the created route is performed.
  • the local map created by the parking assist vehicle-side device 81 is a map reflecting the environment (situation) in the parking lot 10.
  • a local map is supplied from the parking assistance vehicle side device 81 to the global map updating unit 107 of the parking assistance device 51 at a predetermined timing.
  • the global map update unit 107 refers to the supplied local map and updates a portion where a change has occurred in the global map.
  • the global map update unit 107 determines whether the vehicle is parked in the parking space where the vehicle is parked (what position with respect to the parking space). In addition, an update reflecting the angle at which the vehicle is parked is performed.
  • the global map updating unit 107 vacates the vacant parking space. Update to return to the previous state.
  • the local map is created by the parking assist vehicle side device 81 of the vehicle 20 that is actually traveling in the parking lot 10, for example, a vehicle parked in the parking space 13 Information indicating that the vehicle parked outside the parking space 13 or parked in the parking space 3 is parked in the parking space 3 but parked diagonally can be used. .
  • the global map can be made a near real-time map according to the environment in the parking lot 10. Therefore, with reference to such a global map, it is possible to appropriately set a parking space in which the entering vehicle 20 should park and a route to the parking space.
  • step S101 the entrance vehicle detection unit 101 determines whether or not the vehicle 20 that has entered the parking lot 10 has been detected. For example, when the vehicle 20 is captured in the image captured by the imaging device 52 (FIG. 3) installed near the entrance gate 11 in the parking lot 10, the entrance vehicle detection unit 101 Detect that you have entered.
  • the entrance vehicle detection unit 101 may detect that the vehicle 20 has entered the parking lot 10.
  • the imaging device 52 (FIG. 3) installed near the entrance gate 11 in the parking lot 10 is configured to take an image of the vehicle 20 when the vehicle 20 enters, and the entrance vehicle detection unit 101. May detect that the vehicle 20 has entered the parking lot 10 when an image is provided from the imaging device 52.
  • step S101 the process in step S101 is repeated and the standby state is maintained until it is determined that the vehicle 20 that has entered the parking lot 10 is detected. On the other hand, if it is determined in step S101 that the vehicle 20 entering the parking lot 10 has been detected, the process proceeds to step S102.
  • step S102 the vehicle information acquisition unit 102 acquires vehicle information.
  • the vehicle information acquisition part 102 acquires the image imaged with the imaging device 52, and acquires the vehicle information of the vehicle 20 reflected in the image.
  • the vehicle information includes vehicle width, vehicle height, minimum turning radius, and the like, and is information used for searching for a parking space to be guided and determining a route to the parking space.
  • the vehicle information is acquired with reference to the database 71 as described with reference to FIG.
  • information may be provided from the vehicle 20 side.
  • vehicle information itself may be provided, or information that can identify a vehicle (vehicle type) may be provided.
  • the database 71 can be referred to using the provided information to obtain the vehicle information.
  • the database 71 may be provided as a part of the parking assistance device 51.
  • the vehicle information acquisition unit 102 acquires vehicle information with reference to the provided database 71. .
  • vehicle information may be provided from the vehicle 20 side via the communication unit 108.
  • the vehicle information is transmitted from the vehicle 20 side, and the transmission is performed.
  • the vehicle information acquired may be acquired by the vehicle information acquisition unit 102.
  • vehicle information such as the vehicle width and the vehicle length may be acquired by analyzing the image captured by the imaging device 52.
  • the route creation unit 105 creates a route.
  • the route creation unit 105 determines a parking space where the entering vehicle 20 is parked using information such as the vehicle width, the vehicle length, the vehicle height, and the minimum turning radius of the vehicle 20. For example, in the case of the vehicle 20 having a large vehicle width, a parking space in which a width capable of parking the vehicle 20 having such a vehicle width can be secured is searched with reference to the global map.
  • the parking space 2 (FIG. 4) is vacant, but because the vehicle is in the parking space 1 and the parking space 3, the vehicle width is large. It can be determined that the vehicle 20 cannot park.
  • the parking space is vacant, but the parking space where the space required for turning back, etc. cannot be secured before the parking space is not searched, and turning back is sufficient.
  • a search such as searching for a parking space in a place where a space required for the vehicle can be secured is also performed.
  • a search such as searching for a parking space from the rooftop or outdoor parking lot is also performed.
  • Such a search can be performed by referring to such a global map because it is written in which parking space the vehicle is parked and in what state the vehicle is parked.
  • Such a global map is updated by the global map update unit 107 so that the latest state is maintained as much as possible.
  • the route creation unit 105 refers to the global map and determines a parking space where the entering vehicle 20 parks in consideration of vehicle information.
  • the determination method of the parking space mentioned above is an example, and does not show limitation.
  • the parking space may be determined in consideration of the following conditions in addition to the following structural conditions of the vehicle 20.
  • an image in which a predetermined mark such as a mark indicating that an elderly person is driving, a mark indicating that a disabled person is driving, or a mark indicating that a beginner is driving is captured by the imaging device 52 May be searched for a parking space suitable for the driver.
  • a predetermined mark such as a mark indicating that an elderly person is driving, a mark indicating that a disabled person is driving, or a mark indicating that a beginner is driving is captured by the imaging device 52 May be searched for a parking space suitable for the driver.
  • a predetermined mark such as a mark indicating that an elderly person is driving, a mark indicating that a disabled person is driving, or a mark indicating that a beginner is driving is captured by the imaging device 52 May be searched for a parking space suitable for the driver.
  • a mark indicating that the disabled person is driving. May be determined as the parking space of the vehicle 20 that has entered the parking space where the disabled person can preferentially park.
  • a parking space suitable for the vehicle 20 entering the vehicle is determined.
  • a route from the entrance gate 11 to the parking space is searched.
  • the route is also set by referring to the global map.
  • the parking space 13 and the parking space 30 (FIG. 4) there are vehicles parked outside the parking space, so that the road width that can be traveled is narrow.
  • the parking space 13 or the parking space 30 The route to reach the determined parking space is searched and set.
  • the parking start position is, for example, a position when parking is started, and is a distance or inclination with respect to the parking space to be parked. If an approach to the parking space is started from the parking start position, the vehicle is placed in the center of the parking space. It is a position where parking is possible with the center of 20 coming.
  • the route creation unit 105 searches and sets a parking space suitable for the vehicle 20 that has entered, a route to the parking space, and a parking start position.
  • the set parking space, route, and parking start position are described as route information.
  • the route information may include information other than these pieces of information. Further, the route information may be information such that only the route information is included, and it is sufficient that at least one information of the parking space, the route, and the parking start position is included.
  • step S104 information such as route information and markers is provided to the vehicle 20 via the communication unit.
  • the route information is information generated by the route creation unit 105.
  • the marker information is information related to the marker of the marker pattern, and is information in which information on the marker itself is associated with a position on the global map.
  • a global map held by the global map holding unit 106 is also provided to the vehicle 20.
  • the vehicle 20 images the marker pattern, determines whether or not it matches the supplied marker, and identifies the projection device 53 that projects the marker pattern by confirming that it matches, in other words, If so, the position on the global map can be specified.
  • the position on the global map can be specified.
  • the information about the marker and the global map are supplied to the vehicle 20 so that such processing can be performed on the vehicle 20 side.
  • information about the marker provided by the ID3 projection device 53 and the marker pattern projected by the ID11 projection device 53 are provided as the marker-related information.
  • the pattern projected from the projection device 53 located in the vicinity of the determined parking space is also a marker pattern, and the parking assist vehicle indicates that the parking space is the destination parking space.
  • the side device 81 may be able to recognize.
  • Provision of route information and the like from the parking assistance device 51 to the parking assistance vehicle side device 81 is performed by the communication unit 108.
  • an entrance gate 11 is provided, and a mechanism for temporarily stopping the vehicle 20 at the entrance gate 11 is provided.
  • route information and the like are provided from the communication unit 108 to the vehicle 20.
  • an information providing place may be provided, and route information may be provided at the information providing place.
  • the information providing place may be a predetermined place in the parking lot 10, for example, an area between the entrance gate 11 and the position where the ID1 projection device 53 is installed.
  • the information provision location may be different for each vehicle 20 that has entered. For example, it is assumed that a plurality of vehicles 20 may enter the parking lot 10 continuously. Assume a case where a plurality of vehicles 20, for example, three vehicles 20 (referred to as vehicles 20-1, 20-2, and 20-3) enter.
  • the information providing locations of the vehicle 20-1, the vehicle 20-2, and the vehicle 20-3 may be changed.
  • the vehicle 20-1 when the vicinity of the ID3 projection device 53 is set as the information providing place, and the vehicle 20-1 travels (temporarily stops) near the ID3 projection device 53, the route information, etc. To be provided to 20-1.
  • the vicinity of the ID2 projection device 53 is used as the information providing place, and when the vehicle 20-2 travels (temporarily stops) near the ID2 projection device 53, the route information and the like are displayed. -To be provided in 2. Further, for the vehicle 20-3, when the vicinity of the ID1 projection device 53 is used as the information providing place, and the vehicle 20-3 travels (temporarily stops) near the ID1 projection device 53, the route information and the like are obtained. -To be provided in 3.
  • the same route information may be included in the vehicle 20-1 and the vehicle 20- 2.
  • each vehicle 20 can have different information providing locations.
  • provision of information to each vehicle 20 can be performed. Therefore, it is possible to reduce the possibility that the same route information is erroneously provided to the plurality of vehicles 20.
  • an ID for identifying the vehicle 20 that has entered may be assigned to each vehicle 20.
  • a combination of the above-described information providing location with a mechanism that varies for each vehicle 20 and a mechanism that allows the vehicle 20 to determine whether or not the information is information about itself based on the ID may be provided.
  • the parking support device 51 allocates an ID to the vehicle 20 that has entered the entrance gate 11 or a place corresponding to the entrance gate 11, and the assigned ID is assigned to the vehicle 20 (the parking support vehicle side device 81). To provide.
  • the parking support device 51 controls the communication unit 108 so as to provide the information at the information providing place by associating the allocated ID with the provided information when providing the route information or the like to the vehicle 20.
  • the parking support vehicle-side device 81 of the vehicle 20 acquires information from the parking support device 51, it determines whether or not the ID managed by the device and the ID associated with the acquired information match. If it matches, it is stored as information addressed to itself, and if it does not match, the information is not addressed to itself and is discarded.
  • the information providing place may be set to one place, and information may be provided at that place, and the vehicle 20 side may determine whether the information is addressed to itself based on the ID.
  • step S105 the projection control unit 103 starts controlling the projection of the projection device 53.
  • the projection control unit 103 controls the projection device 53 installed in the parking lot 10.
  • the projection control unit 103 controls each projection by the projection devices 53 (FIG. 4) ID1 to ID21.
  • the projection control unit 103 starts projection control of the projection device 53.
  • the projection by the projection device 53 is not always performed (the projection is always performed during the time when the parking lot 10 is used), and the vehicle 20 has entered the parking lot 10.
  • the projection is started, and the projection is terminated when a predetermined condition is satisfied such that the vehicle 20 has passed, the vehicle has stopped in the parking space, or a predetermined time has elapsed since the projection was started.
  • the projection is not always performed, but is performed as necessary, so that the power consumed by the projection device 53 can be reduced, and the power consumed by the parking assistance system can be reduced. It becomes possible. Note that in a situation where it is not necessary to reduce power consumption, the projection by the projection device 53 can be configured so that it is always performed. Technology can be applied.
  • Projection is controlled by the projection control unit 103 in the projection device 53 located on the path. For example, when the route as shown in FIG. 2 is set, the projection of the projection device 53 of ID1, ID2, ID3, ID7, ID11, and ID12 is controlled.
  • the projection devices 53 of ID1, ID2, ID3, ID7, ID11, and ID12 start projecting a predetermined pattern, and the other projection devices 53 are in a state in which a standby state in which projection is not performed is maintained. Is done.
  • the projection devices 53 of ID1, ID2, ID3, ID7, ID11, and ID12 are controlled to project a marker pattern, and the projection devices of ID2, ID7 53 is controlled to project a random pattern.
  • the projection control unit 103 identifies the projection device 53 that starts projection from the route information supplied from the route creation unit 105, and selects a pattern (random pattern or marker pattern) associated with the ID of the identified projection device 53. Read from the pattern holding unit 104. The pattern holding unit 104 holds the ID, pattern, and marker information of the projection device 53 in association with each other.
  • the projection control unit 103 reads the pattern held by the pattern holding unit 104 and supplies the pattern to the projection device 53 identified by the corresponding ID.
  • the projection device 53 projects a predetermined pattern under the control of the projection control unit 103.
  • the projection control may be started at other timings, such as when the vehicle 20 that has entered the vehicle is detected, not after the route information is generated.
  • the entrance vehicle detection unit 101 detects an entering vehicle 20
  • the entrance vehicle detection unit 101 instructs the projection control unit 103 to start projection.
  • the projection control unit 103 starts projection of the projection device 53 installed near the entrance gate 11.
  • the global map update unit 107 determines whether a local map has been acquired.
  • the local map is supplied from the parking assistance vehicle side device 81 at a predetermined timing.
  • the predetermined timing may be any timing. For example, when the parking assist vehicle-side device 81 side determines that there is a difference between the held global map and the local map, a marker pattern is projected. For example, when the vehicle travels under the projector 53, when the vehicle arrives at the parking space, or when parking is completed.
  • step S106 the process of step S106 is repeated until it is determined that the local map has been acquired. On the other hand, if it is determined in step S106 that a local map has been acquired, the process proceeds to step S107.
  • the global map update unit 107 updates the global map held in the global map holding unit 106 using the acquired local map.
  • the local map is a map created by the parking assist vehicle side device 81 reflecting the environment around the vehicle 20 when the vehicle 20 actually travels in the parking lot 10.
  • the global map update unit 107 refers to such a local map, determines whether or not there is a change in the area of the global map corresponding to the acquired local map. Update the global map to reflect.
  • the global map is a parking space without a parked vehicle, but the local map is a parking space with a parked vehicle, it is determined that there is a change that the vehicle is parked in an empty parking space. As a result, the global map is updated to reflect the changes.
  • the global map is updated in response to environmental (situation) changes in the parking lot 10.
  • the updated global map may be provided to the vehicle 20 traveling in the parking lot 10 at the time of the update.
  • the support for parking the vehicle 20 is performed by repeatedly performing the processing described with reference to the flowchart of FIG.
  • the parking assist vehicle side device 81 includes a self-position estimation processing unit 201, an information reception instruction unit 202, a communication unit 203, a global map update determination unit 204, an information holding unit 205, a local route generation unit 206, a vehicle control signal generation unit 207, And a notification generation unit 208.
  • the self-position estimation processing unit 201 has a configuration as described later with reference to FIG. 10, and performs self-position estimation and creation of a local map.
  • the information reception instruction unit 202 issues an instruction to receive information supplied from the parking support device 51 to the communication unit 203.
  • Information received by the communication unit 203 includes the above-described route information, global map, information on markers, and the like.
  • the information reception instruction unit 202 issues a reception instruction to the communication unit 203 when reception of information supplied from the parking support device 51 is instructed by an input from a user (driver), for example. Further, as described above, when the entrance gate 11 or the information providing place is designated, the information reception instruction unit 202 determines whether or not it has arrived at such a position, and at such a position. When it is determined that it has arrived, a reception instruction is issued to the communication unit 203.
  • the position where the vehicle 20 is located can be used as the start position of the self-position estimation so that the subsequent self-position estimation is performed.
  • the first marker that determines a specific position (part of the global map) can be received by a start signal that is issued when the user performs an operation, and the position where the first marker is detected Can be configured so that the subsequent self-position estimation is performed as the self-position estimation start position.
  • the communication unit 203 communicates with the parking assistance device 51.
  • the communication unit 203 receives a global map, route information, and the like supplied from the parking assistance device 51 and supplies the local map to the parking assistance device 51.
  • the global map update determination unit 204 determines whether or not to update the global map held in the information holding unit 205, and updates the global map as necessary.
  • the global map update determination unit 204 determines that there is a difference between the local map generated by the self-position estimation processing unit 201 and the position corresponding to the local map of the global map stored in the information storage unit 205. If it is determined that the global map is to be updated.
  • the global map update determination unit 204 determines to update the global map, the global map update determination unit 204 updates a part of the global map stored in the information storage unit 205 that is different from the local map so as to match the local map.
  • the global map update determination unit 204 transmits the local map to the parking assistance device 51 via the communication unit 203.
  • the global map update determination unit 204 determines that the global map needs to be updated, the local map is transmitted to the parking support device 51 without updating the global map held in the information holding unit 205. Only the processing to be performed may be executed. That is, when the global map held on the parking support device 51 side needs to be updated in order to match the environment (situation) in the parking lot 10, the parking support device 51 is notified that the update is necessary.
  • the global map update determination unit 204 can be configured to supply a local map indicating a location to be updated.
  • the information holding unit 205 holds the global map, route information, and marker information received by the communication unit 203.
  • the local route generation unit 206 modifies the route specified by the route information held in the information holding unit 205 with reference to the local map. For example, it is assumed that the route information designates passing the parking space 13 (FIGS. 2 and 4) and the vehicle is designated to travel on the center of the road. Then, as shown in FIG. 2, it is assumed that a local map is created in which a vehicle parked in the parking space 13 protrudes from the parking space 13 and protrudes to the traveling area.
  • the local route generation unit 206 when the local route generation unit 206 travels beside the parking space 13, the local route generation unit 206 does not travel in the center of the road, but travels around a vehicle parked in the parking space 13, for example, parking. Change to the route from the space 6 side.
  • the local route generation unit 206 when the local route generation unit 206 arrives at the parking start position, it also generates a route from the parking start position to parking in the parking space.
  • the local route generation unit 206 When parking in a parking space, depending on the environment (situation) around the parking space, especially vehicles parked on both sides and vehicles parked in front of the vehicle, There is a high possibility that the route on which 20 should travel is different. Therefore, the local route generation unit 206 generates a route that takes into account the environment around the parking space.
  • the vehicle control signal generation unit 207 controls a control system that controls the speed and the traveling direction of each unit in the vehicle 20, such as an engine, a brake, and a handle, based on the local route generated by the local route generation unit 206. Generate a signal to control. Each unit in the vehicle 20 is controlled based on the control signal generated by the vehicle control signal generation unit 207.
  • the vehicle 20 is not controlled based only on the control signal generated by the vehicle control signal generation unit 207, but may be a control signal for assisting the driving of the driver.
  • the notification generation unit 208 performs control for notifying the user (driver) of the local route generated by the local route generation unit 206.
  • a local route is displayed on the display of the car navigation system, or a message is displayed when approaching a corner.
  • notification may be made by voice. It is also possible to configure such that a warning is given when the route is deviated, or a notification for returning to the correct route is given.
  • FIG. 10 is a diagram illustrating an internal configuration of the self-position estimation processing unit 201.
  • the self-position estimation processing unit 201 can partially apply a technique called SLAM (Simultaneous Localization and Mapping).
  • the self-position estimation processing unit 201 is configured to include an estimation unit 301, a position information generation unit 302, an object detection unit 303, and a local map generation unit 304.
  • the estimation unit 301 calculates the movement amount, position, posture, and speed of the vehicle based on the relative positions of the feature points in the left image and the right image captured by the imaging devices 401L and 401R (FIG. 11) and the moving object.
  • the imaging device 401L and the imaging device 401R are installed in front of the vehicle 20
  • the imaging device 401L images the left front of the vehicle 20
  • the imaging device 401R It is installed in the position which images each.
  • the estimation unit 301 includes image correction units 321L and 311R, a feature point detection unit 322, a parallax matching unit 323, a distance estimation unit 324, a feature amount calculation unit 325, a map information storage unit 326, a motion matching unit 327, and a movement amount estimation unit 328. , An object dictionary storage unit 329, an object recognition unit 330, a position / orientation information storage unit 331, a position / orientation estimation unit 332, and a speed estimation unit 333.
  • the image correction unit 321L and the image correction unit 321R correct the left image supplied from the imaging device 401L and the right image supplied from the imaging device 401R, respectively, so that the images are directed in the same direction.
  • the image correction unit 321L supplies the corrected left image to the feature point detection unit 322 and the motion matching unit 327.
  • the image correction unit 321R supplies the corrected right image to the parallax matching unit 323.
  • the feature point detection unit 322 detects a feature point of the left image.
  • the feature point detection unit 322 supplies two-dimensional position information indicating the position of each detected feature point on the two-dimensional image coordinate system to the parallax matching unit 323 and the feature amount calculation unit 325.
  • the image coordinate system is represented by, for example, an x coordinate and ay coordinate in the image.
  • the parallax matching unit 323 detects the feature point of the right image corresponding to the feature point detected in the left image. Thereby, the parallax which is the difference between the position on the left image of each feature point and the position on the right image is obtained.
  • the parallax matching unit 323 supplies two-dimensional position information indicating the positions of the feature points on the image coordinate system in the left image and the right image to the distance estimation unit 324.
  • the distance estimation unit 324 estimates the distance to each feature point based on the parallax between the left image and the right image of each feature point, and further calculates the position of each feature point on the three-dimensional spatial coordinate system To do.
  • the distance estimation unit 324 supplies three-dimensional position information indicating the position of each feature point on the spatial coordinate system to the feature amount calculation unit 325.
  • the feature amount calculation unit 325 calculates the feature amount of each feature point of the left image.
  • the feature amount calculation unit 325 causes the map information storage unit 326 to store feature point information including the three-dimensional position information of each feature point and the feature amount.
  • the map information storage unit 326 stores a global map supplied from the parking assist device 51 in addition to the feature point information used for the local map.
  • the motion matching unit 327 acquires, from the map information storage unit 326, the three-dimensional position information of each feature point detected in the previous frame. Next, the motion matching unit 327 detects a feature point corresponding to each feature point detected in the previous frame in the left image of the current frame. Then, the motion matching unit 327 supplies the movement amount estimation unit 328 with the three-dimensional position information in the previous frame of each feature point and the two-dimensional position information indicating the position on the image coordinate system in the current frame. .
  • the description is continued assuming that the previous frame is compared with the current frame, but the frame before N frames (a plurality of frames) and the current frame may be compared.
  • the previous frame is also described, but the frame may be N frames before, and the present technology is not limited to the comparison with the previous frame.
  • the movement amount estimation unit 328 determines the vehicle between frames (more precisely, the imaging device 401L). Estimate the amount of movement of the position and orientation.
  • the movement amount estimation unit 328 supplies movement amount information indicating the estimated movement amount of the position and posture of the host vehicle to the object detection unit 303, the position / orientation estimation unit 332, and the speed estimation unit 333.
  • the object recognition unit 330 recognizes an object in the left image based on the object dictionary stored in the object dictionary storage unit 329. Based on the recognition result of the object, the object recognition unit 330 sets initial values (hereinafter, referred to as an initial position and an initial posture) of the position and posture of the own vehicle (more precisely, the imaging device 401L) in the spatial coordinate system. To do.
  • the object recognition unit 330 causes the position / orientation information storage unit 331 to store initial position / orientation information indicating the set initial position and initial posture.
  • the position / orientation estimation unit 332 is based on the initial position / orientation information stored in the position / orientation information storage unit 221, the position / orientation information of the previous frame, and the estimation result of the movement amount of the own vehicle. Estimate position and orientation. Further, the position / orientation estimation unit 332 corrects the estimated position and orientation of the host vehicle based on the global map stored in the map information storage unit 326 as necessary.
  • the position / orientation estimation unit 332 supplies position / orientation information indicating the estimated position and orientation of the host vehicle to the dangerous area determination unit 112, the risk prediction unit 113, the position information generation unit 302, and the speed estimation unit 333, The information is stored in the position / orientation information storage unit 221.
  • the speed estimation unit 333 estimates the speed of the host vehicle by dividing the estimated movement amount of the host vehicle by the elapsed time.
  • the speed estimation unit 333 supplies speed information indicating the estimated speed to the danger prediction unit 113 and the position information generation unit 302.
  • the location information generation unit 302 generates location information including the location and speed of the vehicle when notified from the danger region determination unit 112 that the vehicle is in the danger region.
  • the position information generation unit 302 supplies the generated position information to the transmission unit 104.
  • the object detection unit 303 Based on the movement amount information and the feature point information of the previous frame and the current frame stored in the map information storage unit 326, the object detection unit 303 detects the stationary object and the moving body around the host vehicle. To detect. The object detection unit 303 notifies the local map generation unit 304 of detection results of stationary objects and moving objects around the host vehicle.
  • the local map generation unit 304 generates a local map based on the detection results of stationary objects and moving objects around the host vehicle and the feature point information of the current frame stored in the map information storage unit 326.
  • the generated local map is transmitted from the communication unit 203 (FIG. 9) to the parking assist device 51 as necessary.
  • step S201 the imaging devices 401R and 401L provided in the vehicle 20 start imaging.
  • An image picked up by the image pickup device 401 is a random pattern or a marker pattern projected by the projection device 53.
  • the self-position estimation processing unit 201 When imaging is started by the imaging apparatus 401, the self-position estimation processing unit 201 starts self-position estimation in step S202.
  • the self-position estimation processing unit 201 generates a local map and estimates a position where the self is in the parking lot 10. Specifically, it is estimated as follows.
  • the image correction unit 321L and the image correction unit 321R correct the left image supplied from the imaging device 401L and the right image supplied from the imaging device 401R, respectively, so that the images are directed in the same direction.
  • the image correction unit 321L supplies the corrected left image to the feature point detection unit 322 and the motion matching unit 327.
  • the image correction unit 321R supplies the corrected right image to the parallax matching unit 323.
  • the feature point detection unit 322 detects a feature point of the left image.
  • a feature point detection method for example, an arbitrary method such as a Harris corner can be used.
  • the feature point detection unit 322 supplies two-dimensional position information indicating the position of each detected feature point on the image coordinate system to the parallax matching unit 323.
  • the parallax matching unit 323 detects the feature point of the right image corresponding to the feature point detected in the left image.
  • the parallax matching unit 323 supplies two-dimensional position information indicating the positions of the feature points on the image coordinate system in the left image and the right image to the distance estimation unit 324.
  • the distance estimation unit 324 estimates the distance to each feature point based on the parallax between the left image and the right image of each feature point, and further calculates the position of each feature point on the three-dimensional spatial coordinate system To do.
  • the distance estimation unit 324 supplies three-dimensional position information indicating the position of each feature point on the spatial coordinate system to the feature amount calculation unit 325.
  • the feature amount calculation unit 325 calculates the feature amount of each feature point of the left image.
  • the feature amount for example, an arbitrary feature amount such as SURF (Speeded Up Robust ⁇ ⁇ ⁇ Features) can be used.
  • the feature amount calculation unit 325 causes the map information storage unit 326 to store feature point information including the three-dimensional position information of each feature point and the feature amount.
  • the motion matching unit 327 acquires, from the map information storage unit 326, the three-dimensional position information of each feature point detected in the previous frame. Next, the motion matching unit 327 detects a feature point corresponding to each feature point detected in the previous frame in the left image of the current frame. Then, the motion matching unit 327 supplies the movement amount estimation unit 328 with the three-dimensional position information in the previous frame of each feature point and the two-dimensional position information indicating the position on the image coordinate system in the current frame. .
  • the movement amount estimation unit 328 estimates the movement amount of the host vehicle (more precisely, the imaging device 401L) between the previous frame and the current frame. For example, the movement amount estimation unit 328 calculates the movement amount dX that minimizes the value of the cost function f in the following equation (1).
  • the moving amount dX indicates the moving amount of the position and posture of the own vehicle (more precisely, the imaging device 401L) from the previous frame to the current frame.
  • the movement amount dX indicates the movement amount of the position in the three-axis direction (three degrees of freedom) and the posture around each axis (three degrees of freedom) in the spatial coordinate system.
  • M t ⁇ 1 and Z t indicate the positions of the frame immediately before the corresponding feature point and the current frame. More specifically, M t-1 indicates the position of the feature point on the spatial coordinate system of the previous frame, and Z t indicates the position of the feature point on the image coordinate system of the current frame. Yes.
  • proj (dX, M t-1 ) is the image coordinates of the left image of the current frame, using the movement amount dX as the position M t-1 of the feature point in the previous frame on the spatial coordinate system. The projected position on the system is shown. That is, proj (dX, M t-1 ) estimates the position of the feature point on the left image of the current frame based on the position M t-1 of the feature point in the previous frame and the movement amount dX. Is.
  • the movement amount estimation unit 328 obtains a movement amount dX that minimizes the sum of squares of Z t -proj (dX, M t-1 ) of each feature point shown in Expression (1) by, for example, the least square method. That is, the movement amount estimation unit 328 calculates the feature point of the left image of the current frame on the image coordinate system based on the position M t-1 of the feature point on the spatial coordinate system and the movement amount dX of the previous frame. The amount of movement dX that minimizes the error when the position of is estimated is obtained.
  • the movement amount estimation unit 328 supplies movement amount information indicating the obtained movement amount dX to the object detection unit 303, the position / orientation estimation unit 332, and the speed estimation unit 333.
  • the position / orientation estimation unit 332 acquires position / orientation information in the previous frame from the position / orientation information storage unit 331. Then, the position / orientation estimation unit 332 adds the movement amount dX estimated by the movement amount estimation unit 328 to the position and orientation of the own vehicle in the previous frame, thereby obtaining the current position and orientation of the own vehicle. presume.
  • the position / orientation estimation unit 332 acquires initial position / orientation information from the position / orientation information storage unit 331 when estimating the position and orientation of the vehicle in the first frame. Then, the position / orientation estimation unit 332 estimates the position and orientation of the host vehicle by adding the movement amount dX estimated by the movement amount estimation unit 328 to the initial position and initial posture of the host vehicle.
  • the position / orientation estimation unit 332 corrects the estimated position and orientation of the own vehicle based on the global map stored in the map information storage unit 326 as necessary.
  • the position / orientation estimation unit 332 supplies position / orientation information indicating the estimated position and orientation of the host vehicle to the position information generation unit 302 and the speed estimation unit 333 and causes the position / orientation information storage unit 331 to store the position / orientation information.
  • Speed estimation unit 333 estimates the speed of the vehicle. Specifically, the speed estimation unit 333 estimates the speed of the host vehicle by dividing the movement amount dX estimated by the movement amount estimation unit 328 by the elapsed time. The speed estimation unit 333 supplies speed information indicating the estimated speed to the position information generation unit 302.
  • the object detection unit 303 detects a surrounding object, for example, a parked vehicle, a structure such as a pillar, or the like. Specifically, the object detection unit 303 acquires the feature point information of the previous frame and the current frame from the map information storage unit 326. Next, the object detection unit 303 performs matching between the feature point of the previous frame and the feature point of the current frame, and detects the movement of each feature point between frames.
  • the object detection unit 303 performs a feature point that moves corresponding to the movement of the own vehicle based on the movement amount dX estimated by the movement amount estimation unit 328 and a movement that corresponds to the movement of the own vehicle. Distinguish from no feature points.
  • the object detection unit 303 detects a stationary object around the own vehicle based on the feature point that moves corresponding to the movement of the own vehicle.
  • the object detection unit 303 detects a moving body around the own vehicle based on feature points that do not move corresponding to the movement of the own vehicle. Then, the object detection unit 303 notifies the local map generation unit 204 of the detection results of stationary objects and moving objects around the own vehicle.
  • the local map generation unit 304 acquires feature point information of the current frame from the map information storage unit 326. Next, the local map generation unit 304 deletes information on the feature points of the surrounding moving objects detected by the object detection unit 303 from the acquired feature point information. Then, the local map generation unit 304 generates a local map based on the remaining feature point information.
  • step S203 the communication unit 203 communicates with the parking support device 51, receives route information, a global map, and information on the marker, supplies the information to the information holding unit 205, and holds the information. As described above, this communication is performed at the entrance gate 11, the information providing place, etc., and information is received.
  • the local route generation unit 206 generates a local route.
  • the local route can be such that the route specified in the route information is a finely tuned route in the local map. For example, as described above, in a place where the vehicle is parked out of the parking space on the designated route, the route is finely adjusted to avoid the vehicle. Further, there is a possibility that the route is deviated, and in such a case, the route is corrected.
  • a route from the parking start position to parking in the parking space is generated.
  • more detailed driving is required than when traveling, such as turning back, and more detailed routes are generated as routes, and routes that take into account the environment (situation) around the parking space are generated. To do.
  • the guidance to the parking space may be performed by using the marker pattern of the marker pattern being imaged as the pattern projected from the projection device 53 near the parking start position.
  • the marker of the marker pattern being imaged changes even if it is the same marker as the vehicle 20 moves. From the change, it is determined whether or not the vehicle is traveling accurately on the designated route. When it is determined that the vehicle is not traveling, the control for correcting the vehicle is repeatedly performed to guide the parking space. May be performed.
  • step S205 the vehicle control signal generation unit 207 generates a control signal for controlling the speed and direction of the vehicle 20 based on the local route generated by the local route generation unit 26.
  • step S206 the notification generation unit 208 notifies the user (driver) of the route.
  • the route is displayed to the user by using the navigation system to display the route on the display of the navigation system or by notifying a corner or the like by voice.
  • control signal for controlling the vehicle 20 only one of the generation of the control signal for controlling the vehicle 20 and the notification of the route to the user may be performed.
  • the generation of a control signal for controlling the vehicle 20 may be performed, and the vehicle 20 may be controlled based on the local route to reduce the burden on the driver.
  • the route notification to the user may be performed, and the user (driver) may use the notification to drive or park the designated parking space.
  • step S207 the global map update determination unit 204 determines whether or not the global map stored in the information storage unit 205 needs to be updated.
  • the global map update determination unit 204 determines that there is a difference between the local map generated by the self-position estimation processing unit 201 and the position corresponding to the local map of the global map stored in the information storage unit 205. If it is determined that the global map is to be updated.
  • step S207 when the global map update determination unit 204 determines to update the global map, the process proceeds to step S208, where a difference occurs from the local map of the global map held in the information holding unit 205.
  • Information global map update information
  • a process in which a part of the global map stored in the information storage unit 205 that is different from the local map is updated to match the local map may be performed.
  • the local map itself may be supplied to the parking assistance device 51 via the communication unit 203. As described above, when the parking assistance device 51 receives the supply of the local map, the parking assistance device 51 updates the global map held by itself.
  • the global map update determination unit 204 determines that the global map needs to be updated, the local map is transmitted to the parking support device 51 without updating the global map held in the information holding unit 205. Only the processing to be performed may be executed. That is, when the global map held on the parking support device 51 side needs to be updated in order to match the environment (situation) in the parking lot 10, the parking support device 51 is notified that the update is necessary.
  • the global map update determination unit 204 can be configured to supply a local map indicating a location to be updated.
  • step S207 determines whether the global map is updated, or if the global map is updated in step S208 (when the local map is provided to the parking assist device 51 side). If it is determined in step S207 that the global map is not updated, or if the global map is updated in step S208 (when the local map is provided to the parking assist device 51 side), step S209. The process proceeds.
  • step S209 it is determined whether or not a marker has been detected.
  • the self-position estimation processing unit 201 determines whether or not the marker held in the information holding unit 205 has been detected from the image captured by the imaging device 401. If it is determined in step S209 that a marker has been detected, the process proceeds to step S210.
  • step S210 self-position correction is performed.
  • the projection device 53 installed at a predetermined position on the global map projects a marker pattern, and the marker is detected when the marker pattern projected by the projection device 53 is imaged.
  • the self-position estimation processing unit 201 corrects the position estimated by itself.
  • step S209 determines whether or not the destination has been reached.
  • step S211 it is determined whether or not the destination has been reached.
  • the parking start position may be set as the destination, or until the parking space is parked. Further, the projection device 53 installed at the position set as the destination projects the marker pattern, and it is determined whether or not the marker according to the marker pattern is detected. This determination can be made.
  • step S211 processing is returned to step S204 until it is determined that the destination has been reached, and the subsequent processing is repeated. That is, the execution of the process of generating a local route and moving to the destination based on the local route is maintained.
  • step S211 if it is determined in step S211 that the destination has been reached, the processing of the parking assistance vehicle side device 81 is terminated.
  • the parking assistance vehicle side device 81 travels on the route set by the parking assistance device 51 by creating a local map or a local route, assistance when the driver parks in the parking lot 10. It is possible to reduce the burden on the driver.
  • assistance until leaving the parking space can be performed in the same way, The present technology can be applied even when leaving. That is, a route from the parking space to the exit gate is generated by the parking assist device 51, and the vehicle 20 travels based on the route, thereby assisting until the vehicle exits.
  • the present technology can be applied to the parking lot 10 both indoors and outdoors.
  • the projector 53 is installed on the ceiling, and in the case of outdoors, a pole or the like is installed, and the projector 53 is installed on the pole.
  • the present technology can be applied to support for a vehicle traveling on a highway.
  • the present technology is not applied only to the vehicle 20, and other objects (moving objects) It is also possible to apply the present technology to For example, the present technology can be applied to a vehicle (robot) traveling in a factory, an airplane (including a drone), and the like.
  • the position, posture, speed, and the like of the vehicle 20 are estimated by a stereo camera system using two imaging devices 401 in the vehicle 20 has been described as an example.
  • the position, posture, speed, and the like of the moving body may be estimated using three or more imaging devices.
  • the present technology can be applied not only to a case where the moving body is a vehicle that is moved by a prime mover, but also to a case that a vehicle is driven by rails or overhead lines, a vehicle that is moved by human power, or the like. Furthermore, the present technology can be applied regardless of differences in vehicle driving methods (for example, automatic driving, manual driving, remote control, etc.).
  • the present technology can be applied to, for example, a case where many moving bodies come and go and a blind spot from the moving body is likely to occur.
  • a person wearing a head-mounted display uses an AR (augmented reality) or VR (virtual reality) application, such as collision, rear-end collision, contact, etc. between people on a road or event venue where many people come and go. It can be applied to avoid accidents and guide to a predetermined position.
  • AR augmented reality
  • VR virtual reality
  • the series of processes described above can be executed by hardware or can be executed by software.
  • a program constituting the software is installed in the computer.
  • the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
  • FIG. 13 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input / output interface 1005 is further connected to the bus 1004.
  • An input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010 are connected to the input / output interface 1005.
  • the input unit 1006 includes a keyboard, a mouse, a microphone, and the like.
  • the output unit 1007 includes a display, a speaker, and the like.
  • the storage unit 1008 includes a hard disk, a nonvolatile memory, and the like.
  • the communication unit 1009 includes a network interface.
  • the drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 1001 loads, for example, the program stored in the storage unit 1008 to the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the program. Is performed.
  • the program executed by the computer (CPU 1001) can be provided by being recorded on the removable medium 1011 as a package medium, for example.
  • the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 1008 via the input / output interface 1005 by attaching the removable medium 1011 to the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be installed in the ROM 1002 or the storage unit 1008 in advance.
  • the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
  • system represents the entire apparatus composed of a plurality of apparatuses.
  • this technique can also take the following structures.
  • a projection control unit that controls a plurality of projection devices that project a predetermined pattern;
  • a route creation unit that creates a route when the moving body moves to the destination, and
  • the projection control unit controls projection of the projection device located on the path created by the path creation unit.
  • the projection control unit sets a pattern projected by the projection apparatus installed at a predetermined position as a marker pattern, and sets a pattern projected by the projection apparatus installed at a position other than the predetermined position as a random pattern.
  • the information processing apparatus according to (1).
  • the route creation unit creates a route to the destination to which the moving body can move using at least one of the width, length, height, and minimum rotation radius of the moving body.
  • the information processing apparatus according to any one of (3) to (3).
  • the destination is a parking space;
  • the information processing apparatus according to any one of (1) to (4), wherein the route creation unit creates the route up to a position where the moving body starts an approach for parking in the parking space.
  • the system further includes a global map update unit that acquires a local map indicating a position in a three-dimensional space of a feature point in an image photographed from the moving body, and updates the global map based on the local map.
  • the information processing apparatus according to any one of (5) to (5).
  • the route created by the route creation unit, marker information of the marker pattern, and a global map indicating the position of a feature point in a predetermined region in a three-dimensional space are supplied to the moving body.
  • Information processing device Controlling a plurality of projection devices that project a predetermined pattern; Including a step of creating a route when the moving object moves to the destination, The projection control is performed by controlling the projection of the projection device located on the created path.
  • the information processing apparatus further including a local map generation unit that generates a local map indicating a position in a three-dimensional space of the feature point in the image captured by the imaging unit.
  • a determination unit that determines whether or not the position corresponding to the local map has changed in the global map; and
  • the information processing apparatus according to (11), wherein the route is corrected with the local map.
  • the information processing apparatus according to any one of (10) to (13), wherein when the marker is detected from the predetermined pattern, the self-position estimated by the estimation unit is corrected.
  • the control signal is a signal for notifying a user of the route.

Abstract

The present technique relates to an information processing device and an information processing method which makes it easier for a driver to park. This information processing device is provided with a projection control unit which controls multiple projection devices that project a prescribed pattern, and a path creation unit which creates a path when a mobile body is moving to a target position. The projection control unit controls that projection by a projection device positioned on the path created by the path creation unit. The information processing device is provided with an imaging unit which images the prescribed pattern projected by the projection device, an estimation unit which estimates the local position using the pattern imaged by the imaging unit, and a control signal generation unit which, on the basis of a path supplied from another device and the estimated local position, generates a control signal for controlling each part to enable moving along the path. This technique can be adapted to systems for managing a parking lot.

Description

情報処理装置、情報処理方法Information processing apparatus and information processing method
 本技術は、情報処理装置、情報処理方法に関する。詳しくは、移動物体を目的地まで誘導する際の支援を行う情報処理装置、情報処理方法に関する。 This technology relates to an information processing apparatus and an information processing method. Specifically, the present invention relates to an information processing apparatus and an information processing method that provide support when guiding a moving object to a destination.
 従来、車両の駐車経路を算出して駐車動作を支援する駐車支援装置が提案されている。例えば、下記の特許文献1に記載の駐車支援システムは、駐車の対象となる環境において目印となる指標を設定し、その指標との距離や方向に基づいて車両を誘導して駐車支援を行う。 Conventionally, a parking support device that supports a parking operation by calculating a parking route of a vehicle has been proposed. For example, the parking support system described in Patent Document 1 below sets an index that serves as a mark in an environment that is to be parked, and performs parking support by guiding the vehicle based on the distance and direction from the index.
 特許文献1に記載の駐車支援システムでは、小石の存在、横風の発生、路面形状の経年劣化等の環境条件を原因として誘導中の車両の位置にズレが生じる可能性があり、ズレの生じ具合によっては指標を見失う可能性がある。その場合に、車両の位置を的確に検出できなくなることから駐車支援を適切に行うことができなくなってしまう可能性がある。 In the parking assistance system described in Patent Document 1, there is a possibility that the position of the vehicle being guided may be displaced due to environmental conditions such as the presence of pebbles, the occurrence of crosswinds, and aging of the road surface shape. Some indicators may be missed. In that case, since it becomes impossible to detect the position of the vehicle accurately, there is a possibility that parking assistance cannot be performed appropriately.
 そこで特許文献2では、自車両の周辺状況を検出してマップ情報において設定されたマーカーを検出するとともに、検出されたマーカーと自車両との相対的な位置関係に基づいて自車両の現在位置を推定することが提案されている。 Therefore, in Patent Document 2, the surrounding state of the host vehicle is detected to detect the marker set in the map information, and the current position of the host vehicle is determined based on the relative positional relationship between the detected marker and the host vehicle. It has been proposed to estimate.
特開2008-087676号公報JP 2008-087676 A 特開2015-083430号公報Japanese Patent Laying-Open No. 2015-083430
 特許文献2では、予め固定のマーカーが設置され、そのマーカーを検出することで処理が行われる。しかしながら、予め設置されているマーカーが駐車車両などにより隠されてしまい、検出できない可能性がある。特許文献2においても、環境条件を原因とするずれが発生する可能性があり、車両の位置を的確に検出できなくなることから駐車支援を適切に行うことができなくなってしまう可能性がある。 In Patent Document 2, a fixed marker is set in advance, and processing is performed by detecting the marker. However, there is a possibility that a marker installed in advance is hidden by a parked vehicle and cannot be detected. Also in Patent Document 2, there is a possibility that a shift due to environmental conditions may occur, and it becomes impossible to appropriately perform parking support because the position of the vehicle cannot be accurately detected.
 本技術は、このような状況に鑑みてなされたものであり、車両の位置を的確に検出し、所定の場所までの誘導を適切に行えるようにすることができるようにするものである。 The present technology has been made in view of such a situation, and is capable of accurately detecting the position of a vehicle and appropriately guiding to a predetermined place.
 本技術の一側面の第1の情報処理装置は、所定のパターンを投影する複数の投影装置を制御する投影制御部と、移動体が目的地まで移動するときの経路を作成する経路作成部とを備え、前記投影制御部は、前記経路作成部で作成された前記経路上に位置する前記投影装置の投影を制御する。 A first information processing apparatus according to an aspect of the present technology includes a projection control unit that controls a plurality of projection apparatuses that project a predetermined pattern, and a path creation unit that creates a path when the moving body moves to a destination. The projection control unit controls the projection of the projection device located on the path created by the path creation unit.
 前記投影制御部は、所定の位置に設置されている前記投影装置が投影するパターンを、マーカーパターンとし、前記所定の位置以外に設置されている前記投影装置が投影するパターンを、ランダムパターンとするようにすることができる。 The projection control unit sets a pattern projected by the projection apparatus installed at a predetermined position as a marker pattern, and sets a pattern projected by the projection apparatus installed at a position other than the predetermined position as a random pattern. Can be.
 前記所定の位置は、曲がり角、または前記目的地付近の少なくとも一方であるようにすることができる。 The predetermined position may be at least one of a corner or the vicinity of the destination.
 前記経路作成部は、前記移動体の幅、長さ、高さ、最小回転半径のうちの少なくとも1つの情報を用いて、前記移動体が移動できる前記目的地までの経路を作成するようにすることができる。 The route creation unit creates a route to the destination to which the moving body can move using at least one information of the width, length, height, and minimum turning radius of the moving body. be able to.
 前記目的地は、駐車スペースであり、前記経路作成部は、前記移動体が、前記駐車スペースに駐車するためのアプローチを開始する位置まで前記経路として作成するようにすることができる。 The destination may be a parking space, and the route creation unit may create the route up to a position where the moving body starts an approach for parking in the parking space.
 所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを保持する保持部と、前記移動体から撮影された画像内の特徴点の3次元空間内の位置を示すローカルマップを取得し、前記ローカルマップに基づいて、前記グローバルマップを更新するグローバルマップ更新部をさらに備えるようにすることができる。 A holding unit that holds a global map indicating the position of the feature point in the predetermined area in the three-dimensional space, and a local map that indicates the position of the feature point in the image captured from the moving body in the three-dimensional space. A global map update unit that updates the global map based on the local map may be further provided.
 前記経路作成部は、前記保持部に保持されている前記グローバルマップを参照して経路を作成するようにすることができる。 The route creation unit can create a route with reference to the global map held in the holding unit.
 前記移動体に、前記経路作成部で作成された経路、前記マーカーパターンのマーカー情報、および所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを供給するようにすることができる。 The global map indicating the path created by the path creating unit, the marker information of the marker pattern, and the position of the feature point in a predetermined area can be supplied to the moving body.
 本技術の一側面の第1の情報処理方法は、所定のパターンを投影する複数の投影装置を制御し、移動体が目的地まで移動するときの経路を作成するステップを含み、前記投影の制御は、作成された前記経路上に位置する前記投影装置の投影を制御することで行われる。 A first information processing method according to an aspect of the present technology includes a step of controlling a plurality of projection devices that project a predetermined pattern, and creating a path when the moving body moves to a destination, and controls the projection Is performed by controlling the projection of the projection device located on the created path.
 本技術の一側面の第2の情報処理装置は、投影装置で投影された所定のパターンを撮像する撮像部と、前記撮像部で撮像された前記パターンを用いて自己位置の推定を行う推定部と、他の装置から供給される経路と、推定された前記自己位置に基づき、前記経路上を移動するために各部を制御するための制御信号を生成する制御信号生成部とを備える。 A second information processing apparatus according to an aspect of the present technology includes an imaging unit that captures a predetermined pattern projected by a projection device, and an estimation unit that estimates a self-position using the pattern captured by the imaging unit And a path supplied from another device, and a control signal generation unit that generates a control signal for controlling each unit to move on the path based on the estimated self-position.
 前記撮像部で撮影された画像内の特徴点の3次元空間内の位置を示すローカルマップを生成するローカルマップ生成部をさらに備えるようにすることができる。 It is possible to further include a local map generation unit that generates a local map indicating the position in the three-dimensional space of the feature points in the image captured by the imaging unit.
 前記他の装置から供給される所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを保持する保持部と、前記グローバルマップ内で、前記ローカルマップに対応する位置に変更があったか否かを判定する判定部とをさらに備え、前記判定部により、変更があったと判定された場合、前記ローカルマップを前記他の装置に供給するようにすることができる。 A holding unit that holds a global map indicating a position in a three-dimensional space of feature points in a predetermined area supplied from the other device; and whether or not a position corresponding to the local map has changed in the global map And a determination unit that determines whether the local map is supplied to the other device when the determination unit determines that there has been a change.
 前記経路を、前記ローカルマップで修正するようにすることができる。 The route can be corrected with the local map.
 前記所定のパターンからマーカーが検出された場合、前記推定部で推定されている前記自己位置を補正するようにすることができる。 When the marker is detected from the predetermined pattern, the self-position estimated by the estimation unit can be corrected.
 前記制御信号は、ユーザに前記経路を通知する信号であるようにすることができる。 The control signal may be a signal that notifies the user of the route.
 本技術の一側面の第2の情報処理方法は、投影装置で投影された所定のパターンを撮像し、撮像された前記パターンを用いて自己位置の推定を行い、他の装置から供給される経路と、推定された前記自己位置に基づき、前記経路上を移動するために各部を制御するための制御信号を生成するステップを含む。 A second information processing method according to an aspect of the present technology captures a predetermined pattern projected by a projection device, estimates a self-position using the captured pattern, and supplies a route supplied from another device And generating a control signal for controlling each part to move on the route based on the estimated self-position.
 本技術の一側面の第1の情報処理装置、情報処理方法においては、所定のパターンを投影する複数の投影装置が制御され、移動体が目的地まで移動するときの経路が作成される。また投影の制御は、作成された経路上に位置する投影装置の投影を制御することで行われる。 In the first information processing apparatus and information processing method according to one aspect of the present technology, a plurality of projection apparatuses that project a predetermined pattern are controlled, and a route when the moving body moves to the destination is created. Further, the projection is controlled by controlling the projection of the projection device located on the created path.
 本技術の一側面の第2の情報処理装置、情報処理装置は、投影装置で投影された所定のパターンが撮像され、撮像部で撮像されたパターンが用いられて自己位置の推定が行われ、他の装置から供給される経路と、推定された自己位置に基づき、経路上を移動するために各部を制御するための制御信号が生成される。 The second information processing apparatus and the information processing apparatus according to an aspect of the present technology capture an image of a predetermined pattern projected by the projection device, and use the pattern captured by the imaging unit to perform self-position estimation. Based on a route supplied from another device and the estimated self-position, a control signal for controlling each part to move on the route is generated.
 本技術の一側面によれば、車両の位置を的確に検出し、所定の場所までの誘導を適切に行えるようにすることができる。 According to one aspect of the present technology, it is possible to accurately detect the position of the vehicle and appropriately guide to a predetermined place.
 なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。 It should be noted that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
駐車場の構成について説明するための図である。It is a figure for demonstrating the structure of a parking lot. 経路について説明するための図である。It is a figure for demonstrating a path | route. 駐車支援システムの構成について説明するための図である。It is a figure for demonstrating the structure of a parking assistance system. 投影装置に割り振られたIDについて説明するための図である。It is a figure for demonstrating ID allocated to the projection apparatus. 投影装置の投影範囲について説明するための図である。It is a figure for demonstrating the projection range of a projection apparatus. 駐車支援システムの動作について説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of a parking assistance system. 駐車支援装置の構成を示す図である。It is a figure which shows the structure of a parking assistance apparatus. 駐車支援装置の動作について説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of a parking assistance apparatus. 駐車支援車両側装置の構成を示す図である。It is a figure which shows the structure of a parking assistance vehicle side apparatus. 自己位置推定処理部の構成を示す図である。It is a figure which shows the structure of a self-position estimation process part. 車両側の撮像装置の位置について説明するための図である。It is a figure for demonstrating the position of the imaging device of a vehicle side. 駐車支援車両側装置の動作について説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the parking assistance vehicle side apparatus. 記録媒体について説明するための図である。It is a figure for demonstrating a recording medium.
 以下に、本技術を実施するための形態(以下、実施の形態という)について説明する。なお、説明は、以下の順序で行う。
 1.駐車支援について
 2.駐車支援システムの構成
 3.駐車支援システムの動作
 4.駐車支援装置の構成
 5.駐車支援装置の動作
 6.駐車支援車両側装置の構成
 7.駐車支援車両側装置の動作
 8.記録媒体について
Hereinafter, modes for carrying out the present technology (hereinafter referred to as embodiments) will be described. The description will be given in the following order.
1. Parking assistance 2. Configuration of parking support system Operation of parking support system 4. 4. Configuration of parking assist device Operation of parking assist device 6. 6. Configuration of parking assist vehicle side device 7. Operation of parking assist vehicle side device About recording media
 <駐車支援について>
 本技術は、駐車場内で、空きスペースに車両を誘導し、駐車するときの支援を行うシステムに適用できる。例えば、図1に示したような駐車場10があり、図2に示すように、駐車場10に入場してきた車両20を、矢印に示した経路で、空きスペースに誘導するときのシステムに適用できる。
<About parking assistance>
The present technology can be applied to a system that guides a vehicle to an empty space and assists in parking in a parking lot. For example, there is a parking lot 10 as shown in FIG. 1, and as shown in FIG. 2, the present invention is applied to a system in which a vehicle 20 entering the parking lot 10 is guided to an empty space by a route indicated by an arrow. it can.
 図1に示したように、駐車場10には、入場ゲート11が設置されており、白線(図では黒線)などで駐車スペースが示されている。なお、ここでは入場ゲート11が設置されており、入場ゲート11で車両20が一時停止するとして説明を続けるが、入場ゲート11が設置されていない駐車場や、車両20が一時停止する場所が設けられていない駐車場に対しても、以下に説明する技術を適用できる。 As shown in FIG. 1, an entrance gate 11 is installed in the parking lot 10, and a parking space is indicated by a white line (black line in the figure). Here, the description will be continued assuming that the entrance gate 11 is installed and the vehicle 20 is temporarily stopped at the entrance gate 11, but there is a parking lot where the entrance gate 11 is not installed and a place where the vehicle 20 is temporarily stopped. The technology described below can be applied to a parking lot that is not provided.
 入場してきた車両20がある場合、その車両20に対して、その時点における環境を考慮し、適切と思われる駐車スペースの特定が行われ、その駐車スペースまでの経路が生成される。ここでは、図2に矢印で示した経路が生成されたとして説明を続ける。車両20は、生成された経路に基づいて、走行し、駐車スペースまで移動し、駐車する。 When there is a vehicle 20 that has entered, a parking space that is considered appropriate is identified for the vehicle 20 in consideration of the environment at that time, and a route to the parking space is generated. Here, the description will be continued assuming that the route indicated by the arrow in FIG. 2 has been generated. The vehicle 20 travels based on the generated route, moves to the parking space, and parks.
 このような経路を生成することで、駐車スペースまでの移動の支援や、駐車スペースに到着し、駐車スペースに車両20を停めるまでの処理を行う駐車支援システムについて説明を加える。 A description will be given of a parking support system that generates a route such as this and supports the movement to the parking space and the process of arriving at the parking space and stopping the vehicle 20 in the parking space.
 なおここでは、車両20を例に挙げて説明を続けるが、車両20ではなく、移動体(移動物体)に対して、本技術を適用できる。 Here, the description will be continued by taking the vehicle 20 as an example, but the present technology can be applied not to the vehicle 20 but to a moving body (moving object).
 <駐車支援システムの構成>
 図3は、駐車支援システムの一実施の形態の構成を示す図である。図3に示した駐車支援システムは、駐車支援装置51、撮像装置52、投影装置53、ネットワーク61、データベース71、および駐車支援車両側装置81を含む構成とされている。
<Configuration of parking support system>
FIG. 3 is a diagram illustrating a configuration of an embodiment of the parking assistance system. The parking assistance system shown in FIG. 3 includes a parking assistance device 51, an imaging device 52, a projection device 53, a network 61, a database 71, and a parking assistance vehicle side device 81.
 駐車支援装置51は、例えば図1に示した駐車場10を管理する装置である。詳細は後述するが、駐車支援装置51は、経路を作成し、作成した経路を車両20に対して提供し、グローバルマップ(詳細は後述する)の保持や更新を行う。 The parking support device 51 is a device that manages the parking lot 10 shown in FIG. Although details will be described later, the parking assistance device 51 creates a route, provides the created route to the vehicle 20, and holds and updates a global map (details will be described later).
 撮像装置52は、入場ゲート11(図1)付近に設置され、駐車場10に入場してきた車両20を撮像する。駐車支援装置51は、撮像装置52で撮像された車両20の車幅、車長、車高、最小回転半径などの車両に関する情報(以下、車両情報と適宜記述する)を取得する。取得する際、ネットワーク61に接続されているデータベース71が参照されるように構成することができる。 The imaging device 52 is installed near the entrance gate 11 (FIG. 1) and images the vehicle 20 that has entered the parking lot 10. The parking assist device 51 acquires information about the vehicle such as the vehicle width, the vehicle length, the vehicle height, and the minimum turning radius of the vehicle 20 imaged by the imaging device 52 (hereinafter referred to as vehicle information as appropriate). When acquiring, the database 71 connected to the network 61 can be referred to.
 ネットワーク61は、WAN(Wide Area Network)やLAN(Local Area Network)等から構成されるネットワークであり、駐車支援装置51とデータベース71は、ネットワーク61を介して、互いに情報(データ)の授受を行うことができる構成とされている。 The network 61 is a network composed of a WAN (Wide Area Network), a LAN (Local Area Network), or the like, and the parking support device 51 and the database 71 exchange information (data) with each other via the network 61. It can be configured.
 データベース71は、上記したように、車両20に関する情報を管理するデータベースである。ここではデータベース71は、ネットワーク61に接続されている例を示したが、駐車支援装置51内に設けられている構成とすることもできる。 The database 71 is a database that manages information related to the vehicle 20 as described above. Here, an example in which the database 71 is connected to the network 61 is shown, but a configuration provided in the parking assistance device 51 may be adopted.
 また、ここでは、データベース71が参照されて車両20に関する情報が取得されるとして説明を続けるが、車両20に関する情報が他の方法で取得されるようにしても良い。例えば、撮像装置52で撮像された画像を解析し、車幅や車高といった情報が取得されるようにしても良い。 In addition, here, the description is continued on the assumption that information on the vehicle 20 is acquired with reference to the database 71, but information on the vehicle 20 may be acquired by other methods. For example, information such as the vehicle width and the vehicle height may be acquired by analyzing an image captured by the imaging device 52.
 駐車支援装置51には、投影装置53も接続されている。投影装置53は、駐車支援装置51からの指示に基づき、所定のパターンを床や駐車されている車両などの駐車場10内の物体に投影する。投影装置53は、駐車場10に複数設置されている。また、駐車場10が、屋内駐車場である場合、投影装置53は、天井に設置され、床や車両に所定のパターンを投影する。 Projection device 53 is also connected to parking support device 51. The projection device 53 projects a predetermined pattern onto an object in the parking lot 10 such as a floor or a parked vehicle based on an instruction from the parking support device 51. A plurality of projection devices 53 are installed in the parking lot 10. Further, when the parking lot 10 is an indoor parking lot, the projection device 53 is installed on the ceiling and projects a predetermined pattern on the floor or the vehicle.
 車両20は、投影装置53で投影されたパターンを撮像する撮像装置(カメラ)を備えている。駐車支援車両側装置81は、車両20に備えられている撮像装置で撮像されたパターンを解析し、自己位置の推定やローカルマップ(詳細は、後述する)の作成を行う。 The vehicle 20 includes an imaging device (camera) that images the pattern projected by the projection device 53. The parking assistance vehicle side device 81 analyzes the pattern imaged by the imaging device provided in the vehicle 20, and estimates the self position and creates a local map (details will be described later).
 このような構成を有する駐車支援システムにおいて、上記したように、投影装置53は、駐車場10内に複数設置されている。ここでは、駐車場10内に設置されている投影装置53には、それぞれIDが割り振られており、IDで個々の投影装置53が識別できるようにされている。ここでは図4に示すように、駐車場10内には、21台の投影装置53が設置され、それぞれIDが割り振られているとして説明を続ける。 In the parking support system having such a configuration, a plurality of projection devices 53 are installed in the parking lot 10 as described above. Here, an ID is assigned to each of the projection devices 53 installed in the parking lot 10, and each projection device 53 can be identified by the ID. Here, as shown in FIG. 4, the description will be continued assuming that 21 projection devices 53 are installed in the parking lot 10 and are assigned IDs.
 21台の投影装置53には、それぞれID1乃至ID21が割り振られている。以下、例えば、ID1が割り振られた投影装置53を、ID1の投影装置53と記述し、ID2が割り振られた投影装置53を、ID2の投影装置53と記述する。他の投影装置53も同様に記述することで駐車場10内の個々の投影装置53を区別する。 21 projectors 53 are assigned ID1 to ID21, respectively. Hereinafter, for example, the projection device 53 assigned ID1 is described as the projection device 53 ID1, and the projection device 53 assigned ID2 is described as the projection device 53 ID2. The other projection devices 53 are also described in a similar manner to distinguish the individual projection devices 53 in the parking lot 10.
 例えば、ID1の投影装置53は、入場ゲート11の近傍に設置され、ID2の投影装置53は、ID1の投影装置53の右側に設置されている。なお個々の投影装置53を識別する情報としては、IPアドレスなどを用いることもできる。 For example, the ID1 projection device 53 is installed in the vicinity of the entrance gate 11, and the ID2 projection device 53 is installed on the right side of the ID1 projection device 53. Note that an IP address or the like can also be used as information for identifying each projection device 53.
 図4に示したように、説明のため、1台の車両用の駐車スペースに番号を付す。図4に示した例では、1乃至48の番号を付した。すなわち、駐車場10内には、48台分の駐車スペースが設けられている。以下の説明においては、例えば、駐車スペース1との記載をし、駐車スペース1は、図4の駐車場10内において、入場ゲート11に近い位置にある駐車スペースを示すとする。他の駐車スペースも同様に記載する。 As shown in FIG. 4, for the sake of explanation, a number is assigned to the parking space for one vehicle. In the example shown in FIG. 4, numbers 1 to 48 are assigned. That is, 48 parking spaces are provided in the parking lot 10. In the following description, for example, it is described as a parking space 1, and the parking space 1 is a parking space located near the entrance gate 11 in the parking lot 10 of FIG. 4. Other parking spaces are listed as well.
 ここで、図2と図4を参照する。図2に矢印で示した経路が、車両20の駐車スペースまでの経路として決定されたとする。この場合、まず駐車スペースとして、駐車スペース37が設定され、入場ゲート11から駐車スペース37までの経路として、矢印のような経路が設定された場合である。 Here, refer to FIG. 2 and FIG. Assume that the route indicated by the arrow in FIG. 2 is determined as the route to the parking space of the vehicle 20. In this case, a parking space 37 is first set as a parking space, and a route like an arrow is set as a route from the entrance gate 11 to the parking space 37.
 矢印の経路上には、ID1の投影装置53、ID2の投影装置53、ID3の投影装置53、ID7の投影装置53、ID11の投影装置53があり、駐車スペース37の近傍にID12の投影装置53が設置されている。 On the path of the arrow, there are an ID1 projection device 53, an ID2 projection device 53, an ID3 projection device 53, an ID7 projection device 53, an ID11 projection device 53, and an ID12 projection device 53 in the vicinity of the parking space 37. Is installed.
 このような場合、駐車支援装置51から、ID1、ID2、ID3、ID7、ID11、ID12の投影装置53に、それぞれ所定のパターンの投影の指示が出される。投影装置53は、図5に示すように天井91に設置され、床92に所定のパターンを投影する。図5では、ID1の投影装置53とID2の投影装置53を示した。隣接するID1の投影装置53とID2の投影装置53の、それぞれの投影範囲に重なりがあるように、投影装置53が設置されていても良いし、投影範囲に重なりがないように、投影装置53が設置されていても良い。 In such a case, the parking assistance device 51 gives instructions for projecting predetermined patterns to the projection devices 53 of ID1, ID2, ID3, ID7, ID11, and ID12. As shown in FIG. 5, the projection device 53 is installed on the ceiling 91 and projects a predetermined pattern onto the floor 92. In FIG. 5, the projection device 53 of ID1 and the projection device 53 of ID2 are shown. The projection device 53 may be installed so that the projection ranges of the adjacent ID1 projection device 53 and ID2 projection device 53 overlap, or the projection device 53 does not overlap. May be installed.
 投影されるパターンは、可視光によるものでも良いし、可視光以外の、例えば赤外線などによるものでも良い。 The projected pattern may be visible light, or may be other than visible light, such as infrared rays.
 投影装置53から投影されるパターンは、ランダムパターンとされるが、特定の位置に設置されている投影装置53から投影されるパターンは、マーカーパターンとされる。マーカーパターンを投影する投影装置53は、曲がり角に設置されている投影装置53とすることができる。 The pattern projected from the projection device 53 is a random pattern, but the pattern projected from the projection device 53 installed at a specific position is a marker pattern. The projection device 53 that projects the marker pattern can be a projection device 53 installed at a corner.
 後述するように、投影されたパターンは、車両20の撮像装置に撮像され、駐車支援車両側装置81は、撮像された画像から、特徴点を検出し、マッチングなどの解析を行うことで、自己位置を推定したり、ローカルマップを作成したりする。投影装置53から投影されるパターンをランダムパターンとすることで、駐車支援車両側装置81では、自己位置の推定を行いやすくなる。 As will be described later, the projected pattern is picked up by the image pickup device of the vehicle 20, and the parking assist vehicle-side device 81 detects feature points from the picked-up image and performs analysis such as matching, thereby Estimate location or create local map. By making the pattern projected from the projection device 53 a random pattern, the parking assist vehicle-side device 81 can easily estimate the self position.
 仮に、投影装置53から投影されるパターンを、格子形状などの所定の形状のパターンとし、ID1の投影装置53とID2の投影装置53から同一の格子形状のパターンが投影されるとした場合、駐車支援車両側装置81が撮像したパターンは、ID1の投影装置53で投影されたパターンであるのか、ID2の投影装置53で投影されたパターンであるのか判定しづらく、自己位置の推定やローカルマップの作成に誤差が発生する可能性がある。 If the pattern projected from the projection device 53 is a pattern having a predetermined shape such as a lattice shape, and the same lattice shape pattern is projected from the ID1 projection device 53 and the ID2 projection device 53, parking is performed. It is difficult to determine whether the pattern captured by the assisting vehicle side device 81 is a pattern projected by the ID1 projection device 53 or a pattern projected by the ID2 projection device 53. An error may occur in the creation.
 投影装置53から投影されるパターンを、ランダムパターンとすることで、個々の投影装置53からのパターンを異なるようにすることができ、上記したような駐車支援車両側装置81が自己位置の推定などに誤差が発生するようなことを抑制することが可能となる。 By making the pattern projected from the projection device 53 into a random pattern, the pattern from each projection device 53 can be made different, and the parking assist vehicle-side device 81 as described above can estimate its own position, etc. It is possible to suppress the occurrence of an error.
 また、投影装置53から投影されるパターンに、ランダムパターンとマーカーパターンとを設ける。マーカーパターンは、自己位置推定の補正などに用いられ、例えば、曲がり角に設置されている投影装置53が投影するパターンとされる。 Further, a random pattern and a marker pattern are provided on the pattern projected from the projection device 53. The marker pattern is used for correction of self-position estimation. For example, the marker pattern is a pattern projected by the projection device 53 installed at a corner.
 例えば、図4を再度参照するに、ID1、ID3、ID5、ID9、ID11、ID13、ID17、ID19、ID21の投影装置53は、曲がり角に設置されており、これらの投影装置53は、マーカーパターンを投影する。これらの投影装置53以外の投影装置53は、ランダムパターンを投影する。 For example, referring again to FIG. 4, the projection devices 53 of ID1, ID3, ID5, ID9, ID11, ID13, ID17, ID19, and ID21 are installed at the corners, and these projection devices 53 display the marker pattern. Project. Projectors 53 other than these projectors 53 project a random pattern.
 駐車支援車両側装置81は、マーカーパターンを撮像したとき、そのマーカーパターンで得られる情報から、推定している自己位置を補正する。また、誘導する駐車スペースのところ(近傍)に設置されている投影装置53は、マーカーパターンを投影するようにしても良い。例えば、駐車スペース37が誘導する駐車スペースとされているときには、ID12の投影装置53は、ランダムパターンではなく、マーカーパターンを投影するようにしても良い。このように構成した場合、ID12の投影装置53は、ランダムパターンとマーカーパターンを切り換えて投影する構成とされている。 When the parking assist vehicle-side device 81 images the marker pattern, the parking assist vehicle-side device 81 corrects the estimated self-position from the information obtained from the marker pattern. Further, the projection device 53 installed near (in the vicinity of) the parking space to be guided may project a marker pattern. For example, when the parking space 37 is a guided parking space, the ID12 projection device 53 may project a marker pattern instead of a random pattern. When configured in this way, the projection device 53 of ID12 is configured to project by switching between a random pattern and a marker pattern.
 このように、投影装置53は、ランダムパターンまたはマーカーパターンのどちらか一方のみを投影するように構成することもできるし、必要に応じ、ランダムパターンまたはマーカーパターンを切り換えて投影するように構成することもできる。 As described above, the projection device 53 can be configured to project only one of the random pattern and the marker pattern, or can be configured to switch and project the random pattern or the marker pattern as necessary. You can also.
 投影装置53は、図5に示したように、天井91に設置され、床92に所定のパターンを投影するように設置されている。天井91から床92に対して投影されるようにすることで、遮蔽物などの影響を受けず、車両20が走行する路面上で、投影されているパターンが遮断されるようなことを防ぐことが可能となる。 As shown in FIG. 5, the projection device 53 is installed on the ceiling 91 and is installed so as to project a predetermined pattern onto the floor 92. By projecting from the ceiling 91 to the floor 92, the projected pattern is prevented from being blocked on the road surface on which the vehicle 20 travels without being affected by the shielding object. Is possible.
 また、駐車場10内では、駐車スペースに車両が駐車しているところと、駐車していないところがある。また駐車している車両が、きちんと駐車スペース内に収まって駐車していない場合もある。すなわち、駐車場10内の環境は、常に変化している。投影装置53からの投影は、駐車している車両や、置かれている荷物など(例えば、ショッピングカート、車両に積み卸ししている物、人などの物体)にも行われる。よって、駐車支援車両側装置81は、駐車している車両などの物体に投影されたパターンも撮像し、解析することができる。 Also, in the parking lot 10, there are places where vehicles are parked in the parking space and places where parking is not done. In addition, there are cases where the parked vehicle is properly parked within the parking space. That is, the environment in the parking lot 10 is constantly changing. The projection from the projection device 53 is also performed on a parked vehicle, a baggage placed, and the like (for example, a shopping cart, an object loaded and unloaded on the vehicle, an object such as a person). Therefore, the parking assist vehicle-side device 81 can capture and analyze a pattern projected on an object such as a parked vehicle.
 駐車支援車両側装置81は、駐車場10内の物体も考慮したローカルマップの作成や、自己位置の推定を行うことができ、駐車場10内の物体も考慮して、走行すべき経路を推定することができる。例えば、駐車スペースではなく、走行するエリアまではみ出している車両や、荷物や人など、駐車場10内に常にある物体ではない物体を避けて走行する経路を推定することができる。 The parking assist vehicle-side device 81 can create a local map that also considers objects in the parking lot 10 and estimate its own position, and estimates the route to be taken in consideration of the objects in the parking lot 10. can do. For example, it is possible to estimate a route that travels avoiding an object that is not always an object in the parking lot 10, such as a vehicle that protrudes to a traveling area, not a parking space, or a luggage or a person.
 仮に、投影装置53でパターンを投影するのではなく、床や柱などに所定のパターンが描かれ、そのパターンを撮像して自己位置の推定などが行われる場合、床や柱などに所定のパターンが、駐車している車両、物、人といった物体により遮蔽され、撮像できない可能性がある。また、描かれたパターンだと、経年劣化により消えてしまう可能性もある。このようなことから、描かれたパターンを利用し、自己位置を推定するようにした場合、正しく推定できない可能性が高くなる。 If a predetermined pattern is drawn on a floor or a pillar instead of projecting a pattern with the projection device 53, and the self-position is estimated by imaging the pattern, the predetermined pattern on the floor or the pillar is There is a possibility that an image such as a parked vehicle, object, or person is blocked and cannot be imaged. Also, if the pattern is drawn, it may disappear due to aging. For this reason, when the self-position is estimated using the drawn pattern, there is a high possibility that it cannot be estimated correctly.
 しかしながら、投影装置53でパターンを投影することで、経年劣化によりパターンが消えてしまうようなことを防ぐことができる。また、投影装置53でパターンを天井から投影、換言すれば、上方から投影することで、駐車している車両、物、人といった物体によりパターンが遮蔽されてしまうようなことを防ぐことができるだけでなく、車両、物、人といった物体までも検知できるようになり、より精度高く自己位置を推定したり、経路を推定したりすることが可能となる。 However, by projecting the pattern with the projection device 53, it is possible to prevent the pattern from disappearing due to deterioration over time. Further, by projecting the pattern from the ceiling with the projection device 53, in other words, projecting from above, it is only possible to prevent the pattern from being blocked by an object such as a parked vehicle, object, or person. In addition, it is possible to detect objects such as vehicles, objects, and people, and it is possible to estimate the self-position and the route with higher accuracy.
 ところで、自己位置を推定するのには、一般的にGPS(Global Positioning System)通信により行う方法が用いられている。しかしながら、GPS通信は、屋内では、通信が不安定になり、精度良く位置を推定するのが困難である。また、ジャイロセンサや加速度センサなどの慣性計測装置(IMU:Inertial Measurement Unit)を用いた自己位置の推定も行われているが、高精度で位置を推定するのは難しい。 By the way, in order to estimate the self-position, generally, a method using GPS (Global Positioning System) communication is used. However, in GPS communication, communication becomes unstable indoors, and it is difficult to accurately estimate the position. In addition, although self-position estimation is performed using an inertial measurement device (IMU: Inertial Measurement Unit) such as a gyro sensor or an acceleration sensor, it is difficult to estimate the position with high accuracy.
 しかしながら、本技術によれば、投影装置53で投影されたパターンを用いて自己位置を推定するため、GPS通信を用いた場合と異なり、屋内であっても、精度良く位置を推定することが可能となる。また、駐車支援車両側装置81で後述するような処理、例えば、マーカーパターンで推定されている位置を補正するといった処理を行うことが可能となるため、高精度で位置を推定することが可能となる。 However, according to the present technology, since the self-position is estimated using the pattern projected by the projection device 53, the position can be accurately estimated even indoors, unlike the case of using GPS communication. It becomes. Moreover, since it becomes possible to perform the process as described later in the parking assistance vehicle side device 81, for example, the process of correcting the position estimated by the marker pattern, the position can be estimated with high accuracy. Become.
 また、詳細は後述するが、駐車支援車両側装置81は、投影されたパターンを車両20の撮像装置が撮像し、撮像されたパターンから特徴点を検出し、マッチングなどの解析を行うことで、自己位置を推定する。仮に、投影装置53から投影されたパターンではなく、駐車場10内の構造物を撮像し、その構造物の画像から特徴点を検出し、マッチングなどの解析を行うことで、自己位置を推定するようにすることもできる。 Although details will be described later, the parking assist vehicle-side device 81 captures the projected pattern by the image capturing device of the vehicle 20, detects feature points from the captured pattern, and performs matching analysis, etc. Estimate self-position. Suppose that a self-position is estimated by imaging a structure in the parking lot 10, not a pattern projected from the projection device 53, detecting feature points from the image of the structure, and performing analysis such as matching. It can also be done.
 駐車場10には、床や柱などが構造物としてあるが、テクスチャに乏しい構造物であり、そのような構造物の画像を取得し、解析して、自己位置を推定するのは、困難であり、精度良い推定や、安定性を保つのは難しい。 Although the parking lot 10 has a structure such as a floor or a pillar as a structure, it is a structure with poor texture, and it is difficult to estimate the self-position by acquiring and analyzing an image of such a structure. It is difficult to estimate accurately and maintain stability.
 しかしながら、本技術によれば、投影装置53でパターンを投影することで、テクスチャに乏しい構造物であっても、テクスチャを有する構造物にすることができるため、自己位置の推定の精度を向上させ、安定性を保った推定を行えるようにすることが可能となる。 However, according to the present technology, by projecting a pattern with the projection device 53, even a structure with poor texture can be made into a structure with texture, thereby improving the accuracy of self-position estimation. Thus, it is possible to perform estimation while maintaining stability.
 <駐車支援システムの動作>
 図3に示した駐車支援システムの動作について図6のフローチャートを参照して説明する。駐車支援装置51と駐車支援車両側装置81の詳細な動作については、後述するとし、ここでは、システムにおける処理の概略を説明する。
<Operation of parking support system>
The operation of the parking support system shown in FIG. 3 will be described with reference to the flowchart of FIG. Detailed operations of the parking assist device 51 and the parking assist vehicle side device 81 will be described later, and here, an outline of processing in the system will be described.
 ステップS11において、駐車支援装置51は、入場してきた車両20の車両情報を取得する。ステップS12において、駐車支援装置51は、経路を生成する。経路は、ステップS11において取得された車両20の車両情報も利用して生成される。例えば、車両の車幅などを考慮し、駐車可能な駐車スペースを検索し、検索された駐車スペースまで、車両の車幅で通れる経路を検索することで、経路が生成される。 In step S11, the parking assistance device 51 acquires the vehicle information of the vehicle 20 that has entered. In step S12, the parking assistance device 51 generates a route. The route is also generated using the vehicle information of the vehicle 20 acquired in step S11. For example, in consideration of the vehicle width of the vehicle, a parking space that can be parked is searched, and a route that can be passed through the vehicle width of the vehicle to the searched parking space is searched.
 ステップS13において、投影装置53における所定のパターンの投影の制御が開始される。生成された経路上にある投影装置53に対して、投影の開始が指示される。 In step S13, control of projection of a predetermined pattern in the projection device 53 is started. The projection device 53 on the generated path is instructed to start projection.
 ステップS14において、駐車支援車両側装置81に生成された経路が提供される。駐車支援車両側装置81に提供される情報は、生成された経路の情報(以下、経路情報と記述する)や、グローバルマップなどである。 In step S14, the generated route is provided to the parking assist vehicle-side device 81. Information provided to the parking assist vehicle-side device 81 is information on a generated route (hereinafter referred to as route information), a global map, and the like.
 ステップS21において、駐車支援車両側装置81は、駐車支援装置51から提供された経路情報などを取得する。そして、駐車支援車両側装置81は、ステップS22において、提供された経路情報に基づき走行できるように、車両20の各部を制御したり、自己位置の推定を開始したりする。 In step S <b> 21, the parking assistance vehicle side device 81 acquires route information provided from the parking assistance device 51. And the parking assistance vehicle side apparatus 81 controls each part of the vehicle 20, or starts estimation of a self-position so that it can drive | work based on the provided route information in step S22.
 ステップS23において、駐車支援車両側装置81は、ローカルマップを作成する。ローカルマップは、駐車支援車両側装置81(車両20)の周りの環境に関する地図である。例えば、駐車されている車両が、どのような位置に、どのような入庫角度で停止しているかといったような、駐車支援車両側装置81(車両20)が走行している環境に関する情報である。ローカルマップは、自己位置の推定時や経路の補正時などに利用されるとともに、作成される地図である。 In step S23, the parking assistance vehicle side device 81 creates a local map. The local map is a map related to the environment around the parking assist vehicle-side device 81 (vehicle 20). For example, it is information relating to the environment in which the parking assist vehicle-side device 81 (vehicle 20) is traveling, such as what position and at what warehousing angle the parked vehicle is stopped. The local map is a map that is used when the self-position is estimated or when the route is corrected.
 ステップS24において、駐車支援車両側装置81は、必要に応じて、駐車支援装置51に作成したローカルマップを提供する。 In step S24, the parking assistance vehicle side device 81 provides the created local map to the parking assistance device 51 as necessary.
 ステップS15において、駐車支援装置51は、駐車支援車両側装置81から提供されたローカルマップを受信する。そして、ステップS16において、駐車支援装置51は、必要に応じ、自己が管理しているグローバルマップを更新する。 In step S15, the parking assistance device 51 receives the local map provided from the parking assistance vehicle side device 81. And in step S16, the parking assistance apparatus 51 updates the global map which self manages, as needed.
 <駐車支援装置の構成>
 駐車支援装置51の構成について説明する。図7は、駐車支援装置51の内部構成を示す図である。駐車支援装置51は、入場車両検出部101、車両情報取得部102、投影制御部103、パターン保持部104、経路作成部105、グローバルマップ保持部106、グローバルマップ更新部107、および通信部108を含む構成とされている。
<Configuration of parking assistance device>
The structure of the parking assistance apparatus 51 is demonstrated. FIG. 7 is a diagram illustrating an internal configuration of the parking assist device 51. The parking assist device 51 includes an entrance vehicle detection unit 101, a vehicle information acquisition unit 102, a projection control unit 103, a pattern holding unit 104, a route creation unit 105, a global map holding unit 106, a global map update unit 107, and a communication unit 108. It is configured to include.
 入場車両検出部101は、駐車場10に入場してきた車両20を検出する。ここでは、撮像装置52で撮像された画像が入場車両検出部101に供給され、その供給された画像を基に、駐車場10に入場してきた車両20を検出するとして説明を続ける。 The entrance vehicle detection unit 101 detects the vehicle 20 that has entered the parking lot 10. Here, the description is continued on the assumption that the image captured by the imaging device 52 is supplied to the entrance vehicle detection unit 101 and the vehicle 20 entering the parking lot 10 is detected based on the supplied image.
 車両情報取得部102は、駐車場10に入場してきた車両20の情報、例えば、車幅、車高、最小回転半径等の情報(以下、車両情報と記述する)を取得する。 The vehicle information acquisition unit 102 acquires information on the vehicle 20 that has entered the parking lot 10, such as information on the vehicle width, vehicle height, minimum turning radius, and the like (hereinafter referred to as vehicle information).
 投影制御部103は、投影装置53を制御する。投影制御部103は、入場車両検出部101からの制御信号に基づき、投影装置53による投影の開始を制御し、経路作成部105からの経路など情報に基づく制御信号に基づき、投影装置53による投影を制御する。 Projection control unit 103 controls projection device 53. The projection control unit 103 controls the start of projection by the projection device 53 based on a control signal from the entrance vehicle detection unit 101, and projects by the projection device 53 based on a control signal based on information such as a route from the route creation unit 105. To control.
 パターン保持部104は、例えば、投影装置53のIDと投影するパターン(画像)を関連付けて保持している。投影制御部103は、経路作成部105で作成された経路上に位置する投影装置53のIDと関連付けられているパターンをパターン保持部104から読み出し、経路上に位置する投影装置53に対して供給する。 The pattern holding unit 104 holds, for example, the ID of the projection device 53 and the pattern (image) to be projected in association with each other. The projection control unit 103 reads the pattern associated with the ID of the projection device 53 located on the path created by the route creation unit 105 from the pattern holding unit 104 and supplies the pattern to the projection device 53 located on the path. To do.
 経路作成部105は、車両情報取得部102で取得された車両情報を用いて、入場してきた車両20が駐車するのに適した駐車スペースと、その駐車スペースまでの入場ゲート11からの経路を、グローバルマップ保持部106に保持されているグローバルマップを参照して決定する。 The route creation unit 105 uses the vehicle information acquired by the vehicle information acquisition unit 102 to determine a parking space suitable for the vehicle 20 that has entered the vehicle to park and a route from the entrance gate 11 to the parking space. The determination is made with reference to the global map held in the global map holding unit 106.
 グローバルマップ保持部106は、グローバルマップを保持する。グローバルマップ更新部107は、グローバルマップ保持部106に保持されているグローバルマップを、通信部108を介して駐車支援車両側装置81から供給されるローカルマップを参照し、必要に応じて更新する。 The global map holding unit 106 holds a global map. The global map update unit 107 updates the global map stored in the global map storage unit 106 with reference to the local map supplied from the parking assistance vehicle side device 81 via the communication unit 108 as necessary.
 通信部108は、駐車支援車両側装置81との通信を行う。通信部108は、経路作成部105で作成された経路情報や、グローバルマップ保持部106に保持されているグローバルマップなどを、駐車支援車両側装置81に提供するときの通信を行う。また、通信部108は、駐車支援車両側装置81から供給されるローカルマップなどを、グローバルマップ更新部107や経路作成部105に供給するときの通信を行う。 The communication unit 108 communicates with the parking assistance vehicle side device 81. The communication unit 108 performs communication when the route information created by the route creation unit 105, the global map held by the global map holding unit 106, and the like are provided to the parking assistance vehicle side device 81. In addition, the communication unit 108 performs communication when supplying a local map or the like supplied from the parking assistance vehicle side device 81 to the global map update unit 107 or the route creation unit 105.
 ここで、グローバルマップ保持部106に保持されているグローバルマップについて説明を加える。 Here, the global map held in the global map holding unit 106 will be described.
 グローバルマップとは、所定の広域な領域内の静止物体の3次元空間内の位置を示すマップである。例えば、グローバルマップは、所定の領域内の静止物体の特徴点の3次元の空間座標系上の位置及び特徴量を示す情報を含む。なお、空間座標系は、例えば、緯度、経度及び地面からの高さにより表される。 The global map is a map showing the position in a three-dimensional space of a stationary object within a predetermined wide area. For example, the global map includes information indicating the position and feature amount of a feature point of a stationary object in a predetermined region on a three-dimensional spatial coordinate system. The spatial coordinate system is represented by, for example, latitude, longitude, and height from the ground.
 グローバルマップには、基となるグローバルマップ(基グローバルマップと記述する)があり、そのグローバルマップが、環境に応じて適宜更新される。グローバルマップは、ここでは、駐車場10のマップであり、基グローバルマップは、図1に示したような平面的マップである。すなわち基グローバルマップは、空車時の駐車場10のマップである。 The global map has a base global map (referred to as the base global map), and the global map is updated as appropriate according to the environment. Here, the global map is a map of the parking lot 10, and the base global map is a planar map as shown in FIG. That is, the basic global map is a map of the parking lot 10 when the vehicle is empty.
 基グローバルマップは、例えば、駐車場10の設計時の設計図を用いて作成されたマップとすることができる。 The base global map can be, for example, a map created using a design drawing at the time of designing the parking lot 10.
 基グローバルマップは、グローバルマップ更新部107により更新される。更新は、新たに駐車した車両を反映した更新、駐車場10から退場した車両を反映した更新など、駐車場10内の環境(状況)に変化が生じたときに行われる。 The base global map is updated by the global map update unit 107. The update is performed when a change occurs in the environment (situation) in the parking lot 10 such as an update reflecting a newly parked vehicle or an update reflecting a vehicle leaving the parking lot 10.
 更新が行われることで、基グローバルマップは、例えば、図2に示したようなマップに更新される。図2に示したマップは、駐車場10に、複数台の車両が駐車している状態である。図2に示したマップは、図4を合わせて参照するに、駐車スペース1、駐車スペース3、駐車スペース13、駐車スペース25、駐車スペース30、および駐車スペース33に車両が駐車しているマップである。 When the update is performed, the base global map is updated to a map as shown in FIG. 2, for example. The map shown in FIG. 2 is a state in which a plurality of vehicles are parked in the parking lot 10. The map shown in FIG. 2 is a map in which a vehicle is parked in the parking space 1, the parking space 3, the parking space 13, the parking space 25, the parking space 30, and the parking space 33, referring to FIG. is there.
 駐車支援車両側装置81は、駐車場10内を指定された経路を走行しながら、ローカルマップを作成する。ローカルマップとは、車両20の周辺の静止物体の3次元空間内の位置を示すマップであり、駐車支援車両側装置81により生成される。例えば、ローカルマップは、グローバルマップと同様に、各移動体の周辺の静止物体の特徴点の3次元の空間座標系上の位置及び特徴量を示す情報を含む。 The parking assist vehicle side device 81 creates a local map while traveling along the designated route in the parking lot 10. The local map is a map indicating the position of a stationary object around the vehicle 20 in the three-dimensional space, and is generated by the parking assistance vehicle side device 81. For example, the local map includes information indicating the position and the feature amount on the three-dimensional spatial coordinate system of the feature point of the stationary object around each moving object, as in the global map.
 ローカルマップは、自車が走行している周りの環境を反映したマップである。例えば、駐車スペース1、駐車スペース3の辺りを走行しているときには、駐車スペース1、駐車スペース3に車両が駐車しているというマップが作成される。 The local map is a map that reflects the surrounding environment where the vehicle is traveling. For example, when traveling around the parking space 1 and the parking space 3, a map is created that the vehicle is parked in the parking space 1 and the parking space 3.
 また、駐車スペース13の辺りを走行するときには、駐車スペース13に駐車している車両は、駐車スペースをはみ出して駐車しているというマップが作成される。このようなローカルマップが作成されることで、駐車支援車両側装置81は、駐車スペース13に駐車している車両の左横を通過するときには、はみ出して駐車している車両にぶつからないように、駐車スペース6よりで通過するという経路を作成し、その作成した経路に基づく走行が行われるように、車両20内の各部を制御する。 In addition, when traveling around the parking space 13, a map is created in which vehicles parked in the parking space 13 are parked outside the parking space. By creating such a local map, when the parking assist vehicle side device 81 passes the left side of the vehicle parked in the parking space 13, the parking assist vehicle side device 81 does not protrude and collide with the parked vehicle. A route that passes through the parking space 6 is created, and each part in the vehicle 20 is controlled so that traveling based on the created route is performed.
 このように、駐車支援車両側装置81で作成されるローカルマップは、駐車場10内の環境(状況)を反映したマップとなっている。このようなローカルマップが、所定のタイミングで、駐車支援車両側装置81から駐車支援装置51のグローバルマップ更新部107に供給される。グローバルマップ更新部107は、供給されたローカルマップを参照し、グローバルマップで変化が生じた部分を更新する。 Thus, the local map created by the parking assist vehicle-side device 81 is a map reflecting the environment (situation) in the parking lot 10. Such a local map is supplied from the parking assistance vehicle side device 81 to the global map updating unit 107 of the parking assistance device 51 at a predetermined timing. The global map update unit 107 refers to the supplied local map and updates a portion where a change has occurred in the global map.
 すなわち、グローバルマップ更新部107は、新たに駐車された車両がある場合には、その車両が駐車している駐車スペースに、その車両が駐車している状況(駐車スペースに対してどのような位置に、どのような角度で駐車されているのかといった状況)を反映した更新を行う。 In other words, when there is a newly parked vehicle, the global map update unit 107 determines whether the vehicle is parked in the parking space where the vehicle is parked (what position with respect to the parking space). In addition, an update reflecting the angle at which the vehicle is parked is performed.
 また、グローバルマップ更新部107は、駐車されていた車両が駐車場10内から退場した場合には、換言すれば、駐車スペースとして空きが出た場合には、その空きとなった駐車スペースを空いた状態に戻す更新を行う。 In addition, when the parked vehicle leaves the parking lot 10, in other words, when the vacant parking space is available, the global map updating unit 107 vacates the vacant parking space. Update to return to the previous state.
 上記したように、ローカルマップは、実際に駐車場10内を走行している車両20の駐車支援車両側装置81により作成されるため、例えば、駐車スペース13に駐車している車両は、駐車スペース13をはみ出して駐車しているとか、駐車スペース3に駐車している車両は、駐車スペース3内に駐車しているが、斜めに駐車しているといった情報が反映されたものとすることができる。 As described above, since the local map is created by the parking assist vehicle side device 81 of the vehicle 20 that is actually traveling in the parking lot 10, for example, a vehicle parked in the parking space 13 Information indicating that the vehicle parked outside the parking space 13 or parked in the parking space 3 is parked in the parking space 3 but parked diagonally can be used. .
 このようなローカルマップに基づき、グローバルマップが更新されることで、グローバルマップを、駐車場10内の環境に応じた、ほぼリアルタイムなマップにすることができる。よって、このようなグローバルマップを参照して、入場してきた車両20が駐車すべき駐車スペースや、そこに至る経路を適切に設定することが可能となる。 グ ロ ー バ ル By updating the global map based on such a local map, the global map can be made a near real-time map according to the environment in the parking lot 10. Therefore, with reference to such a global map, it is possible to appropriately set a parking space in which the entering vehicle 20 should park and a route to the parking space.
 <駐車支援装置の動作>
 次に、駐車支援装置51の動作について図8に示したフローチャートを参照して説明する。
<Operation of parking assist device>
Next, operation | movement of the parking assistance apparatus 51 is demonstrated with reference to the flowchart shown in FIG.
 ステップS101において、入場車両検出部101は、駐車場10に入場してきた車両20を検出したか否かを判定する。例えば、入場車両検出部101は、駐車場10内の入場ゲート11付近に設置されている撮像装置52(図3)で撮像された画像に、車両20が撮像されている場合に、車両20が入場してきたと検出する。 In step S101, the entrance vehicle detection unit 101 determines whether or not the vehicle 20 that has entered the parking lot 10 has been detected. For example, when the vehicle 20 is captured in the image captured by the imaging device 52 (FIG. 3) installed near the entrance gate 11 in the parking lot 10, the entrance vehicle detection unit 101 Detect that you have entered.
 または駐車場10内の入場ゲート11付近に設置されているセンサ(不図示)で、入場ゲート11に入場してきた車両20を検知するようにし、そのようなセンサからの情報を取得することで、入場車両検出部101は、駐車場10に車両20が入場してきたことを検出するようにしても良い。 Alternatively, by detecting a vehicle 20 entering the entrance gate 11 with a sensor (not shown) installed near the entrance gate 11 in the parking lot 10, and obtaining information from such a sensor, The entrance vehicle detection unit 101 may detect that the vehicle 20 has entered the parking lot 10.
 または、駐車場10内の入場ゲート11付近に設置されている撮像装置52(図3)は、車両20が入場してきたときに、その車両20を撮像するように構成し、入場車両検出部101は、撮像装置52から画像を提供されたとき、駐車場10に車両20が入場してきたと検出するようにしても良い。 Alternatively, the imaging device 52 (FIG. 3) installed near the entrance gate 11 in the parking lot 10 is configured to take an image of the vehicle 20 when the vehicle 20 enters, and the entrance vehicle detection unit 101. May detect that the vehicle 20 has entered the parking lot 10 when an image is provided from the imaging device 52.
 ステップS101において、駐車場10に入場してきた車両20を検出したと判定されるまで、ステップS101の処理が繰り返され、待機状態が維持される。一方、ステップS101において、駐車場10に入場してきた車両20を検出したと判定された場合、ステップS102に処理は進められる。 In step S101, the process in step S101 is repeated and the standby state is maintained until it is determined that the vehicle 20 that has entered the parking lot 10 is detected. On the other hand, if it is determined in step S101 that the vehicle 20 entering the parking lot 10 has been detected, the process proceeds to step S102.
 ステップS102において、車両情報取得部102は、車両情報を取得する。車両情報取得部102は、撮像装置52で撮像された画像を取得し、その画像に写っている車両20の車両情報を取得する。車両情報は、車幅、車高、最小回転半径等であり、誘導する駐車スペースの検索や、駐車スペースまでの経路の決定に用いられる情報である。 In step S102, the vehicle information acquisition unit 102 acquires vehicle information. The vehicle information acquisition part 102 acquires the image imaged with the imaging device 52, and acquires the vehicle information of the vehicle 20 reflected in the image. The vehicle information includes vehicle width, vehicle height, minimum turning radius, and the like, and is information used for searching for a parking space to be guided and determining a route to the parking space.
 車両情報は、例えば、図3を参照して説明したように、データベース71を参照して取得される。 The vehicle information is acquired with reference to the database 71 as described with reference to FIG.
 または、車両20側から情報が提供されるようにしても良い。提供される情報は、車両情報そのものが提供されるようにしても良いし、車両(車種)を特定できるような情報が提供されるようにしても良い。車両を特定できるような情報である場合、提供された情報を用いて、データベース71が参照され、車両情報が取得されるように構成することができる。 Alternatively, information may be provided from the vehicle 20 side. As the information to be provided, vehicle information itself may be provided, or information that can identify a vehicle (vehicle type) may be provided. When the information is such that the vehicle can be specified, the database 71 can be referred to using the provided information to obtain the vehicle information.
 また、データベース71は、駐車支援装置51の一部として備えられるようにしても良く、そのような場合、車両情報取得部102は、備えられているデータベース71を参照して、車両情報を取得する。 In addition, the database 71 may be provided as a part of the parking assistance device 51. In such a case, the vehicle information acquisition unit 102 acquires vehicle information with reference to the provided database 71. .
 なお、通信部108を介して、車両20側から車両情報が提供されるようにしても良い。例えば、運転者が、所定のボタンを操作したとき、または駐車支援装置51から車両情報の提供の要求を出し、その要求に応答したときなどに、車両20側から車両情報が送信され、その送信された車両情報が、車両情報取得部102に取得されるようにしても良い。 Note that vehicle information may be provided from the vehicle 20 side via the communication unit 108. For example, when the driver operates a predetermined button or when a request for provision of vehicle information is issued from the parking assist device 51 and responds to the request, the vehicle information is transmitted from the vehicle 20 side, and the transmission is performed. The vehicle information acquired may be acquired by the vehicle information acquisition unit 102.
 また、撮像装置52で撮像された画像を解析することで、車幅、車長などの車両情報が取得されるようにしても良い。 Further, vehicle information such as the vehicle width and the vehicle length may be acquired by analyzing the image captured by the imaging device 52.
 ステップS103において、経路作成部105は、経路を作成する。経路作成部105は、入場してきた車両20が駐車する駐車スペースを、車両20の車幅、車長、車高、最小回転半径といった情報を用いて決定する。例えば、車幅の大きい車両20である場合、そのような車幅の車両20が駐車できる幅が確保できる駐車スペースが、グローバルマップが参照されて検索される。 In step S103, the route creation unit 105 creates a route. The route creation unit 105 determines a parking space where the entering vehicle 20 is parked using information such as the vehicle width, the vehicle length, the vehicle height, and the minimum turning radius of the vehicle 20. For example, in the case of the vehicle 20 having a large vehicle width, a parking space in which a width capable of parking the vehicle 20 having such a vehicle width can be secured is searched with reference to the global map.
 グローバルマップには、駐車されている車両の状況が書き込まれているため、例えば、駐車スペース2(図4)は空いているが、駐車スペース1と駐車スペース3の車両のため、車幅の大きい車両20は駐車することができないといったようなことを判定することができる。 Since the status of the parked vehicle is written in the global map, for example, the parking space 2 (FIG. 4) is vacant, but because the vehicle is in the parking space 1 and the parking space 3, the vehicle width is large. It can be determined that the vehicle 20 cannot park.
 また、駐車スペース30(図4)に駐車している車両は、駐車スペースからはみ出して駐車しているため、最小回転半径が大きい車両や車長が長い車などは、駐車スペース30に駐車している車両のために、駐車スペース30の近傍の駐車スペースには駐車しづらい可能性があるため、駐車スペース30から離れた位置の駐車スペースを検索した方が良いと判定することができる。 In addition, since vehicles parked in the parking space 30 (FIG. 4) are parked out of the parking space, vehicles with a large minimum turning radius or vehicles with a long vehicle length are parked in the parking space 30. Since there is a possibility that it is difficult to park in the parking space in the vicinity of the parking space 30 because of the existing vehicle, it can be determined that it is better to search for a parking space at a position away from the parking space 30.
 また、最小回転半径が大きいため、駐車スペースとしては空いているが、その駐車スペース前に、切り返しなどのために必要とされるスペースが確保できないような駐車スペースは検索せず、十分に切り返しなどのために必要とされるスペースが確保できる場所にある駐車スペースを検索するといったような検索も行われる。 In addition, because the minimum turning radius is large, the parking space is vacant, but the parking space where the space required for turning back, etc. cannot be secured before the parking space is not searched, and turning back is sufficient. A search such as searching for a parking space in a place where a space required for the vehicle can be secured is also performed.
 また、車高の高い車両20が入場してきたときには、屋上や屋外の駐車場から、駐車スペースを検索するといった検索も行う。 Also, when a vehicle 20 with a high vehicle height enters, a search such as searching for a parking space from the rooftop or outdoor parking lot is also performed.
 このような検索は、グローバルマップに、どの駐車スペースに車両が駐車され、どのような状態で駐車されているかが書き込まれているため、そのようなグローバルマップを参照することで行うことができる。このようなグローバルマップは、グローバルマップ更新部107により更新されることで、できる限り最新の状態が保たれるように構成されている。 Such a search can be performed by referring to such a global map because it is written in which parking space the vehicle is parked and in what state the vehicle is parked. Such a global map is updated by the global map update unit 107 so that the latest state is maintained as much as possible.
 経路作成部105は、グローバルマップを参照して、車両情報を考慮して、入場してきた車両20が駐車する駐車スペースを決定する。 The route creation unit 105 refers to the global map and determines a parking space where the entering vehicle 20 parks in consideration of vehicle information.
 なお、上記した駐車スペースの決定方法は一例であり、限定を示すものではない。例えば、以下のような車両20の構造上の条件以外に、以下のような条件も加味して駐車スペースが決定されるようにしても良い。 In addition, the determination method of the parking space mentioned above is an example, and does not show limitation. For example, the parking space may be determined in consideration of the following conditions in addition to the following structural conditions of the vehicle 20.
 例えば、老人が運転していることを示すマーク、障害者が運転していることを示すマーク、初心者が運転していることを示すマークなど、所定のマークが、撮像装置52で撮像された画像を解析することで検出された場合、その運転者に適した駐車スペースが検索されるようにしても良い。例えば、駐車場10には、障害者の人が優先的に駐車できる駐車スペースを設けられている駐車場もあり、そのような駐車場10の場合、障害者が運転していることを示すマークが検出された場合には、障害者の人が優先的に駐車できる駐車スペースを、入場してきた車両20の駐車スペースとして決定するようにしても良い。 For example, an image in which a predetermined mark such as a mark indicating that an elderly person is driving, a mark indicating that a disabled person is driving, or a mark indicating that a beginner is driving is captured by the imaging device 52 May be searched for a parking space suitable for the driver. For example, in the parking lot 10, there is a parking lot provided with a parking space where a disabled person can preferentially park. In such a parking lot 10, a mark indicating that the disabled person is driving. May be determined as the parking space of the vehicle 20 that has entered the parking space where the disabled person can preferentially park.
 このようにして、入場してきた車両20に適した駐車スペースが決定される。駐車スペースが決定されると、入場ゲート11から、駐車スペースまでの経路が検索される。 In this way, a parking space suitable for the vehicle 20 entering the vehicle is determined. When the parking space is determined, a route from the entrance gate 11 to the parking space is searched.
 経路も、グローバルマップが参照されて設定される。例えば、駐車スペース13や駐車スペース30(図4)には、駐車スペースをはみ出して駐車している車両があるため、走行できる道路幅が狭くなっている。そのようなところを、例えば、車幅が広い車両が走行するのは困難であると判定される場合、入場してきた車両20が車幅の広い車両であった場合、駐車スペース13や駐車スペース30を避けて、決定された駐車スペースまでたどり着く経路が検索され、設定される。 The route is also set by referring to the global map. For example, in the parking space 13 and the parking space 30 (FIG. 4), there are vehicles parked outside the parking space, so that the road width that can be traveled is narrow. In such a case, for example, when it is determined that it is difficult for a vehicle with a wide vehicle width to travel, when the entering vehicle 20 is a vehicle with a wide vehicle width, the parking space 13 or the parking space 30 The route to reach the determined parking space is searched and set.
 また設定された駐車スペースに車両20が到着すると、その後、車両20は、駐車スペースに駐車するための移動を開始するが、その駐車開始位置も、設定される。駐車開始位置とは、例えば、駐車を開始するときの位置であり、駐車する駐車スペースに対する距離や傾きであり、その駐車開始位置から駐車スペースに対するアプローチを開始すれば、駐車スペースの中央に、車両20の中央が来た状態で駐車できるような位置である。 Further, when the vehicle 20 arrives at the set parking space, the vehicle 20 starts moving to park in the parking space, and the parking start position is also set. The parking start position is, for example, a position when parking is started, and is a distance or inclination with respect to the parking space to be parked. If an approach to the parking space is started from the parking start position, the vehicle is placed in the center of the parking space. It is a position where parking is possible with the center of 20 coming.
 このように、経路作成部105は、入場してきた車両20に適した駐車スペース、その駐車スペースまでの経路、および駐車開始位置を検索、設定する。ここでは、設定された駐車スペース、経路、および駐車開始位置を、経路情報と記述する。なお、経路情報には、これらの情報以外の情報も含まれるようにしても良い。また、経路情報として、経路の情報だけが含まれるといったような情報であっても良く、駐車スペース、経路、および駐車開始位置の少なくとも1つの情報が含まれていれば良い。 Thus, the route creation unit 105 searches and sets a parking space suitable for the vehicle 20 that has entered, a route to the parking space, and a parking start position. Here, the set parking space, route, and parking start position are described as route information. The route information may include information other than these pieces of information. Further, the route information may be information such that only the route information is included, and it is sufficient that at least one information of the parking space, the route, and the parking start position is included.
 ステップS104において、通信部108を介して、経路情報、マーカーなどの情報が、車両20に対して提供される。経路情報は、経路作成部105で生成された情報である。マーカーの情報とは、マーカーパターンのマーカーに関する情報であり、マーカーそのものの情報と、グローバルマップ上の位置が関連付けられた情報である。また、グローバルマップ保持部106で保持されているグローバルマップも、車両20に提供される。 In step S104, information such as route information and markers is provided to the vehicle 20 via the communication unit. The route information is information generated by the route creation unit 105. The marker information is information related to the marker of the marker pattern, and is information in which information on the marker itself is associated with a position on the global map. In addition, a global map held by the global map holding unit 106 is also provided to the vehicle 20.
 車両20は、マーカーパターンを撮像し、供給されたマーカーと一致するか否かを判定し、一致していることを確認することで、そのマーカーパターンを投影している投影装置53を特定、換言すれば、グローバルマップでの位置を特定できる。グローバルマップ内での位置を特定することで、その特定された位置と、推定している自己位置とを比較することで、自己位置のずれ量がわかり、周辺物体と自車自己位置の補正などを行うことができる。 The vehicle 20 images the marker pattern, determines whether or not it matches the supplied marker, and identifies the projection device 53 that projects the marker pattern by confirming that it matches, in other words, If so, the position on the global map can be specified. By identifying the position in the global map, comparing the identified position with the estimated self position, you can know the amount of displacement of the self position, and correct surrounding objects and the self position of the vehicle etc. It can be performed.
 このような処理を車両20側で行えるように、マーカーに関する情報と、グローバルマップが、車両20に対して供給される。 The information about the marker and the global map are supplied to the vehicle 20 so that such processing can be performed on the vehicle 20 side.
 図2に示したような経路が設定された場合、マーカーに関する情報として提供されるのは、ID3の投影装置53で投影されるマーカーパターンの情報と、ID11の投影装置53で投影されるマーカーパターンの情報であるようにすることができる。 When the route as shown in FIG. 2 is set, information about the marker provided by the ID3 projection device 53 and the marker pattern projected by the ID11 projection device 53 are provided as the marker-related information. Information.
 また、決定された駐車スペース(図2、図4では、駐車スペース37)の近傍に位置する投影装置53から投影されるパターンも、マーカーパターンとし、目的地の駐車スペースであることを駐車支援車両側装置81が認識できるようにしても良い。 Further, the pattern projected from the projection device 53 located in the vicinity of the determined parking space (the parking space 37 in FIGS. 2 and 4) is also a marker pattern, and the parking assist vehicle indicates that the parking space is the destination parking space. The side device 81 may be able to recognize.
 駐車支援装置51から駐車支援車両側装置81への経路情報などの提供は、通信部108により行われる。例えば、入場ゲート11を設け、入場ゲート11で車両20が一時停止する仕組みを設ける。入場ゲート11で車両20が停止しているときに、通信部108から、経路情報などが、車両20に提供される。 Provision of route information and the like from the parking assistance device 51 to the parking assistance vehicle side device 81 is performed by the communication unit 108. For example, an entrance gate 11 is provided, and a mechanism for temporarily stopping the vehicle 20 at the entrance gate 11 is provided. When the vehicle 20 is stopped at the entrance gate 11, route information and the like are provided from the communication unit 108 to the vehicle 20.
 入場ゲート11の他に、情報提供場所を設け、その情報提供場所で経路情報などが提供されるようにしても良い。情報提供場所は、駐車場10内の所定の場所、例えば、入場ゲート11とID1の投影装置53が設置されている位置の間のエリアを情報提供場所とすることができる。 In addition to the entrance gate 11, an information providing place may be provided, and route information may be provided at the information providing place. The information providing place may be a predetermined place in the parking lot 10, for example, an area between the entrance gate 11 and the position where the ID1 projection device 53 is installed.
 情報提供場所は、入場してきた車両20毎に異なる場所としても良い。例えば、複数台の車両20が、連続して駐車場10に入場してくる場合があることが想定される。複数台の車両20、例えば、3台の車両20(車両20-1、車両20-2、車両20-3とする)が入場してきた場合を想定する。 The information provision location may be different for each vehicle 20 that has entered. For example, it is assumed that a plurality of vehicles 20 may enter the parking lot 10 continuously. Assume a case where a plurality of vehicles 20, for example, three vehicles 20 (referred to as vehicles 20-1, 20-2, and 20-3) enter.
 このような場合、車両20-1、車両20-2、車両20-3のそれぞれの情報提供場所を変えるようにしても良い。例えば、車両20-1に対しては、ID3の投影装置53付近を情報提供場所とし、車両20―1がID3の投影装置53付近を走行(一時停止)したときに、経路情報などが、車両20―1に提供されるようにする。 In such a case, the information providing locations of the vehicle 20-1, the vehicle 20-2, and the vehicle 20-3 may be changed. For example, for the vehicle 20-1, when the vicinity of the ID3 projection device 53 is set as the information providing place, and the vehicle 20-1 travels (temporarily stops) near the ID3 projection device 53, the route information, etc. To be provided to 20-1.
 また車両20-2に対しては、ID2の投影装置53付近を情報提供場所とし、車両20―2がID2の投影装置53付近を走行(一時停止)したときに、経路情報などが、車両20―2に提供されるようにする。さらに車両20-3に対しては、ID1の投影装置53付近を情報提供場所とし、車両20―3がID1の投影装置53付近を走行(一時停止)したときに、経路情報などが、車両20―3に提供されるようにする。 For the vehicle 20-2, the vicinity of the ID2 projection device 53 is used as the information providing place, and when the vehicle 20-2 travels (temporarily stops) near the ID2 projection device 53, the route information and the like are displayed. -To be provided in 2. Further, for the vehicle 20-3, when the vicinity of the ID1 projection device 53 is used as the information providing place, and the vehicle 20-3 travels (temporarily stops) near the ID1 projection device 53, the route information and the like are obtained. -To be provided in 3.
 仮に、同一の情報提供場所で、車両20-1、車両20-2、車両20-3に情報を提供するようにした場合、タイミングによっては、同一の経路情報が車両20-1、車両20-2、車両20-3に提供されてしまう可能性がある。すなわち、情報の誤った提供が行われてしまう可能性がある。 If information is provided to the vehicle 20-1, the vehicle 20-2, and the vehicle 20-3 at the same information provision location, the same route information may be included in the vehicle 20-1 and the vehicle 20- 2. There is a possibility of being provided to the vehicle 20-3. That is, there is a possibility that information is erroneously provided.
 上記したように、車両20毎に、異なる情報提供場所を設定することで、複数の車両20が連続的に入場してきたような場合であっても、異なる情報提供場所で、それぞれの車両20に対して、それぞれの車両20に対する情報の提供が行われるようにすることができる。よって、同一の経路情報が、誤って、複数の車両20に提供されてしまうような可能性を低減させることが可能となる。 As described above, by setting different information providing locations for each vehicle 20, even if a plurality of vehicles 20 have entered continuously, each vehicle 20 can have different information providing locations. On the other hand, provision of information to each vehicle 20 can be performed. Therefore, it is possible to reduce the possibility that the same route information is erroneously provided to the plurality of vehicles 20.
 また、入場してきた車両20を識別するIDを、車両20毎に割り振るようにしても良い。上記した情報提供場所を車両20毎に異ならせる仕組みと組み合わせ、IDにより、自己に対する情報であるか否かを車両20側が判定できる仕組みを設けても良い。例えば、入場ゲート11、または入場ゲート11に該当する場所で、駐車支援装置51は、入場してきた車両20に対して、IDを割り振り、割り振ったIDを、車両20(駐車支援車両側装置81)に提供する。 Also, an ID for identifying the vehicle 20 that has entered may be assigned to each vehicle 20. A combination of the above-described information providing location with a mechanism that varies for each vehicle 20 and a mechanism that allows the vehicle 20 to determine whether or not the information is information about itself based on the ID may be provided. For example, the parking support device 51 allocates an ID to the vehicle 20 that has entered the entrance gate 11 or a place corresponding to the entrance gate 11, and the assigned ID is assigned to the vehicle 20 (the parking support vehicle side device 81). To provide.
 駐車支援装置51は、経路情報などを車両20に提供するとき、割り振ったIDと提供する情報を関連付けて、情報提供場所で情報を提供するように、通信部108を制御する。車両20の駐車支援車両側装置81は、駐車支援装置51からの情報を取得したとき、自己が管理しているIDと、取得された情報と関連付けられているIDが一致しているか否かを判定し、一致している場合には、自己宛の情報であるとして保持し、一致していない場合には、自己宛の情報ではないとし、破棄する。 The parking support device 51 controls the communication unit 108 so as to provide the information at the information providing place by associating the allocated ID with the provided information when providing the route information or the like to the vehicle 20. When the parking support vehicle-side device 81 of the vehicle 20 acquires information from the parking support device 51, it determines whether or not the ID managed by the device and the ID associated with the acquired information match. If it matches, it is stored as information addressed to itself, and if it does not match, the information is not addressed to itself and is discarded.
 このように、情報提供場所とIDを用いることで、より適切に、情報を車両20に提供することが可能となる。 Thus, it becomes possible to provide information to the vehicle 20 more appropriately by using the information providing location and the ID.
 また、IDのみを用いた情報提供が行われるようにすることも可能である。情報提供場所を、1箇所にし、その場所で、情報提供が行われるようにし、車両20側は、IDにより自己宛の情報であるのか否かを判定するようにしても良い。 It is also possible to provide information using only ID. The information providing place may be set to one place, and information may be provided at that place, and the vehicle 20 side may determine whether the information is addressed to itself based on the ID.
 ステップS105において、投影制御部103は、投影装置53の投影の制御を開始する。 In step S105, the projection control unit 103 starts controlling the projection of the projection device 53.
 投影制御部103は、駐車場10内に設置されている投影装置53を制御する。ここでは、投影制御部103は、ID1乃至ID21の投影装置53(図4)によるそれぞれの投影を制御する。投影制御部103は、入場車両検出部101から、投影開始指示の制御信号の供給を受けると、投影装置53の投影の制御を開始する。 The projection control unit 103 controls the projection device 53 installed in the parking lot 10. Here, the projection control unit 103 controls each projection by the projection devices 53 (FIG. 4) ID1 to ID21. When receiving a projection start instruction control signal from the entrance vehicle detection unit 101, the projection control unit 103 starts projection control of the projection device 53.
 すなわちこの場合、投影装置53による投影は、常に行われている(駐車場10が使われている時間内は、常に投影している)わけではなく、駐車場10内に車両20が入場してきたときに、投影を開始し、車両20が通過した、駐車スペースに停車した、投影を開始してから所定の時間が経過したなどの所定の条件が満たされたときに、投影を終了する。 That is, in this case, the projection by the projection device 53 is not always performed (the projection is always performed during the time when the parking lot 10 is used), and the vehicle 20 has entered the parking lot 10. Sometimes, the projection is started, and the projection is terminated when a predetermined condition is satisfied such that the vehicle 20 has passed, the vehicle has stopped in the parking space, or a predetermined time has elapsed since the projection was started.
 このように投影を常に行うのではなく、必要に応じて行うようにすることで、投影装置53で消費される電力を低減させることができ、駐車支援システムで消費される電力を低減することが可能となる。なお、消費電力の低減をしなくても良いなどの状況下の場合、投影装置53による投影は、常に行われているように構成することも可能であり、そのように構成した場合でも、本技術を適用できる。 As described above, the projection is not always performed, but is performed as necessary, so that the power consumed by the projection device 53 can be reduced, and the power consumed by the parking assistance system can be reduced. It becomes possible. Note that in a situation where it is not necessary to reduce power consumption, the projection by the projection device 53 can be configured so that it is always performed. Technology can be applied.
 投影制御部103で投影が制御されるのは、経路上に位置している投影装置53である。例えば、図2に示したような経路が設定された場合には、ID1、ID2、ID3、ID7、ID11、およびID12の投影装置53の投影が制御される。 Projection is controlled by the projection control unit 103 in the projection device 53 located on the path. For example, when the route as shown in FIG. 2 is set, the projection of the projection device 53 of ID1, ID2, ID3, ID7, ID11, and ID12 is controlled.
 具体的には、ID1、ID2、ID3、ID7、ID11、およびID12の投影装置53は所定のパターンの投影を開始し、他の投影装置53は、投影を行わない待機状態が維持された状態とされる。またID1、ID2、ID3、ID7、ID11、およびID12の投影装置53のうち、ID1、ID3、ID11、およびID12の投影装置53は、マーカーパターンを投影するように制御され、ID2、ID7の投影装置53は、ランダムパターンを投影するように制御される。 Specifically, the projection devices 53 of ID1, ID2, ID3, ID7, ID11, and ID12 start projecting a predetermined pattern, and the other projection devices 53 are in a state in which a standby state in which projection is not performed is maintained. Is done. Of the projection devices 53 of ID1, ID2, ID3, ID7, ID11, and ID12, the projection devices 53 of ID1, ID3, ID11, and ID12 are controlled to project a marker pattern, and the projection devices of ID2, ID7 53 is controlled to project a random pattern.
 投影制御部103は、経路作成部105から供給された経路情報から、投影を開始する投影装置53を特定し、特定した投影装置53のIDと関連付けられているパターン(ランダムパターンまたはマーカーパターン)を、パターン保持部104から読み出す。パターン保持部104は、投影装置53のID、パターン、およびマーカー情報を関連付けて保持している。 The projection control unit 103 identifies the projection device 53 that starts projection from the route information supplied from the route creation unit 105, and selects a pattern (random pattern or marker pattern) associated with the ID of the identified projection device 53. Read from the pattern holding unit 104. The pattern holding unit 104 holds the ID, pattern, and marker information of the projection device 53 in association with each other.
 パターン保持部104で保持されているパターンを、投影制御部103は、読み出し、該当するIDで識別される投影装置53に対して供給する。投影装置53は、投影制御部103の制御の基、所定のパターンの投影を行う。 The projection control unit 103 reads the pattern held by the pattern holding unit 104 and supplies the pattern to the projection device 53 identified by the corresponding ID. The projection device 53 projects a predetermined pattern under the control of the projection control unit 103.
 なお投影制御の開始は、経路情報が生成された後ではなく、入場してきた車両20を検知した時点など、他のタイミングで開始されても良い。例えば、入場車両検出部101が、入場してきた車両20を検出すると、入場車両検出部101は、投影制御部103に投影開始の指示を出す。投影制御部103は、入場ゲート11付近に設置されている投影装置53の投影を開始させる。 It should be noted that the projection control may be started at other timings, such as when the vehicle 20 that has entered the vehicle is detected, not after the route information is generated. For example, when the entrance vehicle detection unit 101 detects an entering vehicle 20, the entrance vehicle detection unit 101 instructs the projection control unit 103 to start projection. The projection control unit 103 starts projection of the projection device 53 installed near the entrance gate 11.
 その後、経路作成部105により経路が生成されると、その経路上に位置する投影装置53の投影が開始されるようにしても良い。 After that, when a route is generated by the route creation unit 105, projection of the projection device 53 located on the route may be started.
 ステップS106において、グローバルマップ更新部107は、ローカルマップを取得したか否かを判定する。ローカルマップは、所定のタイミングで、駐車支援車両側装置81から供給される。所定のタイミングとは、どのようなタイミングでも良いが、例えば、駐車支援車両側装置81側で、保持しているグローバルマップとローカルマップで異なるところがあると判定されたとき、マーカーパターンを投影している投影装置53下を走行したとき、駐車スペースに到着したとき、駐車が完了したときなどである。 In step S106, the global map update unit 107 determines whether a local map has been acquired. The local map is supplied from the parking assistance vehicle side device 81 at a predetermined timing. The predetermined timing may be any timing. For example, when the parking assist vehicle-side device 81 side determines that there is a difference between the held global map and the local map, a marker pattern is projected. For example, when the vehicle travels under the projector 53, when the vehicle arrives at the parking space, or when parking is completed.
 ステップS106において、ローカルマップを取得したと判定されるまで、ステップS106の処理は繰り返される。一方、ステップS106において、ローカルマップを取得したと判定された場合、ステップS107に処理が進められる。 In step S106, the process of step S106 is repeated until it is determined that the local map has been acquired. On the other hand, if it is determined in step S106 that a local map has been acquired, the process proceeds to step S107.
 グローバルマップ更新部107は、取得したローカルマップを用いてグローバルマップ保持部106に保持されているグローバルマップを更新する。上記したように、ローカルマップは、車両20が実際に駐車場10内を走行したときに、駐車支援車両側装置81が、車両20の周りの環境を反映して作成したマップである。 The global map update unit 107 updates the global map held in the global map holding unit 106 using the acquired local map. As described above, the local map is a map created by the parking assist vehicle side device 81 reflecting the environment around the vehicle 20 when the vehicle 20 actually travels in the parking lot 10.
 グローバルマップ更新部107は、そのようなローカルマップを参照し、取得されたローカルマップに該当するグローバルマップのエリアに、変化があったか否かを判定し、変化があると判定したときには、その変化を反映するようにグローバルマップを更新する。 The global map update unit 107 refers to such a local map, determines whether or not there is a change in the area of the global map corresponding to the acquired local map. Update the global map to reflect.
 例えば、変化として、グローバルマップでは、駐車車両がない駐車スペースであるが、ローカルマップでは、駐車車両がある駐車スペースとなっていた場合、空き駐車スペースに車両が駐車されたという変化があったと判定できるため、その変化を反映するようにグローバルマップが更新される。 For example, as a change, if the global map is a parking space without a parked vehicle, but the local map is a parking space with a parked vehicle, it is determined that there is a change that the vehicle is parked in an empty parking space. As a result, the global map is updated to reflect the changes.
 このようにして、グローバルマップは、駐車場10内の環境(状況)変化に対応し、更新される。 In this way, the global map is updated in response to environmental (situation) changes in the parking lot 10.
 このようにグローバルマップに更新があった場合、更新があった時点で、駐車場10内を走行している車両20に対して、更新後のグローバルマップが提供されるようにしても良い。 In this way, when the global map is updated, the updated global map may be provided to the vehicle 20 traveling in the parking lot 10 at the time of the update.
 図8のフローチャートを参照して説明したような処理が、駐車支援装置51で繰り返し行われることで、車両20の駐車に対する支援が行われる。 The support for parking the vehicle 20 is performed by repeatedly performing the processing described with reference to the flowchart of FIG.
 <駐車支援車両側装置の構成>
 次に、駐車支援車両側装置81について説明を加える。まず図9のブロック図を参照し、駐車支援車両側装置81の構成について説明する。
<Configuration of parking assist vehicle side device>
Next, the parking assistance vehicle side device 81 will be described. First, the configuration of the parking assistance vehicle side device 81 will be described with reference to the block diagram of FIG.
 駐車支援車両側装置81は、自己位置推定処理部201、情報受信指示部202、通信部203、グローバルマップ更新判定部204、情報保持部205、ローカル経路生成部206、車両制御信号生成部207、および通知生成部208を含む構成とされている。 The parking assist vehicle side device 81 includes a self-position estimation processing unit 201, an information reception instruction unit 202, a communication unit 203, a global map update determination unit 204, an information holding unit 205, a local route generation unit 206, a vehicle control signal generation unit 207, And a notification generation unit 208.
 自己位置推定処理部201は、図10を参照して後述するような構成を有し、自己位置の推定やローカルマップの作成を行う。 The self-position estimation processing unit 201 has a configuration as described later with reference to FIG. 10, and performs self-position estimation and creation of a local map.
 情報受信指示部202は、駐車支援装置51から供給される情報の受信の指示を通信部203に出す。通信部203により受信される情報は、上記した経路情報、グローバルマップ、マーカーに関する情報などである。 The information reception instruction unit 202 issues an instruction to receive information supplied from the parking support device 51 to the communication unit 203. Information received by the communication unit 203 includes the above-described route information, global map, information on markers, and the like.
 情報受信指示部202は、例えば、ユーザ(運転者)からの入力により、駐車支援装置51から供給される情報の受信が指示された場合、通信部203に受信の指示を出す。また、上記したように、入場ゲート11や、情報提供場所が指定されているような場合、情報受信指示部202は、そのような位置に到着したか否かを判定し、そのような位置に到着したと判定したときに、通信部203に受信の指示を出す。 The information reception instruction unit 202 issues a reception instruction to the communication unit 203 when reception of information supplied from the parking support device 51 is instructed by an input from a user (driver), for example. Further, as described above, when the entrance gate 11 or the information providing place is designated, the information reception instruction unit 202 determines whether or not it has arrived at such a position, and at such a position. When it is determined that it has arrived, a reception instruction is issued to the communication unit 203.
 情報受信指示部202で指示が出されたときに、車両20が位置している位置を、自己位置の推定の開始位置として、その後の自己位置推定が行われるように構成することができる。特定の位置を決める最初のマーカー(グローバルマップの一部)は、ユーザによる操作が行われた時点で出された開始信号により受信されるようにすることができ、その最初のマーカーを検出した位置を、自己位置の推定の開始位置として、その後の自己位置推定が行われるように構成することができる。 When the information reception instruction unit 202 gives an instruction, the position where the vehicle 20 is located can be used as the start position of the self-position estimation so that the subsequent self-position estimation is performed. The first marker that determines a specific position (part of the global map) can be received by a start signal that is issued when the user performs an operation, and the position where the first marker is detected Can be configured so that the subsequent self-position estimation is performed as the self-position estimation start position.
 通信部203は、駐車支援装置51との通信を行う。例えば、通信部203は、駐車支援装置51から供給されるグローバルマップ、経路情報などを受信し、駐車支援装置51にローカルマップの供給を行う。 The communication unit 203 communicates with the parking assistance device 51. For example, the communication unit 203 receives a global map, route information, and the like supplied from the parking assistance device 51 and supplies the local map to the parking assistance device 51.
 グローバルマップ更新判定部204は、情報保持部205に保持されているグローバルマップを更新するか否かを判定し、必要に応じて、グローバルマップを更新する。グローバルマップ更新判定部204は、自己位置推定処理部201で生成されたローカルマップと、情報保持部205に保持されているグローバルマップのローカルマップに対応する位置に、差異が生じていると判定した場合、グローバルマップを更新すると判定する。 The global map update determination unit 204 determines whether or not to update the global map held in the information holding unit 205, and updates the global map as necessary. The global map update determination unit 204 determines that there is a difference between the local map generated by the self-position estimation processing unit 201 and the position corresponding to the local map of the global map stored in the information storage unit 205. If it is determined that the global map is to be updated.
 グローバルマップ更新判定部204は、グローバルマップを更新すると判定した場合、情報保持部205に保持されているグローバルマップのローカルマップと差異が生じている部分を、ローカルマップに合うように更新を行う。 If the global map update determination unit 204 determines to update the global map, the global map update determination unit 204 updates a part of the global map stored in the information storage unit 205 that is different from the local map so as to match the local map.
 グローバルマップ更新判定部204は、グローバルマップを更新した場合、ローカルマップを、通信部203を介して、駐車支援装置51に送信する。 When the global map is updated, the global map update determination unit 204 transmits the local map to the parking assistance device 51 via the communication unit 203.
 なお、グローバルマップ更新判定部204は、グローバルマップを更新する必要があると判定したときには、情報保持部205に保持されているグローバルマップを更新せずに、ローカルマップを、駐車支援装置51に送信する処理だけを実行するようにしても良い。すなわち、駐車支援装置51側で保持されているグローバルマップを、駐車場10内の環境(状況)に合わせるために、更新が必要なときに、更新が必要であることを駐車支援装置51に知らせ、更新する箇所を示すローカルマップを供給するように、グローバルマップ更新判定部204を構成することも可能である。 When the global map update determination unit 204 determines that the global map needs to be updated, the local map is transmitted to the parking support device 51 without updating the global map held in the information holding unit 205. Only the processing to be performed may be executed. That is, when the global map held on the parking support device 51 side needs to be updated in order to match the environment (situation) in the parking lot 10, the parking support device 51 is notified that the update is necessary. The global map update determination unit 204 can be configured to supply a local map indicating a location to be updated.
 情報保持部205は、通信部203により受信されたグローバルマップ、経路情報、マーカーの情報を保持する。 The information holding unit 205 holds the global map, route information, and marker information received by the communication unit 203.
 ローカル経路生成部206は、情報保持部205に保持されている経路情報で指定されている経路を、ローカルマップを参照して修正を加える。例えば、経路情報で、駐車スペース13(図2、図4)の横を通ることが指定され、道路の中央を走行するように指定されていたとする。そして、駐車スペース13に駐車している車両は、図2に示したように、駐車スペース13をはみ出し、走行するエリアまではみ出して駐車しているというローカルマップが作成されたとする。 The local route generation unit 206 modifies the route specified by the route information held in the information holding unit 205 with reference to the local map. For example, it is assumed that the route information designates passing the parking space 13 (FIGS. 2 and 4) and the vehicle is designated to travel on the center of the road. Then, as shown in FIG. 2, it is assumed that a local map is created in which a vehicle parked in the parking space 13 protrudes from the parking space 13 and protrudes to the traveling area.
 このような場合、ローカル経路生成部206は、駐車スペース13の横を走行するときには、道路の中央を走行するのではなく、駐車スペース13に駐車されている車両を避けて通る経路、例えば、駐車スペース6側よりの経路に変更する。 In such a case, when the local route generation unit 206 travels beside the parking space 13, the local route generation unit 206 does not travel in the center of the road, but travels around a vehicle parked in the parking space 13, for example, parking. Change to the route from the space 6 side.
 またローカル経路生成部206は、駐車開始位置に到着したとき、駐車開始位置から駐車スペースに駐車するまでの経路も生成する。駐車スペースに駐車するときには、特に、両側に駐車している車両や、前方に駐車している車両など、駐車スペースの周りの環境(状況)に依存して、駐車開始位置から駐車するまでに車両20が走行すべき経路は異なる可能性が高い。よって、ローカル経路生成部206により、駐車スペースの周りの環境が考慮された経路が生成される。 Also, when the local route generation unit 206 arrives at the parking start position, it also generates a route from the parking start position to parking in the parking space. When parking in a parking space, depending on the environment (situation) around the parking space, especially vehicles parked on both sides and vehicles parked in front of the vehicle, There is a high possibility that the route on which 20 should travel is different. Therefore, the local route generation unit 206 generates a route that takes into account the environment around the parking space.
 車両制御信号生成部207は、ローカル経路生成部206で生成されたローカル経路に基づき、車両20内の各部、例えば、エンジン、ブレーキ、ハンドルなどの車両20の速度や進行方向を制御する制御系を制御するための信号を生成する。車両20内の各部は、車両制御信号生成部207で生成された制御信号に基づき制御される。 The vehicle control signal generation unit 207 controls a control system that controls the speed and the traveling direction of each unit in the vehicle 20, such as an engine, a brake, and a handle, based on the local route generated by the local route generation unit 206. Generate a signal to control. Each unit in the vehicle 20 is controlled based on the control signal generated by the vehicle control signal generation unit 207.
 なお、車両制御信号生成部207で生成された制御信号だけに基づき車両20が制御されるのではなく、運転者の運転を支援するための制御信号であっても良い。 Note that the vehicle 20 is not controlled based only on the control signal generated by the vehicle control signal generation unit 207, but may be a control signal for assisting the driving of the driver.
 通知生成部208は、ローカル経路生成部206で生成されたローカル経路を、ユーザ(運転者)に通知するための制御を行う。例えば、カーナビゲーションシステムのディスプレイに、ローカル経路を表示したり、曲がり角に近づいたときに、メッセージを表示したりする。また、表示だけでなく、音声でも通知するようにしても良い。また、経路を外れたときなど、警告したり、正しい経路に戻すための通知をしたりするように構成することも可能である。 The notification generation unit 208 performs control for notifying the user (driver) of the local route generated by the local route generation unit 206. For example, a local route is displayed on the display of the car navigation system, or a message is displayed when approaching a corner. In addition to the display, notification may be made by voice. It is also possible to configure such that a warning is given when the route is deviated, or a notification for returning to the correct route is given.
 図10は、自己位置推定処理部201の内部構成を示す図である。 FIG. 10 is a diagram illustrating an internal configuration of the self-position estimation processing unit 201.
 自己位置推定処理部201は、SLAM(Simultaneous Localization and Mapping)などと称される技術を一部適用することができる。自己位置推定処理部201は、推定部301、位置情報生成部302、物体検出部303、及び、ローカルマップ生成部304を含むように構成される。 The self-position estimation processing unit 201 can partially apply a technique called SLAM (Simultaneous Localization and Mapping). The self-position estimation processing unit 201 is configured to include an estimation unit 301, a position information generation unit 302, an object detection unit 303, and a local map generation unit 304.
 推定部301は、撮像装置401L及び401R(図11)により撮影された左画像及び右画像内の特徴点と移動体との相対位置に基づいて、自車の移動量、位置、姿勢及び速度を推定する。撮像装置401L、撮像装置401Rは、図11に示すように、車両20の前方に設置されており、撮像装置401Lは、車両20の左側前方を撮像し、撮像装置401Rは、車両20の右側前方を撮像する位置にそれぞれ設置されている。 The estimation unit 301 calculates the movement amount, position, posture, and speed of the vehicle based on the relative positions of the feature points in the left image and the right image captured by the imaging devices 401L and 401R (FIG. 11) and the moving object. presume. As shown in FIG. 11, the imaging device 401L and the imaging device 401R are installed in front of the vehicle 20, the imaging device 401L images the left front of the vehicle 20, and the imaging device 401R It is installed in the position which images each.
 推定部301は、画像補正部321L,311R、特徴点検出部322、視差マッチング部323、距離推定部324、特徴量算出部325、マップ情報記憶部326、動きマッチング部327、移動量推定部328、物体辞書記憶部329、物体認識部330、位置姿勢情報記憶部331、位置姿勢推定部332、及び、速度推定部333を含むように構成される。 The estimation unit 301 includes image correction units 321L and 311R, a feature point detection unit 322, a parallax matching unit 323, a distance estimation unit 324, a feature amount calculation unit 325, a map information storage unit 326, a motion matching unit 327, and a movement amount estimation unit 328. , An object dictionary storage unit 329, an object recognition unit 330, a position / orientation information storage unit 331, a position / orientation estimation unit 332, and a speed estimation unit 333.
 画像補正部321L及び画像補正部321Rは、それぞれ撮像装置401Lから供給される左画像、及び、撮像装置401Rから供給される右画像を、互いに同じ方向を向いた画像となるように補正する。画像補正部321Lは、補正後の左画像を特徴点検出部322及び動きマッチング部327に供給する。画像補正部321Rは、補正後の右画像を視差マッチング部323に供給する。 The image correction unit 321L and the image correction unit 321R correct the left image supplied from the imaging device 401L and the right image supplied from the imaging device 401R, respectively, so that the images are directed in the same direction. The image correction unit 321L supplies the corrected left image to the feature point detection unit 322 and the motion matching unit 327. The image correction unit 321R supplies the corrected right image to the parallax matching unit 323.
 特徴点検出部322は、左画像の特徴点を検出する。特徴点検出部322は、検出した各特徴点の2次元の画像座標系上の位置を示す2次元位置情報を視差マッチング部323及び特徴量算出部325に供給する。なお、画像座標系は、例えば、画像内のx座標及びy座標により表される。 The feature point detection unit 322 detects a feature point of the left image. The feature point detection unit 322 supplies two-dimensional position information indicating the position of each detected feature point on the two-dimensional image coordinate system to the parallax matching unit 323 and the feature amount calculation unit 325. The image coordinate system is represented by, for example, an x coordinate and ay coordinate in the image.
 視差マッチング部323は、左画像で検出された特徴点に対応する右画像の特徴点を検出する。これにより、各特徴点の左画像上の位置と右画像上の位置の差である視差が求まる。視差マッチング部323は、各特徴点の左画像及び右画像における画像座標系上の位置を示す2次元位置情報を距離推定部324に供給する。 The parallax matching unit 323 detects the feature point of the right image corresponding to the feature point detected in the left image. Thereby, the parallax which is the difference between the position on the left image of each feature point and the position on the right image is obtained. The parallax matching unit 323 supplies two-dimensional position information indicating the positions of the feature points on the image coordinate system in the left image and the right image to the distance estimation unit 324.
 距離推定部324は、各特徴点の左画像と右画像との間の視差に基づいて、各特徴点までの距離を推定し、さらに各特徴点の3次元の空間座標系上の位置を算出する。距離推定部324は、各特徴点の空間座標系上の位置を示す3次元位置情報を特徴量算出部325に供給する。 The distance estimation unit 324 estimates the distance to each feature point based on the parallax between the left image and the right image of each feature point, and further calculates the position of each feature point on the three-dimensional spatial coordinate system To do. The distance estimation unit 324 supplies three-dimensional position information indicating the position of each feature point on the spatial coordinate system to the feature amount calculation unit 325.
 特徴量算出部325は、左画像の各特徴点の特徴量を算出する。特徴量算出部325は、各特徴点の3次元位置情報及び特徴量を含む特徴点情報を、マップ情報記憶部326に記憶させる。 The feature amount calculation unit 325 calculates the feature amount of each feature point of the left image. The feature amount calculation unit 325 causes the map information storage unit 326 to store feature point information including the three-dimensional position information of each feature point and the feature amount.
 マップ情報記憶部326には、ローカルマップに用いられる特徴点情報の他、駐車支援装置51から供給されるグローバルマップも記憶される。 The map information storage unit 326 stores a global map supplied from the parking assist device 51 in addition to the feature point information used for the local map.
 動きマッチング部327は、1つ前のフレームで検出された各特徴点の3次元位置情報をマップ情報記憶部326から取得する。次に、動きマッチング部327は、現在のフレームの左画像において、1つ前のフレームで検出された各特徴点に対応する特徴点を検出する。そして、動きマッチング部327は、各特徴点の1つ前のフレームにおける3次元位置情報、及び、現在のフレームにおける画像座標系上の位置を示す2次元位置情報を移動量推定部328に供給する。 The motion matching unit 327 acquires, from the map information storage unit 326, the three-dimensional position information of each feature point detected in the previous frame. Next, the motion matching unit 327 detects a feature point corresponding to each feature point detected in the previous frame in the left image of the current frame. Then, the motion matching unit 327 supplies the movement amount estimation unit 328 with the three-dimensional position information in the previous frame of each feature point and the two-dimensional position information indicating the position on the image coordinate system in the current frame. .
 なお、ここでは、1つ前のフレームと現在のフレームが比較されるとして説明を続けるが、Nフレーム(複数フレーム)前のフレームと現在のフレームが比較されるようにしても良い。以下の説明においても、1つ前のフレームとの記載を行うが、Nフレーム前のフレームであっても良く、1つ前のフレームとの比較に、本技術が限定されるわけではない。 Note that, here, the description is continued assuming that the previous frame is compared with the current frame, but the frame before N frames (a plurality of frames) and the current frame may be compared. In the following description, the previous frame is also described, but the frame may be N frames before, and the present technology is not limited to the comparison with the previous frame.
 移動量推定部328は、各特徴点の1つ前のフレームの3次元位置情報及び現在のフレームの2次元位置情報に基づいて、フレーム間の自車(より正確には、撮像装置401L)の位置及び姿勢の移動量を推定する。移動量推定部328は、推定した自車の位置及び姿勢の移動量を示す移動量情報を、物体検出部303、位置姿勢推定部332、及び、速度推定部333に供給する。 Based on the three-dimensional position information of the previous frame of each feature point and the two-dimensional position information of the current frame, the movement amount estimation unit 328 determines the vehicle between frames (more precisely, the imaging device 401L). Estimate the amount of movement of the position and orientation. The movement amount estimation unit 328 supplies movement amount information indicating the estimated movement amount of the position and posture of the host vehicle to the object detection unit 303, the position / orientation estimation unit 332, and the speed estimation unit 333.
 物体認識部330は、物体辞書記憶部329に記憶されている物体辞書に基づいて、左画像内の物体の認識を行う。物体認識部330は、物体の認識結果に基づいて、自車(より正確には、撮像装置401L)の空間座標系における位置及び姿勢の初期値(以下、初期位置及び初期姿勢と称する)を設定する。物体認識部330は、設定した初期位置及び初期姿勢を示す初期位置姿勢情報を位置姿勢情報記憶部331に記憶させる。 The object recognition unit 330 recognizes an object in the left image based on the object dictionary stored in the object dictionary storage unit 329. Based on the recognition result of the object, the object recognition unit 330 sets initial values (hereinafter, referred to as an initial position and an initial posture) of the position and posture of the own vehicle (more precisely, the imaging device 401L) in the spatial coordinate system. To do. The object recognition unit 330 causes the position / orientation information storage unit 331 to store initial position / orientation information indicating the set initial position and initial posture.
 位置姿勢推定部332は、位置姿勢情報記憶部221に記憶されている初期位置姿勢情報、又は、前のフレームの位置姿勢情報、並びに、自車の移動量の推定結果に基づいて、自車の位置及び姿勢を推定する。また、位置姿勢推定部332は、必要に応じて、マップ情報記憶部326に記憶されているグローバルマップに基づいて、推定した自車の位置及び姿勢の補正を行う。 The position / orientation estimation unit 332 is based on the initial position / orientation information stored in the position / orientation information storage unit 221, the position / orientation information of the previous frame, and the estimation result of the movement amount of the own vehicle. Estimate position and orientation. Further, the position / orientation estimation unit 332 corrects the estimated position and orientation of the host vehicle based on the global map stored in the map information storage unit 326 as necessary.
 位置姿勢推定部332は、推定した自車の位置及び姿勢を示す位置姿勢情報を、危険領域判定部112、危険予測部113、位置情報生成部302、及び、速度推定部333に供給するとともに、位置姿勢情報記憶部221に記憶させる。 The position / orientation estimation unit 332 supplies position / orientation information indicating the estimated position and orientation of the host vehicle to the dangerous area determination unit 112, the risk prediction unit 113, the position information generation unit 302, and the speed estimation unit 333, The information is stored in the position / orientation information storage unit 221.
 速度推定部333は、推定された自車の移動量を経過時間で除算することにより、自車の速度を推定する。速度推定部333は、推定した速度を示す速度情報を危険予測部113及び位置情報生成部302に供給する。 The speed estimation unit 333 estimates the speed of the host vehicle by dividing the estimated movement amount of the host vehicle by the elapsed time. The speed estimation unit 333 supplies speed information indicating the estimated speed to the danger prediction unit 113 and the position information generation unit 302.
 位置情報生成部302は、危険領域判定部112から自車が危険領域内にあることを通知された場合、自車の位置及び速度を含む位置情報を生成する。位置情報生成部302は、生成した位置情報を送信部104に供給する。 The location information generation unit 302 generates location information including the location and speed of the vehicle when notified from the danger region determination unit 112 that the vehicle is in the danger region. The position information generation unit 302 supplies the generated position information to the transmission unit 104.
 物体検出部303は、移動量情報、並びに、マップ情報記憶部326に記憶されている1つ前のフレームと現在のフレームの特徴点情報に基づいて、自車の周囲の静止物体及び移動体を検出する。物体検出部303は、自車の周囲の静止物体及び移動体の検出結果をローカルマップ生成部304に通知する。 Based on the movement amount information and the feature point information of the previous frame and the current frame stored in the map information storage unit 326, the object detection unit 303 detects the stationary object and the moving body around the host vehicle. To detect. The object detection unit 303 notifies the local map generation unit 304 of detection results of stationary objects and moving objects around the host vehicle.
 ローカルマップ生成部304は、自車の周囲の静止物体及び移動体の検出結果、並びに、マップ情報記憶部326に記憶されている現在のフレームの特徴点情報に基づいて、ローカルマップを生成する。生成されたローカルマップを必要に応じ、通信部203(図9)から、駐車支援装置51に対して送信される。 The local map generation unit 304 generates a local map based on the detection results of stationary objects and moving objects around the host vehicle and the feature point information of the current frame stored in the map information storage unit 326. The generated local map is transmitted from the communication unit 203 (FIG. 9) to the parking assist device 51 as necessary.
 <駐車支援車両側装置の動作>
 次に、駐車支援車両側装置81の動作について、図12のフローチャートを参照して説明する。
<Operation of parking assist vehicle side device>
Next, operation | movement of the parking assistance vehicle side apparatus 81 is demonstrated with reference to the flowchart of FIG.
 ステップS201において、車両20に備えられている撮像装置401R,401Lは、撮像を開始する。撮像装置401で撮像される画像は、投影装置53で投影されているランダムパターン、またはマーカーパターンである。 In step S201, the imaging devices 401R and 401L provided in the vehicle 20 start imaging. An image picked up by the image pickup device 401 is a random pattern or a marker pattern projected by the projection device 53.
 撮像装置401で撮像が開始されると、ステップS202において、自己位置推定処理部201は、自己位置の推定を開始する。自己位置推定処理部201は、ローカルマップを生成し、駐車場10内の自己がいる位置を推定する。具体的には、以下のように推定される。 When imaging is started by the imaging apparatus 401, the self-position estimation processing unit 201 starts self-position estimation in step S202. The self-position estimation processing unit 201 generates a local map and estimates a position where the self is in the parking lot 10. Specifically, it is estimated as follows.
 画像補正部321L及び画像補正部321Rは、それぞれ撮像装置401Lから供給される左画像、及び、撮像装置401Rから供給される右画像を、互いに同じ方向を向いた画像となるように補正する。画像補正部321Lは、補正後の左画像を特徴点検出部322及び動きマッチング部327に供給する。画像補正部321Rは、補正後の右画像を視差マッチング部323に供給する。 The image correction unit 321L and the image correction unit 321R correct the left image supplied from the imaging device 401L and the right image supplied from the imaging device 401R, respectively, so that the images are directed in the same direction. The image correction unit 321L supplies the corrected left image to the feature point detection unit 322 and the motion matching unit 327. The image correction unit 321R supplies the corrected right image to the parallax matching unit 323.
 特徴点検出部322は、左画像の特徴点を検出する。特徴点の検出方法には、例えば、Harrisコーナー等の任意の方法を用いることができる。特徴点検出部322は、検出した各特徴点の画像座標系上の位置を示す2次元位置情報を視差マッチング部323に供給する。 The feature point detection unit 322 detects a feature point of the left image. For the feature point detection method, for example, an arbitrary method such as a Harris corner can be used. The feature point detection unit 322 supplies two-dimensional position information indicating the position of each detected feature point on the image coordinate system to the parallax matching unit 323.
 視差マッチング部323は、左画像で検出された特徴点に対応する右画像の特徴点を検出する。視差マッチング部323は、各特徴点の左画像及び右画像における画像座標系上の位置を示す2次元位置情報を距離推定部324に供給する。 The parallax matching unit 323 detects the feature point of the right image corresponding to the feature point detected in the left image. The parallax matching unit 323 supplies two-dimensional position information indicating the positions of the feature points on the image coordinate system in the left image and the right image to the distance estimation unit 324.
 距離推定部324は、各特徴点の左画像と右画像との間の視差に基づいて、各特徴点までの距離を推定し、さらに各特徴点の3次元の空間座標系上の位置を算出する。距離推定部324は、各特徴点の空間座標系上の位置を示す3次元位置情報を特徴量算出部325に供給する。 The distance estimation unit 324 estimates the distance to each feature point based on the parallax between the left image and the right image of each feature point, and further calculates the position of each feature point on the three-dimensional spatial coordinate system To do. The distance estimation unit 324 supplies three-dimensional position information indicating the position of each feature point on the spatial coordinate system to the feature amount calculation unit 325.
 特徴量算出部325は、左画像の各特徴点の特徴量を算出する。特徴量には、例えばSURF(Speeded Up Robust Features)等、任意の特徴量を用いることができる。特徴量算出部325は、各特徴点の3次元位置情報及び特徴量を含む特徴点情報を、マップ情報記憶部326に記憶させる。 The feature amount calculation unit 325 calculates the feature amount of each feature point of the left image. As the feature amount, for example, an arbitrary feature amount such as SURF (Speeded Up Robust 例 え ば Features) can be used. The feature amount calculation unit 325 causes the map information storage unit 326 to store feature point information including the three-dimensional position information of each feature point and the feature amount.
 動きマッチング部327は、1つ前のフレームで検出された各特徴点の3次元位置情報をマップ情報記憶部326から取得する。次に、動きマッチング部327は、現在のフレームの左画像において、1つ前のフレームで検出された各特徴点に対応する特徴点を検出する。そして、動きマッチング部327は、各特徴点の1つ前のフレームにおける3次元位置情報、及び、現在のフレームにおける画像座標系上の位置を示す2次元位置情報を移動量推定部328に供給する。 The motion matching unit 327 acquires, from the map information storage unit 326, the three-dimensional position information of each feature point detected in the previous frame. Next, the motion matching unit 327 detects a feature point corresponding to each feature point detected in the previous frame in the left image of the current frame. Then, the motion matching unit 327 supplies the movement amount estimation unit 328 with the three-dimensional position information in the previous frame of each feature point and the two-dimensional position information indicating the position on the image coordinate system in the current frame. .
 移動量推定部328は、1つ前のフレームと現在のフレームとの間における自車(より正確には、撮像装置401L)の移動量を推定する。例えば、移動量推定部328は、次式(1)のコスト関数fの値が最小となる移動量dXを算出する。 The movement amount estimation unit 328 estimates the movement amount of the host vehicle (more precisely, the imaging device 401L) between the previous frame and the current frame. For example, the movement amount estimation unit 328 calculates the movement amount dX that minimizes the value of the cost function f in the following equation (1).
f=Σ||Zt-proj(dX,Mt-1)||2 ・・・(1) f = Σ || Z t -proj (dX, M t-1 ) || 2 (1)
 なお、移動量dXは、1つ前のフレームから現在のフレームまでの間の自車(より正確には、撮像装置401L)の位置及び姿勢の移動量を示している。例えば、移動量dXは、空間座標系における3軸方向(3自由度)の位置及び各軸まわり(3自由度)の姿勢の移動量を示す。 The moving amount dX indicates the moving amount of the position and posture of the own vehicle (more precisely, the imaging device 401L) from the previous frame to the current frame. For example, the movement amount dX indicates the movement amount of the position in the three-axis direction (three degrees of freedom) and the posture around each axis (three degrees of freedom) in the spatial coordinate system.
 また、Mt-1とZtは、対応する特徴点の1つ前のフレームと現在のフレームの位置を示している。より具体的には、Mt-1は、1つ前のフレームの空間座標系上の特徴点の位置を示し、Ztは、現在のフレームの画像座標系上の特徴点の位置を示している。 M t−1 and Z t indicate the positions of the frame immediately before the corresponding feature point and the current frame. More specifically, M t-1 indicates the position of the feature point on the spatial coordinate system of the previous frame, and Z t indicates the position of the feature point on the image coordinate system of the current frame. Yes.
 さらに、proj(dX,Mt-1)は、1つ前のフレームにおける特徴点の空間座標系上の位置Mt-1を、移動量dXを用いて、現在のフレームの左画像の画像座標系上に射影した位置を示している。すなわち、proj(dX,Mt-1)は、1つ前のフレームにおける特徴点の位置Mt-1と移動量dXに基づいて、現在のフレームの左画像上の特徴点の位置を推定したものである。 Further, proj (dX, M t-1 ) is the image coordinates of the left image of the current frame, using the movement amount dX as the position M t-1 of the feature point in the previous frame on the spatial coordinate system. The projected position on the system is shown. That is, proj (dX, M t-1 ) estimates the position of the feature point on the left image of the current frame based on the position M t-1 of the feature point in the previous frame and the movement amount dX. Is.
 移動量推定部328は、式(1)に示される各特徴点のZt-proj(dX,Mt-1)の二乗和が最小となる移動量dXを、例えば最小二乗法等により求める。すなわち、移動量推定部328は、1つ前のフレームにおける特徴点の空間座標系上の位置Mt-1及び移動量dXに基づいて、現在のフレームの左画像の特徴点の画像座標系上の位置を推定した場合の誤差が最小となる移動量dXを求める。移動量推定部328は、求めた移動量dXを示す移動量情報を、物体検出部303、位置姿勢推定部332、及び、速度推定部333に供給する。 The movement amount estimation unit 328 obtains a movement amount dX that minimizes the sum of squares of Z t -proj (dX, M t-1 ) of each feature point shown in Expression (1) by, for example, the least square method. That is, the movement amount estimation unit 328 calculates the feature point of the left image of the current frame on the image coordinate system based on the position M t-1 of the feature point on the spatial coordinate system and the movement amount dX of the previous frame. The amount of movement dX that minimizes the error when the position of is estimated is obtained. The movement amount estimation unit 328 supplies movement amount information indicating the obtained movement amount dX to the object detection unit 303, the position / orientation estimation unit 332, and the speed estimation unit 333.
 位置姿勢推定部332は、1つ前のフレームにおける位置姿勢情報を位置姿勢情報記憶部331から取得する。そして、位置姿勢推定部332は、1つ前のフレームにおける自車の位置及び姿勢に、移動量推定部328により推定された移動量dXを加算することにより、現在の自車の位置及び姿勢を推定する。 The position / orientation estimation unit 332 acquires position / orientation information in the previous frame from the position / orientation information storage unit 331. Then, the position / orientation estimation unit 332 adds the movement amount dX estimated by the movement amount estimation unit 328 to the position and orientation of the own vehicle in the previous frame, thereby obtaining the current position and orientation of the own vehicle. presume.
 なお、位置姿勢推定部332は、最初のフレームにおける自車の位置及び姿勢の推定を行う場合、初期位置姿勢情報を位置姿勢情報記憶部331から取得する。そして、位置姿勢推定部332は、自車の初期位置及び初期姿勢に、移動量推定部328により推定された移動量dXを加算することにより、自車の位置及び姿勢を推定する。 The position / orientation estimation unit 332 acquires initial position / orientation information from the position / orientation information storage unit 331 when estimating the position and orientation of the vehicle in the first frame. Then, the position / orientation estimation unit 332 estimates the position and orientation of the host vehicle by adding the movement amount dX estimated by the movement amount estimation unit 328 to the initial position and initial posture of the host vehicle.
 また、位置姿勢推定部332は、必要に応じて、マップ情報記憶部326に記憶されているグローバルマップに基づいて、推定した自車の位置及び姿勢の補正を行う。 Further, the position / orientation estimation unit 332 corrects the estimated position and orientation of the own vehicle based on the global map stored in the map information storage unit 326 as necessary.
 位置姿勢推定部332は、推定した自車の位置及び姿勢を示す位置姿勢情報を、位置情報生成部302、及び、速度推定部333に供給するとともに、位置姿勢情報記憶部331に記憶させる。 The position / orientation estimation unit 332 supplies position / orientation information indicating the estimated position and orientation of the host vehicle to the position information generation unit 302 and the speed estimation unit 333 and causes the position / orientation information storage unit 331 to store the position / orientation information.
 速度推定部333は、自車の速度を推定する。具体的には、速度推定部333は、移動量推定部328により推定された移動量dXを経過時間で除算することにより、自車の速度を推定する。速度推定部333は、推定した速度を示す速度情報を位置情報生成部302に供給する。 Speed estimation unit 333 estimates the speed of the vehicle. Specifically, the speed estimation unit 333 estimates the speed of the host vehicle by dividing the movement amount dX estimated by the movement amount estimation unit 328 by the elapsed time. The speed estimation unit 333 supplies speed information indicating the estimated speed to the position information generation unit 302.
 物体検出部303は、周囲の物体、例えば、駐車している車両、柱などの構造物などの検出を行う。具体的には、物体検出部303は、1つ前のフレームと現在のフレームの特徴点情報をマップ情報記憶部326から取得する。次に、物体検出部303は、1つ前のフレームの特徴点と現在のフレームの特徴点とのマッチングを行い、フレーム間における各特徴点の動きを検出する。 The object detection unit 303 detects a surrounding object, for example, a parked vehicle, a structure such as a pillar, or the like. Specifically, the object detection unit 303 acquires the feature point information of the previous frame and the current frame from the map information storage unit 326. Next, the object detection unit 303 performs matching between the feature point of the previous frame and the feature point of the current frame, and detects the movement of each feature point between frames.
 次に、物体検出部303は、移動量推定部328により推定された移動量dXに基づく自車の動きに対応する動きをしている特徴点と、自車の動きに対応する動きをしていない特徴点とを区別する。次に、物体検出部303は、自車の動きに対応する動きをしている特徴点に基づいて、自車の周囲の静止物体を検出する。また、物体検出部303は、自車の動きに対応する動きをしていない特徴点に基づいて、自車の周囲の移動体を検出する。そして、物体検出部303は、自車の周囲の静止物体及び移動体の検出結果をローカルマップ生成部204に通知する。 Next, the object detection unit 303 performs a feature point that moves corresponding to the movement of the own vehicle based on the movement amount dX estimated by the movement amount estimation unit 328 and a movement that corresponds to the movement of the own vehicle. Distinguish from no feature points. Next, the object detection unit 303 detects a stationary object around the own vehicle based on the feature point that moves corresponding to the movement of the own vehicle. Further, the object detection unit 303 detects a moving body around the own vehicle based on feature points that do not move corresponding to the movement of the own vehicle. Then, the object detection unit 303 notifies the local map generation unit 204 of the detection results of stationary objects and moving objects around the own vehicle.
 ローカルマップ生成部304は、現在のフレームの特徴点情報をマップ情報記憶部326から取得する。次に、ローカルマップ生成部304は、取得した特徴点情報から、物体検出部303により検出された周囲の移動体の特徴点に関する情報を削除する。そして、ローカルマップ生成部304は、残った特徴点情報に基づいてローカルマップを生成する。 The local map generation unit 304 acquires feature point information of the current frame from the map information storage unit 326. Next, the local map generation unit 304 deletes information on the feature points of the surrounding moving objects detected by the object detection unit 303 from the acquired feature point information. Then, the local map generation unit 304 generates a local map based on the remaining feature point information.
 このようにして、自己位置の推定が行われ、ローカルマップが生成される。 In this way, self-location is estimated and a local map is generated.
 ステップS203において、通信部203は、駐車支援装置51と通信し、経路情報、グローバルマップ、マーカーに関する情報を受信し、情報保持部205に供給し、保持させる。上記したように、この通信は、入場ゲート11、情報提供場所などで行われ、情報が受信される。 In step S203, the communication unit 203 communicates with the parking support device 51, receives route information, a global map, and information on the marker, supplies the information to the information holding unit 205, and holds the information. As described above, this communication is performed at the entrance gate 11, the information providing place, etc., and information is received.
 ステップS204において、ローカル経路生成部206は、ローカル経路を生成する。ローカル経路は、経路情報で指定されている経路を、ローカルマップで微調整された経路であるようにすることができる。例えば、上記したように、指定された経路上に、車両が駐車スペースよりはみ出して駐車しているような場所では、その車両をよける経路に微調整される。また、経路から外れてしまう可能性もあり、そのようなときには、経路は修正される。 In step S204, the local route generation unit 206 generates a local route. The local route can be such that the route specified in the route information is a finely tuned route in the local map. For example, as described above, in a place where the vehicle is parked out of the parking space on the designated route, the route is finely adjusted to avoid the vehicle. Further, there is a possibility that the route is deviated, and in such a case, the route is corrected.
 また、駐車開始位置に到着したときには、その駐車開始位置から、駐車スペースに駐車するまでの経路が生成される。駐車するときには、切り返しなどの走行時よりも細かな運転が必要となり、経路として、より細かく設定された経路が生成されたり、駐車スペースの周りの環境(状況)が考慮された経路が生成されたりする。 Also, when arriving at the parking start position, a route from the parking start position to parking in the parking space is generated. When parking, more detailed driving is required than when traveling, such as turning back, and more detailed routes are generated as routes, and routes that take into account the environment (situation) around the parking space are generated. To do.
 駐車スペースに対する誘導は、駐車開始位置付近の投影装置53から投影されているパターンがマーカーパターンであり、撮像されているマーカーパターンのマーカーの変化を用いて行うようにしても良い。 The guidance to the parking space may be performed by using the marker pattern of the marker pattern being imaged as the pattern projected from the projection device 53 near the parking start position.
 撮像されているマーカーパターンのマーカーは、車両20が移動することで同一のマーカーであっても変化する。その変化から、指定されている経路上を正確に走行しているか否かが判定され、走行してないと判定される場合には、修正するといった制御が繰り返し行われることで、駐車スペースに対する誘導が行われるようにしても良い。 The marker of the marker pattern being imaged changes even if it is the same marker as the vehicle 20 moves. From the change, it is determined whether or not the vehicle is traveling accurately on the designated route. When it is determined that the vehicle is not traveling, the control for correcting the vehicle is repeatedly performed to guide the parking space. May be performed.
 ステップS205において、車両制御信号生成部207は、ローカル経路生成部26により生成されたローカル経路に基づき、車両20の速度や方向を制御するための制御信号を生成する。 In step S205, the vehicle control signal generation unit 207 generates a control signal for controlling the speed and direction of the vehicle 20 based on the local route generated by the local route generation unit 26.
 ステップS206において、通知生成部208は、ユーザ(運転者)に対して経路を通知する。例えば、ナビゲーションシステムを用いて、ナビゲーションシステムのディスプレイに経路を表示したり、音声で曲がり角などを通知したりすることで、ユーザに対する経路の通知が行われる。 In step S206, the notification generation unit 208 notifies the user (driver) of the route. For example, the route is displayed to the user by using the navigation system to display the route on the display of the navigation system or by notifying a corner or the like by voice.
 なお、車両20を制御する制御信号の生成と、ユーザへの経路の通知は、どちらか一方のみが行われるようにしても良い。例えば、車両20を制御する制御信号の生成のみが行われるようにし、ローカル経路に基づき、車両20が制御され、運転者の負担が軽減されるようにしても良い。また例えば、ユーザへの経路の通知のみが行われるようにし、ユーザ(運転者)は、その通知を利用して、指定された駐車スペースまでの運転や、駐車を行うようにしても良い。 Note that only one of the generation of the control signal for controlling the vehicle 20 and the notification of the route to the user may be performed. For example, only the generation of a control signal for controlling the vehicle 20 may be performed, and the vehicle 20 may be controlled based on the local route to reduce the burden on the driver. Further, for example, only the route notification to the user may be performed, and the user (driver) may use the notification to drive or park the designated parking space.
 ステップS207において、グローバルマップ更新判定部204は、情報保持部205に保持されているグローバルマップを更新する必要があるか否かを判定する。グローバルマップ更新判定部204は、自己位置推定処理部201で生成されたローカルマップと、情報保持部205に保持されているグローバルマップのローカルマップに対応する位置に、差異が生じていると判定した場合、グローバルマップを更新すると判定する。 In step S207, the global map update determination unit 204 determines whether or not the global map stored in the information storage unit 205 needs to be updated. The global map update determination unit 204 determines that there is a difference between the local map generated by the self-position estimation processing unit 201 and the position corresponding to the local map of the global map stored in the information storage unit 205. If it is determined that the global map is to be updated.
 ステップS207において、グローバルマップ更新判定部204が、グローバルマップを更新すると判定した場合、ステップS208に処理が進められ、情報保持部205に保持されているグローバルマップのローカルマップと差異が生じている部分に関する情報(グローバルマップ更新情報)が生成され、駐車支援装置51に提供される。 In step S207, when the global map update determination unit 204 determines to update the global map, the process proceeds to step S208, where a difference occurs from the local map of the global map held in the information holding unit 205. Information (global map update information) is generated and provided to the parking assistance device 51.
 また、情報保持部205に保持されているグローバルマップのローカルマップと差異が生じている部分が、ローカルマップに合うように更新される処理が行われるようにしても良い。 In addition, a process in which a part of the global map stored in the information storage unit 205 that is different from the local map is updated to match the local map may be performed.
 グローバルマップ更新情報として、ローカルマップそのものが、通信部203を介して、駐車支援装置51に供給されるようにしても良い。上記したように、駐車支援装置51は、ローカルマップの供給を受けた場合、自己が保持しているグローバルマップを更新する。 As the global map update information, the local map itself may be supplied to the parking assistance device 51 via the communication unit 203. As described above, when the parking assistance device 51 receives the supply of the local map, the parking assistance device 51 updates the global map held by itself.
 なお、グローバルマップ更新判定部204は、グローバルマップを更新する必要があると判定したときには、情報保持部205に保持されているグローバルマップを更新せずに、ローカルマップを、駐車支援装置51に送信する処理だけを実行するようにしても良い。すなわち、駐車支援装置51側で保持されているグローバルマップを、駐車場10内の環境(状況)に合わせるために、更新が必要なときに、更新が必要であることを駐車支援装置51に知らせ、更新する箇所を示すローカルマップを供給するように、グローバルマップ更新判定部204を構成することも可能である。 When the global map update determination unit 204 determines that the global map needs to be updated, the local map is transmitted to the parking support device 51 without updating the global map held in the information holding unit 205. Only the processing to be performed may be executed. That is, when the global map held on the parking support device 51 side needs to be updated in order to match the environment (situation) in the parking lot 10, the parking support device 51 is notified that the update is necessary. The global map update determination unit 204 can be configured to supply a local map indicating a location to be updated.
 一方、ステップS207において、グローバルマップは更新しないと判定された場合、または、ステップS208において、グローバルマップの更新が行われた場合(ローカルマップが駐車支援装置51側に提供された場合)、ステップS209に処理は進められる。 On the other hand, if it is determined in step S207 that the global map is not updated, or if the global map is updated in step S208 (when the local map is provided to the parking assist device 51 side), step S209. The process proceeds.
 ステップS209において、マーカーが検出されたか否かが判定される。自己位置推定処理部201は、情報保持部205に保持されているマーカーを、撮像装置401で撮像された画像から検出したか否かを判定する。ステップS209において、マーカーが検出されたと判定された場合、ステップS210に処理は進められる。 In step S209, it is determined whether or not a marker has been detected. The self-position estimation processing unit 201 determines whether or not the marker held in the information holding unit 205 has been detected from the image captured by the imaging device 401. If it is determined in step S209 that a marker has been detected, the process proceeds to step S210.
 ステップS210において、自己位置補正が行われる。グローバルマップの所定の位置に設置されている投影装置53は、マーカーパターンを投影し、この投影装置53で投影されているマーカーパターンが撮像されたときに、マーカーが検出される。そのような特定の位置で投影されているマーカーパターンを撮像し、マーカーが検出された場合、自己位置推定処理部201は、自己が推定している位置を補正する。 In step S210, self-position correction is performed. The projection device 53 installed at a predetermined position on the global map projects a marker pattern, and the marker is detected when the marker pattern projected by the projection device 53 is imaged. When the marker pattern projected at such a specific position is imaged and a marker is detected, the self-position estimation processing unit 201 corrects the position estimated by itself.
 一方、ステップS209において、マーカーは検出していないと判定された場合、ステップS210の処理はスキップされ、ステップS211に処理は進められる。ステップS211において、目的地に着いたか否かが判定される。 On the other hand, if it is determined in step S209 that no marker is detected, the process in step S210 is skipped, and the process proceeds to step S211. In step S211, it is determined whether or not the destination has been reached.
 ステップS211において、目的地についたか否かが判定される。駐車開始位置を目的地としても良いし、駐車スペースに駐車するまでを目的地としても良い。また目的地に設定されている位置に設置されている投影装置53がマーカーパターンを投影するようにし、そのマーカーパターンによるマーカーが検出されたか否かを判定することで、目的地に着いたか否かの判定が行われるようにすることができる。 In step S211, it is determined whether or not the destination has been reached. The parking start position may be set as the destination, or until the parking space is parked. Further, the projection device 53 installed at the position set as the destination projects the marker pattern, and it is determined whether or not the marker according to the marker pattern is detected. This determination can be made.
 ステップS211において、目的地に着いたと判定されるまで、ステップS204に処理が戻され、それ以降の処理が繰り返される。すなわち、ローカル経路を生成し、そのローカル経路に基づいて目的地まで移動するという処理の実行が維持される。 In step S211, processing is returned to step S204 until it is determined that the destination has been reached, and the subsequent processing is repeated. That is, the execution of the process of generating a local route and moving to the destination based on the local route is maintained.
 一方、ステップS211において、目的地についたと判定された場合、駐車支援車両側装置81の処理は終了される。 On the other hand, if it is determined in step S211 that the destination has been reached, the processing of the parking assistance vehicle side device 81 is terminated.
 このように、駐車支援車両側装置81は、駐車支援装置51で設定された経路上を、ローカルマップやローカル経路を作成して走行するため、運転者が駐車場10内で駐車するときの支援を行い、運転者の負担を軽減することが可能となる。 Thus, since the parking assistance vehicle side device 81 travels on the route set by the parking assistance device 51 by creating a local map or a local route, assistance when the driver parks in the parking lot 10. It is possible to reduce the burden on the driver.
 なおここでは、駐車場10に入場してから駐車スペースに駐車するまで支援が行われる場合を例に挙げて説明したが、駐車スペースから退場するまでの支援も、同様に行うことができ、そのような退場のときに対しても、本技術を適用できる。すなわち、駐車スペースから退場ゲートまでの経路を、駐車支援装置51で生成し、その経路に基づいて、車両20が走行することで、退場するまでの支援が行われる。 In addition, although the case where assistance is performed from entering the parking lot 10 until parking in the parking space is described here as an example, assistance until leaving the parking space can be performed in the same way, The present technology can be applied even when leaving. That is, a route from the parking space to the exit gate is generated by the parking assist device 51, and the vehicle 20 travels based on the route, thereby assisting until the vehicle exits.
 上記した実施の形態においては、駐車場10で駐車するときの支援を行う場合を例に挙げて説明したが、駐車場10は、屋内、屋外のどちらでも、本技術を適用することはできる。屋内の場合、天井に投影装置53を設置し、屋外の場合、ポールなどを設置し、そのポールに投影装置53を設置するなど、設置場所は適宜、駐車場10に合わせた場所とされる。 In the above-described embodiment, the case where assistance is performed when parking in the parking lot 10 has been described as an example. However, the present technology can be applied to the parking lot 10 both indoors and outdoors. In the case of indoors, the projector 53 is installed on the ceiling, and in the case of outdoors, a pole or the like is installed, and the projector 53 is installed on the pole.
 また、上記した実施の形態においては、駐車場10で駐車するときの支援を行う場合を例に挙げて説明したが、駐車場内での支援に限定されるわけではなく、他の場所における支援であっても、本技術を適用できる。例えば、高速道路を走行している車両に対する支援に対して本技術を適用することはできる。 Moreover, in the above-described embodiment, the case where the support at the time of parking at the parking lot 10 is described as an example, but it is not limited to the support in the parking lot, but the support at other places Even if it exists, this technique is applicable. For example, the present technology can be applied to support for a vehicle traveling on a highway.
 また、上記した実施の形態においては、車両20に対して支援を行う場合を例に挙げて説明したが、車両20に対してのみ本技術が適用されるのではなく、他の物体(移動体)に対して本技術を適用することも可能である。例えば、工場内を走行する車両(ロボット)、飛行機(ドローンも含む)などに対して本技術を適用することも可能である。 In the above-described embodiment, the case where the vehicle 20 is supported has been described as an example. However, the present technology is not applied only to the vehicle 20, and other objects (moving objects) It is also possible to apply the present technology to For example, the present technology can be applied to a vehicle (robot) traveling in a factory, an airplane (including a drone), and the like.
 上記した実施の形態においては、車両20に2台の撮像装置401を用いたステレオカメラ方式で車両20の位置、姿勢、速度等を推定する場合を例に挙げて説明したが、例えば、1台又は3台以上の撮像装置を用いて、移動体の位置、姿勢、速度等を推定するようにしてもよい。 In the above-described embodiment, the case where the position, posture, speed, and the like of the vehicle 20 are estimated by a stereo camera system using two imaging devices 401 in the vehicle 20 has been described as an example. Alternatively, the position, posture, speed, and the like of the moving body may be estimated using three or more imaging devices.
 また、上記した方法以外の方法により、車両(移動体)から撮影した画像に基づいて、移動体の位置、姿勢、速度等を推定するようにすることも可能である。 Also, it is possible to estimate the position, posture, speed, etc. of the moving body based on an image taken from the vehicle (moving body) by a method other than the above-described method.
 また、本技術は、移動体が、原動機により動く車両の場合だけでなく、レール又は架線により運転する車両や、人の力により動く車両等の場合にも適用することができる。さらに、車両の運転方法(例えば、自動運転、マニュアル運転、遠隔操作等)の違いに関わらず、本技術は適用することが可能である。 In addition, the present technology can be applied not only to a case where the moving body is a vehicle that is moved by a prime mover, but also to a case that a vehicle is driven by rails or overhead lines, a vehicle that is moved by human power, or the like. Furthermore, the present technology can be applied regardless of differences in vehicle driving methods (for example, automatic driving, manual driving, remote control, etc.).
 また、本技術は、例えば、多くの移動体が行き交う場合や、移動体からの死角が発生しやすい場合にも適用できる。例えば、ヘッドマウントディスプレイを装着した人がAR(拡張現実)やVR(仮想現実)のアプリケーションを利用する場合、多くの人が行き交う路上やイベント会場等において、人同士の衝突、追突、接触等の事故を避け、所定の位置まで誘導する際などに適用することができる。 In addition, the present technology can be applied to, for example, a case where many moving bodies come and go and a blind spot from the moving body is likely to occur. For example, when a person wearing a head-mounted display uses an AR (augmented reality) or VR (virtual reality) application, such as collision, rear-end collision, contact, etc. between people on a road or event venue where many people come and go. It can be applied to avoid accidents and guide to a predetermined position.
 <記録媒体について>
 上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
<About recording media>
The series of processes described above can be executed by hardware or can be executed by software. When a series of processing is executed by software, a program constituting the software is installed in the computer. Here, the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
 図13は、上述した一連の処理をプログラムにより実行するコンピュータのハードウエアの構成例を示すブロック図である。コンピュータにおいて、CPU(Central Processing Unit)1001、ROM(Read Only Memory)1002、RAM(Random Access Memory)1003は、バス1004により相互に接続されている。バス1004には、さらに、入出力インタフェース1005が接続されている。入出力インタフェース1005には、入力部1006、出力部1007、記憶部1008、通信部1009、およびドライブ1010が接続されている。 FIG. 13 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program. In the computer, a CPU (Central Processing Unit) 1001, a ROM (Read Only Memory) 1002, and a RAM (Random Access Memory) 1003 are connected to each other via a bus 1004. An input / output interface 1005 is further connected to the bus 1004. An input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010 are connected to the input / output interface 1005.
 入力部1006は、キーボード、マウス、マイクロフォンなどよりなる。出力部1007は、ディスプレイ、スピーカなどよりなる。記憶部1008は、ハードディスクや不揮発性のメモリなどよりなる。通信部1009は、ネットワークインタフェースなどよりなる。ドライブ1010は、磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリなどのリムーバブルメディア1011を駆動する。 The input unit 1006 includes a keyboard, a mouse, a microphone, and the like. The output unit 1007 includes a display, a speaker, and the like. The storage unit 1008 includes a hard disk, a nonvolatile memory, and the like. The communication unit 1009 includes a network interface. The drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
 以上のように構成されるコンピュータでは、CPU1001が、例えば、記憶部1008に記憶されているプログラムを、入出力インタフェース1005およびバス1004を介して、RAM1003にロードして実行することにより、上述した一連の処理が行われる。 In the computer configured as described above, the CPU 1001 loads, for example, the program stored in the storage unit 1008 to the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the program. Is performed.
 コンピュータ(CPU1001)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブルメディア1011に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線または無線の伝送媒体を介して提供することができる。 The program executed by the computer (CPU 1001) can be provided by being recorded on the removable medium 1011 as a package medium, for example. The program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
 コンピュータでは、プログラムは、リムーバブルメディア1011をドライブ1010に装着することにより、入出力インタフェース1005を介して、記憶部1008にインストールすることができる。また、プログラムは、有線または無線の伝送媒体を介して、通信部1009で受信し、記憶部1008にインストールすることができる。その他、プログラムは、ROM1002や記憶部1008に、予めインストールしておくことができる。 In the computer, the program can be installed in the storage unit 1008 via the input / output interface 1005 by attaching the removable medium 1011 to the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be installed in the ROM 1002 or the storage unit 1008 in advance.
 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
 また、本明細書において、システムとは、複数の装置により構成される装置全体を表すものである。 In addition, in this specification, the system represents the entire apparatus composed of a plurality of apparatuses.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものではなく、また他の効果があってもよい。 Note that the effects described in the present specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Note that the embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology.
 なお、本技術は以下のような構成も取ることができる。
(1)
 所定のパターンを投影する複数の投影装置を制御する投影制御部と、
 移動体が目的地まで移動するときの経路を作成する経路作成部と
 を備え、
 前記投影制御部は、前記経路作成部で作成された前記経路上に位置する前記投影装置の投影を制御する
 情報処理装置。
(2)
 前記投影制御部は、所定の位置に設置されている前記投影装置が投影するパターンを、マーカーパターンとし、前記所定の位置以外に設置されている前記投影装置が投影するパターンを、ランダムパターンとする
 前記(1)に記載の情報処理装置。
(3)
 前記所定の位置は、曲がり角、または前記目的地付近の少なくとも一方である
 前記(2)に記載の情報処理装置。
(4)
 前記経路作成部は、前記移動体の幅、長さ、高さ、最小回転半径のうちの少なくとも1つの情報を用いて、前記移動体が移動できる前記目的地までの経路を作成する
 前記(1)乃至(3)のいずれかに記載の情報処理装置。
(5)
 前記目的地は、駐車スペースであり、
 前記経路作成部は、前記移動体が、前記駐車スペースに駐車するためのアプローチを開始する位置まで前記経路として作成する
 前記(1)乃至(4)のいずれかに記載の情報処理装置。
(6)
 所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを保持する保持部と、
 前記移動体から撮影された画像内の特徴点の3次元空間内の位置を示すローカルマップを取得し、前記ローカルマップに基づいて、前記グローバルマップを更新するグローバルマップ更新部を
 さらに備える
 前記(1)乃至(5)のいずれかに記載の情報処理装置。
(7)
 前記経路作成部は、前記保持部に保持されている前記グローバルマップを参照して経路を作成する
 前記(6)に記載の情報処理装置。
(8)
 前記移動体に、前記経路作成部で作成された経路、前記マーカーパターンのマーカー情報、および所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを供給する
 前記(2)に記載の情報処理装置。
(9)
 所定のパターンを投影する複数の投影装置を制御し、
 移動体が目的地まで移動するときの経路を作成する
 ステップを含み、
 前記投影の制御は、作成された前記経路上に位置する前記投影装置の投影を制御することで行われる
 情報処理方法。
(10)
 投影装置で投影された所定のパターンを撮像する撮像部と、
 前記撮像部で撮像された前記パターンを用いて自己位置の推定を行う推定部と、
 他の装置から供給される経路と、推定された前記自己位置に基づき、前記経路上を移動するために各部を制御するための制御信号を生成する制御信号生成部と
 を備える情報処理装置。
(11)
 前記撮像部で撮影された画像内の特徴点の3次元空間内の位置を示すローカルマップを生成するローカルマップ生成部を
 さらに備える前記(10)に記載の情報処理装置。
(12)
 前記他の装置から供給される所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを保持する保持部と、
 前記グローバルマップ内で、前記ローカルマップに対応する位置に変更があったか否かを判定する判定部と
 をさらに備え、
 前記判定部により、変更があったと判定された場合、前記ローカルマップを前記他の装置に供給する
 前記(11)に記載の情報処理装置。
(13)
 前記経路を、前記ローカルマップで修正する
 前記(11)に記載の情報処理装置。
(14)
 前記所定のパターンからマーカーが検出された場合、前記推定部で推定されている前記自己位置を補正する
 前記(10)乃至(13)のいずれかに記載の情報処理装置。
(15)
 前記制御信号は、ユーザに前記経路を通知する信号である
 前記(10)乃至(14)のいずれかに記載の情報処理装置。
(16)
 投影装置で投影された所定のパターンを撮像し、
 撮像された前記パターンを用いて自己位置の推定を行い、
 他の装置から供給される経路と、推定された前記自己位置に基づき、前記経路上を移動するために各部を制御するための制御信号を生成する
 ステップを含む情報処理方法。
In addition, this technique can also take the following structures.
(1)
A projection control unit that controls a plurality of projection devices that project a predetermined pattern;
A route creation unit that creates a route when the moving body moves to the destination, and
The projection control unit controls projection of the projection device located on the path created by the path creation unit.
(2)
The projection control unit sets a pattern projected by the projection apparatus installed at a predetermined position as a marker pattern, and sets a pattern projected by the projection apparatus installed at a position other than the predetermined position as a random pattern. The information processing apparatus according to (1).
(3)
The information processing apparatus according to (2), wherein the predetermined position is at least one of a corner and a vicinity of the destination.
(4)
The route creation unit creates a route to the destination to which the moving body can move using at least one of the width, length, height, and minimum rotation radius of the moving body. The information processing apparatus according to any one of (3) to (3).
(5)
The destination is a parking space;
The information processing apparatus according to any one of (1) to (4), wherein the route creation unit creates the route up to a position where the moving body starts an approach for parking in the parking space.
(6)
A holding unit for holding a global map indicating the position of a feature point in a three-dimensional space within a predetermined area;
The system further includes a global map update unit that acquires a local map indicating a position in a three-dimensional space of a feature point in an image photographed from the moving body, and updates the global map based on the local map. The information processing apparatus according to any one of (5) to (5).
(7)
The information processing apparatus according to (6), wherein the route creation unit creates a route with reference to the global map held in the holding unit.
(8)
The route created by the route creation unit, marker information of the marker pattern, and a global map indicating the position of a feature point in a predetermined region in a three-dimensional space are supplied to the moving body. Information processing device.
(9)
Controlling a plurality of projection devices that project a predetermined pattern;
Including a step of creating a route when the moving object moves to the destination,
The projection control is performed by controlling the projection of the projection device located on the created path.
(10)
An imaging unit for imaging a predetermined pattern projected by the projection device;
An estimation unit that estimates a self-position using the pattern imaged by the imaging unit;
An information processing apparatus comprising: a route supplied from another device; and a control signal generation unit that generates a control signal for controlling each unit to move on the route based on the estimated self-position.
(11)
The information processing apparatus according to (10), further including a local map generation unit that generates a local map indicating a position in a three-dimensional space of the feature point in the image captured by the imaging unit.
(12)
A holding unit for holding a global map indicating the position of a feature point in a three-dimensional space in a predetermined region supplied from the other device;
A determination unit that determines whether or not the position corresponding to the local map has changed in the global map; and
The information processing apparatus according to (11), wherein when the determination unit determines that there is a change, the local map is supplied to the other apparatus.
(13)
The information processing apparatus according to (11), wherein the route is corrected with the local map.
(14)
The information processing apparatus according to any one of (10) to (13), wherein when the marker is detected from the predetermined pattern, the self-position estimated by the estimation unit is corrected.
(15)
The information processing apparatus according to any one of (10) to (14), wherein the control signal is a signal for notifying a user of the route.
(16)
Capture a predetermined pattern projected by the projector,
Use the captured pattern to estimate the self-position,
An information processing method including a step of generating a control signal for controlling each unit to move on the route based on a route supplied from another device and the estimated self-position.
 10 駐車場, 20 車両, 51 駐車支援装置, 52 撮像装置, 53 投影装置, 61 ネットワーク, 71 データベース, 81 駐車支援車両側装置, 101 入場車両検出部, 102 車両情報取得部, 103 投影制御部, 104 パターン保持部, 105 経路作成部, 106 グローバルマップ保持部, 107 グローバルマップ更新部, 108 通信部, 201 自己位置推定処理部, 202 情報受信指示部, 203 通信部, 204 グローバルマップ更新判定部, 205 情報保持部, 206 ローカル経路生成部, 207 車両制御信号生成部, 208 通知生成部 10 parking lot, 20 vehicles, 51 parking support device, 52 imaging device, 53 projection device, 61 network, 71 database, 81 parking support vehicle side device, 101 entrance vehicle detection unit, 102 vehicle information acquisition unit, 103 projection control unit, 104 pattern holding unit, 105 route creation unit, 106 global map holding unit, 107 global map update unit, 108 communication unit, 201 self-position estimation processing unit, 202 information reception instruction unit, 203 communication unit, 204 global map update determination unit, 205 information holding unit, 206 local route generation unit, 207 vehicle control signal generation unit, 208 notification generation unit

Claims (16)

  1.  所定のパターンを投影する複数の投影装置を制御する投影制御部と、
     移動体が目的地まで移動するときの経路を作成する経路作成部と
     を備え、
     前記投影制御部は、前記経路作成部で作成された前記経路上に位置する前記投影装置の投影を制御する
     情報処理装置。
    A projection control unit that controls a plurality of projection devices that project a predetermined pattern;
    A route creation unit that creates a route when the moving body moves to the destination, and
    The projection control unit controls projection of the projection device located on the path created by the path creation unit.
  2.  前記投影制御部は、所定の位置に設置されている前記投影装置が投影するパターンを、マーカーパターンとし、前記所定の位置以外に設置されている前記投影装置が投影するパターンを、ランダムパターンとする
     請求項1に記載の情報処理装置。
    The projection control unit sets a pattern projected by the projection apparatus installed at a predetermined position as a marker pattern, and sets a pattern projected by the projection apparatus installed at a position other than the predetermined position as a random pattern. The information processing apparatus according to claim 1.
  3.  前記所定の位置は、曲がり角、または前記目的地付近の少なくとも一方である
     請求項2に記載の情報処理装置。
    The information processing apparatus according to claim 2, wherein the predetermined position is at least one of a corner and the vicinity of the destination.
  4.  前記経路作成部は、前記移動体の幅、長さ、高さ、最小回転半径のうちの少なくとも1つの情報を用いて、前記移動体が移動できる前記目的地までの経路を作成する
     請求項1に記載の情報処理装置。
    The route creation unit creates a route to the destination to which the moving body can move using at least one information of the width, length, height, and minimum turning radius of the moving body. The information processing apparatus described in 1.
  5.  前記目的地は、駐車スペースであり、
     前記経路作成部は、前記移動体が、前記駐車スペースに駐車するためのアプローチを開始する位置まで前記経路として作成する
     請求項1に記載の情報処理装置。
    The destination is a parking space;
    The information processing apparatus according to claim 1, wherein the route creation unit creates the route up to a position where the moving body starts an approach for parking in the parking space.
  6.  所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを保持する保持部と、
     前記移動体から撮影された画像内の特徴点の3次元空間内の位置を示すローカルマップを取得し、前記ローカルマップに基づいて、前記グローバルマップを更新するグローバルマップ更新部を
     さらに備える
     請求項1に記載の情報処理装置。
    A holding unit for holding a global map indicating the position of a feature point in a three-dimensional space within a predetermined area;
    The global map update part which acquires the local map which shows the position in the three-dimensional space of the feature point in the image image | photographed from the said mobile body, and updates the said global map based on the said local map is further provided. The information processing apparatus described in 1.
  7.  前記経路作成部は、前記保持部に保持されている前記グローバルマップを参照して経路を作成する
     請求項6に記載の情報処理装置。
    The information processing apparatus according to claim 6, wherein the route creation unit creates a route with reference to the global map held in the holding unit.
  8.  前記移動体に、前記経路作成部で作成された経路、前記マーカーパターンのマーカー情報、および所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを供給する
     請求項2に記載の情報処理装置。
    3. The information according to claim 2, wherein a global map that indicates a path created by the path creation unit, marker information of the marker pattern, and a position of a feature point in a predetermined area in a three-dimensional space is supplied to the moving body. Processing equipment.
  9.  所定のパターンを投影する複数の投影装置を制御し、
     移動体が目的地まで移動するときの経路を作成する
     ステップを含み、
     前記投影の制御は、作成された前記経路上に位置する前記投影装置の投影を制御することで行われる
     情報処理方法。
    Controlling a plurality of projection devices that project a predetermined pattern;
    Including a step of creating a route when the moving object moves to the destination,
    The projection control is performed by controlling the projection of the projection device located on the created path.
  10.  投影装置で投影された所定のパターンを撮像する撮像部と、
     前記撮像部で撮像された前記パターンを用いて自己位置の推定を行う推定部と、
     他の装置から供給される経路と、推定された前記自己位置に基づき、前記経路上を移動するために各部を制御するための制御信号を生成する制御信号生成部と
     を備える情報処理装置。
    An imaging unit for imaging a predetermined pattern projected by the projection device;
    An estimation unit that estimates a self-position using the pattern imaged by the imaging unit;
    An information processing apparatus comprising: a route supplied from another device; and a control signal generation unit that generates a control signal for controlling each unit to move on the route based on the estimated self-position.
  11.  前記撮像部で撮影された画像内の特徴点の3次元空間内の位置を示すローカルマップを生成するローカルマップ生成部を
     さらに備える請求項10に記載の情報処理装置。
    The information processing apparatus according to claim 10, further comprising a local map generation unit configured to generate a local map indicating a position in a three-dimensional space of a feature point in an image captured by the imaging unit.
  12.  前記他の装置から供給される所定の領域内の特徴点の3次元空間の位置を示すグローバルマップを保持する保持部と、
     前記グローバルマップ内で、前記ローカルマップに対応する位置に変更があったか否かを判定する判定部と
     をさらに備え、
     前記判定部により、変更があったと判定された場合、前記ローカルマップを前記他の装置に供給する
     請求項11に記載の情報処理装置。
    A holding unit for holding a global map indicating the position of a feature point in a three-dimensional space in a predetermined region supplied from the other device;
    A determination unit that determines whether or not the position corresponding to the local map has changed in the global map; and
    The information processing apparatus according to claim 11, wherein when the determination unit determines that there is a change, the local map is supplied to the other apparatus.
  13.  前記経路を、前記ローカルマップで修正する
     請求項11に記載の情報処理装置。
    The information processing apparatus according to claim 11, wherein the route is corrected by the local map.
  14.  前記所定のパターンからマーカーが検出された場合、前記推定部で推定されている前記自己位置を補正する
     請求項10に記載の情報処理装置。
    The information processing apparatus according to claim 10, wherein when a marker is detected from the predetermined pattern, the self-position estimated by the estimation unit is corrected.
  15.  前記制御信号は、ユーザに前記経路を通知する信号である
     請求項10に記載の情報処理装置。
    The information processing apparatus according to claim 10, wherein the control signal is a signal for notifying a user of the route.
  16.  投影装置で投影された所定のパターンを撮像し、
     撮像された前記パターンを用いて自己位置の推定を行い、
     他の装置から供給される経路と、推定された前記自己位置に基づき、前記経路上を移動するために各部を制御するための制御信号を生成する
     ステップを含む情報処理方法。
    Capture a predetermined pattern projected by the projector,
    Use the captured pattern to estimate the self-position,
    An information processing method including a step of generating a control signal for controlling each unit to move on the route based on a route supplied from another device and the estimated self-position.
PCT/JP2016/077426 2015-09-30 2016-09-16 Information processing device, information processing method WO2017057053A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015193359 2015-09-30
JP2015-193359 2015-09-30

Publications (1)

Publication Number Publication Date
WO2017057053A1 true WO2017057053A1 (en) 2017-04-06

Family

ID=58423718

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/077426 WO2017057053A1 (en) 2015-09-30 2016-09-16 Information processing device, information processing method

Country Status (1)

Country Link
WO (1) WO2017057053A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018225177A1 (en) * 2017-06-07 2018-12-13 三菱電機株式会社 Empty space notification device, empty space notification system, and empty space notification method
WO2019144622A1 (en) * 2018-01-26 2019-08-01 京东方科技集团股份有限公司 Parking management system and method
JP2020027321A (en) * 2018-08-09 2020-02-20 三菱重工機械システム株式会社 Travel control device, automatic travel vehicle, operation system, travel control method, and program
CN110972111A (en) * 2018-10-01 2020-04-07 现代自动车株式会社 Method for detecting caller by autonomous vehicle
EP3674182A1 (en) * 2018-12-31 2020-07-01 Hyundai Motor Company System, method, infrastructure, and vehicle for automated valet parking
JP2020102127A (en) * 2018-12-25 2020-07-02 清水建設株式会社 Buried object visualizing system
JP2020152234A (en) * 2019-03-20 2020-09-24 クラリオン株式会社 On-vehicle processing device, and movement support system
WO2021161671A1 (en) * 2020-02-14 2021-08-19 パナソニックIpマネジメント株式会社 Information processing method, information processing system, and information processing device
CN114120692A (en) * 2021-12-01 2022-03-01 航天科工哈尔滨风华有限公司电站设备分公司 Parking lot vacancy indication navigation system and method
WO2023166275A1 (en) * 2022-03-01 2023-09-07 Eloy Limited Method and apparatus for in-vehicle navigation and vehicle bay directions
JP7442381B2 (en) 2020-04-21 2024-03-04 Ihi運搬機械株式会社 Warehousing support method and warehousing support device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010256974A (en) * 2009-04-21 2010-11-11 Mitsubishi Electric Corp Position acquisition transmission device
JP2011186808A (en) * 2010-03-09 2011-09-22 Sony Corp Information processing apparatus, map update method, program, and information processing system
JP2012185202A (en) * 2011-03-03 2012-09-27 Toyota Central R&D Labs Inc Local map generation device, global map generation device and program
JP2013178718A (en) * 2012-02-29 2013-09-09 Casio Comput Co Ltd Parking support system
JP2013178213A (en) * 2012-02-29 2013-09-09 Casio Comput Co Ltd Portable lighting device, and guiding method and program
JP2015096411A (en) * 2013-10-11 2015-05-21 本田技研工業株式会社 Parking support system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010256974A (en) * 2009-04-21 2010-11-11 Mitsubishi Electric Corp Position acquisition transmission device
JP2011186808A (en) * 2010-03-09 2011-09-22 Sony Corp Information processing apparatus, map update method, program, and information processing system
JP2012185202A (en) * 2011-03-03 2012-09-27 Toyota Central R&D Labs Inc Local map generation device, global map generation device and program
JP2013178718A (en) * 2012-02-29 2013-09-09 Casio Comput Co Ltd Parking support system
JP2013178213A (en) * 2012-02-29 2013-09-09 Casio Comput Co Ltd Portable lighting device, and guiding method and program
JP2015096411A (en) * 2013-10-11 2015-05-21 本田技研工業株式会社 Parking support system

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018225177A1 (en) * 2017-06-07 2018-12-13 三菱電機株式会社 Empty space notification device, empty space notification system, and empty space notification method
JPWO2018225177A1 (en) * 2017-06-07 2019-11-07 三菱電機株式会社 Free space notification device, free space notification system, and free space notification method
CN110709910A (en) * 2017-06-07 2020-01-17 三菱电机株式会社 Free space notification device, free space notification system, and free space notification method
WO2019144622A1 (en) * 2018-01-26 2019-08-01 京东方科技集团股份有限公司 Parking management system and method
US11410553B2 (en) 2018-01-26 2022-08-09 Boe Technology Group Co., Ltd. Parking management device, system and method
JP2020027321A (en) * 2018-08-09 2020-02-20 三菱重工機械システム株式会社 Travel control device, automatic travel vehicle, operation system, travel control method, and program
JP7177627B2 (en) 2018-08-09 2022-11-24 三菱重工機械システム株式会社 Driving control device, automatic driving vehicle, operation system, driving control method, and program
CN110972111B (en) * 2018-10-01 2024-04-23 现代自动车株式会社 Method for detecting a caller by an autonomous vehicle
CN110972111A (en) * 2018-10-01 2020-04-07 现代自动车株式会社 Method for detecting caller by autonomous vehicle
JP2020102127A (en) * 2018-12-25 2020-07-02 清水建設株式会社 Buried object visualizing system
EP3674182A1 (en) * 2018-12-31 2020-07-01 Hyundai Motor Company System, method, infrastructure, and vehicle for automated valet parking
KR20200092442A (en) * 2018-12-31 2020-08-04 현대자동차주식회사 Automated Valet Parking System and method, infrastructure and vehicle thereof
CN111508251A (en) * 2018-12-31 2020-08-07 现代自动车株式会社 System, method, infrastructure and vehicle for autonomous valet parking
KR102651412B1 (en) 2018-12-31 2024-03-27 현대자동차주식회사 Automated Valet Parking System and method, infrastructure and vehicle thereof
US11377097B2 (en) 2018-12-31 2022-07-05 Hyundai Motor Company System, method, infrastructure, and vehicle for automated valet parking
CN111508251B (en) * 2018-12-31 2023-02-28 现代自动车株式会社 System, method, infrastructure and vehicle for autonomous valet parking
JP2020152234A (en) * 2019-03-20 2020-09-24 クラリオン株式会社 On-vehicle processing device, and movement support system
JP7393128B2 (en) 2019-03-20 2023-12-06 フォルシアクラリオン・エレクトロニクス株式会社 In-vehicle processing equipment, mobility support system
US20220041179A1 (en) * 2020-02-14 2022-02-10 Panasonic Intellectual Property Management Co., Ltd. Information processing method, information processing system, and information processing device
US11866065B2 (en) * 2020-02-14 2024-01-09 Panasonic Intellectual Property Management Co., Ltd. Information processing method, information processing system, and information processing device
WO2021161671A1 (en) * 2020-02-14 2021-08-19 パナソニックIpマネジメント株式会社 Information processing method, information processing system, and information processing device
JP7442381B2 (en) 2020-04-21 2024-03-04 Ihi運搬機械株式会社 Warehousing support method and warehousing support device
CN114120692A (en) * 2021-12-01 2022-03-01 航天科工哈尔滨风华有限公司电站设备分公司 Parking lot vacancy indication navigation system and method
WO2023166275A1 (en) * 2022-03-01 2023-09-07 Eloy Limited Method and apparatus for in-vehicle navigation and vehicle bay directions

Similar Documents

Publication Publication Date Title
WO2017057053A1 (en) Information processing device, information processing method
CN108732589B (en) Automatic acquisition of training data for object recognition using 3D LIDAR and localization
US9983020B2 (en) Vehicle operation device and method
KR102399591B1 (en) System for determining the location of entrances and areas of interest
CN108628324B (en) Unmanned vehicle navigation method, device, equipment and storage medium based on vector map
JP4771147B2 (en) Route guidance system
US20190228664A1 (en) Vehicle calling system
WO2018142852A1 (en) Movement assistance system, movement assistance device, movement assistance terminal, movement assistance method, map generating system, map generating device, and information acquisition terminal
JPWO2006064544A1 (en) Car storage equipment
RU2746684C1 (en) Parking control method and parking control equipment
KR20140003987A (en) Slam system for mobile robot based on vision sensor data and motion sensor data fusion
CN113997931A (en) Bird&#39;s-eye view image generation device, bird&#39;s-eye view image generation system, and automatic parking device
CN110858452A (en) Parking management system and method
JP2015131713A (en) Management system, flight control method, flight control program, and recording medium
CN110858453A (en) Autonomous parking in an indoor parking facility
US11785430B2 (en) System and method for real-time indoor navigation
JP2010086416A (en) Autonomous movement device
CN110597265A (en) Recharging method and device for sweeping robot
CN112835359B (en) AVP control method and device based on visual SLAM technology
US20190114911A1 (en) Method and system for determining the location of a vehicle
CN110766962A (en) Intelligent vehicle searching method, device and system based on unmanned aerial vehicle and server
JP2008238383A (en) Robot
JP2019135579A (en) Mobile body control system, mobile body, and mobile body control method
JP2001277969A (en) Vehicle guiding method, vehicle guiding system and computer readable storage medium
US11402215B2 (en) Indoor positioning method for a moving apparatus using first and second two-dimensional maps of z-axis areas

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16851227

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16851227

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP