WO2021029016A1 - Guide display method - Google Patents

Guide display method Download PDF

Info

Publication number
WO2021029016A1
WO2021029016A1 PCT/JP2019/031863 JP2019031863W WO2021029016A1 WO 2021029016 A1 WO2021029016 A1 WO 2021029016A1 JP 2019031863 W JP2019031863 W JP 2019031863W WO 2021029016 A1 WO2021029016 A1 WO 2021029016A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
snow
data
display
guide
Prior art date
Application number
PCT/JP2019/031863
Other languages
French (fr)
Japanese (ja)
Inventor
山岡 大祐
田中 一彦
祐 瀧口
瞳 ▲濱▼村
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2021539747A priority Critical patent/JP7212788B2/en
Priority to PCT/JP2019/031863 priority patent/WO2021029016A1/en
Publication of WO2021029016A1 publication Critical patent/WO2021029016A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments

Definitions

  • the present invention relates to a guide display.
  • Patent Document 1 describes an agricultural work machine 1 which is a self-propelled boom sprayer.
  • the captured image A2 taken by the camera 32 attached to the vehicle 10 is superimposed on the guide image A1 by the arithmetic unit 35b of the computer 35 and displayed as AR on the display 36. It is stated that. Further, since the guide image A1 and the actual field shot image A2 are superimposed and the guide image A3 is displayed, the information shown in the guide image A1 (for example, the guide line L, the traveling locus X, the work area S, etc.) ) And the position of crop C, the structure, etc. can be grasped by the worker.
  • the guide image A1 for example, the guide line L, the traveling locus X, the work area S, etc.
  • the guide image A1 is corrected based on the captured image A2 of the grid pattern drawn on the ground of the field which is the moving region of the vehicle 10, and the corrected guide image A1 is superimposed on the captured image A2.
  • An object of the present invention is to provide a mechanism for appropriately displaying a guide for traveling in a state after snowfall.
  • the guide display method includes a first acquisition step of acquiring photographing data representing a state after snow accumulation at a predetermined position, and snow accumulation at the predetermined position based on the photographing data acquired in the first acquisition step.
  • a generation step of generating display data for displaying a guide for traveling in the state after snowfall represented by the shooting data and a generation step. It is characterized by having.
  • the guide display method is a guide display method executed in a guide display system including a server and a vehicle capable of communicating with the server, and the server represents a state after snowfall at a predetermined position. Based on the first receiving step of receiving the shooting data from the vehicle and the shooting data received in the first receiving step, the shooting is performed from a storage means for storing image data representing the state before snow accumulation at the predetermined position.
  • the guide for driving in the state after snowfall can be displayed appropriately.
  • FIG. 1 is a diagram showing a configuration of a guide display system 100 according to the present embodiment.
  • the guide display system 100 is a system that displays guide information indicating a travel path on a photographed image taken by a vehicle traveling on a snow surface in AR (Augmented Reality) and provides the passengers of the vehicle.
  • the guide display system 100 includes a server 101, vehicles 104, 105, 106, and a portable terminal 107.
  • the server 101 is a server for providing the above-mentioned guide display service. Although the server 101 is shown as one device in FIG. 1, it may be composed of a plurality of devices.
  • the server 101 in the present embodiment includes a configuration in which one device provides a server function and a configuration in which a plurality of devices cooperate to provide a server function.
  • the "vehicle” is not limited to that form, and includes a four-wheeled vehicle, a saddle-riding two-wheeled vehicle, and a work vehicle as shown by vehicles 104, 105, and 106 in FIG. Including.
  • the work vehicle is a vehicle for special purposes, such as a snowplow.
  • the passengers of the vehicles 104, 105, 106 can enjoy the guide display service.
  • a portable terminal 107 such as a smartphone can also be connected to the guide display system 100, and a migrant who holds the portable terminal 107 and moves on a snow surface enjoys the guide display service of the guide display system 100. be able to.
  • the passengers of the vehicle include the driver of the vehicle and other occupants.
  • the wireless base station 103 is, for example, a wireless base station provided in an area where the server 101 can provide a guide display service, and can communicate with the vehicles 104, 105, 106, and the portable terminal 107. Further, the server 101 is configured to be able to communicate with the radio base station 103 via the network 102. With such a configuration, for example, the vehicles 104, 105, 106 and the portable terminal 107 can transmit vehicle information such as position information or terminal information to the server 101, and the server 101 can send the vehicles 104, 105, 106, Display data can be transmitted to the portable terminal 107.
  • the network 102 is shown as a single network, but a plurality of networks may be configured by a router or the like. Further, the server 101, the vehicles 104, 105, 106, and the portable terminal 107 can be connected to a network other than the network 102 shown in FIG. 1, and can be connected to, for example, the Internet.
  • the communication device 108 is, for example, a roadside device provided along the road, and for example, communication with at least one of the vehicles 104, 105, 106, and the portable terminal 107 by DSRC (Dedicated Short Range Communication) (road-to-vehicle communication, etc.). Is possible.
  • the communication device 108 is used to transmit the vehicle information of the vehicle 104 to the server 101 and to transmit the road surface condition information (snow cover state, frozen state, etc.) to the server 101.
  • the guide display system 100 is not limited to the configuration shown in FIG. 1, and may include other devices as appropriate.
  • a database server for map information, traffic information, or weather information may be included, and the server 101 may be configured to have access to those database servers.
  • FIG. 2 is a diagram showing the configuration of the vehicle 106, which is a snowplow.
  • the vehicle 106 is a crawler type snow removal vehicle that collects snow by an auger and projects snow from a shooter. A part of the snow removal mechanism of the vehicle 106 will be described.
  • the vehicle 106 has a snow removal unit 201, which includes an auger 203, a blower 204, a sled 210, and a shooter 205 driven by an engine 221.
  • the power of the engine 221 is transmitted in the order of the small diameter pulley 207, the belt 208, the large diameter pulley 209, the drive shaft 206, and the auger shaft 202 to rotate the auger 203.
  • the vehicle 106 By rotating, the auger 203 collects the snow on the road in the front and back directions of the drawing and sends it to the blower 204, and the centrifugal force of the blower 204 projects the snow to the outside through the shooter 205.
  • the vehicle 106 also has an engine cover 220, an engine air-cooled fan 222, an air-cooled fan shaft 223, and an output pulley connected to the drive wheels 217.
  • the vehicle 106 has a body frame 219 connected to the rear part of the crawler frame 212, which is provided with a floating wheel 211 at the front and three lower wheels 213, 214, 215 at the bottom, respectively, and is connected to the front part of the body frame 219.
  • the snow removal unit 201 is configured in.
  • the pivot shaft 218 connects the vehicle body frame 219 to the crawler frame 212.
  • a drive wheel 217 is configured at the rear of the body frame 219, and a crawler belt 216 is wound between the drive wheel 217 and the idler wheel 211, whereby the vehicle 106 is configured as a crawler type snow removal vehicle.
  • the vehicle 106 is equipped with a headlamp 225 and a sub headlamp 226 capable of illuminating the front.
  • the operation unit 224 is a human-machine interface capable of receiving instructions for controlling each unit related to the traveling mechanism, the snow removal mechanism, and the like of the vehicle 106.
  • the vehicle 106 further includes a control unit 229.
  • the control unit 229 acquires the detection signal from the detection units 227 and 228 for recognizing the external environment attached to the vehicle 106.
  • the detection units 227 and 228 are, for example, cameras and sensors.
  • the vehicle 106 may be a type on which the operator is boarded or a hand-push type by the operator. Further, the vehicle 106 may be a vehicle capable of fully automatic driving in the absence of a driver. If the type is for the operator to board, a boarding unit (not shown) including a seat, a control unit such as a steering wheel and pedals, and an operation unit 224 is provided at the rear of the engine cover 220.
  • a boarding unit including a seat, a control unit such as a steering wheel and pedals, and an operation unit 224 is provided at the rear of the engine cover 220.
  • FIG. 3 is a block diagram showing the configuration of the vehicle 106, which is a snowplow.
  • the control unit 300 includes a main processor 301 and a memory 302.
  • the processor 301 reads the program stored in the storage unit 315 into the memory 302 and executes it, so that the control unit 300 controls the vehicle 106 in an integrated manner.
  • the control unit 300 corresponds to the control unit 229 in FIG.
  • the external world recognition camera 305 and the external world recognition sensor 306 of FIG. 3 correspond to the detection units 227 and 228 of FIG.
  • the operation unit 314 of FIG. 3 corresponds to the operation unit 224 of FIG.
  • the drive control unit 303 of FIG. 3 corresponds to the engine 221 of FIG.
  • the traveling mechanism 310 and the snow removing mechanism 311 correspond to the respective parts of the traveling mechanism and the snow removing mechanism described in FIG.
  • the lighting unit 312 includes a notification mechanism such as a headlamp 225 and a sub headlamp 226, and a direction indicator (not shown).
  • the display control unit 304 controls the display on the display unit 313 based on the display data.
  • the storage unit 315 stores various parameters, data, etc. in addition to the above program.
  • the operation unit 314 includes a steering wheel and pedal for receiving a maneuvering operation from a passenger, an instrument panel, a panel (display unit 313) for displaying various user interface screens, and an input unit for receiving a setting operation.
  • the communication interface (I / F) 309 includes an antenna for communicating with the radio base station 103 and a signal format conversion unit.
  • the GPS sensor 307 is a GPS (Global Positioning System) sensor, and detects the current position of the vehicle 106.
  • the gyro sensor 308 detects the posture of the vehicle 106.
  • the posture of the vehicle 106 is, for example, a roll angle, a pitch angle, and a yaw angle.
  • the external world recognition camera 305 is a camera that images the external environment, and is attached so that it can image the front, side, and rear of the vehicle 106.
  • the external world recognition sensor 306 is, for example, an ultrasonic sensor, and detects the state of the snow-covered surface, for example, the snow-covered surface is hill-shaped by detecting the reflected wave of the ultrasonic wave emitted in front of the vehicle 106. It is possible. Further, the external world recognition sensor 306 is, for example, an infrared sensor provided above the shooter 205, for example, at the position of the detection unit 227 in FIG. 2, and detects the snow depth in front by projecting infrared rays onto the snow surface. It is possible.
  • the outside world recognition camera 305 and the outside world recognition sensor 306 are not limited to the positions of the detection units 227 and 228 in FIG. For example, it may be on the side surface of the vehicle 106, near the headlamp 225, or near the tip of the shooter 205. For example, by attaching it to a portion close to the tip of the shooter 205, even when the snow depth is about 2 m, it is possible to take an image of a distance in front of the passenger of the vehicle 106, which is a blind spot, with a camera.
  • FIGS. 2 and 3 the configuration of the vehicle 106, which is a snowplow, has been described. However, if it is provided with a configuration for recognizing the outside world (camera, etc.), a configuration for recognizing position information and attitude information (GPS sensor, gyro sensor, etc.), a display unit, and a communication I / F, the vehicle 106
  • the configuration is not limited to that of a four-wheeled vehicle such as the vehicle 104, a two-wheeled vehicle such as the vehicle 105, and a portable terminal 107.
  • FIGS. 2 and 3 an example in which the vehicle 106 has a configuration for recognizing the outside world has been described, but the configuration for recognizing the outside world with respect to the vehicle 106 may be removable.
  • a smartphone equipped with a camera or GPS may be attached to the vehicle 106 by an attachment or the like.
  • the vehicle 106 is not limited to a snowplow as long as it is a vehicle for special purposes, and may be, for example, an agricultural work machine.
  • a configuration example of a four-wheeled vehicle will be described as a vehicle other than the vehicle 106.
  • FIG. 4 is a block diagram showing an example of the configuration of the vehicle 104, which is a four-wheeled vehicle.
  • the control unit 400 includes a main processor 401 and a memory 402.
  • the processor 401 reads the program stored in the storage unit into the memory 402 and executes it, so that the control unit 400 controls the vehicle 104 in an integrated manner.
  • the control unit 400 includes an outside world recognition unit 403, an action planning unit 404, a drive control unit 405, and a device control unit 406.
  • Each block is realized by one ECU or a plurality of ECUs.
  • the outside world recognition unit 403 recognizes the outside world information of the vehicle 104 based on the signals from the outside world recognition camera 407 and the outside world recognition sensor 408.
  • the external world recognition camera 407 is, for example, a camera that photographs the front of the vehicle 104, and is a camera that is attached to the vehicle interior side of the front window at the front of the roof.
  • the external world recognition sensor 408 is, for example, a LIDAR (Light Detection and Ringing) or a millimeter-wave radar that detects a target around the vehicle 104 and measures the distance from the target.
  • the outside world recognition unit 403 recognizes, for example, free spaces such as intersections, railroad crossings, tunnels, and shoulders, and the behavior (speed and traveling direction) of other vehicles based on the signals from the outside world recognition camera 407 and the outside world recognition sensor 408. To do.
  • the GPS sensor 409 detects the current position of the vehicle 104
  • the gyro sensor 410 detects the posture of the vehicle 104.
  • the posture of the vehicle 104 is, for example, a roll angle, a pitch angle, and a yaw angle.
  • the action planning unit 404 plans the actions of the vehicle 104 such as the optimum route and the risk avoidance route based on the position information detected by the external world recognition unit 403 and the GPS sensor 409.
  • the action planning unit 404 performs, for example, an approach determination based on the start point and the end point of an intersection or a railroad crossing, and an action plan based on the behavior prediction of another vehicle.
  • the drive control unit 405 controls the drive force output mechanism 412, the steering mechanism 413, and the brake mechanism 414 based on the action plan by the action planning unit 404.
  • the driving force output mechanism 412 is, for example, a power plant
  • the steering mechanism 413 is, for example, an electric power steering device
  • the brake mechanism 414 is, for example, a disc brake device.
  • the device control unit 406 controls the device connected to the control unit 400.
  • the device control unit 406 controls the speaker 415 to output a predetermined voice message such as a warning or a message for navigation.
  • the device control unit 406 controls the display device 416 to display various interface screens.
  • the device control unit 406 controls the navigation device 417 and acquires the setting information in the navigation device 417.
  • the communication interface (I / F) 411 includes an antenna for communicating with the radio base station 103 and a signal format conversion unit.
  • the control unit 400 may appropriately include functional blocks other than those shown in FIG. 4, and may include, for example, an optimum route calculation unit that calculates the optimum route to the destination based on the map information. Further, the control unit 400 may acquire the outside world information from other than the outside world recognition camera 407 and the outside world recognition sensor 408. For example, the control unit 400 may acquire the outside world information via another vehicle. Further, the control unit 400 receives detection signals from not only the GPS sensor 409 and the gyro sensor 410 but also various sensors provided in the vehicle 104. For example, the control unit 400 receives the detection signals of the door open / close sensor and the door lock mechanism sensor provided on the door portion of the vehicle 104 via the ECU configured in the door portion. As a result, the control unit 400 can detect the unlocking of the door and the opening / closing operation of the door.
  • FIG. 5 is a block diagram showing the configuration of the server 101.
  • the control unit 500 includes a main processor 505, which is a CPU and GPU, and a memory 506, which is a ROM and RAM, and controls the server 101 in an integrated manner.
  • the operation of the present embodiment is realized by the processor 505 reading the program stored in the storage unit 501 into the memory 506 and executing the program.
  • the server 101 can be a computer that implements the present invention relating to a program stored in a storage medium.
  • the pre-snow state acquisition unit 507 acquires pre-snow image data representing the pre-snow state at a predetermined position.
  • the pre-snow cover state acquisition unit 507 acquires, for example, by the user taking a picture of a state at a predetermined position without snow cover. Alternatively, the pre-snow state acquisition unit 507 acquires street view data taken before snow cover as pre-snow image data by a drone or the like.
  • the pre-snow state acquisition unit 507 stores the acquired pre-snow image data in the storage unit 501 as pre-snow image data 514.
  • the pre-snow state acquisition unit 507 periodically acquires pre-snow image data and updates the pre-snow image data 514.
  • the shooting data acquisition unit 508 acquires shooting data shot by the vehicles 104, 105, 106, and the portable terminal 107.
  • the shooting data acquired by the shooting data acquisition unit 508 is shooting data obtained by shooting the snow cover state in front of the traveling directions of the vehicles 104, 105, 106 and the moving direction of the holder of the portable terminal 107. ..
  • the shooting data acquisition unit 508 stores the acquired shooting data in the storage unit 501 as shooting data 517.
  • the vehicle information acquisition unit 509 acquires vehicle information from the vehicles 104, 105, and 106.
  • the vehicle information is, for example, vehicle position information, vehicle behavior information, vehicle attitude information, and vehicle attribute information.
  • the position information of the vehicle is, for example, latitude and longitude information
  • the information regarding the behavior of the vehicle is, for example, speed and acceleration information, driving information of each part of the vehicle, and detection information such as a camera and a sensor.
  • the information regarding the posture of the vehicle is, for example, tilt information of the vehicle having a roll angle, a pitch angle, and a yaw angle.
  • the vehicle attribute information is, for example, information regarding the size (small, medium, or large) of the vehicle 106 of the snowplow.
  • the vehicle information acquisition unit 509 acquires terminal information from the portable terminal 107.
  • the terminal information includes, for example, position information of the portable terminal and information regarding the behavior of the portable terminal.
  • the position information of the portable terminal is, for example, latitude and longitude information
  • the information regarding the behavior of the portable terminal is, for example, speed and acceleration information, detection information of a camera, a sensor, and the like.
  • the vehicle 106 will be described as a representative example of the vehicles 104, 105, 106, and the portable terminal 107. Therefore, the vehicle information in the explanation can be replaced as terminal information.
  • the vehicle information acquisition unit 509 stores the acquired vehicle information in the storage unit 501 as vehicle information 518 (or terminal information).
  • the image recognition unit 510 analyzes the shooting data acquired by the shooting data acquisition unit 508 and performs image recognition.
  • the image recognition unit 510 extracts, for example, an object from the image represented by the shooting data. The extraction of objects by the image recognition unit 510 will be described later.
  • the AR object generation unit 511 generates an AR object based on the pre-snow image data acquired by the pre-snow state acquisition unit 507. The generation of the AR object by the AR object generation unit 511 will be described later.
  • the display data generation unit 512 generates display data for AR display based on the shooting data acquired by the shooting data acquisition unit 508 and the AR object generated by the AR object generation unit 511. The generation of display data by the display data generation unit 512 will be described later.
  • the environmental information acquisition unit 513 acquires environmental information regarding the environment of the vehicle's driving path.
  • the environmental information is, for example, information on snow cover, information on moving objects (pedestrians, etc.) around the vehicle's travel path, slope information on the travel path, and width information.
  • the environmental information acquisition unit 513 acquires environmental information from, for example, a communication device 108 installed near the road, a road surface condition sensor, or an external database server.
  • the environmental information acquisition unit 513 may acquire information on snow cover from the vehicle 106.
  • the information on snow cover is, for example, information on snow depth and snow quality (hardness, etc.).
  • the environmental information acquisition unit 513 stores the acquired environmental information in the storage material 501 as the environmental information 515.
  • the environmental information acquisition unit 513 periodically acquires the environmental information and updates the environmental information 515.
  • the storage unit 501 stores programs, various parameters, data, and the like. Further, as described above, the storage unit 501 stores the image data 514 before snowfall, the environmental information 515, the shooting data 517, and the vehicle information 518. Since the shooting data 517 and the vehicle information 518 are information acquired from each vehicle, they are stored in association with each other as information 516 for each vehicle. In addition, the information 516 for each vehicle also includes snow removal amount information 519, which will be described later.
  • the display unit 502 is, for example, a display and displays various user interface screens.
  • the operation unit 503 is, for example, a keyboard or a pointing device, and accepts user operations.
  • the communication interface (I / F) 504 is an interface for enabling communication with the network 102, and has a configuration according to the medium of the network 102.
  • the four-wheeled vehicle 104 travels on a snow-covered road and the snowplow 106 is snow-removing on a snow-covered road.
  • the snow depth is several centimeters to several tens of centimeters
  • the road is completely covered with snow, and it becomes difficult to recognize the boundary between the road and the non-road.
  • a gutter, an irrigation canal, a curb, a flower bed, a tree planting, or the like may exist. In such cases, the occupants of the vehicle may unintentionally travel over them.
  • the outside world recognition camera 407 and the outside world recognition sensor 408 of the four-wheeled vehicle can recognize the boundary between the traveling road and the non-traveling road.
  • an AR object is generated from the pre-snow image data previously photographed by the user himself or in Street View or the like at a place corresponding to the traveling position of the vehicle, and the vehicle photographs the AR object.
  • the AR is displayed by superimposing it on the shooting data.
  • the vehicle 106 will be described as a representative example of the vehicles 104, 105, 106, and the portable terminal 107.
  • FIG. 6 is a diagram for explaining the generation of AR display data in the present embodiment.
  • the shooting data 601 is shooting data taken by the vehicle 106.
  • FIG. 16 is a diagram showing an example of shooting data 601.
  • the shooting data 601 is shooting data obtained by shooting the front of the vehicle 106, and as shown in FIG. 16, one surface is covered with snow, and the traveling path cannot be recognized.
  • the pre-snow image data 602 is data stored in the storage unit 501 of the server 101, and is, for example, street view data before snow cover corresponding to the position of the photographed image of the photographed data 601.
  • FIG. 17 is a diagram showing an example of pre-snow image data 602. As shown in FIG. 17, the shooting data 601 of FIG. 16 shows the state before snowfall, and the traveling path and the gutter can be recognized.
  • the AR object 603 is generated from the pre-snow cover image data 602, and the generated AR object 603 is superimposed on the shooting data 601 to generate the AR display data 604.
  • FIG. 18 is a diagram showing an example of the AR object 603
  • FIG. 19 is a diagram showing an example of the AR display data 604. As shown in FIG. 19, in the AR display data 604, the AR object 603 of FIG. 18 is superimposed on the shooting data 601 of FIG. Then, the AR display data 604 of FIG. 19 is displayed on the display unit 313 of the vehicle 106.
  • the passenger of the vehicle 106 can easily recognize which part of the snow surface is the traveling path and which is not the traveling path. Can be done.
  • FIG. 7 is a flowchart showing a process of requesting the server 101 to display AR, which is executed in the vehicle 106.
  • the process of FIG. 7 is realized, for example, by the processor 301 loading the program stored in the storage unit 315 into the memory 302 and executing the program.
  • the control unit 300 determines whether or not an instruction to perform AR display has been received. This determination may be determined, for example, depending on whether or not a menu item (such as an "AR display implementation" item) on the user interface screen displayed on the display unit 313 of the vehicle 106 is selected by the operation unit 314. ..
  • the above user interface screen is, for example, a screen displayed by starting an application provided by the server 101. If it is determined that the instruction to display the AR is accepted, the process proceeds to S102, and if it is determined that the instruction to display the AR is not accepted, the process of FIG. 7 is terminated.
  • the control unit 300 acquires shooting data in front of the vehicle 106 in the traveling direction. At that time, the shooting data may be still image data or moving image data.
  • the control unit 300 transmits a request for AR display data to the server 101 together with the identification information of the vehicle 106.
  • the control unit 300 transmits the vehicle information and the shooting data acquired in S102 to the server 101 together with the identification information of the vehicle 106.
  • the vehicle information is, for example, position information detected by the GPS sensor 307.
  • the vehicle information may include other information, for example, information on the posture and behavior of the vehicle 106 detected by the gyro sensor 308, and information on snow cover detected by the external world recognition sensor 306.
  • the control unit 300 receives the AR display data from the server 101, and in S106, AR displays based on the received AR display data.
  • the control unit 300 determines whether or not to continue the AR display. For example, when the control unit 300 detects that the vehicle 106 has stopped and the instruction to end the AR display has been received, the control unit 300 determines that the AR display will not be continued. When it is determined that the AR display is to be continued, the process from S102 is repeated, and when it is determined that the AR display is not to be continued, the process of FIG. 7 is terminated.
  • FIG. 8 is a flowchart showing a process of generating AR display data executed on the server 101.
  • the process of FIG. 8 is realized, for example, by the processor 505 loading the program stored in the storage unit 501 into the memory 506 and executing the program.
  • control unit 500 determines whether or not the request for AR display data has been received. If it is determined that the request for the AR display data has been received, the process proceeds to S202, and if it is determined that the request for the AR display data has not been received, the process of S201 is repeated.
  • the control unit 500 acquires the vehicle information and stores it as the vehicle information 518 in the storage unit 501 together with the identification information of the vehicle 106.
  • the control unit 500 acquires the shooting data and stores the shooting data 517 in the storage unit 501 together with the identification information of the vehicle 106.
  • the control unit 500 acquires the pre-snow image data.
  • FIG. 9 is a flowchart showing a process of acquiring pre-snow image data of S204.
  • the control unit 500 performs image recognition on the shooting data acquired in S203, and extracts an object in S302.
  • image recognition a plurality of objects such as trees, buildings, and utility poles can be recognized.
  • the control unit 500 extracts at least two objects in the image of the shooting data. For example, in the case of the shooting data of FIG. 16, the object 801 and the object 802 are extracted. By extracting in this way, as will be described later, it is possible to detect a directional deviation between the photographed data and the pre-snow image data.
  • the control unit 500 determines whether or not the pre-snow image data including the object extracted in S302 exists. In S303, the control unit 500 targets the pre-snow image data corresponding to the position of the vehicle 106 among the pre-snow image data 514 stored in the storage unit 501, and the pre-snow image data including the object extracted in S302 is generated. Determine if it exists. When it is determined that the pre-snow image data including the object extracted in S302 exists, the control unit 500 acquires the pre-snow image data in S306. After S306, the process of FIG. 9 is completed, and the process proceeds to S205 of FIG.
  • the process proceeds to S304.
  • the pre-snow cover image data of FIG. 17 includes objects 901 and 902 corresponding to the objects 801 and 802 extracted from the shooting data of FIG. In that case, it is determined that the pre-snow image data including the object extracted in S302 exists.
  • the control unit 500 determines whether or not the determination of S303 has been completed for all the extracted objects as a result of image recognition of the shooting data. If it is determined that all the extracted objects are not finished, the process returns to S302 and other objects are extracted. On the other hand, when it is determined that all the extracted objects have been completed, the AR object in S205 of FIG. 8 cannot be generated, so the notification message data indicating that the AR cannot be displayed is sent to the control unit 300 of the vehicle 106. Send. After S305, the processing of FIGS. 9 and 8 is completed. The control unit 300 of the vehicle 106 that has received the notification message data displays the notification message on the display unit 313.
  • the notification message data may be display data or voice output data.
  • the pre-snow image data corresponding to the position of the vehicle 106 is acquired.
  • the control unit 500 generates an AR object based on the pre-snow image data acquired in S204.
  • FIG. 10 is a flowchart showing a process of generating the AR object of S205.
  • the control unit 500 compares the shooting data acquired in S203 with the pre-snow cover image data acquired in S306.
  • S402 as a result of comparison, does the control unit 500 have a directional deviation between the shooting data acquired in S203 and the pre-snow image data acquired in S306 due to a difference in shooting position and shooting angle? Judge whether or not.
  • the judgment of the deviation of the direction is performed as follows. As shown in FIG. 11, when two objects 701 and 702 are photographed from the direction A, it is assumed that the distance between the objects in the photographed image is the distance 703. On the other hand, when shooting from the direction B rotated by an angle ⁇ 1 minutes, the distance between the objects in the captured image is 704, and when shooting from the direction C rotated by an angle ⁇ 2 minutes, the distance between the objects in the captured image is 704. The distance is 705. As shown in FIG. 11, as the rotation angle increases, the distance between objects in the captured image becomes shorter. In other words, it can be said that there is a correlation between the directionality and the distance between objects in the captured image.
  • the control unit 500 corrects the pre-snow image data according to the deviation of the directionality.
  • the correction here is a correction for matching the pre-snow image data acquired in S306 with the directionality of the shooting data acquired in S203, that is, eliminating the deviation determined in S402, for example. It is a rotation of an angle ⁇ 1 minute in FIG. After the rotation is performed based on the shooting coordinates of the pre-snow image data and the position of the vehicle 106, the movement in the shooting direction (for example, direction B) (approaches the objects 701 and 702 or moves away from the objects 701 and 702). ) May be performed.
  • the control unit 500 acquires the AR display mode specified by the passenger of the vehicle 106.
  • the AR display mode is information about whether or not to display an additional object described later.
  • the AR display that enables the distinction between the traveling road and the non-traveling road as shown in FIG. 19 is set as the standard AR display mode.
  • a display mode for warning the inclination of the vehicle 106, the presence of a moving object such as a person, or the like by the AR display is possible.
  • the selection information of those display modes is transmitted together with the request for AR display data in S103 of FIG. 7, for example. Here, it is described that only the standard AR display mode is selected.
  • control unit 500 generates an AR object for track identification using the image recognition result by the image recognition unit 510.
  • FIG. 18 is a diagram showing an example of an AR object.
  • the image recognition unit 510 of the server 101 performs image recognition on the pre-snow image data 602 of FIG. 17 and extracts an object.
  • the object extracted here is an object for recognizing the boundary between the traveling path and the non-traveling path, and is, for example, a gutter, an irrigation canal, a curb, a flower bed, or a tree planting.
  • These objects to be image-recognized are learned in advance by deep learning or the like on the server 101, and the image recognition unit 510 extracts those objects from the pre-snow image data 602 using the trained model. ..
  • the image recognition unit 510 recognizes the gutter 905, the bollard 903, and the road end 904 as a result of the image recognition of the pre-snow image data 602 in FIG. Then, the image recognition unit 510 generates the AR objects 1001, 1002, 1003, 1004, 1005 of FIG.
  • the AR object 1001 corresponds to the gutter 905, and the AR object 1002 corresponds to the roadside edge 904.
  • the AR object 1003 corresponds to the bollard 903.
  • the AR object 1004 is a warning mark indicating that the vehicle cannot travel (cannot enter), and is arranged in the vicinity of the AR objects 1001, 1003, 1002 corresponding to the gutter 905, the bollard 903, and the road end 904.
  • the AR object 1005 is an AR object for representing a travelable area.
  • the AR object 1005 has the shape of an arrow pointing in the traveling direction, it does not have to have the shape shown in FIG. 18 as long as it indicates that the AR object 1005 can travel.
  • the AR object 1005 may be dynamically displayed so that the colors are sequentially blinked toward the back side of the screen.
  • the control unit 500 determines whether or not to generate an additional AR object based on the AR display mode acquired in S404. Here, if it is determined that the additional AR object is to be generated, the process proceeds to S407, and the additional AR object is generated. The processing of S407 will be described later. On the other hand, if it is determined that the additional AR object is not generated, the process of FIG. 10 is terminated and the process proceeds to S206 of FIG. When only the standard AR display mode is selected as described above, it is determined in S406 that no additional AR object is generated.
  • control unit 500 In S206 of FIG. 8, the control unit 500 generates AR display data. As described with reference to FIG. 6, the control unit 500 generates AR display data by superimposing the AR object generated in S205 on the shooting data acquired in S203.
  • the control unit 500 transmits the AR display data generated in S206 to the vehicle 106 via the communication I / F 504.
  • the control unit 300 of the vehicle 106 causes the display unit 313 to display the AR display screen based on the received AR display data.
  • the process of FIG. 8 ends.
  • the description of the AR object may be displayed according to an operation such as matching or clicking an icon with each AR object on the screen by the passenger of the vehicle 106. For example, by matching the icon to the AR object 1003, a balloon display such as "It is a car stop. You cannot pass due to a dead end.” May be displayed. Further, the description of the icon may be presented by another method such as in the user interface screen of the application provided in advance from the server 101.
  • FIG. 12 is a flowchart showing a process of generating the additional AR object of S407. Subsequent determinations of S501, S503, S505, and S507 are performed based on, for example, the selection state of a menu item on the user interface screen when receiving an instruction to perform AR display. For example, on the user interface screen, a submenu item called “additional AR display” is further provided in addition to the menu item “implementation of AR display", and the detailed items are "warning", “work line”, “snow cover information", and “snow removal”. It is displayed so that the necessity of each display of "amount" can be selected. The selection information is received, for example, in S201 together with the request for AR display data.
  • control unit 500 determines whether or not to display the warning object based on the above selection information. If it is determined that the warning object is to be displayed, the process proceeds to S502 and the warning object is generated. On the other hand, if it is determined that the warning object is not displayed, the warning object is not generated and the process proceeds to S503.
  • the warning object is an AR object for giving a warning notification by AR display when the inclination of the vehicle 106 becomes equal to or higher than a predetermined value.
  • the vehicle may tilt depending on the degree of snow depth and the quality of the snow, and if the degree of tilt is large, it may cause a fall or the like. Therefore, in the present embodiment, the control unit 500 monitors the tilted state of the vehicle 106, and when the amount exceeds a predetermined value, causes the vehicle 106 to display a warning object.
  • the control unit 500 monitors the inclination of the vehicle 106 based on the vehicle information of the vehicle 106. For example, the control unit 500 acquires information on the posture from the vehicle 106 at predetermined time intervals. The process of FIG. 12 proceeds from S502 to S503 in parallel with the monitoring of the inclination of the vehicle 106 being performed.
  • FIG. 13 is a flowchart showing the process of monitoring the inclination of the vehicle 106.
  • the control unit 500 acquires information on the inclination (posture) from the vehicle 106. Then, in S602, it is determined whether or not the information regarding the inclination acquired in S601, for example, the roll angle or the pitch angle is equal to or greater than the threshold value. If it is determined that the threshold value is not equal to or higher than the threshold value, the process of S601 is repeated. On the other hand, if it is determined that the threshold value is equal to or higher than the threshold value, the control unit 500 generates a warning object in S603.
  • FIG. 22 is a diagram showing an example of a warning object that warns of the inclination of the vehicle 106.
  • the warning object 1301 is displayed.
  • the warning object 1301 is dynamically displayed so as to repeatedly tilt in the arrow direction, but the display form is not limited as long as it warns of the tilt of the vehicle 106.
  • Other display forms such as blinking display and text display may be used.
  • the warning object is an AR object for giving a warning when there is a moving object such as a person at a position that is a blind spot from the passenger of the vehicle 106.
  • a moving object such as a person at a position that is a blind spot from the passenger of the vehicle 106.
  • the snow depth is several meters, and even if there is a person on the other side of the walled snow, it may not be visible to the passengers of vehicle 106. If the passenger removes snow without being aware of the existence of such a person, the snow projected from the shooter 205 may be dropped onto the person's head. Therefore, in the present embodiment, when it is estimated that a moving object exists in the blind spot from the passenger of the vehicle 106, the control unit 500 displays a warning object to that effect in AR.
  • FIG. 14 is a flowchart showing a process of generating the warning object of S502.
  • the control unit 500 acquires environmental information around the position of the vehicle 106.
  • the environmental information is, for example, information regarding the existence of a moving body around the position of the vehicle 106.
  • the moving body is, for example, a pedestrian, a bicycle, or a vehicle.
  • the control unit 500 may be acquired from a communication device 108 such as a roadside unit, or may be acquired from an external database server.
  • the control unit 500 may acquire information regarding the existence of the moving body from the vehicle information.
  • the control unit 500 may acquire image data of a camera attached to a portion of the vehicle 106 near the tip of the shooter 205.
  • control unit 500 determines whether or not the moving body is recognized based on the environmental information acquired in S701. Here, if it is determined that the moving body has been recognized, the process proceeds to S703. On the other hand, when it is determined that the moving object is not recognized, it is not necessary to generate the warning object, so the process of FIG. 14 is terminated and the process proceeds to S503 of FIG.
  • the control unit 500 performs image recognition of the shooting data acquired in S203, and determines in S704 whether or not the moving object is recognized on the shooting data.
  • a warning object is generated in S705.
  • the case of proceeding to S705 is, for example, a state in which it is recognized from the environmental information that a pedestrian exists in the right front in the traveling direction of the vehicle 106, and is also recognized in the shooting data by the vehicle 106. Therefore, in S705, the control unit 500 generates a warning object so that the pedestrian object recognized on the shooting data can be easily identified by the passenger, such as surrounding the pedestrian object with a quadrangle.
  • a warning object is generated in S706.
  • the case of proceeding to S706 is, for example, a state in which it is recognized from the environmental information that a pedestrian exists in the right front in the traveling direction of the vehicle 106, but is not recognized in the shooting data data of the vehicle 106. This may be the case when the snow depth on the right front of the vehicle 106 is several meters and the pedestrian on the other side of the snow is invisible to the passengers, as shown in FIG. Therefore, in S706, a warning object is generated as in the warning object 1401 of FIG. 23, and a message such as "There is a person on the other side of the snow! Is displayed in the vicinity thereof.
  • warning display By such a warning display, it is possible to make the passenger recognize the moving body existing at the position where the passenger of the vehicle 106 becomes a blind spot. Further, the warning message 1401 may be dynamically displayed by blinking or the like to enhance the recognition effect.
  • control unit 500 determines whether or not to display the work line based on the selection information. If it is determined that the work line is to be displayed, the process proceeds to S504 and the work line object is generated. On the other hand, if it is determined that the work line is not displayed, the work line object is not generated and the process proceeds to S505.
  • the work line object is a snow removal work line of the vehicle 106, which is a snow remover.
  • the vehicle 106 is a small snowplow and the travel path width is wide, the work is performed while reciprocating a certain distance.
  • the control unit 500 AR-displays the work line as in the work line object 1101. Let me.
  • the control unit 500 generates a work line object according to the vehicle information acquired from the vehicle 106 and the environmental information. For example, a medium-sized snowplow and a small-sized snowplow have different numbers of round trips for the same road width. Therefore, in S504, the control unit 500 determines the number of round trips based on the size of the snowplow obtained from the vehicle information and the travel path width obtained from the environmental information, and generates a work line object according to the number of round trips. .. For example, according to the work line object 1101 of FIG. 20, after making three round trips, the user is guided to exit the traveling path. In this way, the passenger of the vehicle 106 can be easily made to recognize the route going out of the traveling path from the portion without the gutter 905.
  • the display of the work line object 1101 may be changed according to the traveling of the vehicle 106. For example, when the vehicle 106 is traveling on a line that greatly deviates from the work line of the work line object 1101, the color of the work line object 1101 is blinked in red, and "off the line" or the like is displayed. You may want to display a message. After S504, the process proceeds to S505.
  • control unit 500 determines whether or not to display the information regarding the snow cover based on the selection information. If it is determined that the information regarding the snow cover is to be displayed, the process proceeds to S506, and the snow cover information object is generated. On the other hand, if it is determined that the information regarding the snow cover is not displayed, the snow cover information object is not generated and the process proceeds to S507.
  • Snow quality such as the density of snow
  • the control unit 500 in addition to displaying the traveling road and the non-traveling road in an identifiable manner, for example, as shown in FIG. 21, the control unit 500, like the snow information objects 1201, 1202, 1203, snow.
  • AR display the difference in quality.
  • the control unit 500 generates snow information objects 1201, 1202, 1203 according to the snow quality information. For example, the snow information objects 1201, 1202, and 1203 indicate that the snow density increases (hardness increases) in that order.
  • the snow quality information may be acquired from the vehicle information (sensor information) from the vehicle 106 or from the environmental information.
  • the display as shown in FIG. 21 makes it possible for the passengers of the vehicle 106 to easily recognize that the snow quality becomes harder as the distance from the traveling path increases.
  • the display of the object 1005 is omitted in FIG. 21 for the purpose of explaining this display, the object 1005 may be displayed together.
  • Information on snow cover is not limited to snow quality, but may be other information.
  • it may be snow depth.
  • the control unit 500 displays the snow depth information in AR in addition to displaying the traveling road and the non-traveling road in an distinguishable manner.
  • a numerical value such as "1 m” may be displayed beside an object recognized as a snowy mountain covered with snow in a hill shape.
  • contour lines 50 cm line, 1 m line, etc. may be superimposed on the object recognized as the snowy mountain.
  • the control unit 500 determines whether or not to display the amount of snow removal based on the selection information. If it is determined that the snow removal amount is to be displayed, the process proceeds to S508 and the snow removal amount object is generated. On the other hand, if it is determined that the snow removal amount is not displayed, the snow removal amount object is not generated, the process of FIG. 12 is terminated, and the process proceeds to S206 of FIG.
  • FIG. 15 is a flowchart showing an example of a process for generating a snow removal amount object.
  • the control unit 500 determines whether or not snow is being removed based on the vehicle information from the vehicle 106. If it is determined that snow is being removed, the process proceeds to S802. On the other hand, when it is determined that the snow is not being removed, such as when the snow is stopped, the processing of FIGS. 15 and 12 is terminated, and the process proceeds to S206 of FIG.
  • the control unit 500 acquires snow removal amount information.
  • the control unit 500 may acquire the engine torque as the snow removal amount information from the vehicle information from the vehicle 106.
  • the control unit 500 estimates the amount of snow removal based on the acquired engine torque.
  • the control unit 500 creates a snow removal amount object according to the snow removal amount estimated in S803.
  • the snow removal amount object is an object that indicates a numerical value, and is an object for displaying a numerical display such as "10 t / h" at the edge of the screen like a score. With such a display, the passenger can be made to feel like a game, and can be used for income conversion associated with snow removal work.
  • the process of S801 is repeated. In parallel with the execution of the processes S801 to S804, the processes after S206 in FIG. 8 are executed.
  • the control unit 500 may store the snow removal amount information acquired in S802 in the storage unit 501 as the snow removal amount information 519 and use it for maintenance notification. For example, the control unit 500 may notify the vehicle 106 that it is time to replace consumables such as battery replacement based on the accumulated snow removal amount information 519.
  • the passenger when the vehicle travels on a snow surface, the passenger can easily recognize the traveling road and the non-traveling road.
  • FIG. 1 a mode in which the server 101 is configured as a separate device from the vehicles 104, 105, 106 and the portable terminal 107 has been described, but at least a part of the server 101 is the vehicle 104. , 105, 106, and the portable terminal 107, and the real-time property from the acquisition of the shooting data to the generation of the AR display data and the AR display may be improved.
  • the guide display method of the above embodiment is based on the first acquisition step (S203) of acquiring shooting data representing the state after snowfall at a predetermined position and the shooting data acquired in the first acquisition step.
  • To display a guide for traveling in the state after snowfall represented by the shooting data based on the shooting data acquired in the acquisition step and the image data acquired in the second acquisition step. It is characterized by having a generation step (S205) for generating display data.
  • the guide is characterized by including an object representing a traveling path and an object representing a non-traveling path. Further, the object representing the non-traveling road is characterized by including a gutter.
  • the passenger can easily recognize the gutter.
  • the second acquisition step is characterized in that the image data including the object recognized by the shooting data acquired in the first acquisition step is acquired from the first storage means (FIG. 9).
  • the guide display method is a determination step (determining whether or not there is a directional deviation between the captured data acquired in the first acquisition step and the image data acquired from the first storage means ().
  • S402 and the correction step (S402) which corrects the image data acquired from the first storage means so as to eliminate the directional deviation when it is determined in the determination step that there is the directional deviation.
  • S403 and the second acquisition step is characterized in that the image data corrected in the correction step is acquired.
  • the guide display method includes a display control step (S206, S207) of displaying a screen on which the guide is superimposed on the state after snow accumulation on the display device based on the display data generated in the generation step. It is further characterized by having. Further, the display device (313) is characterized in that it is configured as a moving body.
  • the guide display method further includes a third acquisition step (FIG. 13) for acquiring information on the inclination of the moving object, and the generation step notifies that the inclination of the moving object is equal to or higher than a predetermined value. It is characterized by further creating an object for the purpose.
  • the guide display method is characterized in that the moving body is a snowplow. Further, the guide display method further includes a fourth acquisition step (FIG. 15) for acquiring information on the amount of snow removed by the snowplow, and the generation step further includes an object for notifying the information on the amount of snow removal. It is characterized by generating. Further, the guide display method is further characterized by further including a storage step of storing the snow removal amount information acquired in the fourth acquisition step in the second storage means (519).
  • 100 guide display system 101 server: 104, 105, 106 vehicle: 107 portable terminal: 313 display unit: 416 display device: 500 control unit

Abstract

Provided is a design that appropriately displays a guide for traveling in a post-snowfall state. Photograph data representing a post-snowfall state in a prescribed position is acquired, and on the basis of the acquired photograph data, image data corresponding to the post-snowfall state represented by the photograph data is acquired from a storage means that stores image data representing a pre-snowfall state. Display data is generated on the basis of the acquired photograph data and image data, the display data being for displaying a guide for traveling in the post-snowfall state represented by the photograph data.

Description

ガイド表示方法Guide display method
 本発明は、ガイド表示に関する。 The present invention relates to a guide display.
 近年、AR(Augmented Reality)として知られる表示技術が様々な分野で利用されている。特許文献1には、自走式のブームスプレーヤーである農作業機1が記載されている。特許文献1では、車両10(農作業機1)に取り付けられたカメラ32によって撮影された撮影映像A2が、計算機35の演算装置35bによってガイド画像A1に重ね合わせて、ARとして表示機36に表示されると記載されている。さらには、ガイド画像A1と実際の圃場の撮影映像A2とが重ね合わされてガイド映像A3が表示されるため、ガイド画像A1に示される情報(例えば、ガイド線L、走行軌跡X、作業領域S等)と作物Cの位置や構造物等との関係が作業者によって把握可能になると記載されている。 In recent years, a display technology known as AR (Augmented Reality) has been used in various fields. Patent Document 1 describes an agricultural work machine 1 which is a self-propelled boom sprayer. In Patent Document 1, the captured image A2 taken by the camera 32 attached to the vehicle 10 (agricultural work machine 1) is superimposed on the guide image A1 by the arithmetic unit 35b of the computer 35 and displayed as AR on the display 36. It is stated that. Further, since the guide image A1 and the actual field shot image A2 are superimposed and the guide image A3 is displayed, the information shown in the guide image A1 (for example, the guide line L, the traveling locus X, the work area S, etc.) ) And the position of crop C, the structure, etc. can be grasped by the worker.
特開2015-164001号公報JP-A-2015-164001
 特許文献1では、車両10の移動領域である圃場の地面に描かれた格子状の模様の撮影映像A2に基づいてガイド画像A1が補正され、補正されたガイド画像A1が撮影映像A2に重畳される。 In Patent Document 1, the guide image A1 is corrected based on the captured image A2 of the grid pattern drawn on the ground of the field which is the moving region of the vehicle 10, and the corrected guide image A1 is superimposed on the captured image A2. To.
 しかしながら、車両が雪面上を走行する場合、地面は積雪で覆われており、例えば用水路や縁石といった走行の障害になるような物体が雪面下に存在することがある。従って、積雪後の状態におけるガイド表示には、より改善が求められる。 However, when the vehicle travels on the snow surface, the ground is covered with snow, and objects such as irrigation canals and curbs that hinder the travel may exist under the snow surface. Therefore, further improvement is required for the guide display in the state after snowfall.
 本発明は、積雪後の状態で走行するためのガイドを適切に表示する仕組みを提供することを目的とする。 An object of the present invention is to provide a mechanism for appropriately displaying a guide for traveling in a state after snowfall.
 本発明に係るガイド表示方法は、所定位置における積雪後の状態を表す撮影データを取得する第1取得工程と、前記第1取得工程において取得された前記撮影データに基づいて、前記所定位置における積雪前の状態を表す画像データを記憶する第1記憶手段から前記撮影データの表す前記積雪後の状態に対応する画像データを取得する第2取得工程と、前記第1取得工程において取得された前記撮影データと、前記第2取得工程において取得された前記画像データとに基づいて、前記撮影データの表す前記積雪後の状態で走行するためのガイドを表示するための表示データを生成する生成工程と、を有することを特徴とする。 The guide display method according to the present invention includes a first acquisition step of acquiring photographing data representing a state after snow accumulation at a predetermined position, and snow accumulation at the predetermined position based on the photographing data acquired in the first acquisition step. A second acquisition step of acquiring image data corresponding to the state after snow accumulation represented by the shooting data from a first storage means for storing image data representing the previous state, and the shooting acquired in the first acquisition step. Based on the data and the image data acquired in the second acquisition step, a generation step of generating display data for displaying a guide for traveling in the state after snowfall represented by the shooting data, and a generation step. It is characterized by having.
 本発明に係るガイド表示方法は、サーバと、前記サーバと通信可能な車両と、を含むガイド表示システムにおいて実行されるガイド表示方法であって、前記サーバが、所定位置における積雪後の状態を表す撮影データを前記車両から受信する第1受信工程と、前記第1受信工程において受信された前記撮影データに基づいて、前記所定位置における積雪前の状態を表す画像データを記憶する記憶手段から前記撮影データの表す前記積雪後の状態に対応する画像データを取得する取得工程と、前記第1受信工程において受信された前記撮影データと、前記取得工程において取得された前記画像データとに基づいて、前記撮影データの表す前記積雪後の状態で走行するためのガイドを表示するための表示データを生成する生成工程と、前記生成工程において生成された前記表示データを前記車両に送信する第1送信工程と、前記車両が、撮影手段により撮影された前記撮影データを前記サーバへ送信する第2送信工程と、前記表示データを前記サーバから受信する第2受信工程と、前記第2受信工程において受信された前記表示データに基づいて、前記積雪後の状態に前記ガイドが重畳された画面を表示する表示工程と、を有することを特徴とする。 The guide display method according to the present invention is a guide display method executed in a guide display system including a server and a vehicle capable of communicating with the server, and the server represents a state after snowfall at a predetermined position. Based on the first receiving step of receiving the shooting data from the vehicle and the shooting data received in the first receiving step, the shooting is performed from a storage means for storing image data representing the state before snow accumulation at the predetermined position. Based on the acquisition step of acquiring the image data corresponding to the state after snow accumulation represented by the data, the shooting data received in the first reception step, and the image data acquired in the acquisition step, the said A generation step of generating display data for displaying a guide for traveling in the state after snowfall represented by the shooting data, and a first transmission step of transmitting the display data generated in the generation step to the vehicle. , The second transmission step of transmitting the photographed data photographed by the photographing means to the server, the second receiving step of receiving the display data from the server, and the second receiving process of the vehicle. It is characterized by having a display step of displaying a screen on which the guide is superimposed on the state after snow accumulation based on the display data.
 積雪後の状態で走行するためのガイドを適切に表示することができる。 The guide for driving in the state after snowfall can be displayed appropriately.
 本発明のその他の特徴及び利点は、添付図面を参照とした以下の説明により明らかになるであろう。なお、添付図面においては、同じ若しくは同様の構成には、同じ参照番号を付す。 Other features and advantages of the present invention will be clarified by the following description with reference to the accompanying drawings. In the attached drawings, the same or similar configurations are given the same reference numbers.
 添付図面は明細書に含まれ、その一部を構成し、本発明の実施の形態を示し、その記述と共に本発明の原理を説明するために用いられる。
ガイド表示システムの構成を示す図である。 除雪機の構成を示す図である。 除雪機の構成を示すブロック図である。 四輪車両の構成を示すブロック図である。 サーバの構成を示すブロック図である。 AR表示データの生成を説明するための図である。 車両側の処理を示すフローチャートである。 サーバ側の処理を示すフローチャートである。 サーバ側の処理を示すフローチャートである。 サーバ側の処理を示すフローチャートである。 方向性のずれを説明するための図である。 サーバ側の処理を示すフローチャートである。 サーバ側の処理を示すフローチャートである。 サーバ側の処理を示すフローチャートである。 サーバ側の処理を示すフローチャートである。 撮影データの画像を示す図である。 積雪前画像データの画像を示す図である。 ARオブジェクトを示す図である。 ARオブジェクトが撮影データに重畳された画像を示す図である。 作業ラインオブジェクトを示す図である。 積雪情報オブジェクトを示す図である。 警告オブジェクトを示す図である。 警告オブジェクトを示す図である。
The accompanying drawings are included in the specification, form a part thereof, show an embodiment of the present invention, and are used together with the description to explain the principle of the present invention.
It is a figure which shows the structure of the guide display system. It is a figure which shows the structure of a snowplow. It is a block diagram which shows the structure of a snow blower. It is a block diagram which shows the structure of a four-wheeled vehicle. It is a block diagram which shows the structure of a server. It is a figure for demonstrating the generation of AR display data. It is a flowchart which shows the processing on the vehicle side. It is a flowchart which shows the process on the server side. It is a flowchart which shows the process on the server side. It is a flowchart which shows the process on the server side. It is a figure for demonstrating the deviation of a direction. It is a flowchart which shows the process on the server side. It is a flowchart which shows the process on the server side. It is a flowchart which shows the process on the server side. It is a flowchart which shows the process on the server side. It is a figure which shows the image of the shooting data. It is a figure which shows the image of the image data before snow cover. It is a figure which shows the AR object. It is a figure which shows the image which AR object superposed on the photographed data. It is a figure which shows the work line object. It is a figure which shows the snow cover information object. It is a figure which shows the warning object. It is a figure which shows the warning object.
 図1は、本実施形態におけるガイド表示システム100の構成を示す図である。ガイド表示システム100は、雪面を走行する車両が撮影している撮影画像に対して走行路を示すガイド情報をAR(Augmented Reality)表示して車両の搭乗者に提供するシステムである。図1に示すように、ガイド表示システム100は、サーバ101 、車両104、105、106、携帯型端末107を含む。サーバ101は、上記のガイド表示サービスを提供するためのサーバである。なお、図1では、サーバ101は1つの装置として示されているが、複数の装置で構成されても良い。即ち、本実施形態におけるサーバ101は、1つの装置でサーバ機能を提供する構成、複数の装置で連携してサーバ機能を提供する構成、を含む。本実施形態で「車両」は、その形態について限定されるものではなく、図1の車両104、105、106で示されるような、四輪車両、鞍乗り型の二輪車両、作業用車両、を含む。作業用車両は、特殊用途のための車両であり、例えば除雪機である。ガイド表示システム100において、車両104、105、106の搭乗者は、ガイド表示サービスを享受することができる。また、スマートフォンのような携帯型端末107もガイド表示システム100に接続可能であり、携帯型端末107を保持して雪面上を移動する移動者は、ガイド表示システム100のガイド表示サービスを享受することができる。なお、本実施形態において、車両の搭乗者とは、車両の運転者や他の乗員を含むものとする。 FIG. 1 is a diagram showing a configuration of a guide display system 100 according to the present embodiment. The guide display system 100 is a system that displays guide information indicating a travel path on a photographed image taken by a vehicle traveling on a snow surface in AR (Augmented Reality) and provides the passengers of the vehicle. As shown in FIG. 1, the guide display system 100 includes a server 101, vehicles 104, 105, 106, and a portable terminal 107. The server 101 is a server for providing the above-mentioned guide display service. Although the server 101 is shown as one device in FIG. 1, it may be composed of a plurality of devices. That is, the server 101 in the present embodiment includes a configuration in which one device provides a server function and a configuration in which a plurality of devices cooperate to provide a server function. In the present embodiment, the "vehicle" is not limited to that form, and includes a four-wheeled vehicle, a saddle-riding two-wheeled vehicle, and a work vehicle as shown by vehicles 104, 105, and 106 in FIG. Including. The work vehicle is a vehicle for special purposes, such as a snowplow. In the guide display system 100, the passengers of the vehicles 104, 105, 106 can enjoy the guide display service. In addition, a portable terminal 107 such as a smartphone can also be connected to the guide display system 100, and a migrant who holds the portable terminal 107 and moves on a snow surface enjoys the guide display service of the guide display system 100. be able to. In the present embodiment, the passengers of the vehicle include the driver of the vehicle and other occupants.
 無線基地局103は、例えばサーバ101がガイド表示サービスを提供可能な領域内に設けられた無線基地局であり、車両104、105、106、携帯型端末107と相互に通信可能である。また、サーバ101は、ネットワーク102を介して無線基地局103と相互に通信可能に構成されている。そのような構成により、例えば、車両104、105、106、携帯型端末107は、位置情報等の車両情報もしくは端末情報をサーバ101に送信可能であり、サーバ101は、車両104、105、106、携帯型端末107に対して表示データを送信可能である。図1では、ネットワーク102は、単体のネットワークとして示されているが、ルータ等により複数のネットワークが構成されても良い。また、サーバ101及び車両104、105、106、携帯型端末107は、図1に示すネットワーク102以外のネットワークにも接続可能であり、例えば、インターネットに接続可能である。 The wireless base station 103 is, for example, a wireless base station provided in an area where the server 101 can provide a guide display service, and can communicate with the vehicles 104, 105, 106, and the portable terminal 107. Further, the server 101 is configured to be able to communicate with the radio base station 103 via the network 102. With such a configuration, for example, the vehicles 104, 105, 106 and the portable terminal 107 can transmit vehicle information such as position information or terminal information to the server 101, and the server 101 can send the vehicles 104, 105, 106, Display data can be transmitted to the portable terminal 107. In FIG. 1, the network 102 is shown as a single network, but a plurality of networks may be configured by a router or the like. Further, the server 101, the vehicles 104, 105, 106, and the portable terminal 107 can be connected to a network other than the network 102 shown in FIG. 1, and can be connected to, for example, the Internet.
 通信機108は、例えば道路沿いに設けられた路側機であり、例えばDSRC(Dedicated Short Range Communication)により、車両104、105、106、携帯型端末107の少なくともいずれかとの通信(路車間通信など)が可能である。例えば、通信機108は、車両104の車両情報をサーバ101に送信するためや、路面の状態情報(積雪状態や凍結状態など)をサーバ101に送信するために用いられる。 The communication device 108 is, for example, a roadside device provided along the road, and for example, communication with at least one of the vehicles 104, 105, 106, and the portable terminal 107 by DSRC (Dedicated Short Range Communication) (road-to-vehicle communication, etc.). Is possible. For example, the communication device 108 is used to transmit the vehicle information of the vehicle 104 to the server 101 and to transmit the road surface condition information (snow cover state, frozen state, etc.) to the server 101.
 図1の構成に限られず、ガイド表示システム100は、他の装置を適宜含んでも良い。例えば、地図情報、交通情報、または気象情報に関するデータベースサーバを含み、サーバ101は、それらのデータベースサーバにアクセス可能なように構成されても良い。 The guide display system 100 is not limited to the configuration shown in FIG. 1, and may include other devices as appropriate. For example, a database server for map information, traffic information, or weather information may be included, and the server 101 may be configured to have access to those database servers.
 図2は、除雪機である車両106の構成を示す図である。本実施形態では、車両106は、オーガにより雪をかき集め、シュータから雪を投射するクローラ式除雪車両である。車両106の除雪機構の部分を説明する。車両106は、除雪部201を有し、除雪部201は、エンジン221を駆動源とするオーガ203、ブロア204、そり210、及びシュータ205を含む。エンジン221の動力は、小径プーリ207、ベルト208、大径プーリ209、駆動軸206、オーガ軸202の順で伝わり、オーガ203を回転させる。オーガ203は、回転することにより、路上の雪を図面表裏方向にかき集めてブロア204に送り込み、ブロア204の遠心力でシュータ205を介して雪を外部へ投射する。車両106は、また、エンジンカバー220、エンジン空冷用ファン222、空冷用ファン軸223、駆動輪217に接続される出力プーリを有する。 FIG. 2 is a diagram showing the configuration of the vehicle 106, which is a snowplow. In the present embodiment, the vehicle 106 is a crawler type snow removal vehicle that collects snow by an auger and projects snow from a shooter. A part of the snow removal mechanism of the vehicle 106 will be described. The vehicle 106 has a snow removal unit 201, which includes an auger 203, a blower 204, a sled 210, and a shooter 205 driven by an engine 221. The power of the engine 221 is transmitted in the order of the small diameter pulley 207, the belt 208, the large diameter pulley 209, the drive shaft 206, and the auger shaft 202 to rotate the auger 203. By rotating, the auger 203 collects the snow on the road in the front and back directions of the drawing and sends it to the blower 204, and the centrifugal force of the blower 204 projects the snow to the outside through the shooter 205. The vehicle 106 also has an engine cover 220, an engine air-cooled fan 222, an air-cooled fan shaft 223, and an output pulley connected to the drive wheels 217.
 車両106の走行機構の部分を説明する。車両106は、前部に遊動輪211、下部に3個の下部転輪213、214、215を各々回転自在に備えたクローラフレーム212の後部に車体フレーム219を連結し、車体フレーム219の前部に除雪部201が構成されている。ピボット軸218は、クローラフレーム212に車体フレーム219を連結する。車体フレーム219の後部に駆動輪217が構成され、駆動輪217と遊動輪211との間にクローラベルト216が巻き掛けられることで、車両106はクローラ式除雪車両として構成される。また、車両106は、前方を照射可能なヘッドランプ225及びサブヘッドランプ226が取り付けられている。また、操作部224は、車両106の走行機構や除雪機構等に係る各部を制御するための指示を受付可能なヒューマンマシンインタフェースである。 The part of the traveling mechanism of the vehicle 106 will be described. The vehicle 106 has a body frame 219 connected to the rear part of the crawler frame 212, which is provided with a floating wheel 211 at the front and three lower wheels 213, 214, 215 at the bottom, respectively, and is connected to the front part of the body frame 219. The snow removal unit 201 is configured in. The pivot shaft 218 connects the vehicle body frame 219 to the crawler frame 212. A drive wheel 217 is configured at the rear of the body frame 219, and a crawler belt 216 is wound between the drive wheel 217 and the idler wheel 211, whereby the vehicle 106 is configured as a crawler type snow removal vehicle. Further, the vehicle 106 is equipped with a headlamp 225 and a sub headlamp 226 capable of illuminating the front. Further, the operation unit 224 is a human-machine interface capable of receiving instructions for controlling each unit related to the traveling mechanism, the snow removal mechanism, and the like of the vehicle 106.
 本実施形態においては、車両106はさらに、制御部229を備える。制御部229は、車両106に取り付けられた外部環境を認識するための検知部227、228からの検知信号を取得する。検知部227、228は、例えば、カメラやセンサである。 In the present embodiment, the vehicle 106 further includes a control unit 229. The control unit 229 acquires the detection signal from the detection units 227 and 228 for recognizing the external environment attached to the vehicle 106. The detection units 227 and 228 are, for example, cameras and sensors.
 本実施形態では、車両106は、操縦者が搭乗するタイプであっても良いし、操縦者による手押しタイプであっても良い。 また、車両106は、操縦者が不在の全自動運転が可能な車両であっても良い。操縦者が搭乗するタイプであれば、シート、ハンドルやペダル等の操縦部、操作部224、を含む搭乗部(不図示)がエンジンカバー220の後部に設けられる。 In the present embodiment, the vehicle 106 may be a type on which the operator is boarded or a hand-push type by the operator. Further, the vehicle 106 may be a vehicle capable of fully automatic driving in the absence of a driver. If the type is for the operator to board, a boarding unit (not shown) including a seat, a control unit such as a steering wheel and pedals, and an operation unit 224 is provided at the rear of the engine cover 220.
 図3は、除雪機である車両106の構成を示すブロック図である。制御部300は、メインとなるプロセッサ301、メモリ302を含む。例えば、プロセッサ301が記憶部315に記憶されたプログラムをメモリ302に読み出して実行することにより、制御部300は、車両106を統括的に制御する。制御部300は、図2の制御部229に対応する。図3の外界認識用カメラ305及び外界認識用センサ306は、図2の検知部227、228に対応する。図3の操作部314は、図2の操作部224に対応する。図3の駆動制御部303は、図2のエンジン221や、車両106の各部の機構を制御する機構制御部(不図示)、電気系統制御部(不図示)に対応する。走行機構310、除雪機構311は、図2で上述した走行機構、除雪機構の各部に対応する。点灯部312は、ヘッドランプ225及びサブヘッドランプ226や、方向指示器(不図示)などの報知機構を含む。表示制御部304は、表示データに基づいて、表示部313への表示を制御する。 FIG. 3 is a block diagram showing the configuration of the vehicle 106, which is a snowplow. The control unit 300 includes a main processor 301 and a memory 302. For example, the processor 301 reads the program stored in the storage unit 315 into the memory 302 and executes it, so that the control unit 300 controls the vehicle 106 in an integrated manner. The control unit 300 corresponds to the control unit 229 in FIG. The external world recognition camera 305 and the external world recognition sensor 306 of FIG. 3 correspond to the detection units 227 and 228 of FIG. The operation unit 314 of FIG. 3 corresponds to the operation unit 224 of FIG. The drive control unit 303 of FIG. 3 corresponds to the engine 221 of FIG. 2, a mechanism control unit (not shown) that controls the mechanism of each part of the vehicle 106, and an electric system control unit (not shown). The traveling mechanism 310 and the snow removing mechanism 311 correspond to the respective parts of the traveling mechanism and the snow removing mechanism described in FIG. The lighting unit 312 includes a notification mechanism such as a headlamp 225 and a sub headlamp 226, and a direction indicator (not shown). The display control unit 304 controls the display on the display unit 313 based on the display data.
 記憶部315は、上記のプログラムの他、各種パラメータやデータ等を記憶する。操作部314は、搭乗者からの操縦操作を受け付けるためのハンドルやペダル、計器パネルの他、各種ユーザインタフェース画面等を表示するパネル(表示部313)や設定操作を受け付けるための入力部を含む。通信インタフェース(I/F)309は、無線基地局103と通信するためのアンテナ、信号フォーマットの変換部を含む。GPSセンサ307は、GPS(Global Positioning System)センサであり、車両106の現在位置を検知する。ジャイロセンサ308は、車両106の姿勢を検知する。車両106の姿勢とは、例えば、ロール角、ピッチ角、ヨー角である。 The storage unit 315 stores various parameters, data, etc. in addition to the above program. The operation unit 314 includes a steering wheel and pedal for receiving a maneuvering operation from a passenger, an instrument panel, a panel (display unit 313) for displaying various user interface screens, and an input unit for receiving a setting operation. The communication interface (I / F) 309 includes an antenna for communicating with the radio base station 103 and a signal format conversion unit. The GPS sensor 307 is a GPS (Global Positioning System) sensor, and detects the current position of the vehicle 106. The gyro sensor 308 detects the posture of the vehicle 106. The posture of the vehicle 106 is, for example, a roll angle, a pitch angle, and a yaw angle.
 外界認識用カメラ305は、外部環境を撮像するカメラであり、車両106の前方、側方、後方を撮像可能なように取り付けられている。外界認識用センサ306は、例えば、超音波センサであり、車両106の前方に照射した超音波の反射波を検知することにより、積雪面の状態、例えば、積雪面が丘状であることを検知可能である。また、外界認識用センサ306は、例えば、シュータ205の上方、例えば図2の検知部227の位置に設けられた赤外線センサであり、積雪面に赤外線を投射することにより、前方の積雪深を検知可能である。また、例えば、雪質状態、例えば、密度が高く積雪面が固い状態であるか、若しくは、密度が低く積雪面が柔らかい状態であるかを、反射波の強度に基づいて検知可能である。外界認識用カメラ305及び外界認識用センサ306は、図2の検知部227、228の位置に限られない。例えば、車両106の側面でも良いし、ヘッドランプ225の近傍でも良いし、シュータ205の先端に近い部分でも良い。例えば、シュータ205の先端に近い部分に取り付けられることにより、積雪深が2mほどになった場合であっても、車両106の搭乗者から死角となるような前遠方をカメラにより撮像可能となる。 The external world recognition camera 305 is a camera that images the external environment, and is attached so that it can image the front, side, and rear of the vehicle 106. The external world recognition sensor 306 is, for example, an ultrasonic sensor, and detects the state of the snow-covered surface, for example, the snow-covered surface is hill-shaped by detecting the reflected wave of the ultrasonic wave emitted in front of the vehicle 106. It is possible. Further, the external world recognition sensor 306 is, for example, an infrared sensor provided above the shooter 205, for example, at the position of the detection unit 227 in FIG. 2, and detects the snow depth in front by projecting infrared rays onto the snow surface. It is possible. Further, for example, it is possible to detect, for example, whether the snow quality is high and the snow surface is hard, or the density is low and the snow surface is soft, based on the intensity of the reflected wave. The outside world recognition camera 305 and the outside world recognition sensor 306 are not limited to the positions of the detection units 227 and 228 in FIG. For example, it may be on the side surface of the vehicle 106, near the headlamp 225, or near the tip of the shooter 205. For example, by attaching it to a portion close to the tip of the shooter 205, even when the snow depth is about 2 m, it is possible to take an image of a distance in front of the passenger of the vehicle 106, which is a blind spot, with a camera.
 図2及び図3では、除雪機である車両106の構成について説明した。しかしながら、外界を認識するための構成(カメラ等)、位置情報や姿勢情報を認識するための構成(GPSセンサやジャイロセンサ等)、表示部、通信I/Fを備えるのであれば、車両106の構成に限られず、車両104のような四輪車両、車両105のような二輪車両、携帯型端末107であっても良い 。また、図2及び図3では、車両106が、外界を認識するための構成を備えている例を説明したが、車両106に対して外界を認識するための構成が脱着可能であっても良い。例えば、カメラやGPSを備えるスマートフォンを車両106にアタッチメント等で取り付けるようにしても良い。また、車両106は、特殊用途の車両であれば、除雪機に限られず、例えば、農作業機であっても良い。以下、車両106以外の車両として、四輪車両の構成例を説明する。 In FIGS. 2 and 3, the configuration of the vehicle 106, which is a snowplow, has been described. However, if it is provided with a configuration for recognizing the outside world (camera, etc.), a configuration for recognizing position information and attitude information (GPS sensor, gyro sensor, etc.), a display unit, and a communication I / F, the vehicle 106 The configuration is not limited to that of a four-wheeled vehicle such as the vehicle 104, a two-wheeled vehicle such as the vehicle 105, and a portable terminal 107. Further, in FIGS. 2 and 3, an example in which the vehicle 106 has a configuration for recognizing the outside world has been described, but the configuration for recognizing the outside world with respect to the vehicle 106 may be removable. .. For example, a smartphone equipped with a camera or GPS may be attached to the vehicle 106 by an attachment or the like. Further, the vehicle 106 is not limited to a snowplow as long as it is a vehicle for special purposes, and may be, for example, an agricultural work machine. Hereinafter, a configuration example of a four-wheeled vehicle will be described as a vehicle other than the vehicle 106.
 図4は、四輪車両である車両104の構成の一例を示すブロック図である。制御部400は、メインとなるプロセッサ401、メモリ402を含む。例えば、プロセッサ401が記憶部に記憶されたプログラムをメモリ402に読み出して実行することにより、制御部400は、車両104を統括的に制御する。また、制御部400は、外界認識部403、行動計画部404、駆動制御部405、デバイス制御部406を含む。各ブロックは、1つのECU、若しくは、複数のECUにより実現される。 FIG. 4 is a block diagram showing an example of the configuration of the vehicle 104, which is a four-wheeled vehicle. The control unit 400 includes a main processor 401 and a memory 402. For example, the processor 401 reads the program stored in the storage unit into the memory 402 and executes it, so that the control unit 400 controls the vehicle 104 in an integrated manner. Further, the control unit 400 includes an outside world recognition unit 403, an action planning unit 404, a drive control unit 405, and a device control unit 406. Each block is realized by one ECU or a plurality of ECUs.
 外界認識部403は、外界認識用カメラ407及び外界認識用センサ408からの信号に基づいて、車両104の外界情報を認識する。ここで、外界認識用カメラ407は、例えば車両104の前方を撮影するカメラであり、ルーフ前部でフロントウィンドウの車室内側に取り付けられたカメラである。また、外界認識用センサ408は、例えば、車両104の周囲の物標を検知したり、物標との距離を測距するLIDAR(Light Detection and Ranging)やミリ波レーダーである。外界認識部403は、外界認識用カメラ407及び外界認識用センサ408からの信号に基づいて、例えば、交差点や踏切、トンネル、路肩等のフリースペース、他車両の挙動(速度や進行方向)を認識する。GPSセンサ409は、車両104の現在位置を検知し、ジャイロセンサ410は、車両104の姿勢を検知する。車両104の姿勢とは、例えば、ロール角、ピッチ角、ヨー角である。 The outside world recognition unit 403 recognizes the outside world information of the vehicle 104 based on the signals from the outside world recognition camera 407 and the outside world recognition sensor 408. Here, the external world recognition camera 407 is, for example, a camera that photographs the front of the vehicle 104, and is a camera that is attached to the vehicle interior side of the front window at the front of the roof. The external world recognition sensor 408 is, for example, a LIDAR (Light Detection and Ringing) or a millimeter-wave radar that detects a target around the vehicle 104 and measures the distance from the target. The outside world recognition unit 403 recognizes, for example, free spaces such as intersections, railroad crossings, tunnels, and shoulders, and the behavior (speed and traveling direction) of other vehicles based on the signals from the outside world recognition camera 407 and the outside world recognition sensor 408. To do. The GPS sensor 409 detects the current position of the vehicle 104, and the gyro sensor 410 detects the posture of the vehicle 104. The posture of the vehicle 104 is, for example, a roll angle, a pitch angle, and a yaw angle.
 行動計画部404は、外界認識部403、GPSセンサ409で検知した位置情報に基づいて、最適経路、リスク回避経路など、車両104の行動を計画する。行動計画部404は、例えば、交差点や踏切等の開始点や終点に基づく進入判定、他車両の挙動予測に基づく行動計画を行う。駆動制御部405は、行動計画部404による行動計画に基づいて、駆動力出力機構412、ステアリング機構413、ブレーキ機構414を制御する。ここで、駆動力出力機構412は、例えばパワープラントであり、ステアリング機構413は、例えば電動パワーステアリング装置であり、ブレーキ機構414は、例えばディスクブレーキ装置である。 The action planning unit 404 plans the actions of the vehicle 104 such as the optimum route and the risk avoidance route based on the position information detected by the external world recognition unit 403 and the GPS sensor 409. The action planning unit 404 performs, for example, an approach determination based on the start point and the end point of an intersection or a railroad crossing, and an action plan based on the behavior prediction of another vehicle. The drive control unit 405 controls the drive force output mechanism 412, the steering mechanism 413, and the brake mechanism 414 based on the action plan by the action planning unit 404. Here, the driving force output mechanism 412 is, for example, a power plant, the steering mechanism 413 is, for example, an electric power steering device, and the brake mechanism 414 is, for example, a disc brake device.
 デバイス制御部406は、制御部400に接続されるデバイスを制御する。例えば、デバイス制御部406は、スピーカ415を制御し、警告やナビゲーションのためのメッセージ等、所定の音声メッセージを出力させる。また、例えば、デバイス制御部406は、表示装置416を制御し、各種のインタフェース画面を表示させる。また、例えば、デバイス制御部406は、ナビゲーション装置417を制御し、ナビゲーション装置417での設定情報を取得する。通信インタフェース(I/F)411は、無線基地局103と通信するためのアンテナ、信号フォーマットの変換部を含む。 The device control unit 406 controls the device connected to the control unit 400. For example, the device control unit 406 controls the speaker 415 to output a predetermined voice message such as a warning or a message for navigation. Further, for example, the device control unit 406 controls the display device 416 to display various interface screens. Further, for example, the device control unit 406 controls the navigation device 417 and acquires the setting information in the navigation device 417. The communication interface (I / F) 411 includes an antenna for communicating with the radio base station 103 and a signal format conversion unit.
 制御部400は、図4に示す以外の機能ブロックを適宜含んでも良く、例えば、地図情報に基づいて目的地までの最適経路を算出する最適経路算出部を含んでも良い。また、制御部400が、外界認識用カメラ407や外界認識用センサ408以外から外界情報を取得しても良く、例えば、他の車両を経由して外界情報を取得するようにしても良い。また、制御部400は、GPSセンサ409やジャイロセンサ410だけでなく、車両104に設けられた各種センサからの検知信号を受信する。例えば、制御部400は、車両104のドア部に設けられたドアの開閉センサやドアロックの機構センサの検知信号を、ドア部に構成されたECUを介して受信する。それにより、制御部400は、ドアのロック解除や、ドアの開閉動作を検知することができる。 The control unit 400 may appropriately include functional blocks other than those shown in FIG. 4, and may include, for example, an optimum route calculation unit that calculates the optimum route to the destination based on the map information. Further, the control unit 400 may acquire the outside world information from other than the outside world recognition camera 407 and the outside world recognition sensor 408. For example, the control unit 400 may acquire the outside world information via another vehicle. Further, the control unit 400 receives detection signals from not only the GPS sensor 409 and the gyro sensor 410 but also various sensors provided in the vehicle 104. For example, the control unit 400 receives the detection signals of the door open / close sensor and the door lock mechanism sensor provided on the door portion of the vehicle 104 via the ECU configured in the door portion. As a result, the control unit 400 can detect the unlocking of the door and the opening / closing operation of the door.
 図5は、サーバ101の構成を示すブロック図である。制御部500は、CPUやGPUであるメインのプロセッサ505、ROMやRAMであるメモリ506を含み、サーバ101を統括的に制御する。例えば、プロセッサ505が記憶部501に記憶されたプログラムをメモリ506に読み出して実行することにより、本実施形態の動作が実現される。サーバ101は、記憶媒体に記憶されたプログラムに係る本発明を実施するコンピュータとなり得る。 FIG. 5 is a block diagram showing the configuration of the server 101. The control unit 500 includes a main processor 505, which is a CPU and GPU, and a memory 506, which is a ROM and RAM, and controls the server 101 in an integrated manner. For example, the operation of the present embodiment is realized by the processor 505 reading the program stored in the storage unit 501 into the memory 506 and executing the program. The server 101 can be a computer that implements the present invention relating to a program stored in a storage medium.
 積雪前状態取得部507は、所定位置における積雪前の状態を表す積雪前画像データを取得する。積雪前状態取得部507は、例えば、 ユーザが積雪のない所定位置の状態を撮影することにより取得する。または、積雪前状態取得部507は、ドローンなどにより、積雪前に撮影したストリートビューのデータを積雪前画像データとして取得する。積雪前状態取得部507は、取得した積雪前画像データを記憶部501に積雪前画像データ514として格納する。積雪前状態取得部507は、定期的に積雪前画像データを取得し、積雪前画像データ514を更新する。 The pre-snow state acquisition unit 507 acquires pre-snow image data representing the pre-snow state at a predetermined position. The pre-snow cover state acquisition unit 507 acquires, for example, by the user taking a picture of a state at a predetermined position without snow cover. Alternatively, the pre-snow state acquisition unit 507 acquires street view data taken before snow cover as pre-snow image data by a drone or the like. The pre-snow state acquisition unit 507 stores the acquired pre-snow image data in the storage unit 501 as pre-snow image data 514. The pre-snow state acquisition unit 507 periodically acquires pre-snow image data and updates the pre-snow image data 514.
 撮影データ取得部508は、車両104、105、106、携帯型端末107で撮影された撮影データを取得する。本実施形態では、撮影データ取得部508により取得される撮影データは、車両104、105、106の走行方向、携帯型端末107の保持者の移動方向の前方における積雪状態を撮影した撮影データである。撮影データ取得部508は、取得した撮影データを記憶部501に撮影データ517として格納する。 The shooting data acquisition unit 508 acquires shooting data shot by the vehicles 104, 105, 106, and the portable terminal 107. In the present embodiment, the shooting data acquired by the shooting data acquisition unit 508 is shooting data obtained by shooting the snow cover state in front of the traveling directions of the vehicles 104, 105, 106 and the moving direction of the holder of the portable terminal 107. .. The shooting data acquisition unit 508 stores the acquired shooting data in the storage unit 501 as shooting data 517.
 車両情報取得部509は、車両104、105、106から車両情報を取得する。ここで、車両情報とは、例えば、車両の位置情報、車両の挙動に関する情報、車両の姿勢に関する情報、車両の属性情報である。車両の位置情報は、例えば緯度及び経度情報であり、車両の挙動に関する情報は、例えば速度や加速度情報、車両の各部の駆動情報や、カメラやセンサなどの検知情報である。車両の姿勢に関する情報は、例えばロール角、ピッチ角、ヨー角の車両の傾き情報である。車両の属性情報とは、例えば、除雪機の車両106のサイズ(小型、中型、若しくは大型)に関する情報である。また、車両情報取得部509は、携帯型端末107から端末情報を取得する。ここで、端末情報とは、例えば、携帯型端末の位置情報、携帯型端末の挙動に関する情報を含む。携帯型端末の位置情報は、例えば緯度及び経度情報であり、携帯型端末の挙動に関する情報は、例えば速度や加速度情報、カメラやセンサなどの検知情報である。本実施形態では、車両106を、車両104、105、106、携帯型端末107の代表例として説明する。従って、説明中の車両情報は、端末情報として置換可能である。車両情報取得部509は、取得した車両情報を記憶部501に車両情報518(若しくは、端末情報)として格納する。 The vehicle information acquisition unit 509 acquires vehicle information from the vehicles 104, 105, and 106. Here, the vehicle information is, for example, vehicle position information, vehicle behavior information, vehicle attitude information, and vehicle attribute information. The position information of the vehicle is, for example, latitude and longitude information, and the information regarding the behavior of the vehicle is, for example, speed and acceleration information, driving information of each part of the vehicle, and detection information such as a camera and a sensor. The information regarding the posture of the vehicle is, for example, tilt information of the vehicle having a roll angle, a pitch angle, and a yaw angle. The vehicle attribute information is, for example, information regarding the size (small, medium, or large) of the vehicle 106 of the snowplow. Further, the vehicle information acquisition unit 509 acquires terminal information from the portable terminal 107. Here, the terminal information includes, for example, position information of the portable terminal and information regarding the behavior of the portable terminal. The position information of the portable terminal is, for example, latitude and longitude information, and the information regarding the behavior of the portable terminal is, for example, speed and acceleration information, detection information of a camera, a sensor, and the like. In the present embodiment, the vehicle 106 will be described as a representative example of the vehicles 104, 105, 106, and the portable terminal 107. Therefore, the vehicle information in the explanation can be replaced as terminal information. The vehicle information acquisition unit 509 stores the acquired vehicle information in the storage unit 501 as vehicle information 518 (or terminal information).
 画像認識部510は、撮影データ取得部508により取得された撮影データを解析し、画像認識を行う。画像認識部510は、例えば、撮影データの表す画像からオブジェクトを抽出する。画像認識部510によるオブジェクトの抽出については後述する。 The image recognition unit 510 analyzes the shooting data acquired by the shooting data acquisition unit 508 and performs image recognition. The image recognition unit 510 extracts, for example, an object from the image represented by the shooting data. The extraction of objects by the image recognition unit 510 will be described later.
 ARオブジェクト生成部511は、積雪前状態取得部507により取得された積雪前画像データに基づいて、ARオブジェクトを生成する。ARオブジェクト生成部511によるARオブジェクトの生成については後述する。表示データ生成部512は、撮影データ取得部508により取得された撮影データと、ARオブジェクト生成部511により生成されたARオブジェクトとに基づいて、AR表示させるための表示データを生成する。表示データ生成部512による表示データの生成については後述する。 The AR object generation unit 511 generates an AR object based on the pre-snow image data acquired by the pre-snow state acquisition unit 507. The generation of the AR object by the AR object generation unit 511 will be described later. The display data generation unit 512 generates display data for AR display based on the shooting data acquired by the shooting data acquisition unit 508 and the AR object generated by the AR object generation unit 511. The generation of display data by the display data generation unit 512 will be described later.
 環境情報取得部513は、車両の走行路の環境に関する環境情報を取得する。環境情報とは、例えば、積雪に関する情報、車両の走行路周辺の移動物(歩行者等)に関する情報、走行路の傾斜情報や幅情報である。環境情報取得部513は、例えば、道路近傍に設置されている通信機108や路面状況センサ、外部のデータベースサーバから環境情報を取得する。また、環境情報取得部513は、車両106から積雪に関する情報を取得する場合もある。なお、積雪に関する情報とは、例えば積雪深や雪質(固さなど)に関する情報である。環境情報取得部513は、取得した環境情報を記憶物501に環境情報515として格納する。また、環境情報取得部513は、定期的に環境情報を取得し、環境情報515を更新するようにする。 The environmental information acquisition unit 513 acquires environmental information regarding the environment of the vehicle's driving path. The environmental information is, for example, information on snow cover, information on moving objects (pedestrians, etc.) around the vehicle's travel path, slope information on the travel path, and width information. The environmental information acquisition unit 513 acquires environmental information from, for example, a communication device 108 installed near the road, a road surface condition sensor, or an external database server. In addition, the environmental information acquisition unit 513 may acquire information on snow cover from the vehicle 106. The information on snow cover is, for example, information on snow depth and snow quality (hardness, etc.). The environmental information acquisition unit 513 stores the acquired environmental information in the storage material 501 as the environmental information 515. In addition, the environmental information acquisition unit 513 periodically acquires the environmental information and updates the environmental information 515.
 記憶部501は、プログラムや各種パラメータ、データ等を記憶する。また、上述のように、記憶部501は、積雪前画像データ514、環境情報515、撮影データ517、車両情報518を記憶する。撮影データ517、車両情報518は、各車両から取得する情報であるので、各車両ごとの情報516として相互に関連付けられて記憶される。また、各車両ごとの情報516には、後述する除雪量情報519も含まれる。 The storage unit 501 stores programs, various parameters, data, and the like. Further, as described above, the storage unit 501 stores the image data 514 before snowfall, the environmental information 515, the shooting data 517, and the vehicle information 518. Since the shooting data 517 and the vehicle information 518 are information acquired from each vehicle, they are stored in association with each other as information 516 for each vehicle. In addition, the information 516 for each vehicle also includes snow removal amount information 519, which will be described later.
 表示部502は、例えば、ディスプレイであり、各種ユーザインタフェース画面を表示する。操作部503は、例えば、キーボードやポインティングデバイスであり、ユーザ操作を受け付ける。通信インタフェース(I/F)504は、ネットワーク102との通信を可能にするためのインタフェースであり、ネットワーク102の媒体に応じた構成を有する。 The display unit 502 is, for example, a display and displays various user interface screens. The operation unit 503 is, for example, a keyboard or a pointing device, and accepts user operations. The communication interface (I / F) 504 is an interface for enabling communication with the network 102, and has a configuration according to the medium of the network 102.
 以下、本実施形態の動作について説明する。本実施形態では、例えば、四輪車両104が積雪された走行路を走行する状況や、除雪機である車両106が積雪された走行路上を除雪作業する状況を想定している。積雪深が数cmから数十cmとなると、走行路は完全に雪に覆われ、さらには、走行路と非走行路の境界の認識は困難となる。走行路と非走行路との境界には、例えば、側溝や用水路、縁石、花壇や植樹等が存在する場合がある。そのような場合、車両の搭乗者は、意図せずに、それらの上を走行してしまうおそれがある。積雪されていない状態であれば、例えば、四輪車両の外界認識用カメラ407や外界認識用センサ408によって、走行路と非走行路の境界を認識することが可能である。しかしながら、上記のような状況であれば、それらのデバイスを用いても、走行路と非走行路の境界の認識は困難である。 The operation of this embodiment will be described below. In the present embodiment, for example, it is assumed that the four-wheeled vehicle 104 travels on a snow-covered road and the snowplow 106 is snow-removing on a snow-covered road. When the snow depth is several centimeters to several tens of centimeters, the road is completely covered with snow, and it becomes difficult to recognize the boundary between the road and the non-road. At the boundary between the running road and the non-running road, for example, a gutter, an irrigation canal, a curb, a flower bed, a tree planting, or the like may exist. In such cases, the occupants of the vehicle may unintentionally travel over them. If there is no snow, for example, the outside world recognition camera 407 and the outside world recognition sensor 408 of the four-wheeled vehicle can recognize the boundary between the traveling road and the non-traveling road. However, in the above situation, it is difficult to recognize the boundary between the traveling road and the non-traveling road even if these devices are used.
 そこで、本実施形態では、車両の走行位置に対応する場所を、ユーザ自身により若しくはストリートビューなどで予め撮影しておいた積雪前画像データからARオブジェクトを生成し、そのARオブジェクトを車両が撮影した撮影データに対して重畳してAR表示させる。以下、車両106を車両104、105、106、携帯型端末107の代表例として説明する。 Therefore, in the present embodiment, an AR object is generated from the pre-snow image data previously photographed by the user himself or in Street View or the like at a place corresponding to the traveling position of the vehicle, and the vehicle photographs the AR object. The AR is displayed by superimposing it on the shooting data. Hereinafter, the vehicle 106 will be described as a representative example of the vehicles 104, 105, 106, and the portable terminal 107.
 図6は、本実施形態におけるAR表示データの生成を説明するための図である。撮影データ601は、車両106が撮影した撮影データである。図16は、撮影データ601の一例を示す図である。撮影データ601は、車両106の前方を撮影した撮影データであり、図16に示すように、一面が雪で覆われ、走行路を認識することができない。 FIG. 6 is a diagram for explaining the generation of AR display data in the present embodiment. The shooting data 601 is shooting data taken by the vehicle 106. FIG. 16 is a diagram showing an example of shooting data 601. The shooting data 601 is shooting data obtained by shooting the front of the vehicle 106, and as shown in FIG. 16, one surface is covered with snow, and the traveling path cannot be recognized.
 積雪前画像データ602は、サーバ101の記憶部501に記憶されているデータであり、例えば、撮影データ601の撮影画像の位置に対応する積雪前のストリートビューデータである。図17は、積雪前画像データ602の一例を示す図である。図17に示すように、図16の撮影データ601が表す積雪前の状態を示しており、走行路や側溝を認識することができる。 The pre-snow image data 602 is data stored in the storage unit 501 of the server 101, and is, for example, street view data before snow cover corresponding to the position of the photographed image of the photographed data 601. FIG. 17 is a diagram showing an example of pre-snow image data 602. As shown in FIG. 17, the shooting data 601 of FIG. 16 shows the state before snowfall, and the traveling path and the gutter can be recognized.
 本実施形態では、積雪前画像データ602からARオブジェクト603を生成し、撮影データ601に対して、生成されたARオブジェクト603を重畳することで、AR表示データ604を生成する。図18は、ARオブジェクト603の一例を示す図であり、図19は、AR表示データ604の一例を示す図である。図19に示すように、AR表示データ604は、図16の撮影データ601に対して、図18のARオブジェクト603が重畳されている。そして、車両106の表示部313には、図19のAR表示データ604が表示される。 In the present embodiment, the AR object 603 is generated from the pre-snow cover image data 602, and the generated AR object 603 is superimposed on the shooting data 601 to generate the AR display data 604. FIG. 18 is a diagram showing an example of the AR object 603, and FIG. 19 is a diagram showing an example of the AR display data 604. As shown in FIG. 19, in the AR display data 604, the AR object 603 of FIG. 18 is superimposed on the shooting data 601 of FIG. Then, the AR display data 604 of FIG. 19 is displayed on the display unit 313 of the vehicle 106.
 本実施形態では、上記のように走行路をガイドするためのAR表示が行われるので、車両106の搭乗者は、雪面のどこが走行路であり、どこが走行路でないかを容易に認識することができる。 In the present embodiment, since the AR display for guiding the traveling path is performed as described above, the passenger of the vehicle 106 can easily recognize which part of the snow surface is the traveling path and which is not the traveling path. Can be done.
 図7は、車両106において実行される、AR表示をサーバ101に要求する処理を示すフローチャートである。図7の処理は、例えば、プロセッサ301が記憶部315に記憶されたプログラムをメモリ302にロードして実行することにより実現される。 FIG. 7 is a flowchart showing a process of requesting the server 101 to display AR, which is executed in the vehicle 106. The process of FIG. 7 is realized, for example, by the processor 301 loading the program stored in the storage unit 315 into the memory 302 and executing the program.
 S101において、制御部300は、AR表示を行う指示を受け付けたか否かを判定する。この判定は、例えば、車両106の表示部313に表示されたユーザインタフェース画面のメニュー項目(「AR表示の実施」項目など)が操作部314によって選択されたか否かに応じて判定されても良い。上記のユーザインタフェース画面は、例えば、サーバ101から提供されたアプリケーションを起動することにより表示される画面である。AR表示を行う指示を受け付けたと判定された場合、S102に進み、AR表示を行う指示を受け付けていないと判定された場合、図7の処理を終了する。 In S101, the control unit 300 determines whether or not an instruction to perform AR display has been received. This determination may be determined, for example, depending on whether or not a menu item (such as an "AR display implementation" item) on the user interface screen displayed on the display unit 313 of the vehicle 106 is selected by the operation unit 314. .. The above user interface screen is, for example, a screen displayed by starting an application provided by the server 101. If it is determined that the instruction to display the AR is accepted, the process proceeds to S102, and if it is determined that the instruction to display the AR is not accepted, the process of FIG. 7 is terminated.
 S102において、制御部300は、車両106の走行方向前方の撮影データを取得する。その際、撮影データは、静止画像データでも良いし、動画像データでも良い。S103において、制御部300は、車両106の識別情報とともに、AR表示データの要求をサーバ101に送信する。 In S102, the control unit 300 acquires shooting data in front of the vehicle 106 in the traveling direction. At that time, the shooting data may be still image data or moving image data. In S103, the control unit 300 transmits a request for AR display data to the server 101 together with the identification information of the vehicle 106.
 S104において、制御部300は、車両106の識別情報とともに、車両情報、及びS102で取得された撮影データをサーバ101に送信する。ここで、車両情報とは、例えば、GPSセンサ307により検知された位置情報である。また、車両情報として、他の情報を含んでも良く、例えば、ジャイロセンサ308により検知された車両106の姿勢や挙動に関する情報、外界認識用センサ306により検知された積雪に関する情報を含んでも良い。 In S104, the control unit 300 transmits the vehicle information and the shooting data acquired in S102 to the server 101 together with the identification information of the vehicle 106. Here, the vehicle information is, for example, position information detected by the GPS sensor 307. Further, the vehicle information may include other information, for example, information on the posture and behavior of the vehicle 106 detected by the gyro sensor 308, and information on snow cover detected by the external world recognition sensor 306.
 そして、S105において、制御部300は、サーバ101からAR表示データを受信し、S106において、受信したAR表示データに基づいてAR表示する。S107において、制御部300は、AR表示を継続するか否かを判定する。例えば、制御部300は、車両106の停止、AR表示の終了の指示の受付け、を検出した場合には、AR表示を継続しないと判定する。AR表示を継続すると判定された場合、S102からの処理を繰り返し、AR表示を継続しないと判定された場合、図7の処理を終了する。 Then, in S105, the control unit 300 receives the AR display data from the server 101, and in S106, AR displays based on the received AR display data. In S107, the control unit 300 determines whether or not to continue the AR display. For example, when the control unit 300 detects that the vehicle 106 has stopped and the instruction to end the AR display has been received, the control unit 300 determines that the AR display will not be continued. When it is determined that the AR display is to be continued, the process from S102 is repeated, and when it is determined that the AR display is not to be continued, the process of FIG. 7 is terminated.
 図8は、サーバ101において実行される、AR表示データを生成する処理を示すフローチャートである。図8の処理は、例えば、プロセッサ505が記憶部501に記憶されたプログラムをメモリ506にロードして実行することにより実現される。 FIG. 8 is a flowchart showing a process of generating AR display data executed on the server 101. The process of FIG. 8 is realized, for example, by the processor 505 loading the program stored in the storage unit 501 into the memory 506 and executing the program.
 S201において、制御部500は、AR表示データの要求を受信したか否かを判定する。AR表示データの要求を受信したと判定された場合、S202へ進み、AR表示データの要求を受信していないと判定された場合、S201の処理を繰り返す。 In S201, the control unit 500 determines whether or not the request for AR display data has been received. If it is determined that the request for the AR display data has been received, the process proceeds to S202, and if it is determined that the request for the AR display data has not been received, the process of S201 is repeated.
 S202において、制御部500は、車両情報を取得し、車両106の識別情報とともに記憶部501に車両情報518として格納する。S203において、制御部500は、撮影データを取得し、車両106の識別情報とともに記憶部501に撮影データ517として格納する。S204において、制御部500は、積雪前画像データを取得する。 In S202, the control unit 500 acquires the vehicle information and stores it as the vehicle information 518 in the storage unit 501 together with the identification information of the vehicle 106. In S203, the control unit 500 acquires the shooting data and stores the shooting data 517 in the storage unit 501 together with the identification information of the vehicle 106. In S204, the control unit 500 acquires the pre-snow image data.
 図9は、S204の積雪前画像データを取得する処理を示すフローチャートである。S301において、制御部500は、S203で取得された撮影データに対して画像認識を行い、S302において、オブジェクトを抽出する。画像認識の結果、例えば、樹木、ビル、電柱といったように複数のオブジェクトが認識され得る。S302では、制御部500は、撮影データの画像内において少なくとも2つのオブジェクトを抽出するようにする。例えば、図16の撮影データの場合、オブジェクト801とオブジェクト802を抽出する。そのように抽出することで、後述するように、撮影データと積雪前画像データの間の方向性のずれを検出することが可能となる。 FIG. 9 is a flowchart showing a process of acquiring pre-snow image data of S204. In S301, the control unit 500 performs image recognition on the shooting data acquired in S203, and extracts an object in S302. As a result of image recognition, a plurality of objects such as trees, buildings, and utility poles can be recognized. In S302, the control unit 500 extracts at least two objects in the image of the shooting data. For example, in the case of the shooting data of FIG. 16, the object 801 and the object 802 are extracted. By extracting in this way, as will be described later, it is possible to detect a directional deviation between the photographed data and the pre-snow image data.
 S303において、制御部500は、S302で抽出されたオブジェクトを含む積雪前画像データが存在するか否かを判定する。S303では、制御部500は、記憶部501に記憶された積雪前画像データ514のうち車両106の位置に対応する積雪前画像データを対象として、S302で抽出されたオブジェクトを含む積雪前画像データが存在するか否かを判定する。S302で抽出されたオブジェクトを含む積雪前画像データが存在すると判定された場合、S306において、制御部500は、その積雪前画像データを取得する。S306の後、図9の処理を終了し、図8のS205に進む。一方、S303で抽出されたオブジェクトを含む積雪前画像データが存在しないと判定された場合、S304に進む。例えば、図17の積雪前画像データは、図16の撮影データで抽出されたオブジェクト801及び802に対応するオブジェクト901及び902を含んでいる。その場合、S302で抽出されたオブジェクトを含む積雪前画像データが存在すると判定される。 In S303, the control unit 500 determines whether or not the pre-snow image data including the object extracted in S302 exists. In S303, the control unit 500 targets the pre-snow image data corresponding to the position of the vehicle 106 among the pre-snow image data 514 stored in the storage unit 501, and the pre-snow image data including the object extracted in S302 is generated. Determine if it exists. When it is determined that the pre-snow image data including the object extracted in S302 exists, the control unit 500 acquires the pre-snow image data in S306. After S306, the process of FIG. 9 is completed, and the process proceeds to S205 of FIG. On the other hand, if it is determined that the pre-snow image data including the object extracted in S303 does not exist, the process proceeds to S304. For example, the pre-snow cover image data of FIG. 17 includes objects 901 and 902 corresponding to the objects 801 and 802 extracted from the shooting data of FIG. In that case, it is determined that the pre-snow image data including the object extracted in S302 exists.
 S304において、制御部500は、撮影データの画像認識の結果、抽出された全てのオブジェクトについてS303の判定が終了したか否かを判定する。抽出された全てのオブジェクトについて終了していないと判定された場合、S302に戻り、他のオブジェクトを抽出する。一方、抽出された全てのオブジェクトについて終了していると判定された場合、図8のS205のARオブジェクトの生成が実行できないので、AR表示ができない旨の通知メッセージデータを車両106の制御部300に送信する。S305の後、図9及び図8の処理を終了する。通知メッセージデータを受信した車両106の制御部300は、表示部313にその通知メッセージを表示する。なお、通知メッセージデータは、表示用のデータでも良いし、音声出力用のデータでも良い。 In S304, the control unit 500 determines whether or not the determination of S303 has been completed for all the extracted objects as a result of image recognition of the shooting data. If it is determined that all the extracted objects are not finished, the process returns to S302 and other objects are extracted. On the other hand, when it is determined that all the extracted objects have been completed, the AR object in S205 of FIG. 8 cannot be generated, so the notification message data indicating that the AR cannot be displayed is sent to the control unit 300 of the vehicle 106. Send. After S305, the processing of FIGS. 9 and 8 is completed. The control unit 300 of the vehicle 106 that has received the notification message data displays the notification message on the display unit 313. The notification message data may be display data or voice output data.
 上述のように図8のS204の処理が行われると、車両106の位置に対応した積雪前画像データが取得される。S204の後、S205において、制御部500は、S204で取得された積雪前画像データに基づいてARオブジェクトを生成する。 When the process of S204 of FIG. 8 is performed as described above, the pre-snow image data corresponding to the position of the vehicle 106 is acquired. After S204, in S205, the control unit 500 generates an AR object based on the pre-snow image data acquired in S204.
 図10は、S205のARオブジェクトを生成する処理を示すフローチャートである。S401において、制御部500は、S203で取得された撮影データと、S306で取得された積雪前画像データとを比較する。S402において、制御部500は、比較の結果、S203で取得された撮影データとS306で取得された積雪前画像データとの間で、撮影位置や撮影角度の違いによって生じる方向性のずれがあるか否かを判定する。 FIG. 10 is a flowchart showing a process of generating the AR object of S205. In S401, the control unit 500 compares the shooting data acquired in S203 with the pre-snow cover image data acquired in S306. In S402, as a result of comparison, does the control unit 500 have a directional deviation between the shooting data acquired in S203 and the pre-snow image data acquired in S306 due to a difference in shooting position and shooting angle? Judge whether or not.
 方向性のずれの判定については、例えば、以下のように行う。図11に示すように、2本の物体701及び702を方向Aから撮影した場合、その撮影画像内での物体間の距離が距離703であるとする。一方、角度θ1分回転した方向Bから撮影した場合、その撮影画像内での物体間の距離は距離704となり、角度θ2分回転した方向Cから撮影した場合、その撮影画像内での物体間の距離は距離705となる。図11に示すように、回転角度が大きくなるにつれて、撮影画像内での物体間の距離は短くなる。つまり、方向性と撮影画像内での物体間の距離とは相関関係があるといえる。従って、S401の比較の結果、S302で抽出された複数のオブジェクト間の距離について、撮影データと積雪前画像データとで所定以上の差がある場合には、方向性のずれがあると判定する。S402で方向性のずれがあると判定された場合、S403へ進み、S402で方向性のずれがないと判定された場合、S404へ進む。 For example, the judgment of the deviation of the direction is performed as follows. As shown in FIG. 11, when two objects 701 and 702 are photographed from the direction A, it is assumed that the distance between the objects in the photographed image is the distance 703. On the other hand, when shooting from the direction B rotated by an angle θ1 minutes, the distance between the objects in the captured image is 704, and when shooting from the direction C rotated by an angle θ2 minutes, the distance between the objects in the captured image is 704. The distance is 705. As shown in FIG. 11, as the rotation angle increases, the distance between objects in the captured image becomes shorter. In other words, it can be said that there is a correlation between the directionality and the distance between objects in the captured image. Therefore, as a result of comparison in S401, if there is a difference of a predetermined distance or more between the shooting data and the pre-snow image data with respect to the distances between the plurality of objects extracted in S302, it is determined that there is a directional deviation. If it is determined in S402 that there is a directional deviation, the process proceeds to S403, and if it is determined in S402 that there is no directional deviation, the process proceeds to S404.
 S403において、制御部500は、方向性のずれに応じて、積雪前画像データを補正する。ここでの補正とは、S306で取得された積雪前画像データをS203で取得された撮影データの方向性に合わせる、即ち、S402で判定されたずれを解消するための補正であって、例えば、図11の角度θ1分の回転である。積雪前画像データの撮影座標と車両106の位置とに基づいて、回転が行われた後、撮影方向(例えば方向B)上の移動(物体701及び702に接近するか若しくは物体701及び702から遠ざかる)を行うようにしても良い。 In S403, the control unit 500 corrects the pre-snow image data according to the deviation of the directionality. The correction here is a correction for matching the pre-snow image data acquired in S306 with the directionality of the shooting data acquired in S203, that is, eliminating the deviation determined in S402, for example. It is a rotation of an angle θ1 minute in FIG. After the rotation is performed based on the shooting coordinates of the pre-snow image data and the position of the vehicle 106, the movement in the shooting direction (for example, direction B) (approaches the objects 701 and 702 or moves away from the objects 701 and 702). ) May be performed.
 S404において、制御部500は、車両106の搭乗者により指定されたAR表示モードを取得する。AR表示モードとは、後述する追加オブジェクトを表示させるか否かについての情報である。本実施形態では、図19に示すような走行路と非走行路を識別可能とするAR表示を、標準のAR表示モードとする。また、本実施形態では、標準のAR表示の他、車両106の傾きや、人などの移動体が近くにいることなどをAR表示により警告する表示モードが可能である。それらの表示モードの選択情報は、例えば、図7のS103でAR表示データの要求とともに送信される。ここでは、標準のAR表示モードのみが選択されているとして説明する。 In S404, the control unit 500 acquires the AR display mode specified by the passenger of the vehicle 106. The AR display mode is information about whether or not to display an additional object described later. In the present embodiment, the AR display that enables the distinction between the traveling road and the non-traveling road as shown in FIG. 19 is set as the standard AR display mode. Further, in the present embodiment, in addition to the standard AR display, a display mode for warning the inclination of the vehicle 106, the presence of a moving object such as a person, or the like by the AR display is possible. The selection information of those display modes is transmitted together with the request for AR display data in S103 of FIG. 7, for example. Here, it is described that only the standard AR display mode is selected.
 S405において、制御部500は、画像認識部510による画像認識結果を用いて、走行路識別用のARオブジェクトを生成する。 In S405, the control unit 500 generates an AR object for track identification using the image recognition result by the image recognition unit 510.
 図18は、ARオブジェクトの一例を示す図である。例えば、サーバ101の画像認識部510は、図17の積雪前画像データ602に対して画像認識を行い、オブジェクトを抽出する。ここで抽出されるオブジェクトは、走行路と非走行路の境界を認識するためのオブジェクトであり、例えば、側溝や用水路、縁石、花壇や植樹である。これらの画像認識対象のオブジェクトについては、サーバ101において予め深層学習等により学習されており、画像認識部510は、その学習済みのモデルを用いて、積雪前画像データ602からそれらのオブジェクトを抽出する。 FIG. 18 is a diagram showing an example of an AR object. For example, the image recognition unit 510 of the server 101 performs image recognition on the pre-snow image data 602 of FIG. 17 and extracts an object. The object extracted here is an object for recognizing the boundary between the traveling path and the non-traveling path, and is, for example, a gutter, an irrigation canal, a curb, a flower bed, or a tree planting. These objects to be image-recognized are learned in advance by deep learning or the like on the server 101, and the image recognition unit 510 extracts those objects from the pre-snow image data 602 using the trained model. ..
 例えば、画像認識部510は、図17の積雪前画像データ602の画像認識の結果、側溝905、車止め903、走行路端904を認識する。そして、画像認識部510は、図18のARオブジェクト1001、1002、1003、1004、1005を生成する。ここで、ARオブジェクト1001は側溝905に対応し、ARオブジェクト1002は走行路端904に対応している。また、ARオブジェクト1003は、車止め903に対応している。また、ARオブジェクト1004は、走行できない(進入できない)という意味を表す警告マークであり、側溝905、車止め903、走行路端904に対応するARオブジェクト1001、1003、1002の近傍に配置される。また、ARオブジェクト1005は、走行可能なエリアを表すためのARオブジェクトである。ARオブジェクト1005は、走行方向に向かっての矢印の形とされているが、走行可能であることを表すものであるならば、図18に示す形状でなくても良い。また、ARオブジェクト1005は、画面奥側に向かって色が順次点滅していくような動的に表示されるものであっても良い。 For example, the image recognition unit 510 recognizes the gutter 905, the bollard 903, and the road end 904 as a result of the image recognition of the pre-snow image data 602 in FIG. Then, the image recognition unit 510 generates the AR objects 1001, 1002, 1003, 1004, 1005 of FIG. Here, the AR object 1001 corresponds to the gutter 905, and the AR object 1002 corresponds to the roadside edge 904. Further, the AR object 1003 corresponds to the bollard 903. Further, the AR object 1004 is a warning mark indicating that the vehicle cannot travel (cannot enter), and is arranged in the vicinity of the AR objects 1001, 1003, 1002 corresponding to the gutter 905, the bollard 903, and the road end 904. Further, the AR object 1005 is an AR object for representing a travelable area. Although the AR object 1005 has the shape of an arrow pointing in the traveling direction, it does not have to have the shape shown in FIG. 18 as long as it indicates that the AR object 1005 can travel. Further, the AR object 1005 may be dynamically displayed so that the colors are sequentially blinked toward the back side of the screen.
 S406において、制御部500は、S404で取得されたAR表示モードに基づいて、追加ARオブジェクトを生成するか否かを判定する。ここで、追加ARオブジェクトを生成すると判定された場合、S407に進み、追加ARオブジェクトが生成される。S407の処理については後述する。一方、追加ARオブジェクトを生成しないと判定された場合、図10の処理を終了し、図8のS206へ進む。上述のように標準のAR表示モードのみが選択されている場合には、S406で、追加ARオブジェクトを生成しないと判定される。 In S406, the control unit 500 determines whether or not to generate an additional AR object based on the AR display mode acquired in S404. Here, if it is determined that the additional AR object is to be generated, the process proceeds to S407, and the additional AR object is generated. The processing of S407 will be described later. On the other hand, if it is determined that the additional AR object is not generated, the process of FIG. 10 is terminated and the process proceeds to S206 of FIG. When only the standard AR display mode is selected as described above, it is determined in S406 that no additional AR object is generated.
 図8のS206において、制御部500は、AR表示データを生成する。図6で説明したように、制御部500は、S203で取得された撮影データに対して、S205で生成されたARオブジェクトを重畳することによりAR表示データを生成する。 In S206 of FIG. 8, the control unit 500 generates AR display data. As described with reference to FIG. 6, the control unit 500 generates AR display data by superimposing the AR object generated in S205 on the shooting data acquired in S203.
 S207において、制御部500は、S206で生成されたAR表示データを通信I/F504を介して車両106に送信する。車両106の制御部300は、受信したAR表示データに基づいて、AR表示画面を表示部313に表示させる。表示部313には、例えば図19のような撮影データにARオブジェクトが重畳されたAR表示画面が表示される。S207の後、図8の処理を終了する。なお、車両106の搭乗者による画面上の各ARオブジェクトにアイコンを合わせるか若しくはクリックするかなどの操作に応じて、ARオブジェクトの説明を表示させるようにしても良い。例えば、アイコンをARオブジェクト1003に合わせることにより、「車止めです。行き止まりで通行できません。」といったバルーン表示をするようにしても良い。また、アイコンの説明については、サーバ101から予め提供されたアプリケーションのユーザインタフェース画面内など、他の方法によって提示されるようにしても良い。 In S207, the control unit 500 transmits the AR display data generated in S206 to the vehicle 106 via the communication I / F 504. The control unit 300 of the vehicle 106 causes the display unit 313 to display the AR display screen based on the received AR display data. On the display unit 313, for example, an AR display screen in which an AR object is superimposed on shooting data as shown in FIG. 19 is displayed. After S207, the process of FIG. 8 ends. The description of the AR object may be displayed according to an operation such as matching or clicking an icon with each AR object on the screen by the passenger of the vehicle 106. For example, by matching the icon to the AR object 1003, a balloon display such as "It is a car stop. You cannot pass due to a dead end." May be displayed. Further, the description of the icon may be presented by another method such as in the user interface screen of the application provided in advance from the server 101.
 以下、図10のS406で追加ARオブジェクトを生成すると判定された場合の処理について説明する。 Hereinafter, the processing when it is determined in S406 of FIG. 10 that an additional AR object will be generated will be described.
 図12は、S407の追加ARオブジェクトを生成する処理を示すフローチャートである。以降のS501、S503、S505、S507の判定は、例えば、AR表示を行う指示を受け付ける際のユーザインタフェース画面のメニュー項目の選択状態に基づいて行われる。例えば、そのユーザインタフェース画面には、「AR表示の実施」というメニュー項目にさらに「追加AR表示」というサブメニュー項目が設けられ、その詳細項目として「警告」「作業ライン」「積雪情報」「除雪量」の各表示の要否の選択が可能なように表示される。それらの選択情報は、例えば、S201でAR表示データの要求とともに受信される。 FIG. 12 is a flowchart showing a process of generating the additional AR object of S407. Subsequent determinations of S501, S503, S505, and S507 are performed based on, for example, the selection state of a menu item on the user interface screen when receiving an instruction to perform AR display. For example, on the user interface screen, a submenu item called "additional AR display" is further provided in addition to the menu item "implementation of AR display", and the detailed items are "warning", "work line", "snow cover information", and "snow removal". It is displayed so that the necessity of each display of "amount" can be selected. The selection information is received, for example, in S201 together with the request for AR display data.
 S501において、制御部500は、上記の選択情報に基づいて、警告オブジェクトを表示するか否かを判定する。警告オブジェクトを表示すると判定された場合、S502へ進み、警告オブジェクトの生成が行われる。一方、警告オブジェクトを表示しないと判定された場合、警告オブジェクトの生成は行われずにS503へ進む。 In S501, the control unit 500 determines whether or not to display the warning object based on the above selection information. If it is determined that the warning object is to be displayed, the process proceeds to S502 and the warning object is generated. On the other hand, if it is determined that the warning object is not displayed, the warning object is not generated and the process proceeds to S503.
 警告オブジェクトの生成の一例について説明する。例えば、警告オブジェクトは、車両106の傾きが所定以上となった場合にその旨をAR表示により警告通知するためのARオブジェクトである。積雪された走行路を走行する際、積雪深の程度や雪質によっては車両が傾いてしまい、その傾きの程度が大きれば、転倒などを引き起こしてしまうおそれがある。そこで、本実施形態では、制御部500は、車両106の傾き状態を監視し、所定以上となった場合には、車両106に警告オブジェクトを表示させる。 An example of generating a warning object will be explained. For example, the warning object is an AR object for giving a warning notification by AR display when the inclination of the vehicle 106 becomes equal to or higher than a predetermined value. When traveling on a snow-covered road, the vehicle may tilt depending on the degree of snow depth and the quality of the snow, and if the degree of tilt is large, it may cause a fall or the like. Therefore, in the present embodiment, the control unit 500 monitors the tilted state of the vehicle 106, and when the amount exceeds a predetermined value, causes the vehicle 106 to display a warning object.
 S501で警告オブジェクトを表示すると判定された場合には、S502において、制御部500は、車両106の車両情報により、車両106の傾きの監視を実行する。例えば、制御部500は、所定の時間間隔で車両106から姿勢に関する情報を取得する。車両106の傾きの監視が実行されることに並行して、図12の処理は、S502からS503へ進む。 When it is determined in S501 that the warning object is to be displayed, in S502, the control unit 500 monitors the inclination of the vehicle 106 based on the vehicle information of the vehicle 106. For example, the control unit 500 acquires information on the posture from the vehicle 106 at predetermined time intervals. The process of FIG. 12 proceeds from S502 to S503 in parallel with the monitoring of the inclination of the vehicle 106 being performed.
 図13は、車両106の傾きの監視の処理を示すフローチャートである。S601において、制御部500は、車両106から傾き(姿勢)に関する情報を取得する。そして、S602において、S601で取得した傾きに関する情報が示す、例えばロール角やピッチ角が閾値以上であるか否かを判定する。閾値以上でないと判定された場合、S601の処理を繰り返す。一方、閾値以上であると判定された場合、S603において、制御部500は、警告オブジェクトを生成する。 FIG. 13 is a flowchart showing the process of monitoring the inclination of the vehicle 106. In S601, the control unit 500 acquires information on the inclination (posture) from the vehicle 106. Then, in S602, it is determined whether or not the information regarding the inclination acquired in S601, for example, the roll angle or the pitch angle is equal to or greater than the threshold value. If it is determined that the threshold value is not equal to or higher than the threshold value, the process of S601 is repeated. On the other hand, if it is determined that the threshold value is equal to or higher than the threshold value, the control unit 500 generates a warning object in S603.
 図22は、車両106の傾きを警告する警告オブジェクトの一例を示す図である。図22に示すように、警告オブジェクト1301が表示される。図22では、警告オブジェクト1301は、矢印方向に傾きを繰り返すような動的な表示が行われるが、その表示形態については限定されるものではなく、車両106の傾きを警告するものであれば、点滅表示やテキスト表示など他の表示形態であっても良い。S603の後、図13の処理を終了し、図12の処理が再開されるか、若しくは、図12の処理が終了していれば、図8のS206へ進む。 FIG. 22 is a diagram showing an example of a warning object that warns of the inclination of the vehicle 106. As shown in FIG. 22, the warning object 1301 is displayed. In FIG. 22, the warning object 1301 is dynamically displayed so as to repeatedly tilt in the arrow direction, but the display form is not limited as long as it warns of the tilt of the vehicle 106. Other display forms such as blinking display and text display may be used. After S603, if the process of FIG. 13 is completed and the process of FIG. 12 is restarted, or if the process of FIG. 12 is completed, the process proceeds to S206 of FIG.
 S502の警告オブジェクトの生成の他の一例について説明する。例えば、警告オブジェクトとは、車両106の搭乗者から死角となるような位置に人などの移動体がいる場合に、その旨を警告通知するためのARオブジェクトである。極めて激しい降雪の際、その積雪深は数mに及び、壁状態となった雪の向こう側に人がいても、車両106の搭乗者から見えないことがある。搭乗者がそのような人の存在を意識することなく除雪作業をすると、シュータ205から投射された雪を人の頭上に落下させてしまうおそれがある。そこで、本実施形態では、制御部500は、車両106の搭乗者からの死角に移動体が存在することが推定される場合には、その旨の警告オブジェクトをAR表示させる。 Another example of generating the warning object of S502 will be described. For example, the warning object is an AR object for giving a warning when there is a moving object such as a person at a position that is a blind spot from the passenger of the vehicle 106. During extremely heavy snowfall, the snow depth is several meters, and even if there is a person on the other side of the walled snow, it may not be visible to the passengers of vehicle 106. If the passenger removes snow without being aware of the existence of such a person, the snow projected from the shooter 205 may be dropped onto the person's head. Therefore, in the present embodiment, when it is estimated that a moving object exists in the blind spot from the passenger of the vehicle 106, the control unit 500 displays a warning object to that effect in AR.
 図14は、S502の警告オブジェクトを生成する処理を示すフローチャートである。S701において、制御部500は、車両106の位置周辺の環境情報を取得する。環境情報とは、例えば、車両106の位置周辺の移動体の存在に関する情報である。ここで、移動体とは、例えば、歩行者、自転車、車両である。制御部500は、路側機等の通信機108から取得するようにしても良いし、外部のデータベースサーバから取得するようにしても良い。もしくは、制御部500は、車両情報から移動体の存在に関する情報を取得するようにしても良い。例えば、制御部500は、車両106のシュータ205の先端に近い部分に取り付けられたカメラの画像データを取得するようにしても良い。 FIG. 14 is a flowchart showing a process of generating the warning object of S502. In S701, the control unit 500 acquires environmental information around the position of the vehicle 106. The environmental information is, for example, information regarding the existence of a moving body around the position of the vehicle 106. Here, the moving body is, for example, a pedestrian, a bicycle, or a vehicle. The control unit 500 may be acquired from a communication device 108 such as a roadside unit, or may be acquired from an external database server. Alternatively, the control unit 500 may acquire information regarding the existence of the moving body from the vehicle information. For example, the control unit 500 may acquire image data of a camera attached to a portion of the vehicle 106 near the tip of the shooter 205.
 S702において、制御部500は、S701で取得された環境情報に基づいて、移動体を認識したか否かを判定する。ここで、移動体を認識したと判定された場合、S703へ進む。一方、移動体を認識していないと判定された場合、警告オブジェクトを生成する必要がないので、図14の処理を終了し、図12のS503へ進む。 In S702, the control unit 500 determines whether or not the moving body is recognized based on the environmental information acquired in S701. Here, if it is determined that the moving body has been recognized, the process proceeds to S703. On the other hand, when it is determined that the moving object is not recognized, it is not necessary to generate the warning object, so the process of FIG. 14 is terminated and the process proceeds to S503 of FIG.
 S703において、制御部500は、S203で取得した撮影データの画像認識を行い、S704において、撮影データ上で移動体を認識したか否かを判定する。S704で移動体を認識したと判定された場合、S705で警告オブジェクトを生成する。S705に進むケースとは、例えば、車両106の進行方向上、右前方に歩行者が存在することが環境情報から認識され、且つ、車両106による撮影データにおいても認識されている状態である。従って、S705では、制御部500は、撮影データ上で認識された歩行者のオブジェクトを四角形で囲うなど、搭乗者から容易に識別可能なように警告オブジェクトを生成する。S705の後、図14の処理を終了し、図12のS503へ進む。 In S703, the control unit 500 performs image recognition of the shooting data acquired in S203, and determines in S704 whether or not the moving object is recognized on the shooting data. When it is determined that the moving object is recognized in S704, a warning object is generated in S705. The case of proceeding to S705 is, for example, a state in which it is recognized from the environmental information that a pedestrian exists in the right front in the traveling direction of the vehicle 106, and is also recognized in the shooting data by the vehicle 106. Therefore, in S705, the control unit 500 generates a warning object so that the pedestrian object recognized on the shooting data can be easily identified by the passenger, such as surrounding the pedestrian object with a quadrangle. After S705, the process of FIG. 14 is completed, and the process proceeds to S503 of FIG.
 一方、S704で移動体を認識していないと判定された場合、S706で警告オブジェクトを生成する。S706に進むケースとは、例えば、車両106の進行方向上、右前方に歩行者が存在することが環境情報から認識される一方、車両106による撮影データデータにおいては認識されていない状態である。これは、図23に示すように、車両106の右前方の積雪深が数mであり、積雪の向こう側の歩行者が搭乗者から見えない状態の場合にあり得る。そこで、S706では、図23の警告オブジェクト1401のように警告オブジェクトを生成し、その近傍に「雪の向こう側に人がいます!」といったメッセージを表示する。そのような警告表示により、車両106の搭乗者から死角となる位置に存在する移動体を、搭乗者に認識させることができる。また、警告メッセージ1401は、点滅などで動的に表示させることにより認識効果を高めるようにしても良い。S706の後、図14の処理を終了し、図12のS503へ進む。 On the other hand, if it is determined in S704 that the moving object is not recognized, a warning object is generated in S706. The case of proceeding to S706 is, for example, a state in which it is recognized from the environmental information that a pedestrian exists in the right front in the traveling direction of the vehicle 106, but is not recognized in the shooting data data of the vehicle 106. This may be the case when the snow depth on the right front of the vehicle 106 is several meters and the pedestrian on the other side of the snow is invisible to the passengers, as shown in FIG. Therefore, in S706, a warning object is generated as in the warning object 1401 of FIG. 23, and a message such as "There is a person on the other side of the snow!" Is displayed in the vicinity thereof. By such a warning display, it is possible to make the passenger recognize the moving body existing at the position where the passenger of the vehicle 106 becomes a blind spot. Further, the warning message 1401 may be dynamically displayed by blinking or the like to enhance the recognition effect. After S706, the process of FIG. 14 is completed, and the process proceeds to S503 of FIG.
 再び、図12を参照する。S503において、制御部500は、選択情報に基づいて、作業ラインを表示するか否かを判定する。作業ラインを表示すると判定された場合、S504へ進み、作業ラインオブジェクトの生成が行われる。一方、作業ラインを表示しないと判定された場合、作業ラインオブジェクトの生成は行われずにS505へ進む。 Refer to FIG. 12 again. In S503, the control unit 500 determines whether or not to display the work line based on the selection information. If it is determined that the work line is to be displayed, the process proceeds to S504 and the work line object is generated. On the other hand, if it is determined that the work line is not displayed, the work line object is not generated and the process proceeds to S505.
 作業ラインオブジェクトの生成の一例について説明する。例えば、作業ラインオブジェクトとは、除雪機である車両106の除雪作業ラインである。車両106が小型の除雪機であり且つ走行路幅が広い場合には、一定の距離を往復しながら作業が行われる。また、手押しタイプの除雪機の場合には、積雪状態や搭乗者の作業の熟達度によっては、直線で走行させることが難しい場合がある。そこで、本実施形態では、制御部500は、例えば図20に示すように、走行路と非走行路を識別可能に表示することに加えて、作業ラインオブジェクト1101のように、作業ラインをAR表示させる。その際、制御部500は、車両106から取得した車両情報と、環境情報とに応じて、作業ラインオブジェクトを生成する。例えば、中型サイズの除雪機と小型サイズの除雪機とでは、同じ走行路幅に対する往復回数は異なる。従って、S504において、制御部500は、車両情報から得られる除雪機のサイズと環境情報から得られる走行路幅とに基づいて往復回数を決定し、その往復回数に応じた作業ラインオブジェクトを生成する。例えば、図20の作業ラインオブジェクト1101によると、3往復した後、走行路から抜けるように誘導される。このように、車両106の搭乗者に対して、側溝905がない部分から走行路外へ出る経路を搭乗者に対して容易に認識させることができる。 An example of generating a work line object will be explained. For example, the work line object is a snow removal work line of the vehicle 106, which is a snow remover. When the vehicle 106 is a small snowplow and the travel path width is wide, the work is performed while reciprocating a certain distance. Further, in the case of a hand-push type snowplow, it may be difficult to drive in a straight line depending on the snow condition and the proficiency of the passenger's work. Therefore, in the present embodiment, in addition to displaying the travel path and the non-travel path in an distinguishable manner, for example, as shown in FIG. 20, the control unit 500 AR-displays the work line as in the work line object 1101. Let me. At that time, the control unit 500 generates a work line object according to the vehicle information acquired from the vehicle 106 and the environmental information. For example, a medium-sized snowplow and a small-sized snowplow have different numbers of round trips for the same road width. Therefore, in S504, the control unit 500 determines the number of round trips based on the size of the snowplow obtained from the vehicle information and the travel path width obtained from the environmental information, and generates a work line object according to the number of round trips. .. For example, according to the work line object 1101 of FIG. 20, after making three round trips, the user is guided to exit the traveling path. In this way, the passenger of the vehicle 106 can be easily made to recognize the route going out of the traveling path from the portion without the gutter 905.
 車両106の走行に応じて、作業ラインオブジェクト1101の表示を変化させるようにしても良い。例えば、車両106が作業ラインオブジェクト1101の作業ラインから大きく逸脱するようなラインを走行している場合には、作業ラインオブジェクト1101の色を赤色点滅表示させるとともに、「ラインを外れています」等のメッセージを表示するようにしても良い。S504の後、S505へ進む。 The display of the work line object 1101 may be changed according to the traveling of the vehicle 106. For example, when the vehicle 106 is traveling on a line that greatly deviates from the work line of the work line object 1101, the color of the work line object 1101 is blinked in red, and "off the line" or the like is displayed. You may want to display a message. After S504, the process proceeds to S505.
 S505において、制御部500は、選択情報に基づいて、積雪に関する情報を表示するか否かを判定する。積雪に関する情報を表示すると判定された場合、S506へ進み、積雪情報オブジェクトの生成が行われる。一方、積雪に関する情報を表示しないと判定された場合、積雪情報オブジェクトの生成は行われずにS507へ進む。 In S505, the control unit 500 determines whether or not to display the information regarding the snow cover based on the selection information. If it is determined that the information regarding the snow cover is to be displayed, the process proceeds to S506, and the snow cover information object is generated. On the other hand, if it is determined that the information regarding the snow cover is not displayed, the snow cover information object is not generated and the process proceeds to S507.
 積雪情報オブジェクトの生成の一例について説明する。積雪された雪の密度などの雪質は、積雪場所によって異なることがある。そのような雪質の違いは、車両106の走行に対してばかりでなく、除雪機の除雪作業に対しても影響を及ぼす可能性がある。そこで、本実施形態では、制御部500は、例えば図21に示すように、走行路と非走行路を識別可能に表示することに加えて、積雪情報オブジェクト1201、1202、1203のように、雪質の違いをAR表示させる。制御部500は、雪質の情報に応じて、積雪情報オブジェクト1201、1202、1203を生成する。例えば、積雪情報オブジェクト1201、1202、1203は、その順で、雪の密度が高くなる(固さが増す)ことを表している。なお、雪質の情報は、車両106からの車両情報(センサ情報)から取得しても良いし、環境情報から取得しても良い。図21のような表示により、走行路から離れるに従って雪質が固くなるといったことを車両106の搭乗者に容易に認識させることができる。なお、図21では、本表示の説明上、オブジェクト1005の表示を省略しているが、オブジェクト1005を合わせて表示するようにしても良い。 An example of generating a snow cover information object will be described. Snow quality, such as the density of snow, may vary depending on the location of the snow. Such a difference in snow quality may affect not only the running of the vehicle 106 but also the snow removal work of the snowplow. Therefore, in the present embodiment, in addition to displaying the traveling road and the non-traveling road in an identifiable manner, for example, as shown in FIG. 21, the control unit 500, like the snow information objects 1201, 1202, 1203, snow. AR display the difference in quality. The control unit 500 generates snow information objects 1201, 1202, 1203 according to the snow quality information. For example, the snow information objects 1201, 1202, and 1203 indicate that the snow density increases (hardness increases) in that order. The snow quality information may be acquired from the vehicle information (sensor information) from the vehicle 106 or from the environmental information. The display as shown in FIG. 21 makes it possible for the passengers of the vehicle 106 to easily recognize that the snow quality becomes harder as the distance from the traveling path increases. Although the display of the object 1005 is omitted in FIG. 21 for the purpose of explaining this display, the object 1005 may be displayed together.
 積雪に関する情報として、雪質に限られず、他の情報であっても良い。例えば、積雪深であっても良い。その場合、制御部500は、走行路と非走行路を識別可能に表示することに加えて、積雪深の情報をAR表示させる。そのAR表示の形態として、例えば、撮影データにおいて、丘状に積雪された雪山と認識されるオブジェクトの傍らに「1m」といった数値表示をするようにしても良い。若しくは、その雪山と認識されるオブジェクトに等高線(50cmライン、1mラインなど)を重畳するようにしても良い。S506の後、S507へ進む。 Information on snow cover is not limited to snow quality, but may be other information. For example, it may be snow depth. In that case, the control unit 500 displays the snow depth information in AR in addition to displaying the traveling road and the non-traveling road in an distinguishable manner. As a form of the AR display, for example, in the shooting data, a numerical value such as "1 m" may be displayed beside an object recognized as a snowy mountain covered with snow in a hill shape. Alternatively, contour lines (50 cm line, 1 m line, etc.) may be superimposed on the object recognized as the snowy mountain. After S506, proceed to S507.
 S507において、制御部500は、選択情報に基づいて、除雪量を表示するか否かを判定する。除雪量を表示すると判定された場合、S508へ進み、除雪量オブジェクトの生成が行われる。一方、除雪量を表示しないと判定された場合、除雪量オブジェクトの生成は行われずに図12の処理を終了し、図8のS206へ進む。 In S507, the control unit 500 determines whether or not to display the amount of snow removal based on the selection information. If it is determined that the snow removal amount is to be displayed, the process proceeds to S508 and the snow removal amount object is generated. On the other hand, if it is determined that the snow removal amount is not displayed, the snow removal amount object is not generated, the process of FIG. 12 is terminated, and the process proceeds to S206 of FIG.
 除雪量オブジェクトの生成の一例について説明する。図15は、除雪量オブジェクトを生成する処理の一例を示すフローチャートである。S801において、制御部500は、車両106からの車両情報に基づいて、除雪中であるか否かを判定する。除雪中であると判定された場合、S802へ進む。一方、停止しているなど、除雪中でないと判定された場合、図15及び図12の処理を終了し、図8のS206へ進む。 An example of generating a snow removal amount object will be described. FIG. 15 is a flowchart showing an example of a process for generating a snow removal amount object. In S801, the control unit 500 determines whether or not snow is being removed based on the vehicle information from the vehicle 106. If it is determined that snow is being removed, the process proceeds to S802. On the other hand, when it is determined that the snow is not being removed, such as when the snow is stopped, the processing of FIGS. 15 and 12 is terminated, and the process proceeds to S206 of FIG.
 S802において、制御部500は、除雪量情報を取得する。制御部500は、例えば、車両106からの車両情報からエンジントルクを除雪量情報として取得するようにしても良い。そして、S803において、制御部500は、取得したエンジントルクに基づいて、除雪量を推定する。 In S802, the control unit 500 acquires snow removal amount information. For example, the control unit 500 may acquire the engine torque as the snow removal amount information from the vehicle information from the vehicle 106. Then, in S803, the control unit 500 estimates the amount of snow removal based on the acquired engine torque.
 S804において、制御部500は、S803で推定された除雪量に応じて、除雪量オブジェクトを生成する。除雪量オブジェクトとは、数値を示すオブジェクトであり、例えば「10t/h」といった数値表示を画面端部にスコアのように表示させるためのオブジェクトである。そのような表示により、搭乗者に対してゲーム感覚を抱かせたり、除雪作業に伴う収入換算に用いることができる。S804の後、S801の処理が繰り返される。なお、S801~S804の処理が実行されることに並行して、図8のS206以降の処理が実行される。 In S804, the control unit 500 creates a snow removal amount object according to the snow removal amount estimated in S803. The snow removal amount object is an object that indicates a numerical value, and is an object for displaying a numerical display such as "10 t / h" at the edge of the screen like a score. With such a display, the passenger can be made to feel like a game, and can be used for income conversion associated with snow removal work. After S804, the process of S801 is repeated. In parallel with the execution of the processes S801 to S804, the processes after S206 in FIG. 8 are executed.
 制御部500は、S802で取得した除雪量情報を記憶部501に除雪量情報519として蓄積し、メンテナンスの通知に用いるようにしても良い。例えば、制御部500は、蓄積された除雪量情報519に基づいて、バッテリ交換などの消耗品交換の時期である旨を車両106に通知するようにしても良い。 The control unit 500 may store the snow removal amount information acquired in S802 in the storage unit 501 as the snow removal amount information 519 and use it for maintenance notification. For example, the control unit 500 may notify the vehicle 106 that it is time to replace consumables such as battery replacement based on the accumulated snow removal amount information 519.
 以上のように、本実施形態によれば、車両が雪面上を走行する場合、搭乗者に対して走行路と非走行路を容易に認識させることができる。また、本実施形態では、図1に示すように、サーバ101が車両104、105、106、携帯型端末107と別装置として構成される形態を説明したが、サーバ101の少なくとも一部が車両104、105、106、携帯型端末107に構成されるようにし、撮影データを取得してから、AR表示データを生成してAR表示するまでのリアルタイム性を向上するようにしても良い。 As described above, according to the present embodiment, when the vehicle travels on a snow surface, the passenger can easily recognize the traveling road and the non-traveling road. Further, in the present embodiment, as shown in FIG. 1, a mode in which the server 101 is configured as a separate device from the vehicles 104, 105, 106 and the portable terminal 107 has been described, but at least a part of the server 101 is the vehicle 104. , 105, 106, and the portable terminal 107, and the real-time property from the acquisition of the shooting data to the generation of the AR display data and the AR display may be improved.
 <実施形態のまとめ>
 上記実施形態のガイド表示方法は、所定位置における積雪後の状態を表す撮影データを取得する第1取得工程(S203)と、前記第1取得工程において取得された前記撮影データに基づいて、前記所定位置における積雪前の状態を表す画像データを記憶する第1記憶手段(514)から前記撮影データの表す前記積雪後の状態に対応する画像データを取得する第2取得工程(S204)と、前記第1取得工程において取得された前記撮影データと、前記第2取得工程において取得された前記画像データとに基づいて、前記撮影データの表す前記積雪後の状態で走行するためのガイドを表示するための表示データを生成する生成工程(S205)とを有することを特徴とする。
<Summary of Embodiment>
The guide display method of the above embodiment is based on the first acquisition step (S203) of acquiring shooting data representing the state after snowfall at a predetermined position and the shooting data acquired in the first acquisition step. The second acquisition step (S204) of acquiring the image data corresponding to the state after snowfall represented by the shooting data from the first storage means (514) that stores the image data representing the state before snowfall at the position, and the first. 1 To display a guide for traveling in the state after snowfall represented by the shooting data, based on the shooting data acquired in the acquisition step and the image data acquired in the second acquisition step. It is characterized by having a generation step (S205) for generating display data.
 そのような構成により、車両が雪面上を走行する場合、搭乗者に対して走行路と非走行路を容易に認識させることができる。 With such a configuration, when the vehicle travels on a snow surface, the passenger can easily recognize the traveling road and the non-traveling road.
 また、前記ガイドは、走行路を表すオブジェクトと、非走行路を表すオブジェクトとを含むことを特徴とする。また、前記非走行路を表すオブジェクトは、側溝を含むことを特徴とする。 Further, the guide is characterized by including an object representing a traveling path and an object representing a non-traveling path. Further, the object representing the non-traveling road is characterized by including a gutter.
 そのような構成により、例えば、搭乗者に対して、側溝を容易に認識させることができる。 With such a configuration, for example, the passenger can easily recognize the gutter.
 また、前記第2取得工程は、前記第1取得工程において取得された前記撮影データで認識されるオブジェクトを含む前記画像データを前記第1記憶手段から取得する(図9)ことを特徴とする。 Further, the second acquisition step is characterized in that the image data including the object recognized by the shooting data acquired in the first acquisition step is acquired from the first storage means (FIG. 9).
 そのような構成により、オブジェクトに基づいて、積雪前の状態を表す画像データを取得することができる。 With such a configuration, it is possible to acquire image data representing the state before snowfall based on the object.
 また、ガイド表示方法は、前記第1取得工程において取得された前記撮影データと、前記第1記憶手段から取得された前記画像データとの方向性のずれがあるか否かを判定する判定工程(S402)と、前記判定工程において前記方向性のずれがあると判定された場合、該方向性のずれを解消するように、前記第1記憶手段から取得された前記画像データを補正する補正工程(S403)と、をさらに有し、前記第2取得工程は、前記補正工程において補正された画像データを取得することを特徴とする。 Further, the guide display method is a determination step (determining whether or not there is a directional deviation between the captured data acquired in the first acquisition step and the image data acquired from the first storage means (). S402) and the correction step (S402), which corrects the image data acquired from the first storage means so as to eliminate the directional deviation when it is determined in the determination step that there is the directional deviation. S403), and the second acquisition step is characterized in that the image data corrected in the correction step is acquired.
 そのような構成により、積雪前の状態を表す画像データの方向性が撮影データの方向性とずれていたとしても、表示データの生成のために用いることができる。 With such a configuration, even if the directionality of the image data representing the state before snowfall deviates from the directionality of the shooting data, it can be used for generating display data.
 また、ガイド表示方法は、前記生成工程において生成された前記表示データに基づいて、前記積雪後の状態に前記ガイドが重畳された画面を表示装置に表示させる表示制御工程(S206、S207)、をさらに有することを特徴とする。また、前記表示装置(313)は、移動体に構成されていることを特徴とする。 Further, the guide display method includes a display control step (S206, S207) of displaying a screen on which the guide is superimposed on the state after snow accumulation on the display device based on the display data generated in the generation step. It is further characterized by having. Further, the display device (313) is characterized in that it is configured as a moving body.
 そのような構成により、積雪後の状態にガイドが重畳された画面を表示装置に表示させることができる。 With such a configuration, it is possible to display the screen on which the guide is superimposed on the state after snowfall on the display device.
 また、ガイド表示方法は、前記移動体の傾きの情報を取得する第3取得工程(図13)、をさらに有し、前記生成工程は、前記移動体の傾きが所定以上であることを通知するためのオブジェクトをさらに生成することを特徴とする。 Further, the guide display method further includes a third acquisition step (FIG. 13) for acquiring information on the inclination of the moving object, and the generation step notifies that the inclination of the moving object is equal to or higher than a predetermined value. It is characterized by further creating an object for the purpose.
 そのような構成により、移動体の傾きが所定以上となった場合に通知することができる。 With such a configuration, it is possible to notify when the inclination of the moving body exceeds a predetermined value.
 また、ガイド表示方法は、前記移動体は、除雪機であることを特徴とする。また、ガイド表示方法は、前記除雪機による除雪量の情報を取得する第4取得工程(図15)、をさらに有し、前記生成工程は、前記除雪量の情報を通知するためのオブジェクトをさらに生成することを特徴とする。また、ガイド表示方法は、前記第4取得工程において取得された前記除雪量の情報を第2記憶手段(519)に格納する格納工程、をさらに有することを特徴とする。 Further, the guide display method is characterized in that the moving body is a snowplow. Further, the guide display method further includes a fourth acquisition step (FIG. 15) for acquiring information on the amount of snow removed by the snowplow, and the generation step further includes an object for notifying the information on the amount of snow removal. It is characterized by generating. Further, the guide display method is further characterized by further including a storage step of storing the snow removal amount information acquired in the fourth acquisition step in the second storage means (519).
 そのような構成により、除雪機である車両に対して、除雪量の情報を通知することができる。 With such a configuration, it is possible to notify the vehicle, which is a snowplow, of information on the amount of snow removed.
 本発明は上記実施の形態に制限されるものではなく、本発明の精神及び範囲から離脱することなく、様々な変更及び変形が可能である。従って、本発明の範囲を公にするために、以下の請求項を添付する。 The present invention is not limited to the above embodiments, and various modifications and modifications can be made without departing from the spirit and scope of the present invention. Therefore, in order to make the scope of the present invention public, the following claims are attached.
 100 ガイド表示システム: 101 サーバ: 104、105、106 車両: 107 携帯型端末: 313 表示部: 416 表示装置: 500 制御部 100 guide display system: 101 server: 104, 105, 106 vehicle: 107 portable terminal: 313 display unit: 416 display device: 500 control unit

Claims (12)

  1.  所定位置における積雪後 の状態を表す撮影データを取得する第1取得工程と、
     前記第1取得工程において取得された前記撮影データに基づいて、前記所定位置における積雪前の状態を表す画像データを記憶する第1記憶手段から前記撮影データの表す前記積雪後の状態に対応する画像データを取得する第2取得工程と、
     前記第1取得工程において取得された前記撮影データと、前記第2取得工程において取得された前記画像データとに基づいて、前記撮影データの表す前記積雪後の状態で走行するためのガイドを表示するための表示データを生成する生成工程と、
     を有することを特徴とするガイド表示方法 。
    The first acquisition step of acquiring shooting data representing the state after snowfall at a predetermined position,
    Based on the shooting data acquired in the first acquisition step, an image corresponding to the state after snowfall represented by the shooting data from the first storage means for storing image data representing the state before snowfall at the predetermined position. The second acquisition process to acquire data and
    Based on the shooting data acquired in the first acquisition step and the image data acquired in the second acquisition step, a guide for traveling in the state after snowfall represented by the shooting data is displayed. Generation process to generate display data for
    A guide display method characterized by having.
  2.  前記ガイドは、走行路を表すオブジェクトと、非走行路を表すオブジェクトとを含むことを特徴とする請求項1に記載のガイド表示方法。 The guide display method according to claim 1, wherein the guide includes an object representing a traveling path and an object representing a non-traveling path.
  3.  前記非走行路を表すオブジェクトは、側溝を含むことを特徴とする請求項2に記載のガイド表示方法。 The guide display method according to claim 2, wherein the object representing the non-traveling road includes a gutter.
  4.  前記第2取得工程は、前記第1取得工程において取得された前記撮影データで認識されるオブジェクトを含む前記画像データを前記第1記憶手段から取得することを特徴とする請求項1乃至3のいずれか1項に記載のガイド表示方法。 Any of claims 1 to 3, wherein the second acquisition step acquires the image data including an object recognized by the shooting data acquired in the first acquisition step from the first storage means. The guide display method described in item 1.
  5.  前記第1取得工程において取得された前記撮影データと、前記第1記憶手段から取得された前記画像データとの方向性のずれがあるか否かを判定する判定工程と、
     前記判定工程において前記方向性のずれがあると判定された場合、該方向性のずれを解消するように、前記第1記憶手段から取得された前記画像データを補正する補正工程と、をさらに有し、
     前記第2取得工程は、前記補正工程において補正された画像データを取得することを特徴とする請求項1乃至4のいずれか1項に記載のガイド表示方法。
    A determination step of determining whether or not there is a directional deviation between the captured data acquired in the first acquisition step and the image data acquired from the first storage means.
    When it is determined in the determination step that there is a deviation in the directionality, there is further a correction step for correcting the image data acquired from the first storage means so as to eliminate the deviation in the directionality. And
    The guide display method according to any one of claims 1 to 4, wherein the second acquisition step acquires image data corrected in the correction step.
  6.  前記生成工程において生成された前記表示データに基づいて、前記積雪後の状態に前記ガイドが重畳された画面を表示装置に表示させる表示制御工程、をさらに有することを特徴とする請求項1乃至5のいずれか1項に記載のガイド表示方法。 Claims 1 to 5 further include a display control step of displaying a screen on which the guide is superimposed on the state after snow accumulation on a display device based on the display data generated in the generation step. The guide display method according to any one of the above.
  7.  前記表示装置は、移動体に構成されていることを特徴とする請求項6に記載のガイド表示方法。 The guide display method according to claim 6, wherein the display device is configured as a mobile body.
  8.  前記移動体の傾きの情報を取得する第3取得工程、をさらに有し、
     前記生成工程は、前記移動体の傾きが所定以上であることを通知するためのオブジェクトをさらに生成することを特徴とする請求項7に記載のガイド表示方法。
    Further, it has a third acquisition step of acquiring information on the inclination of the moving body.
    The guide display method according to claim 7, wherein the generation step further generates an object for notifying that the inclination of the moving body is equal to or higher than a predetermined value.
  9.  前記移動体は、除雪機であることを特徴とする請求項7又は8に記載のガイド表示方法。 The guide display method according to claim 7 or 8, wherein the moving body is a snowplow.
  10.  前記除雪機による除雪量の情報を取得する第4取得工程、をさらに有し、
     前記生成工程は、前記除雪量の情報を通知するためのオブジェクトをさらに生成することを特徴とする請求項9に記載のガイド表示方法。
    It further has a fourth acquisition step of acquiring information on the amount of snow removed by the snowplow.
    The guide display method according to claim 9, wherein the generation step further generates an object for notifying the snow removal amount information.
  11.  前記第4取得工程において取得された前記除雪量の情報を第2記憶手段に格納する格納工程、をさらに有することを特徴とする請求項10に記載のガイド表示方法。 The guide display method according to claim 10, further comprising a storage step of storing the snow removal amount information acquired in the fourth acquisition step in the second storage means.
  12.  サーバと、前記サーバと通信可能な車両と、を含むガイド表示システムにおいて実行されるガイド表示方法であって、
     前記サーバが、
     所定位置における積雪後の状態を表す撮影データを前記車両から受信する第1受信工程と、
     前記第1受信工程において受信された前記撮影データに基づいて、前記所定位置における積雪前の状態を表す画像データを記憶する記憶手段から前記撮影データの表す前記積雪後の状態に対応する画像データを取得する取得工程と、
     前記第1受信工程において受信された前記撮影データと、前記取得工程において取得された前記画像データとに基づいて、前記撮影データの表す前記積雪後の状態で走行するためのガイドを表示するための表示データを生成する生成工程と、
     前記生成工程において生成された前記表示データを前記車両に送信する第1送信工程と、
     前記車両が、
     撮影手段により撮影された前記撮影データを前記サーバへ送信する第2送信工程と、
     前記表示データを前記サーバから受信する第2受信工程と、
     前記第2受信工程において受信された前記表示データに基づいて、前記積雪後の状態に前記ガイドが重畳された画面を表示する表示工程と、
    を有することを特徴とするガイド表示方法。
    A guide display method executed in a guide display system including a server and a vehicle capable of communicating with the server.
    The server
    A first receiving step of receiving shooting data representing a state after snowfall at a predetermined position from the vehicle, and
    Based on the shooting data received in the first receiving step, image data corresponding to the state after snowfall represented by the shooting data is stored from a storage means that stores image data representing the state before snowfall at the predetermined position. Acquisition process to acquire and
    Based on the shooting data received in the first receiving step and the image data acquired in the acquisition step, a guide for displaying the guide for traveling in the state after snowfall represented by the shooting data is displayed. The generation process to generate display data and
    A first transmission step of transmitting the display data generated in the generation step to the vehicle, and
    The vehicle
    A second transmission step of transmitting the photographed data photographed by the photographing means to the server, and
    A second receiving step of receiving the display data from the server, and
    A display step of displaying a screen on which the guide is superimposed on the state after snow accumulation based on the display data received in the second reception step.
    A guide display method characterized by having.
PCT/JP2019/031863 2019-08-13 2019-08-13 Guide display method WO2021029016A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021539747A JP7212788B2 (en) 2019-08-13 2019-08-13 Guide display method
PCT/JP2019/031863 WO2021029016A1 (en) 2019-08-13 2019-08-13 Guide display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/031863 WO2021029016A1 (en) 2019-08-13 2019-08-13 Guide display method

Publications (1)

Publication Number Publication Date
WO2021029016A1 true WO2021029016A1 (en) 2021-02-18

Family

ID=74569552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/031863 WO2021029016A1 (en) 2019-08-13 2019-08-13 Guide display method

Country Status (2)

Country Link
JP (1) JP7212788B2 (en)
WO (1) WO2021029016A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350617A (en) * 2005-06-15 2006-12-28 Denso Corp Vehicle driving support apparatus
JP2010257253A (en) * 2009-04-24 2010-11-11 Denso Corp Display device for vehicle
JP2012158950A (en) * 2011-02-02 2012-08-23 Nichijo Manufacturing Co Ltd Monitoring system for rotary snow blower and rotary snow blower
JP2013235378A (en) * 2012-05-08 2013-11-21 Toyota Motor Corp Vehicle information providing device
JP2015164001A (en) * 2014-02-28 2015-09-10 株式会社丸山製作所 guide device
WO2016051447A1 (en) * 2014-09-29 2016-04-07 三菱電機株式会社 Information display control system and information display control method
JP2017081378A (en) * 2015-10-27 2017-05-18 矢崎総業株式会社 Drive assisting device for vehicle
JP2018097431A (en) * 2016-12-08 2018-06-21 株式会社デンソーテン Driving support apparatus, driving support system and driving support method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350617A (en) * 2005-06-15 2006-12-28 Denso Corp Vehicle driving support apparatus
JP2010257253A (en) * 2009-04-24 2010-11-11 Denso Corp Display device for vehicle
JP2012158950A (en) * 2011-02-02 2012-08-23 Nichijo Manufacturing Co Ltd Monitoring system for rotary snow blower and rotary snow blower
JP2013235378A (en) * 2012-05-08 2013-11-21 Toyota Motor Corp Vehicle information providing device
JP2015164001A (en) * 2014-02-28 2015-09-10 株式会社丸山製作所 guide device
WO2016051447A1 (en) * 2014-09-29 2016-04-07 三菱電機株式会社 Information display control system and information display control method
JP2017081378A (en) * 2015-10-27 2017-05-18 矢崎総業株式会社 Drive assisting device for vehicle
JP2018097431A (en) * 2016-12-08 2018-06-21 株式会社デンソーテン Driving support apparatus, driving support system and driving support method

Also Published As

Publication number Publication date
JPWO2021029016A1 (en) 2021-02-18
JP7212788B2 (en) 2023-01-25

Similar Documents

Publication Publication Date Title
US7362241B2 (en) Vehicle proximity warning apparatus, method and program
US20200209857A1 (en) Multimodal control system for self driving vehicle
JP7074438B2 (en) Vehicle position estimation device
US7429825B2 (en) Headlight beam control system and headlight beam control method
US20230227031A1 (en) 3D Occlusion Reasoning for Accident Avoidance
EP3841525B1 (en) Autonomous vehicle operational management with visual saliency perception control
US11782439B2 (en) Determining routes for autonomous vehicles
WO2019186813A1 (en) Snow plow; projection method, program, and storage medium for snow plow; and projection system
JP4614098B2 (en) Peripheral situation recognition device and method
JP2006209510A (en) Image recognition device and image recognition method
JP6930152B2 (en) Autonomous driving system
JP7011559B2 (en) Display devices, display control methods, and programs
JP6956132B2 (en) Shooting system, server, control method and program
JP7221332B2 (en) Vehicle running control device
JP7315101B2 (en) Obstacle information management device, obstacle information management method, vehicle device
CN109425861A (en) This truck position confidence level arithmetic unit
JP4692831B2 (en) Peripheral situation recognition device and method
JP4687244B2 (en) Image processing system and image processing method
WO2021029016A1 (en) Guide display method
US20220266824A1 (en) Road information generation apparatus
JP4844812B2 (en) Inter-vehicle information communication system
JP7301897B2 (en) map generator
CN114194187B (en) Vehicle travel control device
WO2023188254A1 (en) Control device for moving body, control method for moving body, and storage medium
US20230316900A1 (en) Reproduction system, reproduction method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19941384

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021539747

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19941384

Country of ref document: EP

Kind code of ref document: A1