WO2021172514A1 - Dispositif de sortie d'informations et programme de commande - Google Patents

Dispositif de sortie d'informations et programme de commande Download PDF

Info

Publication number
WO2021172514A1
WO2021172514A1 PCT/JP2021/007330 JP2021007330W WO2021172514A1 WO 2021172514 A1 WO2021172514 A1 WO 2021172514A1 JP 2021007330 W JP2021007330 W JP 2021007330W WO 2021172514 A1 WO2021172514 A1 WO 2021172514A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection
detection position
unit
point
passage
Prior art date
Application number
PCT/JP2021/007330
Other languages
English (en)
Japanese (ja)
Inventor
和行 泉
Original Assignee
株式会社 ミックウェア
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 ミックウェア filed Critical 株式会社 ミックウェア
Priority to JP2022503748A priority Critical patent/JPWO2021172514A1/ja
Publication of WO2021172514A1 publication Critical patent/WO2021172514A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present invention relates to an information output device and a control program.
  • Patent Document 1 describes a reverse-way driving determination device in which a first reverse-way driving determination area, a second reverse-way driving determination area, and a third reverse-way driving determination area are set along an expressway in order to notify the reverse-way driving of a vehicle. It is disclosed. These reverse-way driving determination areas are set in the order opposite to the traveling direction of the expressway.
  • the reverse-way driving determination device disclosed in Patent Document 1 when a vehicle passes through three reverse-way driving determination areas in the order of the first reverse-way driving determination area, the second reverse-way driving determination area, and the third reverse-way driving determination area, the vehicle reverses. Judge that it is running.
  • Patent Document 2 discloses a technique for determining whether or not the current position and the reference point are located across the boundary line of the region in order to determine the region to which the current position belongs.
  • the reference point is set outside the region. Specifically, in the technique disclosed in Patent Document 2, it is detected whether or not the boundary line of the region and the line segment connecting the current position and the reference point intersect, and the current position is determined based on the number of intersections. It is determined whether or not it belongs to the area.
  • the reverse-way driving determination device disclosed in Patent Document 1 it is necessary to set a plurality of reverse-way driving determination areas in order to determine that the vehicle is running in reverse. The reason for this is that it is not possible to determine the passing direction in which the vehicle has passed the reverse driving determination area only by setting one reverse driving determination area.
  • the technique disclosed in Patent Document 2 can determine that the boundary line of the region intersects with the line segment connecting the current position and the reference point, but for example, the passing direction in which the vehicle has passed the boundary line can be determined. You can't ask. It is considered that a technique capable of determining a passing direction in which a vehicle or the like has passed the boundary line can be applied to various techniques such as reverse driving detection. Therefore, a new technology capable of detecting that a vehicle or the like has passed a virtual boundary line is desired.
  • the reverse run determination device disclosed in Patent Document 1 there is a problem that the larger the reverse run determination area is set, the larger the amount of data stored in the reverse run determination device. This is because the reverse-way driving determination device disclosed in Patent Document 1 includes all the information regarding the reverse-way driving determination area in the map data and stores it in the storage unit.
  • an information output device such as a car navigation device
  • the amount of data stored in an information output device is large from the viewpoint of the operation of the device and the required capacity of the storage unit.
  • One aspect of the present invention is to realize an information output device and a control program capable of detecting a passing direction in which a moving body has passed a virtual boundary line.
  • One aspect of the present invention is to realize an information output device and a control program capable of detecting reverse driving while suppressing the amount of data to be stored.
  • the information output device includes a position acquisition unit, a storage unit, and a control unit.
  • the position acquisition unit acquires position data. Position data is acquired at predetermined time intervals.
  • the storage unit stores the point data.
  • the point data shows each of the two points. The two points are arbitrarily set.
  • the control unit detects the passing direction.
  • the control unit outputs predetermined information to the output unit.
  • the passing direction is the direction in which the position changes sequentially and passes between the points.
  • the position is the position indicated by the position data.
  • the position data is position data acquired by the position acquisition unit.
  • the information output device includes a position acquisition unit, a communication unit, an output unit, and a control unit.
  • the position acquisition unit acquires position data. Position data is acquired at predetermined time intervals.
  • the communication unit communicates.
  • the output unit outputs the reverse run notification information.
  • the reverse-way notification information indicates reverse-way driving.
  • the control unit controls the communication unit.
  • the control unit controls the output unit.
  • the position is the position indicated by the position data.
  • the position data is position data acquired by the position acquisition unit.
  • the detection position is set in the passage.
  • the point data indicates other detection positions. Other detection positions are set in the direction of reverse travel from the detection position.
  • the control unit outputs the reverse run notification information to the output unit.
  • the position is the position indicated by the position data.
  • the position data is position data acquired by the position acquisition unit.
  • reverse driving can be detected while suppressing the amount of data to be stored.
  • FIG. 1 It is a schematic block diagram which shows the information output system which is Embodiment 1 of this invention. It is a figure for demonstrating the process of the reverse run detection by the information output system which is Embodiment 1 of this invention. It is a figure which shows typically the structure of a server. It is explanatory drawing explaining the outline of the location trigger using the detection position. It is a figure which shows typically the structure of a user terminal. It is explanatory drawing for demonstrating the detection position. It is a flowchart which shows the procedure of the passage detection processing which detects the passage of the detection position by a vehicle, and the passage direction. It is explanatory drawing for demonstrating the passage detection process shown in FIG. It is explanatory drawing for demonstrating the passage detection process shown in FIG.
  • Example 1 which concerns on Embodiment 2 of this invention.
  • Example 2 which concerns on Embodiment 2 of this invention.
  • Example 3 which concerns on Embodiment 2 of this invention.
  • it is a flowchart which shows the procedure of the passage detection processing which detects the passage of the detection position by a vehicle, and the passage direction. It is explanatory drawing for demonstrating the passage detection process shown in FIG.
  • It is a hardware block diagram which shows an example of the computer which realizes the function of the said user terminal.
  • FIG. 1 is a schematic configuration diagram showing an information output system 1 according to the first embodiment of the present invention.
  • the information output system 1 includes a user terminal 100 and a server 10.
  • the user terminal 100 corresponds to an information output device.
  • the user terminal 100 and the server 10 are connected to each other via a network 30 so as to be able to communicate by wire or wirelessly.
  • the number of each device included in the information output system 1 is not particularly limited.
  • the information output system 1 may be configured to include two or more user terminals 100.
  • the information output system 1 may be configured to include two or more servers 10.
  • the user terminal 100 is a terminal attached to a moving body such as a vehicle.
  • the user terminal 100 is, for example, a mobile terminal or a personal computer.
  • the mobile terminal is, for example, a mobile phone, a smart phone, a tablet, a notebook computer, a laptop computer, or a wearable computer.
  • the moving body is, for example, a person, a vehicle, a ship, an aircraft, a diving probe, or an unmanned motive.
  • the unmanned motive is sometimes called a "drone".
  • Vehicles are, for example, automobiles, motorcycles, bicycles, and trains.
  • the user terminal 100 may be a mobile terminal held by a mobile body such as a human being.
  • the detection position is set to a position corresponding to a line segment connecting two different points.
  • the shape of the detection position is set linearly.
  • a mechanism capable of detecting that a moving body has passed a detection position set linearly is referred to as a "boundary geo-fence".
  • the user terminal 100 is a car navigation device, and the predetermined information is reverse-way driving detection information indicating that the vehicle is running backward in the aisle. Further, in the present embodiment, the user terminal 100 outputs reverse-way driving caution information indicating that the vehicle may run backward in the aisle as predetermined information.
  • a passage means a road or space through which a moving body can pass freely. In this embodiment, the passages are highways and general roads. The passage may be a road other than a highway and a general road.
  • FIG. 2 is a diagram for explaining the reverse-way driving detection process by the information output system 1 of the present embodiment.
  • the information output system 1 detects the reverse driving of the vehicle by using the boundary type geo-fence.
  • the information output system 1 outputs information that calls attention to a vehicle that may run in reverse.
  • the information output system 1 outputs information for notifying the reverse-way driving to the vehicle running in the reverse direction.
  • the example shown in FIG. 2 is a map showing an interchange on an expressway and the area around the interchange. In the information output system 1, a plurality of detection positions F are set.
  • the detection position F1 the detection position F2, the detection position F3, the detection position F4, the detection position F5, the detection position F6, the detection position F7, and the detection position F8 Is set.
  • the detection position F can be said to be a virtual boundary line set at an arbitrary position.
  • detection position F will be used as a generic term for the detection positions F1 to F8 and each detection position described later.
  • the detection position F is set by, for example, a business operator that provides the content. That is, the detection position F is set by determining each point at both ends by the business operator who provides the content. The business operator sets the detection position F, for example, at a place where the vehicle has run backwards in the past. Point data indicating each point at both ends of the detection position F is stored in the server 10.
  • FIG. 3 is a diagram schematically showing the configuration of the server 10. As shown in FIG. 3, the server 10 includes a server communication unit 11, a data storage unit 12, and a server control unit 13.
  • the server communication unit 11 is realized by, for example, a NIC (Network Interface Card) or the like.
  • the server communication unit 11 is connected to the network 30 shown in FIG. 1 by wire or wirelessly.
  • the server communication unit 11 transmits and receives information to and from the user terminal 100.
  • the data storage unit 12 is realized by a storage device.
  • the storage device include semiconductor memory elements such as RAM (Random Access Memory) and flash memory (Flash Memory), hard disks, optical disks, and the like.
  • the data storage unit 12 stores the content DB (DATABASE) 121 and the location trigger DB 122.
  • the content DB 121 is a database for accumulating content data.
  • This content data is reverse-way driving notification information and reverse-way driving caution information. That is, the content DB 121 stores content data for notifying the user that there is a possibility of reverse driving or reverse driving.
  • the content DB 121 may store either the reverse-way driving notification information or the reverse-way driving caution information. Further, the content DB 121 may store information other than these information.
  • the reverse-way driving notification information and the reverse-way driving caution information are at least one of character data, image data, and moving image data.
  • the user terminal 100 outputs the reverse-way driving notification information by displaying the reverse-way driving notification information on the screen. Further, the user terminal 100 outputs the reverse-way driving caution information by displaying the reverse-way driving caution information on the screen.
  • the reverse-way driving notification information and the reverse-way driving caution information may be voice data.
  • the audio data is information for performing TTS (text to speech) or data in a normal audio format. TTS is also called text-to-speech synthesis information.
  • the user terminal 100 outputs the reverse-way driving notification information by transmitting a voice based on the reverse-way driving notification information from the speaker. Further, the user terminal 100 outputs the reverse-way driving caution information by transmitting a voice based on the reverse-way driving caution information from the speaker.
  • the reverse-way driving notification information may be output from both the screen and the speaker of the user terminal 100. Further, the reverse-way driving caution information may be output from both the screen of the user terminal 100 and the speaker.
  • the location trigger DB 122 is a database that stores content data related to the location trigger.
  • FIG. 4 is an explanatory diagram illustrating an outline of the location trigger D using the detection position F. As shown in FIG. 4, the location trigger D is a mechanism in which a specific action is executed by a combination of the detection position F and the trigger action C associated with the detection position F. However, although not essential, the combination may include a state B associated with the detection position F.
  • State B is the state of the vehicle when the vehicle passes the detection position F.
  • the state B is, for example, a state of the vehicle relating to at least one of the speed of the vehicle, the direction in which the vehicle travels, the type of the passage through which the vehicle travels, the time zone in which the vehicle travels, and the like.
  • the state B is, for example, a state in which the vehicle is traveling at a speed of 100 km / h.
  • State B is, for example, a state in which the vehicle is traveling due north.
  • the state B is, for example, a state in which the vehicle is traveling on a road whose passage type is a general road.
  • the state B is, for example, a state in which the vehicle is traveling in the time zone from 7:00 to 12:00.
  • the state B is, for example, a state in which the vehicle has passed the detection position F from the left side to the right side.
  • Trigger action C is an action executed by the user terminal 100 when the trigger condition is satisfied. Actions to be taken include enabling or disabling another location trigger D. Further, as an action to be executed, deleting another location trigger D can be mentioned. Further, as the action to be executed, the server 10 is requested to download the data related to the new location trigger D. Further, as an action to be executed, reading the location trigger D stored in the user terminal 100 itself can be mentioned. In addition, the action to be executed includes playing the content.
  • the trigger condition is that the user terminal 100 detects that the vehicle has passed the detection position F and that the vehicle is in the state B.
  • the content data accumulated in the location trigger DB 122 includes data related to the detection position F, data related to the state B, and data related to the trigger action C.
  • the data related to the detection position F indicates the shape of the detection position F, the coordinates in which the detection position F is set, the state of the detection position F, and the like.
  • Examples of the shape of the detection position F include the above-mentioned linear, circular, polygonal, and point points. Circles, polygons and points will be described later.
  • the coordinates for which the detection position F is set are represented by two or more point data.
  • the point data is data indicating each point at both ends of the detection position F. Each point is represented by, for example, latitude and longitude.
  • the point data is set for each detection position F.
  • the state of the detection position F is represented by the state data.
  • the state data is data indicating whether or not the user terminal 100 can detect the detection position F.
  • the state data is set to the valid state or the invalid state.
  • the state indicated by the state data can be switched from the valid state to the invalid state.
  • the state indicated by the state data can be switched from the invalid state to the valid state.
  • Each of the server 10 and the user terminal 100 is configured so that the state of the state data can be switched.
  • the data related to the state B indicates the speed of the vehicle, the direction in which the vehicle travels, the type of road on which the vehicle travels, and the like.
  • the data related to the trigger action C is information indicating the trigger action C.
  • the data related to the trigger action C is interpreted and used by the user terminal 100 as a URL (Uniform Resource Locator), and the trigger data is transmitted from the server 10. Show that.
  • the trigger data is the data transmitted to the server 10 as a result of the trigger action D shown in FIG. That is, the trigger data is information sent to the server 10 when the trigger condition is satisfied.
  • the trigger data is vehicle position data, time data, speed data, direction data, and the like.
  • the time data is data indicating the time when the position data of the vehicle is acquired.
  • the speed data is data indicating the speed of the vehicle.
  • the directional data is data indicating the directional direction in which the vehicle travels.
  • the data related to the trigger action C indicates that when the information is notified from the user terminal 100 to the server 10, the URL is interpreted and used by the user terminal 100, and the trigger data is notified from the user terminal 100.
  • the data related to the trigger action C indicates that when the other location trigger is enabled, invalidated or deleted, the data indicating the location trigger name and the data indicating the operation are generated.
  • the data related to the trigger action C indicates that when the image content is reproduced on the user terminal 100, the URL of the image is interpreted and used by the user terminal 100. Further, the data related to the trigger action C indicates that when the user terminal 100 reproduces the voice content, the user terminal 100 interprets and uses the URL of the voice.
  • the location trigger D stored in the location trigger DB 122 shown in FIG. 3 is associated with the detection position F1 shown in FIG. 4, the state B associated with the detection position F1, and the detection position F1. It is a combination with the trigger action C.
  • the user terminal 100 may acquire data related to the location trigger D from the server 10. Further, the location trigger D may read and use the data related to the location trigger D stored in the user terminal 100 itself without going through the network 30.
  • the server 10 shown in FIG. 3 may further store the map DB.
  • the user terminal 100 may be configured to display the map data acquired from the server 10 on the screen.
  • the server control unit 13 is realized by, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like.
  • the CPU, MPU, and the like execute various programs stored in a predetermined storage device inside the server 10 using the RAM as a work area.
  • the server control unit 13 is realized by, for example, a semiconductor integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the server control unit 13 has a server reception unit 14 and a server distribution unit 15.
  • the server reception unit 14 receives trigger data from the user terminal 100 via the server communication unit 11.
  • the server distribution unit 15 determines the content data to be distributed to the user terminal 100 based on the trigger data. After that, the server distribution unit 15 distributes the content data to the user terminal 100 via the server communication unit 11. That is, the server 10 controls the distribution content to be distributed to the user terminal 100 based on the trigger data. In the present embodiment, the server distribution unit 15 distributes the data corresponding to the detection position F1 to the user terminal 100 in advance as the content data. Then, when it is detected that the vehicle has passed the detection position F1 shown in FIG. 2, the server distribution unit 15 distributes the content data to the user terminal 100.
  • This content data includes point data, state data, reverse-way driving notification information, and reverse-way driving caution information.
  • the point data and the state data are data corresponding to the detection positions F2 to the detection position F8.
  • FIG. 5 is a diagram schematically showing the configuration of the user terminal 100.
  • the user terminal 100 includes a user terminal communication unit 110, a user input unit 120, an output unit 130, a user terminal storage unit 140, a sensor input unit 150, a GPS receiving unit 160, and a user. It has a terminal control unit 170.
  • the user terminal communication unit 110 is realized by, for example, a NIC or the like.
  • the user terminal communication unit 110 is connected to the network 30 shown in FIG. 1 by wire or wirelessly.
  • the user terminal communication unit 110 transmits / receives information to / from the server 10.
  • the user input unit 120 shown in FIG. 5 is an input device that receives various operations from the user.
  • the user input unit 120 is realized by a keyboard, a mouse, operation keys, or the like.
  • the output unit 130 is a device for displaying various information.
  • the output unit 130 is a screen of the user terminal 100.
  • the output unit 130 is realized by a liquid crystal display or the like.
  • the user input unit 120 and the output unit 130 are integrated.
  • the output unit 130 may be a speaker.
  • the output unit 130 may be both the screen of the user terminal 100 and the speaker.
  • the user terminal storage unit 140 is realized by a storage device. Examples of the storage device include semiconductor memory elements such as RAM and flash memory, hard disks, optical disks, and the like.
  • the user terminal storage unit 140 stores various types of information.
  • the user terminal storage unit 140 stores various programs and the like executed by the user terminal 100.
  • the user terminal storage unit 140 temporarily stores the control program and the like.
  • the user terminal storage unit 140 corresponds to a storage unit.
  • the processing data storage area 141 is an area for storing content data.
  • the processing data storage area 141 stores the content data corresponding to the detection position F1 shown in FIG. 2 in advance. Further, the processing data storage area 141 temporarily stores the content data corresponding to the detection positions F2 to F8 shown in FIG.
  • the sensor input unit 150 shown in FIG. 5 acquires, for example, the angular velocity and acceleration measured by the gyro sensor and the acceleration sensor.
  • the user terminal control unit 170 can improve the positioning accuracy of the vehicle position data by using the angular velocity and acceleration acquired by the sensor input unit 150 at the time of positioning by the GPS receiving unit 160.
  • the sensor input unit 150 is not an indispensable configuration requirement for the user terminal 100.
  • the GPS receiving unit 160 receives radio waves from GPS satellites by a GPS (Global Positioning System) sensor (not shown).
  • the GPS receiving unit 160 positions position data indicating the current position of the vehicle by using radio waves received by the GPS sensor.
  • the position is represented by, for example, latitude and longitude.
  • Positioning by the GPS receiving unit 160 is performed at predetermined time intervals.
  • the predetermined time interval is, for example, 1 second.
  • the GPS receiving unit 160 corresponds to a position acquisition unit.
  • the user terminal control unit 170 is realized by, for example, a CPU, an MPU, or the like.
  • the CPU, MPU, and the like execute various programs stored in the storage device inside the user terminal 100 using the RAM as a work area. These various programs are examples of control programs according to the present embodiment.
  • the user terminal control unit 170 is realized by, for example, a semiconductor integrated circuit such as an ASIC or an FPGA.
  • the user terminal control unit 170 includes a request unit 171, an acquisition unit 172, a drawing unit 173, a detection unit 174, and an output control unit 175.
  • the request unit 171 requests the server 10 to deliver the content data via the user terminal communication unit 110.
  • the acquisition unit 172 acquires content data from the server 10 via the user terminal communication unit 110.
  • the drawing unit 173 causes the output unit 130, which is the screen of the user terminal 100, to display a map or the like.
  • the detection unit 174 detects that the vehicle has passed the detection position F shown in FIG. Further, the detection unit 174 determines the passing direction in which the vehicle has passed the detection position F. When the detection unit 174 detects that the vehicle has passed the detection position F, the output control unit 175 causes the output unit 130 to output the content data.
  • FIG. 6 is an explanatory diagram for explaining the detection position F.
  • the detection position F is set by designating at least two arbitrary points.
  • the two points are the first point A and the second point B.
  • the second point B is set at a position sandwiching the passage L from the first point A.
  • the detection position F is set so as to be orthogonal to the direction in which the passage L extends.
  • the two points may be set inside the passage L instead of outside the passage L, respectively.
  • the two points may be set at positions unrelated to the passage L, not at positions sandwiching the passage L.
  • FIG. 7 is a flowchart showing the procedure of the passage detection process for detecting the passage of the detection position F by the vehicle and the passage direction.
  • 8A and 8B are explanatory views for explaining the passage detection process shown in FIG. 7.
  • the detection unit 174 calculates a vector based on the two position data and the two point data.
  • the two position data are the first position data acquired by the GPS receiving unit 160 and the second position data acquired next by the GPS receiving unit 160.
  • the first position data shows each first position P of FIGS. 8A and 8B.
  • the second position data shows each second position Q in FIGS. 8A and 8B.
  • the two point data indicate two first point A and second point B, which are both ends of each detection position F in FIGS. 8A and 8B, respectively.
  • the detection unit 174 heads for the vector PQ from the first position P to the second position Q, the vector PB from the first position P to the second point B, and the vector PB from the first position P to the first point A. Calculate the vector PA.
  • the states shown in FIGS. 8A and 8B are states in which the vehicle has moved from the first position P through the detection position F toward the second position Q. If the vehicle has not passed the detection position F, the second position Q is on the same side as the first position P with respect to the detection position F.
  • Step S102 The detection unit 174 obtains the outer product vector of the vector PQ and the vector PB.
  • this outer product vector is referred to as an outer product 1.
  • Step S103 The detection unit 174 obtains the outer product vector of the vector PQ and the vector PA.
  • this outer product vector is referred to as an outer product 2.
  • the detection unit 174 compares the code of the Z component of the outer product 1 with the code of the Z component of the outer product 2.
  • the sign of the Z component of the outer product 1 is specified by the direction in which the screw advances when the right-hand screw is turned in the direction from the vector PQ to the vector PB.
  • the direction in which the screw advances is the Z component. More specifically, when the direction in which the screw advances is the positive direction, the sign of the Z component of the outer product 1 is positive. When the direction in which the screw advances is negative, the sign of the Z component of the outer product 1 becomes negative.
  • the detection unit 174 multiplies the sign of the Z component of the outer product 1 and the sign of the Z component of the outer product 2.
  • the detection unit 174 determines whether or not the result of multiplication of the sign of the Z component of the outer product 1 and the sign of the Z component of the outer product 2 is positive. If the result of the multiplication is positive, the detection unit 174 becomes YES in step S104 and ends the passage detection process. If YES in step S104, the detection unit 174 has not detected that the vehicle has passed the detection position F. If the result of the multiplication is not positive, the detection unit 174 becomes NO in step S104, and advances the passage detection process to step S105.
  • the detection unit 174 calculates the vector AB, the vector AP, and the vector AQ.
  • the vector AB is a vector from the first point A to the second point B.
  • the vector AP is a vector from the first point A to the first position P.
  • the vector AQ is a vector from the first point A to the second position Q.
  • Step S106 The detection unit 174 obtains the outer product vector of the vector AB and the vector AP.
  • this outer product vector is referred to as an outer product 3.
  • Step S107 The detection unit 174 obtains the outer product vector of the vector AB and the vector AQ.
  • this outer product vector is referred to as an outer product 4.
  • Step S108 The detection unit 174 compares the code of the Z component of the outer product 3 with the code of the Z component of the outer product 4. Specifically, the detection unit 174 multiplies the code of the Z component of the outer product 3 and the code of the Z component of the outer product 4. The detection unit 174 determines whether or not the result of multiplication of the sign of the Z component of the outer product 3 and the sign of the Z component of the outer product 4 is positive. If the result of the multiplication is positive, the detection unit 174 becomes YES in step S108 and ends the passage detection process. If YES in step S108, the detection unit 174 has not detected that the vehicle has passed the detection position F. If the result of the multiplication is not positive, the detection unit 174 becomes NO in step S108, and advances the passage detection process to step S109.
  • step S104 If the vehicle does not pass between the first point A and the second point B and the vehicle passes the extension line of the line connecting the first point A and the second point B, the outer product 1 in step S104.
  • the multiplication result of the sign of the Z component of the outer product 2 and the sign of the Z component of the outer product 2 is not positive, and the multiplication result of the sign of the Z component of the outer product 3 and the sign of the Z component of the outer product 4 is positive in step S108. .. That is, when the vehicle passes the extension line of the line segment connecting the first point A and the second point B, YES is set in step S108.
  • the detection unit 174 determines whether or not the vehicle has passed the detection position F based on the outer product of the vectors starting from the first position P, and then steps S106 to S107. In, it is determined whether or not the vehicle has passed the detection position F based on the outer product of the vectors starting from the first point A. Therefore, the detection unit 174 does not erroneously determine that the vehicle has passed the detection position F when the vehicle has passed the detection position F. Note that the detection unit 174 may determine whether or not the vehicle has passed the detection position F based on the vector starting from the second position Q instead of the vector starting from the first position P. good. Further, the detection unit 174 may determine whether or not the vehicle has passed the detection position F based on the vector starting from the second point B instead of the vector starting from the first point A. ..
  • Step S109 The detection unit 174 determines whether or not the sign of the Z component of the outer product 4 is negative. When the sign of the Z component of the outer product 4 is negative, the detection unit 174 becomes YES in step S109, and advances the passage detection process to step S110. If the sign of the Z component of the outer product 4 is not negative, the detection unit 174 becomes NO in step S109 and advances the passage detection process to step S111.
  • Step S110 The detection unit 174 determines that the vehicle has passed the detection position F from the second position Q toward the first position P. This ends the passage detection process. That is, the detection unit 174 determines that the passing direction in which the vehicle passes through the detection position F is the direction from the second position Q to the first position P.
  • the sign of the Z component of the outer product 4 becomes negative in step S109 means that the second position Q exists in the opposite direction of the right-handed screw with respect to the vector AB. In the case of the right-handed screw in the opposite direction, the direction in which the screw advances is downward, so that the sign of the Z component of the outer product 4 becomes negative.
  • the presence of the second position Q in the opposite direction of the right-handed screw indicates that the passing direction is the direction from the second position Q to the first position P.
  • Step S111 The detection unit 174 determines whether or not the sign of the Z component of the outer product 4 is positive. When the sign of the Z component of the outer product 4 is positive, the detection unit 174 becomes YES in step S111, and advances the passage detection process to step S112. If the sign of the Z component of the outer product 4 is not positive, the detection unit 174 becomes NO in step S111 and advances the passage detection process to step S113.
  • Step S112 The detection unit 174 determines that the vehicle has passed the detection position F from the first position P to the second position Q. This ends the passage detection process. That is, the detection unit 174 determines that the passing direction in which the vehicle passes through the detection position F is the direction from the first position P to the second position Q.
  • the sign of the Z component of the outer product 4 becomes positive in step S111, it means that the second position Q exists in the direction of the right-handed screw with respect to the vector AB. In the case of the right-handed screw direction, the direction in which the screw advances is upward, so that the sign of the Z component of the outer product 4 becomes positive.
  • the presence of the second position Q in the direction of the right-handed screw indicates that the passing direction is the direction from the first position P to the second position Q.
  • Step S113 The detection unit 174 detects that the vehicle is at a position overlapping the detection position F. As a result, the detection unit 174 ends the passage detection process.
  • the passage detection process can detect not only the vehicle has passed the detection position F but also the passage direction in which the vehicle has passed the detection position F. Therefore, the user terminal 100 can also detect the passage of the detection position F by the vehicle only when the passage direction is a predetermined direction. For example, when the vehicle traveling in the aisle passes through the detection position F from the direction specified by law, the user terminal 100 does not detect the passage of the vehicle to the detection position F, and the vehicle determines by law. It is possible to detect the passage of the detection position F by the vehicle only when the detection position F is passed from the direction opposite to the direction in which the vehicle is used.
  • the law includes not only the norms established by the Diet but also the constitutions, treaties, ordinances, ministerial ordinances, cabinet orders, rules and orders established by institutions such as the national government, prefectures, states or ceremonies.
  • the law includes the Road Traffic Act. Twice
  • the boundary type geo-fence can set the detection position F with two point data, the increase in the amount of data is suppressed even if the number of set detection positions F is increased. Therefore, the information output system 1 can reduce the load on the server 10 and the user terminal 100.
  • the user terminal 100 detects the passage of the detection position F regardless of whether the vehicle passes near the end of the detection position F or the vehicle passes near the center of the detection position F. There is no difference in detection timing. Therefore, the information output system 1 can easily adjust the timing of outputting predetermined information.
  • FIG. 9 is a schematic view showing an example in which the detection position Fa and the detection position Fb are set at the branch point of the first passage H1.
  • the branch point is a position where the first passage H1 and the second passage H2 branch.
  • the detection position Fa and the detection position Fb are set based on a total of three point data.
  • the two point data corresponding to the detection position Fa indicate a first point A1 and a second point B set at a position sandwiching the first passage H1 from the first point A1.
  • the two point data corresponding to the detection position Fb indicate a first point A2 and a second point B set at a position sandwiching the second passage H2 from the first point A2, respectively.
  • the detection position Fa is set so as to be orthogonal to the direction in which the first passage H1 extends.
  • the detection position Fb is set so as to be orthogonal to the direction in which the second passage H2 extends.
  • One end of the detection position Fa and one end of the detection position Fb are at the same position. Therefore, the shape connecting the detection position Fa and the detection position Fb becomes a bent linear shape.
  • the information output system 1 can suppress an increase in the amount of data related to the detection position F.
  • the detection position F may be set in three dimensions instead of two dimensions.
  • FIG. 10 is a schematic view showing an example in which the detection position Fc is set in the space existing between the building 21 and the building 22.
  • the configuration set between the building 21 and the building 22 is only an example.
  • the place where the detection position Fc is set is not limited to between the building 21 and the building 22.
  • the user terminal 100 can detect the passage of the moving object along the direction d101 to the detection position Fc.
  • the detection position Fc is set based on the first point data indicating the first point A, the second point data indicating the second point B, and the altitude data indicating the altitude h.
  • the user terminal 100 can detect that the moving body has passed between the first point A and the second point B at a height less than the altitude h.
  • Such a detection position Fc is suitable when it is arranged in space.
  • Altitude data may be set in a range where the lower limit is a position higher than the ground surface. That is, the altitude data may be set to indicate a range of the first altitude or higher, which is higher than the ground surface, and the second altitude or lower, which is higher than the first altitude.
  • the user terminal 100 moves through a range of the first altitude or more and the second altitude or less among the moving objects passing between the first point A and the second point B. Detect the body.
  • the user terminal 100 detects a moving body passing between the first point A and the second point B, a moving body passing through a height lower than the first altitude, and a moving body passing above the second altitude. do not.
  • the altitude data is set so as to indicate a range of the first altitude or more and the second altitude or less when the detection position Fc is set on a road through which another road passes above and below, for example, an urban highway.
  • the altitude data may be set in a range in which a position lower than the ground surface is a lower limit value and a position higher than the ground surface is an upper limit value.
  • Altitude data may be set in a range in which a position lower than the ground surface is set as an upper limit value and a lower limit value.
  • Such altitude data is used when the detection position Fc is set on the road provided on the seabed or underground. Roads provided on the seabed or underground are, for example, undersea tunnels.
  • the circular geo-fence is a mechanism that can detect that a moving body has passed a detection position whose shape is set to be circular.
  • the polygonal geo-fence is a mechanism that can detect that a moving object has passed a detection position whose shape is set to be polygonal.
  • the point-type geo-fence is a mechanism that can detect that a moving object has passed a detection position set at a certain point.
  • FIG. 11 is an explanatory diagram illustrating a circular geo-fence.
  • the circular detection position F11 is set in the region including the intersection of the road.
  • the user terminal 100 detects that the vehicle has entered the detection position F11.
  • the user terminal 100 detects that the vehicle has left the detection position F11.
  • the user terminal 100 detects that the vehicle has passed the detection position F11 by detecting that the vehicle has entered and exited the detection position F11.
  • FIG. 12 is an explanatory diagram for explaining the polygonal geo-fence.
  • FIG. 13 is an explanatory diagram for explaining another example of the polygonal geo-fence.
  • the detection position F12a is set to a quadrangle.
  • the shape of the detection position F12a is not limited to a quadrangle, and may be a triangle, a pentagon, a heptagon as shown in FIG. 13, or the like.
  • the user terminal 100 can detect the entry of the vehicle into the detection position F12a. Further, the user terminal 100 can detect the exit of the vehicle to the detection position F12a.
  • the user terminal 100 detects that the vehicle has passed the detection position F12a by detecting that the vehicle has entered and exited the detection position F12a.
  • the user terminal 100 can detect the entry of the vehicle into the detection position F13a. Further, the user terminal 100 can detect the exit of the vehicle to the detection position F13a.
  • the user terminal 100 detects that the vehicle has passed the detection position F13a by detecting that the vehicle has entered and exited the detection position F13a.
  • the detection position F12a is set based on the point data indicating each of the four points. More specifically, the detection position F12a is set based on the point F12a-1, the point F12a-2, the point F12a-3, and the point F12a-4.
  • the polygonal geo-fence is set based on at least three point data. Therefore, the polygonal geo-fence requires more data to set the detection position than the boundary type geo-fence.
  • FIG. 14 is an explanatory diagram for explaining the point-type geo-fence.
  • a point-type geo-fence is a geo-fence that uses links.
  • the detection position of the point-type geo-fence is set to the node F141.
  • Node F141 is set at an end point of a road, an intersection, or the like.
  • the link F142 is a virtual line segment connecting two or more nodes F141.
  • the link F142 and the node F141 are both concepts used in the map matching technique. Since the point-type geo-fence can be set by designating one node F141 and a radius, the amount of data can be small. However, the point-type geo-fence has a problem that the range designation becomes too rough, especially when the range designation is an elongated shape or the range designation is large.
  • the user terminal 100 detects the entry of the vehicle into the node F141 in which the detection position is set. Further, the user terminal 100 detects the exit of the vehicle from the node F141. In the example of FIG. 14, the user terminal 100 detects the passage of the vehicle to the link F142 by detecting the entry and exit of the vehicle into the node F141.
  • the detection position F1 is set near the tollhouse T101. More specifically, the detection position F1 is set on the highway side with respect to the tollhouse T101. Further, the detection position F1 is set in the opposite lane R101-1 that goes out from the expressway to the general road and the main lane R103-1 that enters the expressway from the general road. The detection position F1 is set between the first point A1-1 and the second point B1-2. The first point A1-1 and the second point B1-2 are set at positions sandwiching the opposite lane R101-1 and the main lane R103-1.
  • the user terminal 100 detects that the vehicle has passed the detection position F1 in the d101 direction.
  • the direction of d101 is opposite to the direction of the road in the opposite lane R101-1.
  • the detection position F2 is set in the opposite lane R101-2, which connects to the opposite lane R101-1 at the branch point B101.
  • the detection position F2 is set upstream of the detection position F1 with respect to the road direction of the opposite lane R101-1 and the opposite lane R101-2.
  • the detection position F2 is set between the first point A2-1 and the second point B2-2.
  • the first point A2-1 and the second point B2-2 are set at positions sandwiching the opposite lane R101-2.
  • the user terminal 100 detects that the vehicle has passed the detection position F2 in the d102 direction.
  • the d102 direction is opposite to the road direction of the opposite lane R101-1.
  • the detection position F3 is set in the opposite lane R101-2.
  • the detection position F3 is set upstream of the detection position F2 with respect to the road direction of the opposite lane R101-2.
  • the detection position F3 is set between the first point A3-1 and the second point B3-2.
  • the first point A3-1 and the second point B3-2 are set at positions sandwiching the opposite lane R101-2.
  • the user terminal 100 detects that the vehicle has passed the detection position F3 in the d103 direction.
  • the d103 direction is opposite to the road direction of the opposite lane R101-1.
  • the detection position F4 is set in the opposite lane R102-1 which is connected to the opposite lane R101-1 at the branch point B101.
  • the detection position F4 is set upstream of the detection position F1 with respect to the road direction of the opposite lane R101-1 and the opposite lane R102-1.
  • the detection position F4 is set between the first point A4-1 and the second point B4-2.
  • the first point A4-1 and the second point B4-2 are set at positions sandwiching the opposite lane R102-1.
  • the user terminal 100 detects that the vehicle has passed the detection position F4 in the d104 direction.
  • the d104 direction is opposite to the road direction of the opposite lane L102.
  • the detection position F5 is set in the opposite lane R102-1.
  • the detection position F5 is set upstream of the detection position F4 with respect to the road direction in the opposite lane R101-1.
  • the detection position F5 is set between the first point A5-1 and the second point B5-2.
  • the first point A5-1 and the second point B5-2 are set at positions sandwiching the opposite lane R102-1.
  • the user terminal 100 detects that the vehicle has passed the detection position F5 in the d105 direction.
  • the d105 direction is opposite to the road direction of the opposite lane R102-1.
  • the detection position F6 is set in the opposite lane R102-1.
  • the detection position F6 is set upstream of the detection position F5 with respect to the road direction of the opposite lane R102-1.
  • the detection position F6 is set between the first point A6-1 and the second point B6-2.
  • the first point A6-1 and the second point B6-2 are set at positions sandwiching the opposite lane R102-1.
  • the user terminal 100 detects that the vehicle has passed the detection position F6 in the d106 direction.
  • the d106 direction is opposite to the road direction of the opposite lane R102-1.
  • the detection position F7 is set on the main lane R103-2 connected to the main lane R103-1 at the branch point B102 and the general road G101 running in parallel with the main lane R103-2.
  • the detection position F7 is set downstream of the detection position F1 with respect to the road direction of the main lane R103-1 and the main lane R103-2.
  • the detection position F7 is set between the first point A7-1 and the second point B7-2.
  • the first point A7-1 and the second point B7-2 are set at positions sandwiching the main lane R103-1 and the main lane R103-2.
  • the user terminal 100 according to the present embodiment detects that the vehicle has passed the detection position F7 regardless of the passing direction.
  • the detection position F7 corresponds to the cancellation detection position.
  • the user terminal 100 may be configured to detect the passage of the vehicle to the detection position F7 only when the vehicle passes the detection position F7 from a predetermined direction.
  • the detection position F8 is set in the main lane R104-1. Further, the detection position F8 may be set across a general road running in parallel with the main lane R104-1. The detection position F8 is set downstream of the detection position F1 with respect to the road direction of the main lane R104-1. The detection position F8 is set between the first point A8-1 and the second point B8-2. The first point A8-1 and the second point B8-2 are set at positions sandwiching the main lane R104-1.
  • the user terminal 100 according to the present embodiment detects that the vehicle has passed the detection position F8 regardless of the passing direction. The detection position F8 corresponds to the cancellation detection position.
  • the user terminal 100 may be configured to detect the passage of the vehicle to the detection position F7 only when the vehicle passes the detection position F8 from a predetermined direction.
  • the user terminal 100 can set two reverse run detection routes and two data cancellation routes by using the detection positions F1 to F8.
  • Routes L101 and L102 correspond to reverse-way detection routes.
  • Routes L101 and L102 are routes for detecting reverse driving of the vehicle.
  • Routes L103 and L104 correspond to data cancellation routes.
  • the routes L103 and L104 are routes for deleting unnecessary content data from the user terminal 100.
  • the detection positions F2 to F8 correspond to other detection positions F.
  • the route L101 is a route through which the vehicle passes through the detection position F1, the detection position F2, and the detection position F3 in this order.
  • the user terminal 100 When the user terminal 100 detects that the vehicle has passed the detection position F1 from the direction d101, the user terminal 100 outputs reverse-way driving caution information to the output unit 130.
  • the output of the reverse-way driving caution information is the trigger action C shown in FIG.
  • the user terminal 100 When the user terminal 100 detects the passage of the vehicle to the detection position F1, the user terminal 100 acquires the content data corresponding to the detection positions F2 to the detection position F8 from the server 10. That is, the information output system 1 is configured to acquire the content data corresponding to the detection position F in a certain range from the server 10 in advance at the timing when communication is possible.
  • the user terminal 100 temporarily stores the content data corresponding to the detection positions F2 to the detection position F8 in the processing data storage area 141.
  • the acquisition of this content data is the trigger action C shown in FIG. In this way, by collectively acquiring the content data corresponding to the detection positions F2 to the detection position F8, the information output system 1 stabilizes the reverse run detection regardless of the communication state between the user terminal 100 and the server 10. Can be executed.
  • the state data corresponding to the detection position F2, the detection position F4, the detection position F7, and the detection position F8 is acquired from the server 10 in the valid state.
  • the detection position F2 and the detection position F4 correspond to the second detection position.
  • the detection position F3 and the detection position F5 correspond to the third detection position set at a position at a predetermined interval from the second detection position.
  • the user terminal 100 may be configured to acquire content data corresponding to the detection position F set at the tip of the vehicle from the server 10 each time the user terminal 100 detects the passage of the detection position F by the vehicle. That is, the user terminal 100 may be configured to acquire content data in real time from the server 10. Further, the user terminal 100 may be configured to store the content DB. In the case of this configuration, the user terminal 100 can acquire the content data without communicating with the server 10. Further, the user terminal 100 may have a configuration in which the above-described configurations relating to the acquisition of content data are combined.
  • the user terminal 100 When the user terminal 100 detects that the vehicle has passed the detection position F2 from the direction d102, the user terminal 100 effectively switches the state data corresponding to the detection position F3 set in front of the vehicle. Switching the state data corresponding to the detection position F3 is the trigger action C shown in FIG.
  • the user terminal 100 When the user terminal 100 detects that the vehicle has passed the detection position F3 from the direction d103, the user terminal 100 outputs reverse-way driving notification information from the output unit 130.
  • the output of the reverse run notification information is the trigger action C shown in FIG.
  • the detection position F3 corresponds to the final detection position.
  • the final detection position is the detection position set most upstream in the traveling direction defined by law on the route L101.
  • the route L102 is a route through which the vehicle passes through the detection position F1, the detection position F4, the detection position F5, and the detection position F6 in this order.
  • the user terminal 100 When the user terminal 100 detects that the vehicle that has passed the detection position F1 has passed the detection position F4 from the direction d104, the user terminal 100 effectively switches the state data corresponding to the detection position F5. Switching the state data corresponding to the detection position F5 is the trigger action C shown in FIG.
  • the user terminal 100 When the user terminal 100 detects that the vehicle has passed the detection position F5 from the direction d105, the user terminal 100 effectively switches the state data corresponding to the detection position F6. Switching the state data corresponding to the detection position F6 is the trigger action C shown in FIG.
  • the user terminal 100 When the user terminal 100 detects that the vehicle has passed the detection position F6 from the direction d106, the user terminal 100 outputs reverse-way driving notification information from the output unit 130.
  • the output of the reverse run notification information is the trigger action C shown in FIG.
  • the detection position F6 corresponds to the final detection position.
  • the final detection position is the detection position set most upstream in the traveling direction defined by law on the route 102.
  • the route L103 is a route through which the vehicle passes through the detection position F1 and the detection position F7 in this order.
  • the user terminal 100 When the user terminal 100 detects that the vehicle that has passed the detection position F1 has passed the detection position F7, the user terminal 100 deletes the content data stored in the processing data storage area 141.
  • the content data to be deleted is the content data acquired from the server 10 when the passage of the vehicle to the detection position F1 is detected.
  • the vehicle that passes from the detection position F1 to the detection position F7 is a vehicle that is not running in reverse. In this case, it is not necessary to output the reverse run notification information.
  • the content data is deleted from the processing data storage area 141 when it is detected that the vehicle is separated from the detection position F by a predetermined distance or more in addition to the vehicle passing through the cancellation detection position.
  • the user terminal 100 detects that the vehicle is separated from the detection position F by a predetermined distance or more, for example, based on the position data of the vehicle.
  • the detection position F7 is set across a general road that runs parallel to the main lane R104-1. Therefore, even if the content data is transmitted from the server 10 to the vehicle that has passed near the detection position F while traveling on the general road, the content data can be deleted from the processing data storage area 141 at the detection position F7. ..
  • the route L104 is a route through which the vehicle passes through the detection position F1 and the detection position F8 in this order.
  • the user terminal 100 When the user terminal 100 detects that the vehicle has passed the detection position F8 from the direction d108, the user terminal 100 deletes the content data stored in the processing data storage area 141.
  • the content data to be deleted is the content data acquired from the server 10 when the passage of the vehicle to the detection position F1 is detected.
  • the vehicle that passes from the detection position F1 to the detection position F8 is a vehicle that is not running in reverse. In this case, it is not necessary to output the reverse run notification information.
  • FIG. 15 is a flowchart showing the procedure of the reverse run detection process in the information output system 1 of the present embodiment.
  • Step S201 The GPS receiving unit 160 acquires the position data of the vehicle.
  • Step S202 The detection unit 174 determines whether or not the vehicle has passed the detection position F.
  • the detection position here is the detection position F1 shown in FIG.
  • the detection unit 174 determines whether or not the vehicle has passed the detection position F1 based on the position data acquired by the GPS receiving unit 160 in step S201 and the position data previously acquired by the GPS receiving unit 160.
  • the detection unit 174 becomes YES in step S202, and advances the reverse driving detection process to step S203. If the vehicle has not passed the detection position, the detection unit 174 returns the reverse driving detection process to step S201.
  • the request unit 171 acquires the content data corresponding to the other detection position F from the server 10.
  • the content data is content data corresponding to the detection positions F2 to F8 shown in FIG.
  • the state data corresponding to the detection position F2, the detection position F4, the detection position F7, and the detection position F8 are valid states.
  • the state data corresponding to the detection position F3, the detection position F5, and the detection position F6 is an invalid state.
  • the trigger condition for acquiring the content data from the server 10 includes the passing direction with respect to the detection position F of the vehicle, but the trigger condition may further include not only the passing direction but also other traveling conditions. This also applies to other trigger actions.
  • Step S204 The output control unit 175 causes the output unit 130 to output the reverse driving caution information.
  • Step S205 The GPS receiving unit 160 acquires the position data of the vehicle.
  • Step S206 The detection unit 174 determines whether or not the vehicle has passed the detection position F.
  • the detection positions that may be detected in this step S206 are the detection position F2, the detection position F4, the detection position F7, or the detection position F8 shown in FIG.
  • the detection unit 174 becomes YES in step S206, and advances the reverse driving detection process to step S207. If the vehicle has not passed the detection position F, the detection unit 174 advances the reverse driving detection process to step S212.
  • Step S207 The detection unit 174 determines whether or not the detection position F whose passage is detected in step S206 is the final detection position.
  • the final detection position is the detection position F3 or the detection position F6 shown in FIG.
  • the detection unit 174 advances the reverse run detection process to step S208. If the detection position F whose passage is detected in step S206 is not the final detection position, the detection unit 174 proceeds to the reverse run detection process in step S209.
  • Step S208 The output control unit 175 causes the output unit 130 to output the reverse run notification information.
  • Step S209 The detection unit 174 determines whether or not the detection position F whose passage is detected in step S206 is the cancellation detection position.
  • the cancellation detection position that may be detected in this step S209 is the detection position F7 or the detection position F8 shown in FIG.
  • the detection unit 174 advances the reverse run detection process to step S210. If the detection position F whose passage is detected in step S206 is not the cancellation detection position, the detection unit 174 proceeds to the reverse run detection process in step S211.
  • Step S210 The detection unit 174 deletes the content data from the processing data storage area 141. As a result, the reverse run detection process ends.
  • Step S211 The detection unit 174 switches the state data of the detection position F set ahead of the detection position F whose passage is detected in step S206 to the valid state.
  • Step S212 The detection unit 174 determines whether or not the current position of the vehicle is separated from the detection position F in the effective state by a predetermined distance. When the current position of the vehicle is separated from the effective detection position F by a predetermined distance, the detection unit 174 advances the reverse driving detection process to step S210. When the current location of the vehicle is not separated from the detection position F in the valid state by a predetermined distance, the detection unit 174 returns the reverse driving detection process to step S205.
  • the information output system 1 as described above can transmit the latest information to the user terminal 100 by keeping the information stored in the server 10 up-to-date. Therefore, in the information output system 1, for example, even when the position where the road is provided is changed or a new road is provided, the content data stored in the data storage unit 12 is updated, so that these roads are used. Can also detect the reverse running of the vehicle.
  • FIG. 16 is a diagram for explaining the state data 16 stored in the processing data storage area 141 shown in FIG.
  • the detection position F included in the location trigger D, and the state data 16 relating to the state B and the trigger action C associated with the detection position F are described.
  • the state data 16 is stored in a table format.
  • the state data 16 includes valid flag data 161, ID 162, and coordinate data 163.
  • the detection unit 174 searches for the detection position F to be set using the valid flag data 161 as a search key. By using the valid flag data 161, the search of the detection position F by the detection unit 174 can be made more efficient.
  • ID 162 is an identifier given to the detection position F and the location trigger C including the detection position F.
  • the coordinate data 163 includes the coordinate data 163-1 of the first point A and the coordinate data 163-2 of the second point B.
  • the coordinate data 163-1 of the first point A and the coordinate data 163-2 of the second point B indicate two points at both ends of each detection position F.
  • the detection position F is set by a human.
  • the detection position F is not limited to the configuration set by a human, and may be set by a computer, for example.
  • the computer is, for example, a computer owned by a business operator that provides content.
  • FIG. 17 is a flowchart showing the procedure of the setting process for setting the detection position F using a computer.
  • a procedure for setting the detection position F1, the detection position F2, the detection position F3, the detection position F4, the detection position F5, the detection position F6, the detection position F7, and the detection position F8 shown in FIG. 2 will be described with reference to FIG.
  • Step S301 The computer detects a similar shape similar to the reverse road shape from the map displayed on the screen of the user terminal 100. From the similar shape, the computer detects a running start point where a reverse running pattern can occur. Examples of the travel start points include, for example, the point P18A1 in FIG. 18, the point P18B1 in FIG. 18, the point P18C1 in FIG. 18, the point P18D1 in FIG. P18G1, point P18H1 in FIG. 18H, point P18I1 in FIG. 18I.
  • the reference numerals in FIGS. 18A, 18B, 18C, 18D, 18E, 18F, 18G, 18H and 18I are as follows.
  • Codes starting with “H” Expressways ⁇ Codes starting with “R”: Lamps ⁇ Codes starting with “N”: General roads ⁇ Codes starting with “L”: Vehicle travel routes ⁇ Codes starting with “B”: Branching Location-Code starting with “U”: U-turn occurrence location-Code starting with "SA”: Service area-Code starting with "J”: Interchange
  • Step S302 The computer searches for a route connecting the forward route to the reverse route from the detected travel start point.
  • the computer searches for multiple routes.
  • Step S303 The computer detects a point in the searched reverse route where it can be determined that there is no surrounding road or the directional difference with the surrounding road is large and there is little misrecognition with the surrounding road.
  • the computer sets the detected point as the reverse-way driving detection point. However, the computer does not detect the point if the above-mentioned determination cannot be made even after following a certain distance.
  • the computer may further extend the searched reverse route.
  • the computer may set the above-mentioned reverse run detection point for the extended reverse run path.
  • Step S304 The computer sets intermediate detection points at regular intervals or at points where the direction changes with respect to the points from the start point of travel to the detection point of reverse driving.
  • Step S305 The computer extends the route from the starting point of travel and searches for a forward route that connects only to the forward route.
  • the computer searches for multiple routes.
  • Step S306 The computer detects a point in the searched forward route where it can be determined that there is no surrounding road or the directional difference with the surrounding road is large and there is little misrecognition with the surrounding road.
  • the computer sets the detected point as the cancel point.
  • the cancellation point corresponds to the cancellation detection position.
  • Embodiment 2 of the present invention will be described below.
  • the same reference numerals will be added to the members having the same functions as the members described in the above embodiment, and the description will not be repeated.
  • FIG. 19 is a schematic diagram for explaining the first embodiment of the present embodiment.
  • the detection position F191 is set on the road H191.
  • the position mark P191 is a position mark indicating the position of the vehicle on the map M191.
  • the vehicle is equipped with the user terminal 100.
  • the user terminal 100 detects the passage of the user terminal 100 to the detection position F191 and the passing direction d191.
  • the user terminal 100 detects the passage of the vehicle to the detection position F191 and the passing direction d191.
  • the user terminal 100 outputs the content C191 from the output unit 130.
  • the content C191 is, for example, tourist information, traffic information, and weather information around the position where the detection position F191 is set.
  • FIG. 20 is a schematic diagram for explaining Example 2 of the present embodiment.
  • the detection position F201 is set on the road H201 and the road H202.
  • the position mark P201 is a position mark indicating the position of the vehicle on the map M201.
  • the vehicle is equipped with a user terminal 100.
  • the user terminal 100 detects the passage of the vehicle to the detection position F201 and the passing direction d201.
  • the output unit 130 outputs the content C201 and the content C202.
  • the content C201 is the content associated with the point O201.
  • the content C202 is the content associated with the point O202.
  • the content C201 and the content C202 are, for example, contents considering the passing direction d201, the road H201, or the road H202.
  • the trigger action C is specifically an action in which the user terminal 100 sets a new detection position F on the map, and a new location trigger D including the detection position F in which the user terminal 100 is set on the map.
  • the various contents displayed on the map are, for example, images, characters, marks, and symbols superimposed on the map.
  • the content provided to the user is, for example, audio or music emitted from a speaker, a still image or moving image displayed on a display.
  • the trigger action C is, for example, an action in which the user terminal 100 sends an HTTP (HyperText Transfer Protocol) request to the server 10, and an action in which the user terminal 100 sends related information about the user terminal 100 to the server 10.
  • the HTTP request is, for example, a request for downloading various contents and a request for providing various services.
  • the related information is, for example, information collected by the user terminal 100 or a vehicle equipped with the user terminal 100. The information collected is, for example, an image around the user terminal 100 or the vehicle, and the average speed of the user terminal 100 or the vehicle.
  • FIG. 21 is a diagram for explaining Example 3 of the present embodiment.
  • the screen D211 in FIG. 21 is a screen of the user terminal 100 owned by the user B.
  • the detection position F211 is set on the road H211 and SA (service area).
  • the position mark P211 and the position mark P212 are position marks indicating the position of the vehicle on the map M211.
  • the map M211 is a map displayed on the screen of the user terminal 100 owned by the user A.
  • the user terminal 100 of the user A will be referred to as a user terminal A.
  • the detection position F212 is set to the road H211 and the road H212.
  • the position mark P213 is a position mark indicating the position of the vehicle on the map M212.
  • the map M212 is a map displayed on the screen of the user terminal 100 of the user B.
  • the user terminal 100 of the user B will be referred to as a user terminal B.
  • the user terminal A detects the passage of the vehicle to the detection position F211 and the passing direction d211.
  • the user terminal communication unit 110 transmits the talk data of the talk T212 to the network 30.
  • the user terminal A detects the passage of the vehicle to the detection position F211 and the passing direction d212.
  • the user terminal communication unit 110 transmits the map data of the map M211 and the talk data of the talk T212 to the network 30.
  • the user terminal B detects the passage of the vehicle to the detection position F212 and the passing direction d213.
  • the user terminal communication unit 110 transmits the talk data of the talk T211 to the network 30.
  • the configuration of the user terminal A and the user terminal B is not limited to the above, and the talk data may be generated on the server 10 side in response to the trigger action notification from the user terminal A or the user terminal B side. ..
  • the notification information includes vehicle position data and data indicating a traveling situation.
  • a boundary type geo-fence is used as the geo-fence.
  • the information output system 1 is not limited to the configuration using the boundary type geo-fence.
  • the information output system 1 may be configured to use a geo-fence other than the boundary type geo-fence.
  • FIG. 22 is a flowchart showing the procedure of the passage detection process for detecting the passage and the passage direction of the detection position F12a by the vehicle in the polygonal geo-fence.
  • FIG. 23 is an explanatory diagram for explaining the passage detection process shown in FIG. 22.
  • the detection unit 174 calculates a vector based on the two position data and the four point data.
  • the two position data are the first position data acquired by the GPS receiving unit 160 and the second position data acquired next by the GPS receiving unit 160.
  • the first position data shows the first position P in FIG.
  • the second position data shows the second position Q in FIG.
  • the four point data are the first point F12a-A, the second point F12a-B, the third point F12a-C, and the fourth point, which are the vertices of the detection positions F12a corresponding to the polygonal geo-fence shown in FIG. F12a-D are shown respectively.
  • the detection unit 174 includes a vector AB, a vector BC, a vector CD, a vector DA, a vector AP, a vector BP, a vector CP, a vector DP, a vector AQ, a vector BQ, and a vector.
  • the CQ and the vector DQ are calculated.
  • the vector AB is a vector from the first point F12a-A to the second point F12a-B.
  • the vector BC is a vector from the second point F12a-B to the third point F12a-C.
  • the vector CD is a vector from the third point F12a-C to the fourth point F12a-D.
  • the vector DA is a vector from the fourth point F12a-D to the first point F12a-A.
  • the vector AP is a vector from the first point F12a-A to the first position P.
  • the vector BP is a vector from the second point F12a-B to the first position P.
  • the vector CP is a vector from the third point F12a-C to the first position P.
  • the vector DP is a vector from the fourth point F12a-D to the first position P.
  • the vector AQ is a vector from the first point F12a-A to the second position Q.
  • the vector BQ is a vector from the second point F12a-B to the second position Q.
  • the vector CQ is a vector from the third point F12a-C to the second position Q.
  • the vector DQ is a vector from the fourth point F12a-D to the second position Q.
  • FIG. 23 illustrates two states.
  • the first state is a state in which the vehicle at the first position P has entered the detection position F12a and has moved to the second position Q.
  • the movement from the first position P to the second position Q in the first state is shown by a solid line. If the vehicle has not entered the detection position F12a, the second position Q is on the same side as the first position P with respect to the detection position F12a.
  • the second state is a state in which the vehicle has moved from the first position P to the second position Q at the detection position F12a.
  • the movement from the first position P to the second position Q in the second state is shown by a dotted line.
  • Step S222 The detection unit 174 obtains the outer product vector of the vector AB and the vector AP.
  • this outer product vector is referred to as an outer product 5.
  • Step S223 The detection unit 174 obtains the outer product vector of the vector BC and the vector BP.
  • this outer product vector is referred to as an outer product 6.
  • Step S224 The detection unit 174 obtains the outer product vector of the vector CD and the vector CP.
  • this outer product vector is referred to as an outer product 7.
  • Step S225 The detection unit 174 obtains the outer product vector of the vector DA and the vector DP.
  • this outer product vector is referred to as an outer product 8.
  • Step S226) The detection unit 174 obtains the outer product vector of the vector AB and the vector AQ.
  • this outer product vector is referred to as an outer product 9.
  • Step S227) The detection unit 174 obtains the outer product vector of the vector BC and the vector BQ.
  • this outer product vector is referred to as an outer product 10.
  • Step S278 The detection unit 174 obtains the outer product vector of the vector CD and the vector CQ.
  • this outer product vector is referred to as an outer product 11.
  • Step S229) The detection unit 174 obtains the outer product vector of the vector DA and the vector DQ.
  • this outer product vector is referred to as an outer product 12.
  • the detection unit 174 compares the code of the Z component of the outer product 5, the code of the Z component of the outer product 6, the code of the Z component of the outer product 7, and the code of the Z component of the outer product 8.
  • the sign of the Z component of the outer product 5 is specified by the direction in which the screw advances when the right-hand screw is turned in the direction from the vector AB to the vector AP.
  • the direction in which the screw advances is the Z component. More specifically, when the direction in which the screw advances is the positive direction, the sign of the Z component of the outer product 5 is positive. When the direction in which the screw advances is negative, the sign of the Z component of the outer product 5 is negative.
  • the detection unit 174 is the code of the Z component of the outer product 5, the code of the Z component of the outer product 6, the code of the Z component of the outer product 7, and the code of the Z component of the outer product 8 all negative? Judge whether or not.
  • the detection unit 174 determines that the code of the Z component of the outer product 5, the code of the Z component of the outer product 6, the code of the Z component of the outer product 7 and the code of the Z component of the outer product 8 are all negative, in step S230.
  • the result is YES, and the passage detection process proceeds to step S231.
  • the detection unit 174 does not determine that the code of the Z component of the outer product 5, the code of the Z component of the outer product 6, the code of the Z component of the outer product 7 and the code of the Z component of the outer product 8 are all negative, the step The result becomes NO in S230, and the passage detection process proceeds to step S234.
  • Step S231 The detection unit 174 determines whether or not the code of the Z component of the outer product 9, the code of the Z component of the outer product 10, the code of the Z component of the outer product 11, and the code of the Z component of the outer product 12 are all negative.
  • the detection unit 174 determines that the code of the Z component of the outer product 9, the code of the Z component of the outer product 10, the code of the Z component of the outer product 11 and the code of the Z component of the outer product 12 are all negative, in step S231.
  • the result is YES, and the passage detection process proceeds to step S232.
  • step S233 When the detection unit 174 does not determine that the code of the Z component of the outer product 9, the code of the Z component of the outer product 10, the code of the Z component of the outer product 11 and the code of the Z component of the outer product 12 are all negative, the step The result becomes NO in S231, and the passage detection process proceeds to step S233.
  • Step S232 The detection unit 174 determines that the vehicle is in the detection position F12a. This means that the vehicle remains in the area corresponding to the detection position F12a, which is the second state. As a result, the detection unit 174 ends the passage detection process.
  • Step S233 The detection unit 174 determines that the vehicle has exited the detection position F12a. As a result, the detection unit 174 ends the passage detection process. In this passage detection process, the boundary line of the detection position F12a is treated as outside the detection position F12a. That is, when the vehicle is located at a position overlapping the boundary line of the detection position F12a, the detection unit 174 determines that the vehicle has exited from the detection position F12a.
  • Step S234 The detection unit 174 determines whether or not the code of the Z component of the outer product 9, the code of the Z component of the outer product 10, the code of the Z component of the outer product 11, and the code of the Z component of the outer product 12 are all negative.
  • the detection unit 174 determines that the code of the Z component of the outer product 9, the code of the Z component of the outer product 10, the code of the Z component of the outer product 11 and the code of the Z component of the outer product 12 are all negative, in step S234.
  • the result is YES, and the passage detection process proceeds to step S235.
  • step S234 the detection unit 174 does not determine that the code of the Z component of the outer product 9, the code of the Z component of the outer product 10, the code of the Z component of the outer product 11 and the code of the Z component of the outer product 12 are all negative.
  • the detection unit 174 does not determine that the code of the Z component of the outer product 9, the code of the Z component of the outer product 10, the code of the Z component of the outer product 11 and the code of the Z component of the outer product 12 are all negative. It means that the vehicle has not reached the detection position F12a.
  • Step S235 The detection unit 174 determines that the vehicle is in the detection position F12a.
  • the detection unit 174 calculates the moving direction based on the first position P and the second position Q.
  • the moving direction is calculated by an inverse tangent function based on the latitude information and longitude information of the first position P and the latitude information and longitude information of the second position Q. More specifically, the movement direction is based on the difference between the latitude information of the first position P and the latitude information of the second position Q, and the difference between the longitude information of the first position P and the longitude information of the second position Q. Is calculated by the inverse tangent function.
  • Step S237) The detection unit 174 determines whether or not the difference between the moving direction and the forward direction is within ⁇ 90 °.
  • the forward direction is determined by the business operator that provides the content, together with the latitude information and the longitude information of the four points that are the vertices of the detection position F12a.
  • the forward direction is, for example, the direction of travel specified by law. Alternatively, the forward direction may be the direction opposite to the direction of travel specified by law.
  • the detection unit 174 advances the passage detection process to step S238. If the difference between the moving direction and the forward direction is not within ⁇ 90 °, the detection unit 174 advances the passage detection process to step S239.
  • Step S2378 The detection unit 174 determines that the vehicle has entered the detection position F12a from the forward direction. This means that it is the first state illustrated by the solid line in FIG. 23. As a result, the detection unit 174 ends the passage detection process.
  • Step S239) The detection unit 174 determines that the vehicle has entered the detection position F12a from a reverse direction different from the forward direction. As a result, the detection unit 174 ends the passage detection process.
  • the user terminal 100 and the server 10 are each realized by a computer 200 having a configuration as shown in FIG. 24, for example.
  • the user terminal 100 will be described as an example.
  • FIG. 24 is a hardware configuration diagram showing an example of a computer 200 that realizes the functions of the user terminal 100.
  • the computer 200 has a storage device 240 including a CPU 210, a RAM 220, a ROM 230, an HDD, and the like, a communication interface (I / F) 250, an input / output interface (I / F) 260, and a media interface (I / F) 270.
  • a storage device 240 including a CPU 210, a RAM 220, a ROM 230, an HDD, and the like, a communication interface (I / F) 250, an input / output interface (I / F) 260, and a media interface (I / F) 270.
  • I / F communication interface
  • I / F input / output interface
  • I / F media interface
  • the CPU 210 operates based on a program stored in the ROM 230 or the storage device 240, and controls each part.
  • the ROM 230 stores a boot program executed by the CPU 210 when the computer 200 is started, a program that depends on the hardware of the computer 200, and the like.
  • the storage device 240 stores a program executed by the CPU 210, data used by such a program, and the like.
  • the communication interface 250 receives data from another device via the network 30 shown in FIG. 1, for example, and sends the data to the CPU 210. Further, the communication interface 250 transmits the data generated by the CPU 210 to another device via the network 30.
  • the CPU 210 controls an output device such as a display or a printer and an input device such as a keyboard or a mouse via the input / output interface 260.
  • the CPU 210 acquires data from the input device via the input / output interface 260. Further, the CPU 210 outputs the data generated via the input / output interface 260 to the output device.
  • the media interface 270 reads a program or data stored in a recording medium (not shown) and provides the program or data to the CPU 210 via the RAM 220.
  • the CPU 210 loads the program from the recording medium onto the RAM 220 via the media interface 270, and executes the loaded program.
  • the recording medium is, for example, an optical recording medium such as a DVD (Digital Versaille Disc), a magneto-optical recording medium such as MO (Magnet-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory.
  • the CPU 210 of the computer 200 realizes the function of the user terminal control unit 170 by executing the program loaded in the RAM 220.
  • the CPU 210 of the computer 200 acquires these programs from another device via the network 30.
  • the CPU 210 may read these programs from the recording medium.
  • the control block of the user terminal 100 may be realized by a logic circuit formed in an integrated circuit such as an IC chip or by software.
  • the control block is, in particular, a detection unit 174 and an output control unit 175.
  • Logic circuits are hardware.
  • the user terminal 100 includes a computer that executes the instructions of a program that is software that realizes each function.
  • the computer includes, for example, one or more processors and a computer-readable recording medium that stores the program. Then, in the computer, the processor reads the program from the recording medium and executes it, thereby achieving the object of the present invention.
  • the processor for example, a CPU can be used.
  • the recording medium in addition to a "non-temporary tangible medium" such as a ROM (Read Only Memory), a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. Further, a RAM or the like for developing the above program may be further provided.
  • the program may be supplied to the computer via any transmission medium capable of transmitting the program.
  • the transmission medium is a communication network, a broadcast wave, or the like. It should be noted that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
  • the information output device includes a position acquisition unit, a storage unit, a detection unit, and an output control unit.
  • the position acquisition unit acquires position data at predetermined time intervals.
  • the storage unit stores point data indicating two different points.
  • the point data is the first point data and the second point data.
  • the first point data indicates the first point.
  • the first point is arbitrarily set.
  • the second point data indicates the second point.
  • the second point is set at a position across the passage from the first point.
  • the detection unit detects that the moving object has passed the detection position.
  • the moving body moves along the passage.
  • the detection position is a position corresponding to a line segment connecting the first point and the second point.
  • the detection unit determines the passing direction.
  • the passing direction is the direction in which the moving body has passed the detection position.
  • the output control unit outputs predetermined information to the output unit.
  • the output control unit outputs the information when the passage of the detection position by the moving body is detected.
  • the detection unit determines that the first position and the second position are located with the line segment in between, it determines that the moving body has passed the detection position.
  • the detection unit determines that the first position and the second position are positioned with the line segment in between, based on the first position data, the second position data, the first point data, and the second point data. ..
  • the detection unit obtains the passing direction based on the outer product of two different vectors.
  • the two different vectors are vectors that go from the first point to the second point, the first position, or the second position, respectively.
  • the output control unit causes the output unit to output the predetermined information according to the passing direction.
  • the output control unit outputs the above information when it is determined that the passing direction is opposite to the traveling direction specified by law.
  • the information is reverse-way notification information.
  • the reverse run notification information indicates that the moving body is running backward in the aisle.
  • the output control unit outputs the information when it is determined that the passing direction is opposite to the traveling direction specified by law.
  • the above-mentioned information is reverse-way driving caution information.
  • the reverse-way warning information indicates that the moving object may run backwards in the aisle.
  • the information output device further includes a communication unit.
  • the communication unit communicates with the server.
  • the communication unit acquires the point data from the server.
  • the point data is point data corresponding to two or more other detection positions.
  • the other detection position is a detection position set in front of the moving body.
  • the other detection positions include a second detection position and a third detection position.
  • the second detection position is set at a position at a predetermined interval from the detection position.
  • the third detection position is set at a position at a predetermined interval from the second detection position.
  • the second detection position is set to the valid state.
  • the effective state is a state detected by the detection unit.
  • the third detection position is set to the invalid state.
  • the invalid state is a state in which the detection unit does not detect it.
  • the detection unit switches the setting of the third detection position to the valid state. When the detection unit detects the passage of the moving object to the second detection position, the detection unit switches to the effective state.
  • the other detection positions further include the cancellation detection position.
  • the cancellation detection position is set in the second passage.
  • the second passage is a passage that branches off from the first passage.
  • the first passage is a passage in which the second detection position and the third detection position are set.
  • the detector deletes the point data.
  • the point data is the point data acquired from the server. When the detection unit determines that the moving body has passed the cancellation detection position, the detection unit deletes the point data.
  • the information output device may be realized by a computer.
  • the information output device is operated by the computer by operating the computer as each part (software element) included in the information output device.
  • the control program of the information output device to be realized and the computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
  • the information output device includes a position acquisition unit, a storage unit, an output unit, a communication unit, a detection unit, an output control unit, and a communication control unit.
  • the position acquisition unit acquires position data at predetermined time intervals.
  • the storage unit stores the point data.
  • the point data indicates an arbitrary position of the passage.
  • the output unit outputs the reverse run notification information.
  • the reverse run notification information is information indicating that the moving body is running backward in the passage.
  • the communication unit communicates with the server.
  • the detection unit detects that the moving body has passed the position based on the position data and the point data.
  • the position is a position indicated by the point data stored in the storage unit.
  • the output control unit controls the output unit.
  • the communication control unit controls the communication unit. Two or more positions are set at intervals along the passage as detection positions for detecting the passage of the moving body.
  • the communication control unit causes the communication unit to acquire point data.
  • the communication control unit acquires point data from the server.
  • the point data is point data corresponding to other detection positions. Other detection positions are set ahead of the moving object that runs in the opposite direction.
  • the communication control unit acquires the point data.
  • the output control unit causes the output unit to output the reverse run notification information.
  • the output control unit acquires point data when the detection unit detects the passage of another detection position by the moving body.
  • the information output device acquires the data necessary for detecting the reverse drive from the server when the vehicle may run in the reverse direction, the amount of data to be stored in the information output device is suppressed. At the same time, it is possible to detect reverse driving.
  • other detection positions are set in two or more passages, respectively.
  • a passage is a passage through which a moving object may enter.
  • the moving body is a moving body that has passed the detection position.
  • the information output device can detect reverse driving even in a place where two or more passages such as an interchange intersect or are close to each other.
  • the detection position includes a first detection position, a second detection position, and a third detection position.
  • the second detection position is set at a predetermined interval from the first detection position.
  • the third detection position is set at a predetermined interval from the second detection position.
  • the communication control unit causes the communication unit to acquire the point data.
  • the point data is point data corresponding to the second detection position and the third detection position, respectively.
  • the communication control unit causes the communication unit to acquire the point data.
  • the point data is set to the valid state or the invalid state.
  • the valid state is a state in which the detection unit can search.
  • the search is performed on the point data corresponding to the detection position where the moving object may pass.
  • the invalid state is a state in which the detection unit cannot search.
  • the detection unit switches the state data corresponding to the third detection position from the invalid state to the valid state. The switching of the state data is performed when the passage of the second detection position by the moving body is detected.
  • the information output device can suppress the load for the detection unit to search for the corresponding detection position by disabling a part of the two or more point data.
  • a part of the detection position is set as the cancellation detection position.
  • the cancellation detection position is a position for detecting the passage of a moving object that is not running in reverse.
  • the communication control unit causes the communication unit to acquire the point data corresponding to the cancellation detection position.
  • the cancellation detection position is set in the second passage.
  • the second passage branches off from the first passage.
  • the first passage is a passage in which the second detection position and the third detection position are set.
  • the information output device can delete the point data acquired from the server from the storage unit when the possibility of reverse driving disappears.
  • the detection unit determines the passing direction.
  • the passing direction is the direction in which the moving body has passed the detection position.
  • the detection unit transmits to the communication control unit or the output unit that the moving body has passed the detection position.
  • the detection unit determines that the moving body has passed the detection position from the direction opposite to the traveling direction specified by law, the detection unit transmits that the moving body has passed the detection position.
  • the information output device can notify the user of the vehicle running in reverse to the reverse run notification information.
  • the information output device may be realized by a computer.
  • the computer operates as each part of the information output device, so that the control program of the information output device and the recording medium on which the control program is recorded also fall within the scope of the present invention.
  • the recording medium is computer readable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Business, Economics & Management (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Ce dispositif de sortie d'informations comprend : une unité d'acquisition de position qui acquiert des données de position à des intervalles de temps prescrits ; et une unité de commande qui amène une unité de sortie à fournir des informations prédéterminées si la position indiquée par les données de position acquises par l'unité d'acquisition de position change séquentiellement et passe à une position de détection prescrite.
PCT/JP2021/007330 2020-02-28 2021-02-26 Dispositif de sortie d'informations et programme de commande WO2021172514A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022503748A JPWO2021172514A1 (fr) 2020-02-28 2021-02-26

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2020-034196 2020-02-28
JP2020034196 2020-02-28
JP2020-061173 2020-03-30
JP2020061173 2020-03-30

Publications (1)

Publication Number Publication Date
WO2021172514A1 true WO2021172514A1 (fr) 2021-09-02

Family

ID=77491295

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/007330 WO2021172514A1 (fr) 2020-02-28 2021-02-26 Dispositif de sortie d'informations et programme de commande

Country Status (2)

Country Link
JP (1) JPWO2021172514A1 (fr)
WO (1) WO2021172514A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001195693A (ja) * 2000-01-14 2001-07-19 Nippon Signal Co Ltd:The 車両情報送信装置および交通管理システム
JP2012003549A (ja) * 2010-06-17 2012-01-05 Toshiba Teli Corp 異常走行車両検出システムおよび道路監視プログラム
JP2016224798A (ja) * 2015-06-02 2016-12-28 株式会社デンソー 制御装置,及び支援システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001195693A (ja) * 2000-01-14 2001-07-19 Nippon Signal Co Ltd:The 車両情報送信装置および交通管理システム
JP2012003549A (ja) * 2010-06-17 2012-01-05 Toshiba Teli Corp 異常走行車両検出システムおよび道路監視プログラム
JP2016224798A (ja) * 2015-06-02 2016-12-28 株式会社デンソー 制御装置,及び支援システム

Also Published As

Publication number Publication date
JPWO2021172514A1 (fr) 2021-09-02

Similar Documents

Publication Publication Date Title
JP6923441B2 (ja) 注目点情報を提供する方法および装置
JP6782236B2 (ja) 注目点情報を提供する方法および装置
EP2093539A1 (fr) Dispositif de transmission d'informations de position de corps mobile pour système de navigation, et procédé et programme de transmission d'informations de position de corps mobile pour système de navigation
KR20160053971A (ko) 대안 경로를 생성하기 위한 방법 및 시스템
WO2007037281A1 (fr) Système de création de données de recherche de zones environnantes, système de recherche de zones environnantes, procédé de création de données de recherche de zones environnantes, procédé de recherche de zones environnantes et dispositif de navigation
JP2006308390A (ja) 乗車位置案内システム、経路探索サーバおよびプログラムならびに乗車位置案内端末
US20130046465A1 (en) System and method of generating a route across an electronic map
JP2009093384A (ja) Poi検索システム、経路探索サーバおよびpoi検索方法
JP2013140425A (ja) 車載器、位置情報送信方法、及び位置情報送信プログラム
WO2019171705A1 (fr) Procédé de transmission d'informations d'itinéraire, système de transmission d'informations d'itinéraire et terminal monté sur véhicule
JP5047757B2 (ja) ドライブプラン提供サーバおよびドライブプラン提供システム
WO2021172514A1 (fr) Dispositif de sortie d'informations et programme de commande
JP5773202B2 (ja) ナビゲーションシステム、ナビゲーションプログラム及びナビゲーション方法
JP3832284B2 (ja) ナビゲーションシステム及びナビゲーションプログラム
US8660788B2 (en) Navigation system with audio and method of operation thereof
KR100556688B1 (ko) 이동 통신 단말기에 지도 정보를 제공하는 방법 및 시스템
JP2004301606A (ja) 車両用障害物情報提供システム
JP2004093285A (ja) ナビゲーションシステム及び地図表示方法のプログラム
JP6121060B1 (ja) ルート映像送信装置及びルート映像送信プログラム
JP6385255B2 (ja) 経路探索システム、経路探索方法、コンピュータプログラム
JP2006527837A (ja) 経路探索方法およびシステム
JP2006127134A (ja) 複数の交通流の提供方法及び装置
JP4140361B2 (ja) ナビゲーションシステム及びプログラム
JP6298320B2 (ja) ナビゲーション装置、ナビゲーションシステム、ナビゲーション方法、およびプログラム
JP4023259B2 (ja) ナビゲーションシステム及び地図表示方法のプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21759738

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022503748

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21759738

Country of ref document: EP

Kind code of ref document: A1