WO2019208326A1 - カーファインダシステム - Google Patents

カーファインダシステム Download PDF

Info

Publication number
WO2019208326A1
WO2019208326A1 PCT/JP2019/016315 JP2019016315W WO2019208326A1 WO 2019208326 A1 WO2019208326 A1 WO 2019208326A1 JP 2019016315 W JP2019016315 W JP 2019016315W WO 2019208326 A1 WO2019208326 A1 WO 2019208326A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
force sense
information
mobile terminal
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/016315
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
峻英 ▲高▼松
隆文 岡安
大介 滑川
中村 則雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Miraisens inc
Denso Corp
Original Assignee
Miraisens inc
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Miraisens inc, Denso Corp filed Critical Miraisens inc
Publication of WO2019208326A1 publication Critical patent/WO2019208326A1/ja
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/005Traffic control systems for road vehicles including pedestrian guidance indicator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • This disclosure relates to a car finder system.
  • a car finder system that is configured by sending a position where a vehicle is parked and stopped from an in-vehicle device to an electronic key by radio waves and transmitting the vehicle position received by the electronic key to a guidance display device capable of presenting position information.
  • the user may watch the guidance display screen of a smartphone or the like in order to confirm the vehicle position.
  • it since it is highly likely that attention to the surroundings will be lost by gazing at the guidance display screen, it may not be suitable for use in places that require attention to other vehicles such as parking lots. is there.
  • This disclosure is intended to provide a car finder system that can guide a user to a parked vehicle in a state in which attention to the surroundings is not unclear.
  • the car finder system is a car finder system including an in-vehicle device mounted on a vehicle, a mobile terminal possessed by a user, and a force sense presentation device, wherein the in-vehicle device is A position information acquisition unit that acquires position information of the vehicle, a wireless communication unit that transmits position information to the mobile terminal, and position information acquired by the position information acquisition unit when the vehicle is parked by the wireless communication unit.
  • An in-vehicle device control unit for transmitting to the mobile terminal, the mobile terminal positioning a current position, a wireless communication unit for receiving the position information of the vehicle from the wireless communication unit of the on-vehicle device, A moving body control unit that calculates a user's guidance direction from the current position information obtained by the position positioning unit and the vehicle position information, and the force sense presentation device is connected to the mobile terminal. Having force feedback device for presenting the force sense a guidance direction to the user based on the guidance direction of the information of the serial user.
  • the in-vehicle device acquires the position information from the wireless communication unit to the mobile terminal from the wireless communication unit when the position information is acquired by the position measurement unit.
  • the mobile terminal acquires the current position by the position measurement unit, calculates the guidance direction to the parking position of the vehicle from the current position and the vehicle position information received by the wireless communication unit by the moving body control unit, and presents a force sense Notify the device.
  • the force sense presentation device presents a guidance direction by a force sense using a force sense presentation element in accordance with information on the guidance direction given from the mobile terminal.
  • a user who possesses a force sense presentation device can approach the vehicle by moving in the direction in which the force sense presented from the force sense presentation device is guided. Since it is transmitted, it is possible to move while paying attention to the traffic conditions of surrounding vehicles until the vehicle arrives.
  • FIG. 1 is an overall block diagram showing the first embodiment.
  • FIG. 2 is an operation explanatory view showing a usage pattern showing the first embodiment.
  • FIG. 3 is a flowchart 1 of the processing of the in-vehicle device showing the first embodiment.
  • FIG. 4 is a flowchart 2 of the processing of the in-vehicle device showing the first embodiment.
  • FIG. 5 is a flowchart 1 of the smart key processing showing the first embodiment.
  • FIG. 6 is a flowchart 2 of the smart key processing showing the first embodiment.
  • FIG. 7 is a flowchart of the process of the haptic device showing the first embodiment.
  • FIG. 1 is an overall block diagram showing the first embodiment.
  • FIG. 2 is an operation explanatory view showing a usage pattern showing the first embodiment.
  • FIG. 3 is a flowchart 1 of the processing of the in-vehicle device showing the first embodiment.
  • FIG. 4 is a flowchart 2 of the processing of the in-vehicle device showing the
  • FIG. 8 is a sequence diagram showing the first embodiment.
  • FIG. 9 is an overall block diagram showing the second embodiment.
  • FIG. 10 is a flowchart of smart key processing showing the second embodiment.
  • FIG. 11 is an overall block diagram showing the third embodiment.
  • FIG. 12 is a third flowchart of smart key processing showing the third embodiment.
  • FIG. 13 is a flowchart 1 of the process of the smartphone showing the third embodiment.
  • FIG. 14 is a flowchart 4 of the smart key processing showing the third embodiment.
  • FIG. 15 is a flowchart 2 of the process of the smartphone showing the third embodiment.
  • FIG. 16 is a flow diagram 5 of the smart key processing showing the third embodiment.
  • FIG. 17 is a flowchart 3 of the process of the smartphone showing the third embodiment.
  • FIG. 18 is a sequence diagram showing the third embodiment.
  • FIG. 19 is a block configuration diagram of the force sense presentation device.
  • the vehicle 1 of the user P is in a state of being parked at a parking position of the parking lot 100, for example.
  • the parking lot 100 has, for example, eight parking positions K1 to K8, which are divided into four cars by the partition wall 101 and partitioned by the partition line 102.
  • the vehicle 1 is parked at the parking position K2.
  • other vehicles 103a to 103d are parked at, for example, parking positions K1, K5, K6, and K8.
  • the vehicle 1 is equipped with an in-vehicle device 2 constituting a car finder system.
  • the user P has a smart key 3 and a force sense presentation device 4 as mobile terminals corresponding to the vehicle 1.
  • the smart key 3 and the force sense presentation device 4 can be carried by the user P as the mobile device 5 housed in a case, for example.
  • the in-vehicle device 2 includes an in-vehicle device control unit 21, a position positioning unit 22, an obstacle sensor 23, a wireless communication unit 24, and a data storage unit 25.
  • the in-vehicle device control unit 21 includes a CPU, a memory, and the like, and executes processing for providing information for guiding the position of the vehicle 1 according to a program described later.
  • the position positioning unit 22 has a GPS function and the like, detects the current position of the vehicle 1 and acquires current position information.
  • the position positioning unit 22 can also use a position positioning function of a car navigation device mounted on the vehicle 1.
  • the obstacle sensor 23 detects an obstacle existing around the vehicle 1 using, for example, an ultrasonic sensor or a peripheral camera, and is used for preventing a collision during parking.
  • the wireless communication unit 24 is provided so as to be able to exchange information with the smart key 3 through an antenna 24a by a wireless communication method such as BLE (Bluetooth LowBluetoothEnergy: Bluetooth is a registered trademark) or qi ( ⁇ ).
  • the data storage unit 24 is configured by a non-volatile memory or the like, and stores a processing program to be described later, and also stores position information of the vehicle 1 and the like.
  • the smart key 3 is one of electronic keys having a security function. When the vehicle 1 is boarded or parked, the door is locked or unlocked by wireless communication, or the ID is verified when the engine is started. It has a function to perform processing. Further, the smart key 3 has a function of guiding the vehicle so that it can move to the vehicle 1 parked at a remote position as a function in the car finder system.
  • the smart key 3 includes a moving body control unit 31, a position positioning unit 32, a wireless communication unit 33, a data storage unit 34, and the like.
  • the moving body control unit 31 includes a CPU, a memory, and the like, and stores a program to be described later, and performs a process of calculating a guidance direction to the parking position of the vehicle 1 by executing the program.
  • the position positioning unit 32 includes, for example, a gyro 32a, a compass 32b, an acceleration sensor 32c, and an atmospheric pressure sensor as sensors for executing pedestrian autonomous navigation (PDR: “Pedestrian Dead Reckoning”, hereinafter simply referred to as “autonomous navigation”). 32d and the like.
  • Autonomous navigation is a technique for calculating a current position by detecting a moving state from a reference position without acquiring position information based on GPS signals.
  • the position positioning unit 32 measures the position from the moving state of the smart key 3 based on the detection signals of these sensors 32a to 32d.
  • the wireless communication unit 33 is provided so as to be able to exchange information by wireless communication with each of the in-vehicle device 2 and the force sense presentation device 4 through the antenna 33a.
  • the data storage unit 34 is configured by a nonvolatile memory or the like, and stores a program for processing to be described later, and stores position information of the vehicle 1.
  • the force sense presentation device 4 presents a force sense by vibrating the guidance direction in accordance with the guidance signal of the guidance direction given from the smart key 3, and as shown in FIG. 19, the force sense control unit 41, the force sense presentation An element 42, an inductive switch 43, and a wireless communication unit 44 are provided.
  • the haptic control unit 41 includes a CPU, a memory, and the like, and stores a program to be described later. When the power is turned on, the haptic presentation element 42 is haptically sensed based on the guidance information by executing the program. A presentation signal is output. Although the power switch is not shown, it is activated when it is turned on during use.
  • the force sense presentation element 42 is an element that induces an illusion phenomenon by a combination of a vibration generation mechanism and vibration, and includes, for example, a two-dimensional induced direction vibration generation mechanism 42a that performs force sense presentation in a two-dimensional direction and a drive circuit 42b.
  • the two-dimensional guided direction vibration generating mechanism 42a preferably includes an actuator, but is not limited thereto, and may be any vibration device that generates vibration.
  • the induction switch 43 is a switch that is turned on when the induction is turned on, and can also be used as a power switch.
  • the wireless communication unit 44 is provided to transmit the operation information of the induction switch 43 and receive the induction information by performing wireless communication with the smart key 3 through the antenna 44a.
  • the force sense presentation device 42 may generate vibrations in the one-dimensional guidance direction or the three-dimensional guidance direction without being limited to the two-dimensional guidance direction.
  • the haptic presentation by the pseudo-tactile sensation presented by the haptic presentation device 4 described above is based on the technology disclosed in, for example, Japanese Unexamined Patent Application Publication No. 2017-73101, Japanese Unexamined Patent Application Publication No. 2007-248478, Japanese Unexamined Patent Application Publication No. 2005-190465 (National Institute of Advanced Industrial Science and Technology). It uses the principle that can generate and present a sense of being pulled using the nonlinearity of human senses. Accordingly, it is possible to give the user P a sense of urging the user P to move in the guidance direction while holding the force sense presentation device 4 in a predetermined state.
  • the pseudo-tactile sense includes an illusionary tactile force sense.
  • the in-vehicle device 2 When the user P gets on the vehicle 1 with the mobile device 5, that is, the smart key 3 and the force sense presentation device 4, when the ignition (IG) is turned on, the in-vehicle device 2 is shown in FIGS. 3 and 4. The process shown is executed to connect to the smart key 3 of the mobile device 5 wirelessly. Specifically, the in-vehicle device control unit 21 of the in-vehicle device 2 performs connection communication from the wireless communication unit 24 to the smart key 3 in step A1. Thereafter, the in-vehicle device control unit 21 waits for the ignition to be turned off in step A2.
  • the position measurement unit 22 sequentially acquires information on the current position of the vehicle 1, and the obstacle sensor 23 detects an obstacle existing around the vehicle 1. Thereafter, when the ignition is turned off, the in-vehicle device control unit 21 determines YES in Step A2 and proceeds to Step A3. In step A3, the in-vehicle device control unit 21 transmits the current position of the vehicle 1 detected by the position positioning unit 22 to the smart key 3 as a parking position.
  • the in-vehicle device control unit 21 performs the connection processing at Step A ⁇ b> 11, and then smartly acquires the acquired obstacle information at Step A ⁇ b> 12. Send to key 3 to end the process.
  • step B 1 since the smart key 3 is in a power-on state, when the wireless communication unit 33 receives a connection request from the in-vehicle device 2, the mobile control unit 31 is in-vehicle in step B 1. The connection process with the machine 2 is performed. Thereafter, the mobile body control unit 31 enters a standby state in step B2 until an ignition-off radio notification is received from the in-vehicle device 2.
  • the mobile control unit 31 When the ignition is turned off after the vehicle 1 is stopped, the mobile control unit 31 is YES in step B2 due to the wireless notification from the in-vehicle device 2, and proceeds to step B3.
  • the mobile body control unit 31 receives the parking position of the vehicle 1 transmitted from the in-vehicle device 2, the mobile body control unit 31 stores this in the data storage unit 34 as the parking position of the vehicle 1 and activates the autonomous navigation calculation function.
  • the mobile body control unit 31 calculates the current position by autonomous navigation based on the received parking position information of the vehicle 1 and stores it in the data storage unit 34 as current position information. Subsequently, in step B5, the mobile control unit 31 waits for the induction switch 43 of each force presentation device 4 to be turned on. During this time, the moving body control unit 31 waits while repeating steps B4 and B5 to update and store the position information until YES is obtained in step B5.
  • the mobile body control unit 31 calculates a movement path based on detection signals from the gyro 32a, the compass 32b, the acceleration sensor 32c, and the atmospheric pressure sensor 32d of the position positioning unit 32 by performing autonomous navigation. Thereby, the mobile body control unit 31 can calculate the current position from the calculation result of the movement route with respect to the parking position of the vehicle 1, and stores this in the data storage unit 34. While the steps B4 and B5 are repeated as described above, the current position of the user P is recognized by sequentially rewriting and storing the current position of the smart key 3.
  • Step B 5 the moving body control unit 31 becomes YES in Step B 5 and proceeds to Step B 6.
  • the mobile body control unit 31 calculates a direction for guiding the vehicle 1 to the parking position from the parking position information stored in the data storage unit 34 and the current position information calculated in step B4.
  • the mobile body control unit 31 performs the connection process in step B11 when the connection process with the vehicle-mounted device 2 is not performed, and then waits for the reception of the obstacle information in step B12. To do.
  • the mobile body control unit 31 proceeds to step B13, stores the acquired obstacle information in the data storage unit 34, and ends the process.
  • step B6 of FIG. 5 when the obstacle information is acquired and stored when the moving body control unit 31 calculates the direction for guiding to the parking position of the vehicle 1 from the current position information, The guidance direction is calculated in consideration of the information.
  • the parking position of the vehicle 1 is in the direction of the arrow Rx1, but the other vehicle 103d and the partition wall 101 are obstructed. Therefore, when the direction of the arrow Rx1 is the guiding direction, the vehicle 1 cannot be reached as it is. Similarly, the route indicated by the arrow Rx2 cannot reach the vehicle 1 as it is in the guiding direction because the partition wall 101 is an obstacle. Therefore, if not only the parking position of the vehicle 1 and the current position of the smart key 3 but also the position information of the partition wall 101 and the other vehicle 103d as an obstacle can be acquired, the arrows R1 and R2 where there is no obstacle exist. Such a guidance direction can be calculated.
  • the moving body control unit 31 transmits information on the calculated guidance direction from the wireless communication unit 33 to the force sense presentation device 4 in the next step B7. Subsequently, the mobile control unit 31 determines whether or not it has arrived at the parking position in step B8, and if not, determines NO and returns to step B4. Hereinafter, steps B4 to B8 described above are performed. repeat. Then, when the current position of the smart key 3 arrives at the parking position, the mobile control unit 31 determines YES in step B8, proceeds to step B9, notifies the force sense presentation device 4 of the arrival, and ends the process.
  • the force sense presentation device 4 starts the process shown in FIG. 7 when the power is turned on by the user P.
  • the wireless communication unit 44 receives a connection request from the smart key 3
  • the force sense control unit 41 performs wireless connection with the smart key 3 in step C ⁇ b> 1.
  • the haptic control unit 41 waits for the guidance switch 43 to be turned on by the user in step C2.
  • Step C2 the force sense control unit 41 determines YES in Step C2 and proceeds to Step C3 to transmit guidance on notification from the wireless communication unit 44 to the smart key 3 side. To do.
  • step C4 the force sense control unit 41 waits for reception of information on the direction of guidance from the smart key 3 side. When information on the direction of guidance is received, the force sense control unit 41 proceeds to step C5 and moves to the force sense presentation element. The operation of presenting a force sense in a two-dimensional direction is performed by 42.
  • the force sense control unit 41 applies force sense to the force sense presenting element 42 according to the guidance direction information transmitted from the smart key 3 until the user P arrives at the parking position of the vehicle 1 in the parking lot 100. Make the presentation work. And if the arrival notification to the parking position is transmitted from the smart key 3, it becomes YES at Step C6 and proceeds to Step C7.
  • the force sense control unit 41 presents a force sense in the vibration state indicating the arrival state to the force sense presenting element 42 and ends the process.
  • movement of the above-mentioned vehicle equipment 2, the smart key 3, and the force sense presentation apparatus 4 is demonstrated.
  • the in-vehicle device 2 is wirelessly connected to the smart key 3. This corresponds to step A1 in FIG. 3 and step B1 in FIG.
  • S1 in FIG. 8 when the user P stops the vehicle 1 and turns off the ignition switch, the vehicle-mounted device 2 recognizes this and indicates the current position of the vehicle 1 as shown by S2 in FIG. While memorize
  • the smart key 3 receives this and activates autonomous navigation, and repeatedly executes the process of calculating the current position based on the detection signal of the sensor with the parking position as a reference. This corresponds to steps B2 to B4 in FIG.
  • the parking position of the vehicle 1 is parked at the parking position S2 of the parking lot 100 shown in FIG.
  • the position of the vehicle 1 may not be known.
  • the power switch of the force sense presentation device 4 is turned on and the guidance switch 43 is turned on by the user P as indicated by S4 and S5 in FIG.
  • the force sense presentation device 4 performs connection processing with the smart key 3, and notifies the smart key 3 side of guidance on as shown in S6 in FIG. This corresponds to steps C1 to C3 in FIG.
  • the force sense presentation device 4 is in a state where it can receive information on the guidance direction from the smart key 3.
  • the smart key 3 When the smart key 3 receives the notification of guidance on from the force sense presentation device 4, the smart key 3 calculates the direction to the parking position and transmits the guidance direction to the force sense presentation device 4 as shown in S7 in FIG. This corresponds to steps B5 to B7 in FIG.
  • the force sense presentation device 4 performs a force sense presentation operation by the force sense presentation element 42. This corresponds to steps C4 and C5 in FIG.
  • the smart key 3 repeatedly performs notification of the guidance direction to the force sense presentation device 4 until it arrives at the parking position, and notifies arrival when it arrives at the parking position, as indicated by S9 in FIG. This corresponds to steps B8 and B9 in FIG.
  • the force sense presentation device 4 performs force sense presentation indicating arrival by the force sense presentation element 42, as indicated by S10 in FIG. This corresponds to steps C6 and C7 in FIG.
  • the smart key 3 and the force sense presentation device 4 guide the vehicle 1 to the parking position K2.
  • the user P presents a force sense of guidance in the R1 direction by the force sense presentation device 4 possessed by the user P.
  • the user P can tell the guidance direction by the force sense presentation by the force sense presentation device 4, so be careful of the situation around the parking lot 100 without looking at the smart key 3 or the force sense presentation device 4. It is possible to move in the R1 direction while watching.
  • the force sense presentation device 4 presents a force sense of guidance in the R2 direction, and can further move to the front of the vehicle 1.
  • the vehicle-mounted device 2 is mounted on the vehicle 1, and the user P possesses the smart key 3 and the force sense presentation device 4 when leaving the vehicle 1.
  • the user P can move to the position of the vehicle 1 by receiving the force sense in the guidance direction in his / her hand by the smart key 3 and the force sense presentation device 4 that he / she owns after the vehicle 1 is parked and left. This makes it possible to move to the vehicle 1 while paying attention to the passage of other vehicles when walking in the parking lot 100.
  • the information when an obstacle is detected from the in-vehicle device 2 side by the obstacle sensor 23 provided on the vehicle 1 side, the information can be notified to the smart key 3 side.
  • the guidance direction to the parking position of the vehicle 1 with the smart key 3 it can be calculated so as to avoid the obstacle, and thereby the user P can be guided while avoiding the obstacle. .
  • the obstacle sensor 23 is provided.
  • the guidance direction of the user P by the smart key 3 is shown in FIG. It becomes a linear direction from the current position to the parking position of the vehicle 1 like the Rx1 direction.
  • the user P knows the linear direction of the parking position of the vehicle 1, and when there is an obstacle while looking at the surroundings, the user P walks toward the direction of the vehicle 1 while avoiding this obstacle. Therefore, the vehicle 1 can be safely reached.
  • FIG. 9 and FIG. 10 show the second embodiment. Hereinafter, parts different from the first embodiment will be described.
  • the force sense presentation device 4a does not include the force sense control unit 41, the induction switch 43, and the wireless communication unit 44, and the force sense presentation element 42 directly receives a guidance command signal from the external terminal Y. It is a thing.
  • the smart key 3a has the same configuration as the smart key 3, but is provided with an external terminal X.
  • the force sense presentation device 4a has a configuration mainly including the force sense presentation element 42 and a drive power source.
  • the force sense presentation device 4a is provided with an external terminal Y, which is connected to the mobile control unit 31 via the external terminal X of the smart key 3a.
  • the force sense presentation device 4 a is configured such that a force sense control signal is directly given from the moving body control unit 31 to the force sense control element 42.
  • the force sense presentation device 4a has a configuration in which the force sense control unit 41, the power switch, the inductive switch 43, or the wireless communication unit 44 are omitted, so that the force sense presentation device 4a is downsized as a whole and is connected to the external terminal X of the smart key 3a. In the state of the mobile device 5a, it can be made more compact and easy to carry.
  • FIG. 10 shows the processing contents of the smart key 3a. This is obtained by incorporating the process of FIG. 7 into the process shown in FIG. 5 in the first embodiment. That is, the mobile body control unit 31 of the smart key 3a proceeds with the process in the same manner as described above, and when calculating the direction for guiding from the current position information to the parking position of the vehicle 1 in step B6, the obstacle information Is obtained and stored, the guidance direction is calculated in consideration of this information.
  • the mobile control unit 31 outputs the calculated guidance direction as a control signal to the force sense presentation element 42 through the external terminal X in the next step B7a, and performs the force sense presentation operation by the force sense presentation element 42. Let Thereafter, the mobile control unit 31 performs steps B4 to B8 until the current position of the smart key 3a arrives at the parking position. When the mobile body control unit 31 arrives at the parking position, the arrival state is detected with respect to the force sense presentation element 42 at step B9a. A force sense is presented in a vibration state indicating, and the process is terminated.
  • the second embodiment it is possible to obtain substantially the same operational effects as the first embodiment. Moreover, in this embodiment, since it can be set as the structure which mainly has the force sense presentation element 42 as the force sense presentation apparatus 4a, it can be reduced in size and it is set as the compact mobile device 5a which is easy to carry. be able to.
  • the force sense presentation device 4a includes the force sense presentation element 42 is used.
  • the force sense presentation device 4 shown in the first embodiment is provided with the external terminal Y.
  • the external terminal Y As a configuration, by directly connecting to the external terminal X of the smart key 3a without using the wireless communication unit 44, it is possible to obtain a direct guidance direction instruction.
  • FIG. 11 to FIG. 18 show the third embodiment, and only the parts different from the first embodiment will be described below.
  • the user P uses the smartphone 6 as a mobile communication terminal to improve convenience.
  • the overall configuration is a configuration in which a smartphone 6 is provided in addition to the configuration of the first embodiment.
  • the smartphone 6 includes a communication control unit 61, a position positioning unit 62, a wireless communication unit 63, a data storage unit 64, and the like as a configuration related to the car finder system.
  • the communication control unit 61 includes a CPU, a memory, and the like, and stores a program to be described later in addition to performing a function as a normal smartphone. The execution of the program enhances the function of the car finder system. The action can be performed.
  • the position positioning unit 62 has a function of receiving a GPS signal and specifying the current position.
  • the wireless communication unit 63 is provided so as to be able to exchange information with the vehicle-mounted device 2, the smart key 3, and the force sense presentation device 4 through the antenna 63a by wireless communication.
  • the data storage unit 64 is configured by a nonvolatile memory or the like, and stores a program for processing to be described later.
  • the smartphone 6 since the smartphone 6 has a function of receiving the GPS signal and specifying the current position, the smartphone 6 is based on the current position information obtained from the smartphone 6 instead of guiding the autonomous navigation by the smart key 3. Guidance can be made.
  • the smartphone 6 when the smartphone 6 is in a state where the user P can communicate with the vehicle-mounted device 2 of the vehicle 1, after performing the connection process, in the same manner as the smart key 3, information on the parking position during parking is provided. Is transmitted from the in-vehicle device 2.
  • the first is processing that uses current position information based on GPS signals acquired by the smartphone 6.
  • the second is processing when the smartphone 6 is set to the flight mode while the user P is moving away from the vehicle 1.
  • 3rd is a process in case the smart phone 6 detects the state to which the user P moves to an up-down direction.
  • the smartphone 6 can acquire the current position information from the GPS signal, and can thereby correct the error of the current position calculated by the autonomous navigation using the smart key 3.
  • FIG. 12 shows process 3 of the smart key 3
  • FIG. 13 shows process 1 of the smartphone 6.
  • the mobile control unit 31 of the smart key 3 wirelessly connects to the smartphone 6 by the wireless communication unit 33 in step B21, and requests a positioning status from the smartphone 6 in the subsequent step B22.
  • the mobile control unit 31 waits until receiving whether or not the smartphone 6 is able to perform GPS positioning. If the GPS positioning status is received, the mobile body control unit 31 determines YES and proceeds to step B24.
  • the mobile control unit 31 requests the current position information from the smartphone 6 in step B24.
  • the mobile body control unit 31 acquires the current position information of the smart key 3 in the subsequent step B 25, and stores and stores it in the data storage unit 34.
  • the communication control unit 61 of the smartphone 6 is wirelessly connected to the smart key 3 by the wireless communication unit 63 in step D1, and receives a GPS positioning status request from the smart key 3 in the subsequent step D2. Wait for. Thereafter, when receiving the GPS positioning status request, the communication control unit 61 determines YES in Step D2, and transmits the GPS status to the smart key 3 in Step D3.
  • step D4 the communication control unit 61 repeatedly executes steps D2 to D4 described above until a request for the current position information is received from the smart key 3, and waits.
  • the communication control unit 61 determines YES in Step D4, proceeds to Step D5, and transmits the current position information to the smart key 3.
  • the smart key 3 can sequentially acquire the current position information by the GPS positioning acquired by the smartphone 6 as necessary.
  • the calculation process of the current position in the autonomous navigation using the smart key 3 it is possible to calculate the guidance direction data with the error corrected by referring to the current position information acquired from the smartphone 6 at any time. Become.
  • FIG. 14 shows the process 4 of the smart key 3
  • FIG. 15 shows the process 2 of the smartphone 6.
  • the mobile control unit 31 of the smart key 3 wirelessly connects to the smartphone 6 by the wireless communication unit 33 in step B31, and waits to receive a flight mode notification from the smartphone 6 in the subsequent step B32. To do. Thereby, the smart key 3 will be in the state which monitors the setting of the flight mode by the smart phone 6.
  • FIG. 14 the mobile control unit 31 of the smart key 3 wirelessly connects to the smartphone 6 by the wireless communication unit 33 in step B31, and waits to receive a flight mode notification from the smartphone 6 in the subsequent step B32. To do. Thereby, the smart key 3 will be in the state which monitors the setting of the flight mode by the smart phone 6.
  • Step B32 the mobile body control unit 31 determines YES in Step B32 and proceeds to Step B33.
  • step B33 the mobile body control unit 31 acquires the current position information of the smart key 3 that has been measured so far, temporarily stores it in the data storage unit 34, and ends the process.
  • the communication control unit 61 of the smartphone 6 is wirelessly connected to the smart key 3 by the wireless communication unit 63 in Step D11, and the operation for switching to the flight mode is performed in the subsequent Step D12. stand by. Thereby, the smart phone 6 will be in the state which monitors the setting operation of the flight mode by the user P.
  • FIG. 15 the communication control unit 61 of the smartphone 6 is wirelessly connected to the smart key 3 by the wireless communication unit 63 in Step D11, and the operation for switching to the flight mode is performed in the subsequent Step D12. stand by.
  • step D12 determines YES in step D12, and notifies the smart key 3 of the flight mode setting in the next step D13. Thereafter, in step D14, the communication control unit 61 switches its own state to the flight mode and ends the process.
  • the third case will be described with reference to FIG. 16 and FIG.
  • a function of the smartphone 6 when the user P gets on a moving body with movement in the height direction such as an escalator or an elevator, the state in which the moving body is used is detected by detecting movement in the height direction. There is a function.
  • the smart key 3 detects movement in the two-dimensional direction by autonomous navigation, the autonomous navigation by the smart key 3 is temporarily stopped.
  • FIG. 16 shows process 5 of the smart key 3
  • FIG. 17 shows process 3 of the smartphone 6.
  • the mobile control unit 31 of the smart key 3 wirelessly connects to the smartphone 6 by the wireless communication unit 33 in step B41, and receives a notification of the use of the mobile from the smartphone 6 in the subsequent step B42. Wait. Thereby, the smart key 3 will be in the state which monitors the detection of the mobile body use by the smart phone 6.
  • FIG. 16 the mobile control unit 31 of the smart key 3 wirelessly connects to the smartphone 6 by the wireless communication unit 33 in step B41, and receives a notification of the use of the mobile from the smartphone 6 in the subsequent step B42. Wait. Thereby, the smart key 3 will be in the state which monitors the detection of the mobile body use by the smart phone 6.
  • Step B42 the mobile body control unit 31 determines YES in Step B42 and proceeds to Step B43.
  • step B43 the mobile body control unit 31 temporarily stores the current position information in the data storage unit 34 and stops pedestrian autonomous navigation.
  • the mobile body control unit 31 waits for reception of the mobile body use end notification from the smartphone 6 in step B44. And the mobile body control part 31 will judge YES at step B44, will transfer to step B45, if the notification of completion
  • the communication control unit 61 of the smartphone 6 is wirelessly connected to the smart key 3 by the wireless communication unit 63 in step D21, and is connected to the in-vehicle device 2 in the subsequent step D22. Judging.
  • the smartphone 6 does not use this function when connected to the in-vehicle device 2, that is, when YES in step D ⁇ b> 22, and therefore waits until the connection with the in-vehicle device 2 is disconnected.
  • the communication control unit 61 enters a state of waiting for the use of the moving object in the next step D23. Thereafter, the communication control unit 61 monitors the use of the moving object by repeatedly executing Steps D22 and D23.
  • the communication control unit 61 determines YES in Step D23 and proceeds to Step D24. Notify use of. Thereafter, the communication control unit 61 waits for the end of use of the moving body in step D25, and when the end of use of the mobile body is detected, determines YES and proceeds to step D26. In step D26, the communication control unit 61 notifies the smart key 3 of the end of use of the mobile object, and in subsequent step D27, notifies the smart key 3 of the current position information and ends the process.
  • the smartphone 6 can detect this state, and the autonomous navigation using the smart key 3 can be stopped.
  • FIG. 18 is a sequence diagram showing a series of operations, and corresponds to FIG. 8 in the first embodiment. In the following description, functions added by the smartphone 6 will be described.
  • the in-vehicle device 2 wirelessly connects to the smartphone 6, and as shown in S ⁇ b> 20 in FIG. 18, the in-vehicle device 2 is parked when the vehicle 1 is parked. Information of parking position is notified from.
  • the smartphone 6 receives the current position information from the smart key 3 side, that is, the current position acquired by GPS positioning after passing through steps B24 and B25 in FIG. Notify information. This corresponds to steps D4 and D5 in FIG.
  • the function of the smart key 3 can be supported using the function of the smart phone 6 as a portable communication terminal.
  • the smartphone 6 since the smartphone 6 has a GPS positioning function, it is possible to correct an error in autonomous navigation by the smart key 3. Further, when the flight mode is set for the smartphone 6, the current position information of the smart key 3 can be temporarily saved by notifying the smart key 3 of the flight mode. Furthermore, by notifying the smart key 3 of the use of the moving body accompanied by movement in the height direction by the smartphone 6, the pedestrian autonomous navigation can be stopped while the moving body is being used.
  • the smartphone 6 detects the flight mode and the moving body is shown. However, in addition to this, it is effective to apply to a situation where a train or other vehicle is used. . Even in this case, by notifying the smart key 3 of the use state of the moving body, the pedestrian autonomous navigation can be temporarily stopped.
  • the smart key 3 is provided with a gyro 32a, a compass 32b, an acceleration sensor 32c, and an atmospheric pressure sensor 32d as the position positioning unit 32.
  • the position measurement unit includes the gyro 32a and the acceleration sensor 32c, autonomous navigation is possible. Can be implemented.
  • the smart key 3 can also be used to give a guidance instruction to the force sense presentation device 4 using a smartphone as a mobile terminal.
  • a smartphone as a mobile terminal.
  • tablet terminals and wearable terminals can also be used as mobile terminals.
  • any tablet terminal or wearable terminal having a communication function and a positioning function can be used in cooperation with a mobile terminal.
  • the force sense presentation device 42 is provided with the force sense presentation element 42 including the two-dimensional guidance direction vibration presentation mechanism 42a that performs force sense presentation in the two-dimensional direction. It is also possible to provide a force sense presentation element for presenting a force sense to the. In this case, for example, the user P can be guided in a three-dimensional manner, and movement in the vertical direction like a three-dimensional parking lot can be guided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2019/016315 2018-04-24 2019-04-16 カーファインダシステム Ceased WO2019208326A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018083041A JP6909386B2 (ja) 2018-04-24 2018-04-24 カーファインダシステム
JP2018-083041 2018-04-24

Publications (1)

Publication Number Publication Date
WO2019208326A1 true WO2019208326A1 (ja) 2019-10-31

Family

ID=68294008

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/016315 Ceased WO2019208326A1 (ja) 2018-04-24 2019-04-16 カーファインダシステム

Country Status (2)

Country Link
JP (1) JP6909386B2 (enExample)
WO (1) WO2019208326A1 (enExample)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021231052A1 (en) * 2020-05-15 2021-11-18 Nike Innovate C.V. Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback
US11464275B2 (en) 2018-05-31 2022-10-11 Nike, Inc. Intelligent electronic footwear and control logic for automated infrastructure-based pedestrian tracking

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021081339A (ja) * 2019-11-20 2021-05-27 ヤフー株式会社 情報処理装置、情報処理方法、および情報処理プログラム
WO2024177095A1 (ja) * 2023-02-23 2024-08-29 株式会社村田製作所 錯触力覚発生装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000352521A (ja) * 1999-04-29 2000-12-19 Fuji Xerox Co Ltd ユーザにナビゲーションの支援を提供するシステム及び方法、方向指示キューを介してユーザにナビゲーションの支援を提供する触覚方向指示装置、及び触覚方向指示装置を介してユーザにナビゲーションの支援を提供する方法
JP2003240588A (ja) * 1999-11-18 2003-08-27 Equos Research Co Ltd ナビゲーション装置及び情報センタ
JP2011027425A (ja) * 2009-07-21 2011-02-10 Tokai Rika Co Ltd カーファインダシステム
JP2014186638A (ja) * 2013-03-25 2014-10-02 Tokai Rika Co Ltd 携帯機
JP2016180655A (ja) * 2015-03-24 2016-10-13 パイオニア株式会社 案内情報提示装置、案内情報提示方法、および、案内情報提示装置用プログラム
JP2017073100A (ja) * 2015-10-05 2017-04-13 株式会社ミライセンス 触力覚情報提示システム
US20170372565A1 (en) * 2015-01-13 2017-12-28 Ck Materials Lab Co., Ltd. Haptic information provision device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000352521A (ja) * 1999-04-29 2000-12-19 Fuji Xerox Co Ltd ユーザにナビゲーションの支援を提供するシステム及び方法、方向指示キューを介してユーザにナビゲーションの支援を提供する触覚方向指示装置、及び触覚方向指示装置を介してユーザにナビゲーションの支援を提供する方法
JP2003240588A (ja) * 1999-11-18 2003-08-27 Equos Research Co Ltd ナビゲーション装置及び情報センタ
JP2011027425A (ja) * 2009-07-21 2011-02-10 Tokai Rika Co Ltd カーファインダシステム
JP2014186638A (ja) * 2013-03-25 2014-10-02 Tokai Rika Co Ltd 携帯機
US20170372565A1 (en) * 2015-01-13 2017-12-28 Ck Materials Lab Co., Ltd. Haptic information provision device
JP2016180655A (ja) * 2015-03-24 2016-10-13 パイオニア株式会社 案内情報提示装置、案内情報提示方法、および、案内情報提示装置用プログラム
JP2017073100A (ja) * 2015-10-05 2017-04-13 株式会社ミライセンス 触力覚情報提示システム

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11464275B2 (en) 2018-05-31 2022-10-11 Nike, Inc. Intelligent electronic footwear and control logic for automated infrastructure-based pedestrian tracking
US11763676B2 (en) 2018-05-31 2023-09-19 Nike, Inc. Intelligent electronic footwear and control logic for automated pedestrian collision avoidance
US11900810B2 (en) 2018-05-31 2024-02-13 Nike, Inc. Intelligent electronic footwear and control logic for executing automated footwear features
US11915592B2 (en) 2018-05-31 2024-02-27 Nike, Inc. Intelligent electronic footwear and control logic for executing automated footwear features
WO2021231052A1 (en) * 2020-05-15 2021-11-18 Nike Innovate C.V. Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback
CN115605728A (zh) * 2020-05-15 2023-01-13 耐克创新有限合伙公司(Nl) 通过自动触觉、听觉和视觉反馈进行导航辅助的智能电子鞋和逻辑

Also Published As

Publication number Publication date
JP2019191862A (ja) 2019-10-31
JP6909386B2 (ja) 2021-07-28

Similar Documents

Publication Publication Date Title
US20170092127A1 (en) Electronic key, vehicle-mounted device, guidance device, and car-finder system
WO2019208326A1 (ja) カーファインダシステム
US8818714B2 (en) Portable navigation device and method with active elements
CN111344215B (zh) 用于控制车辆的泊车过程的方法
JP6639379B2 (ja) 車両周辺監視装置
US20190360837A1 (en) Navigation system, computer program product, and in-vehicle apparatus
JP2017200812A (ja) 支援システム、携帯端末、及び車載装置
JP2013148419A (ja) 誘導システム、携帯端末装置および車載装置
US11397440B2 (en) Vehicle control system, external electronic control unit, vehicle control method, and application
US20210046949A1 (en) Automated driving system, portable device and travel control device
WO2017187759A1 (ja) 支援システム、携帯端末、及び車載装置
KR101401063B1 (ko) 차량 이동 경로 제공 단말기 및 방법과, 이를 사용한 차량
JP2019191862A5 (enExample)
JP2018087742A (ja) スマートナビゲーション履物を活用したスマートナビゲーションシステム
JP2022174351A (ja) 自動駐車支援システム
JP6432572B2 (ja) 表示装置、表示システム
JP5454212B2 (ja) 情報制御システム及び情報制御方法
JP2013186868A (ja) 運転支援装置
US9693309B2 (en) Mobile electronic device
JP4223305B2 (ja) 駐車支援システム、サーバコンピュータおよびサーバコンピュータの制御方法
JP6100710B2 (ja) 無線車載機及び避難誘導システム
WO2021124781A1 (ja) 自動駐車支援システム、自動駐車支援プログラム、及び自動駐車支援方法
KR20190023730A (ko) 드론 장착용/드론 원격 제어용 스마트 폰을 이용한 드론 제어 시스템 및 방법
KR101379630B1 (ko) 주차 지원장치 및 그 지원방법, 및 차량의 주차 보조장치 및 그 보조방법
JP6457321B2 (ja) ナビゲーションシステム及び車載装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19792952

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19792952

Country of ref document: EP

Kind code of ref document: A1