WO2019123622A1 - Guiding device - Google Patents

Guiding device Download PDF

Info

Publication number
WO2019123622A1
WO2019123622A1 PCT/JP2017/046037 JP2017046037W WO2019123622A1 WO 2019123622 A1 WO2019123622 A1 WO 2019123622A1 JP 2017046037 W JP2017046037 W JP 2017046037W WO 2019123622 A1 WO2019123622 A1 WO 2019123622A1
Authority
WO
WIPO (PCT)
Prior art keywords
force sense
distance
unit
obstacle
person
Prior art date
Application number
PCT/JP2017/046037
Other languages
French (fr)
Japanese (ja)
Inventor
株式会社ニコン
堅二 豊田
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2017/046037 priority Critical patent/WO2019123622A1/en
Publication of WO2019123622A1 publication Critical patent/WO2019123622A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/08Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons

Definitions

  • the present invention relates to an induction device.
  • the guidance device detects an alternative direction instead of the traveling direction based on an obstacle detection unit that detects the distance and direction to a surrounding obstacle and the detection result of the obstacle detection unit. And an artificial force sense generation unit for generating an artificial force sense in the alternative direction.
  • a guidance apparatus comprising: a leader detection unit that detects a leader; a calculation unit that calculates a direction to the leader; and a pseudo force that generates artificial force in the direction of the leader And a sense generation unit.
  • FIG.1 (a) is a perspective view which illustrates the guidance apparatus by 1st Embodiment
  • FIG.1 (b) is a figure which illustrates the state which hold
  • FIG. 5 (a) is a view for explaining the situation of a surrounding object
  • FIG. 5 (b) is a view for illustrating a distance profile.
  • FIG. 6A is a view for explaining the situation of the surrounding object at a certain time, and a view for explaining the situation of the surrounding object after 3 seconds.
  • the guidance device assists and guides the walking of a visually impaired person.
  • the guidance device guides the user in a direction avoiding the obstacle.
  • FIG. 1A is a perspective view illustrating the guiding device 1 according to the first embodiment.
  • an obstacle detection unit 200 a pseudo force sense generation unit 300, and a notification unit 600 are provided in a cylindrical main body 10.
  • FIG.1 (b) is a figure which illustrates the state which hold
  • the user keeps the posture of the main body 10 in the vertical direction with respect to the ground (floor surface) and keeps the posture of the obstacle detection unit 200 in the horizontal direction, and the user moves the front of the obstacle detection unit 200 Faces in the direction (ie, the direction of travel).
  • the main body 10 of the guiding device 1 is provided with a guide groove 11 so that the user can correctly grip the guiding device 1 without looking at it.
  • the position where the guide groove 11 is provided is the front surface of the main body 10 and the front surface side of the obstacle detection unit 200. Therefore, when the user grips the guide groove 11 of the main body 10 in the traveling direction, the front surface of the obstacle detection unit 200 faces the traveling direction.
  • the traveling direction the direction in which the user wants to move is referred to as the traveling direction.
  • the obstacle detection unit 200 faces in the traveling direction.
  • the user can turn the obstacle detection unit 200 in the traveling direction, for example, by adjusting the direction in which the main body 10 is gripped so that the belly of the index finger touches the guide groove 11.
  • FIG. 2 is a block diagram illustrating the configuration of the guidance device 1.
  • the guidance device 1 includes an obstacle detection unit 200, a control unit 400, a simulated force sense generation unit 300, and a notification unit 600.
  • the obstacle detection unit 200 includes an imaging unit 210 and a signal processing unit 220, and detects an object within a predetermined range centered on the traveling direction.
  • the imaging unit 210 images the traveling direction
  • the signal processing unit 220 detects the distance to the object imaged by the imaging unit 210, and generates information indicating a distance profile based on the detection result.
  • the distance profile refers to information indicating the distance from the guidance device 1 to the object and the direction of the object viewed from the guidance device 1.
  • the function of the angle indicating the distance to the object and the direction is a distance profile.
  • FIG. 3 is a diagram showing an example of the configuration of the obstacle detection unit 200.
  • the imaging unit 210 in FIG. 2 is configured by a pair of cameras 210 a and 210 b which are disposed at a predetermined distance L (also referred to as a base length).
  • the respective cameras 210a and 210b have fisheye lenses 211a and 211b, and imaging elements 212a and 212b disposed on the image forming planes of the respective fisheye lenses 211a and 211b.
  • the imaging elements 212a and 212b capture subject images formed by the fisheye lenses 211a and 211b.
  • the image signals output from the imaging elements 212a and 212b are sent to the signal processing unit 220, respectively.
  • the signal processing unit 220 determines the distance to the object captured from the obstacle detection unit 200 (guide device 1) and the obstacle detection unit based on the correlation between the image signal by the imaging device 212a and the image signal by the imaging device 212b.
  • the distance profile is calculated as a function of the angle from the front direction of 200 (in this example, the traveling direction).
  • the fisheye lenses 211a and 211b are configured by equidistant projection type fisheye lenses.
  • the distance from each of the optical axes Xa and Xb represents the direction of the object.
  • the difference in position at which the same object is imaged represents parallax. For this reason, based on the image signal by the imaging element 212a and the image signal by the imaging element 212b, the distance to the object can be obtained using the principle of triangulation.
  • Control part 400 (FIG. 2) is comprised by CPU, ROM, RAM etc., and controls operation
  • Control unit 400 includes ambient condition determination unit 410, storage unit 420, and drive signal generation unit 430.
  • the surrounding situation determination unit 410 analyzes the information indicating the above-mentioned distance profile output from the signal processing unit 220, and the situation of the object around the guiding device 1, the possibility of collision with the object, and the possibility of avoiding the object , And the direction to avoid the object.
  • the storage unit 420 stores information indicating the original traveling direction before the change, for example, the azimuth, when changing the traveling direction described later in detail.
  • the drive signal generation unit 430 is based on the information determined by the surrounding condition determination unit 410, and each coil 305-1, 305-2, 305- included in the pseudo force sense generation unit 300 (FIG. 4) will be described in detail later. 3 and a pseudo haptic control signal which is a drive signal for controlling the current supplied to 305-4.
  • the artificial haptic control signal is output from the control unit 400 to the artificial haptic generation unit 300.
  • the drive signal generation unit 430 also generates a drive signal for driving a notification unit 600 described later.
  • the pseudo-force generation unit 300 uses, for example, a sense of being pulled in a certain direction (referred to as pseudo-force sense) as disclosed in JP-A-2015-226388. Make people feel.
  • FIG. 4 is a diagram for explaining the configuration of the artificial force sense generation unit 300.
  • the artificial force sense generation unit 300 includes a thin cylindrical container 301, a magnetic inertia member 302 disposed inside the container 301, and an elastic member 303 for holding the inertia member 302 in the cylindrical container 301.
  • a drive unit 304 is provided with a coil 305-1, a coil 305-2, a coil 305-3, and a coil 305-4.
  • the inertial body 302 driven by the electromagnetic induction moves in the surface direction to generate a pseudo force sense. Do.
  • the current flowing through each of the coils 305-1, 305-2, 305-3, and 305-4 is controlled based on the artificial haptic control signal from the control unit 400 (drive signal generation unit 430).
  • the artificial force sense generation unit 300 determines the coils 305-1, 305-2, 305- based on the artificial force sense control signal. Drive current is applied to each of 3 and 305-4. As a result, the inertial body 302 of the artificial force sense generation unit 300 is driven to cause asymmetric vibration in a predetermined direction. Asymmetric vibrations are vibrations that are strong and fast in one direction and weak and slow in the opposite direction. As a result, the user who holds the guidance device 1 feels a pseudo-force sensation as if drawn in a predetermined direction.
  • the notification unit 600 (FIG. 2) is provided to alert the person around the guidance device 1 or to transmit a message, and includes a speaker, a buzzer, an LED lamp, a display device, etc. And at least one of When the notification unit 600 includes a speaker, a warning sound is emitted from the speaker or a voice message is reproduced by the drive signal from the control unit 400 (drive signal generation unit 430).
  • the notification unit 600 includes a buzzer
  • a buzzer sound is emitted by a buzzer drive signal from the control unit 400 (drive signal generation unit 430).
  • the notification unit 600 includes an LED lamp
  • the LED lamp is turned on or blinked by a lamp drive signal from the control unit 400 (drive signal generation unit 430).
  • the notification unit 600 includes a display device
  • a warning screen is displayed on the display device or a message is displayed by a display drive signal from the control unit 400 (drive signal generation unit 430).
  • the user uses the guiding device 1 configured as described above, as illustrated in FIG. 1B, the user holds the guiding device 1 with the obstacle detection unit 200 directed in the traveling direction.
  • the obstacle detection unit 200 detects an object in the direction of 180 degrees in total by 90 degrees on the left and right around the traveling direction. Then, the obstacle detection unit 200 calculates a distance profile indicating the position and the direction of the object detected in the detection range of 180 degrees.
  • FIG. 5A is a view for explaining the situation of an object around the user (that is, the guidance device 1), and is a view of the circumference of the guidance device 1 as viewed from above. It is assumed that the guidance device 1 is directed in the traveling direction (arrow direction) of the user. According to FIG. 5A, a wall 2 indicated by a vertically-long rectangle is present on the left side of the guiding device 1, and three persons 3, 4, and 5 are present, respectively.
  • the obstacle detection unit 200 of the guidance device 1 sets, for example, the range of a semicircle on the traveling direction side among circles of a radius of 5 m centered on the guidance device 1 as a detection range. Therefore, the obstacle detection unit 200 includes the left wall 2 of the guiding device 1, the person 3 at the left front of the guiding device 1, the person 4 at the front of the guiding device 1, and the right inclination of the guiding device 1
  • the forward person 5 is detected as an object.
  • the detection range by the obstacle detection unit 200 may be determined based on the walking speed of the user or the person.
  • the radius of the detection range is made longer than 5 m, and if walking speed slower than 1.3 m / sec is assumed, the radius of the detection range is 5 m You may change suitably, such as shortening more.
  • FIG. 5B is a diagram illustrating a distance profile calculated when the obstacle detection unit 200 detects an object shown in FIG. 5A.
  • the distance profile is displayed in polar coordinates indicating the distance r from the guiding device 1 to the object and the angle ⁇ from the traveling direction to the direction of the object.
  • the maximum value of the distance r is clamped to 5 m because the detection range is set to 5 m.
  • the distance profile of FIG. 5 (b) it is indicated that an object is present near the guidance device 1 in the direction in which the profile with the distance r shorter than 5 m is generated (for example, the direction indicated by H2).
  • the pair of cameras 210a and 210b constituting the imaging unit 210 of the obstacle detection unit 200 performs imaging at a constant cycle such as 30 frames / sec, for example.
  • the signal processing unit 220 of the obstacle detection unit 200 generates and updates a distance profile each time imaging is performed by the cameras 210a and 210b.
  • the obstacle detection unit 200 sends information indicating the latest distance profile to the control unit 400.
  • the distance profile is calculated by the obstacle detection unit 200 as follows.
  • the obstacle detection unit 200 causes the signal processing unit 220 to detect an object present in front of the guidance device 1 based on the images captured by the imaging elements 212 a and 212 b of the imaging unit 210.
  • the object is calculated at polar coordinates of the angle ⁇ and the distance r.
  • each of the persons 3 to 5 is detected as, for example, ( ⁇ 1, r1), ( ⁇ 2, r2), and ( ⁇ 2, r2).
  • a distance profile as shown in FIG. 5B is created using these three polar coordinates.
  • the vibration directions of the pseudo force sense instructing the change of the traveling direction are calculated from the data of ( ⁇ 1, r1), ( ⁇ 2, r2), and ( ⁇ 2, r2).
  • the situation around the guiding device 1 is determined by the control unit 400 as follows.
  • the control unit 400 controls the moving direction of the moving object, the moving speed of the user (guidance device 1), and the moving speed of the object by using the plurality of distance profiles respectively created in the plurality of frames by the surrounding situation determination unit 410.
  • the relative moving speed (referred to as relative speed) is calculated.
  • the detected object is identified by shape, color, etc., and is numbered, focusing on the object to which the same number is attached in a plurality of frames, the position or movement direction of the object, the user using the guidance device 1 Calculate the relative velocity of the object and the object.
  • the control unit 400 does not cause the pseudo force sense generation unit 300 to generate a pseudo force sense. More specifically, the surrounding condition determination unit 410 of the control unit 400 has a range of 30 degrees centered on the traveling direction in the above-described distance profile and a first predetermined distance (for example, 2.5 m) from the guidance device 1 m. Area A, a range of 30 degrees centered on the traveling direction, and an area B defined by a second predetermined distance (for example, 1.5 m) from the guiding device 1, It is determined whether or not there is an object in each of and B. Then, based on the determination result, it is determined whether to cause or not to generate force sense by the artificial force sense generation unit 300.
  • a predetermined angle range for example, plus or minus 15 degrees
  • FIG. 6A is a view for explaining the situation of an object around the user (that is, the guidance device 1) at a certain point in time, and a view from above of the detection range by the obstacle detection unit 200.
  • a solid semicircle 6 represents a distance of 2.5 m from the induction device 1
  • a broken semicircle 7 represents a distance of 1.5 m from the induction device 1.
  • the straight line 8a indicates the right 15 degrees with respect to the traveling direction
  • the straight line 8b indicates the left 15 degrees with respect to the traveling direction.
  • the area surrounded by the semicircle 6 and the straight lines 8a and 8b is the above-mentioned area A
  • the area surrounded by the semicircle 7 and the straight lines 8a and 8b is the above-mentioned area B.
  • the signal processing unit 220 does not calculate polar coordinates regarding the object. Therefore, based on the distance profile input from the signal processing unit 220, the surrounding situation determination unit 410 can determine that there is no object (referred to as an obstacle) that is an obstacle to the movement of the user. That is, ambient condition determination unit 410 determines that the user can safely proceed in the traveling direction.
  • an obstacle an obstacle to the movement of the user. That is, ambient condition determination unit 410 determines that the user can safely proceed in the traveling direction.
  • the drive signal generation unit 430 of the control unit 400 supplies current to each of the coils 305-1, 305-2, 305-3, and 305-4 of the artificial force sense generation unit 300 (FIG. 4) based on the above determination.
  • the pseudo force sense control signal is generated so as not to be generated, and is output to the pseudo force sense generation unit 300.
  • the artificial force sense generation unit 300 does not generate an artificial force sense.
  • the above description is an example of generating the pseudo force sense only when transmitting the traveling direction to be changed with the detection of the obstacle. However, even in the case where an obstacle is not detected, a pseudo force sense may be generated forward in the traveling direction to transmit the current traveling direction.
  • FIG. 6B is a view for explaining the situation of an object around the user (that is, the guiding device 1) after, for example, 3 seconds in FIG. 6A. Since the guiding device 1 moves in the traveling direction as the user walks, the relative position of the wall 2 to the guiding device 1 moves downward in FIG. 6 (b). In addition, since the person 5 remains at that place, the relative position of the person 5 with respect to the guiding device 1 moves downward in FIG. On the other hand, since the person 3 moves in the same traveling direction as the user, the relative position of the person 3 with respect to the guidance device 1 is hardly changed as compared with FIG.
  • the relative position of the person 4 with respect to the guiding device 1 largely moves downward in FIG. 6B and approaches the guiding device 1.
  • the presence / absence of movement of each object, the movement direction, and the movement speed can be detected and calculated based on the change in polar coordinates of the same object (obstacle) detected for each frame.
  • the surrounding state determination unit 410 determines that the object in the area A is based on the polar coordinates of the person 4 corresponding to FIG. To detect the presence of The surrounding situation determination unit 410 determines that the person 4 may become an obstacle if the user moves as it is, and searches for alternative routes on both sides of the traveling direction. For example, regions DR and DL of a predetermined angle range (for example, 30 degrees) centered on a direction shifted 30 degrees clockwise and counterclockwise from the advancing direction on both sides of the area in the advancing direction are alternative direction candidates Set These areas DR and DL are areas restricted by a radius of 5 m from the guiding device 1. It is determined whether or not an object is present in these areas, and if it is not present, its direction is determined as a candidate alternative direction. In FIG. 6B, since no object exists in the region DR, the direction is taken as the alternative direction.
  • a predetermined angle range for example, 30 degrees
  • the surrounding situation determination unit 410 can calculate an alternative direction which can avoid the collision with a margin and minimize the amount of change (angle) from the current traveling direction. Do.
  • the moving speed (relative speed) of the person 4 approaching the user is low (eg, 2 m /).
  • the direction in which the change angle from the current traveling direction is smaller is determined as the alternative direction from the predetermined angle range in which no object exists.
  • the surrounding situation determination unit 410 adopts the direction closest to the current traveling direction as the alternative direction. For example, when the direction of 30 degrees to the right from the traveling direction and the direction of 40 degrees to the left from the traveling direction become candidates, the direction of 30 degrees to the right from the traveling direction is adopted.
  • the alternative direction is the area DR on the right side of the traveling direction Only.
  • the drive signal generation unit 430 of the control unit 400 generates the pseudo force sense in the alternative direction based on the information indicating the alternative direction determined by the surrounding condition determination unit 410, each of the artificial force sense generation unit 300 (FIG. 4).
  • the pseudo force sense control signal for controlling the current supplied to the coils 305-1, 305-2, 305-3, and 305-4 is generated and output to the pseudo force sense generation unit 300.
  • the artificial force sense generating unit 300 generates an artificial force sense in the alternative direction.
  • the drive signal generation unit 430 of the control unit 400 generates a pseudo force sense control signal to generate a pseudo force sense in the direction opposite to the traveling direction based on the above determination, and sends the drive signal to the pseudo force sense generation unit 300. . That is, the drive signal generation unit 430 controls the artificial force control signal for controlling the current supplied to the coils 305-1, 305-2, 305-3, and 305-4 of the artificial force sense generation unit 300 (FIG. 4).
  • the artificial force sense generation unit 300 generates and outputs to the artificial force sense generation unit 300.
  • the artificial force sense generation unit 300 generates the artificial force sense in the direction opposite to the traveling direction.
  • the reason for generating the pseudo force sense in the reverse direction is to urge the user to stop because it is difficult to avoid collision by guidance.
  • the surrounding situation determination unit 410 determines the position of the person 4 based on polar coordinates. In other words, the presence of an object (obstacle) in the region B is detected based on the distance profile created based on the polar coordinates. When an object is present in the area B, the surrounding state determination unit 410 determines that the user may collide with the person 4.
  • the drive signal generation unit 430 of the control unit 400 generates a drive signal (referred to as a warning instruction signal) for driving the notification unit 600 based on the above determination, and outputs the drive signal to the notification unit 600.
  • the notification unit 600 receives a warning instruction signal from the drive signal generation unit 430 and issues a warning.
  • the warning may be notified to both the user and the person 4 by the above-mentioned warning sound, buzzer sound, voice message or the like, or may be notified to the person 4 by a light of an LED lamp, a warning screen or a display message.
  • the control unit 400 may cause the artificial force sense generation unit 300 to generate an artificial force sense in the direction opposite to the traveling direction to urge the user to stop.
  • step S10 of FIG. 7 the control unit 400 sends an instruction to the obstacle detection unit 200, and causes the cameras 210a and 210b of the imaging unit 210 to start imaging in the traveling direction. Thereby, the imaging unit 210 repeatedly images the front at a predetermined frame rate.
  • step S20 the signal processing unit 220 of the obstacle detection unit 200 generates a distance profile.
  • the signal processing unit 220 performs image processing of obstacle detection based on the image signals of the two cameras 210a and 210b, and calculates the position of the obstacle in the coordinate system with the guidance device 1 as the coordinate origin as polar coordinates.
  • the distance profile is created as shown in FIG. 5 (b) based on the calculated polar coordinates of the obstacle.
  • FIG. 5 (b) shows an example of the profile, and is risk avoidance information indicated by the position and direction of the obstacle, and is not limited to the form of FIG. 5 (b).
  • Steps S10 and S20 described above are operation processing performed by the obstacle detection unit 200 that has received an instruction from the control unit 400 when the guidance device 1 is turned on.
  • the following step S30 and subsequent steps show processing executed by the control unit 400.
  • the control unit 400 causes the surrounding situation determination unit 410 to determine whether or not there is an obstacle object in the region B (FIG. 6).
  • the determination in step S30 is positive and the process proceeds to step S160 in FIG. If the presence of the obstacle object in the region B is not detected, the determination in step S30 is negative and the process proceeds to step S40.
  • step S40 the control unit 400 determines, by the surrounding situation determination unit 410, whether or not there is an obstacle object in the area A (FIG. 6). When the presence of an obstacle object is detected in the area A, the determination in step S40 is positive and the process proceeds to step S50. If the presence of an object is not detected in the area A, the determination in step S40 is negative and the process proceeds to step S80 in FIG.
  • step S50 the control unit 400 causes the surrounding situation determination unit 410 to search for an alternative direction.
  • step S60 the control unit 400 determines, based on the alternative direction search result, whether or not there is an alternative direction by the surrounding situation determination unit 410.
  • the determination in step S60 is positive and the process proceeds to step S70. If the alternative direction is not calculated (cannot be detected), the determination in step S60 is negative and the process proceeds to step S160 in FIG.
  • step S70 the control unit 400 determines, by the surrounding situation determination unit 410, whether there is a plurality of alternative directions based on the alternative direction search result. If a plurality of alternative direction candidates have been calculated, an affirmative decision is made in step S70, and the process proceeds to step S90 in FIG. If one alternative direction is calculated, the determination in step S70 is negative and the process proceeds to step S110 in FIG.
  • control unit 400 causes drive signal generation unit 430 to generate a pseudo force sense control signal so as not to generate a simulated force sense, and outputs the generated signal to simulated force sense generation unit 300 (FIG. 4). End the process. Thereby, the artificial force sense generation unit 300 does not generate an artificial force sense.
  • step S90 the control unit 400 causes the surrounding condition determination unit 410 to calculate deflection angles with respect to a plurality of alternative directions, that is, declination angles with respect to the direction in which the user faces.
  • step S100 the control unit 400 causes the surrounding condition determination unit 410 to select an alternative direction that minimizes the declination.
  • the control unit 400 further generates a pseudo force sense control signal so that a pseudo force sense is generated from the pseudo force sense generation unit 300 in the alternative direction by the drive signal generation unit 430, It outputs to FIG. 4) and progresses to step S120.
  • step S110 the control unit 400 causes the drive signal generation unit 430 to generate a pseudo force sense control signal so that a pseudo force sense is generated from the pseudo force sense generation unit 300 in the alternative direction, thereby generating a pseudo force sense. It outputs to the part 300 (FIG. 4), and progresses to step S120.
  • the pseudo force sense generation unit 300 By performing the process of step S100 or step S110, the pseudo force sense generation unit 300 generates a pseudo force sense. Thereby, the user changes the traveling direction in the alternative direction.
  • the control unit 400 determines whether the traveling direction of the guidance device 1 has been changed by the surrounding situation determination unit 410.
  • the change in the traveling direction is detected by analyzing the output from the signal processing unit 220 of the obstacle detection unit 200 by the surrounding situation determination unit 410. More specifically, the change in the traveling direction of the guiding device 1 can be understood from the temporal change (inter-frame correlation) of the image signal by the imaging element 212a and the image signal by the imaging element 212b.
  • step S120 If it is determined by the surrounding situation determination unit 410 that the course has changed in the traveling direction different from the original traveling direction, an affirmative determination is made in step S120 and the process proceeds to step S130. If it is not determined that the route has changed in the direction of travel different from the original direction of travel, the determination in step S120 is negative and a change in direction of travel is awaited.
  • step S130 the control unit 400 records the original traveling direction, for example, the azimuth in the internal storage unit 420, and proceeds to step S140. At this time, the user is walking while changing the course in the alternative direction.
  • the surrounding traveling direction determination unit 410 analyzes the output from the signal processing unit 220 of the obstacle detection unit 200. To detect. That is, since the change in the traveling direction of the guidance device 1 can be known from the temporal change (inter-frame correlation) of the image signal by the imaging element 212a and the image signal by the imaging element 212b, the original traveling direction is calculated therefrom.
  • step S140 which is progressing in the alternative direction, the control unit 400 causes the surrounding situation determination unit 410 to determine whether or not there is an obstacle object in the original direction of travel (recorded in the recording unit 420). Determine If the presence of an object is detected in the original traveling direction, the determination in step S140 is positive and the determination process is repeated. The determination process is repeated to wait for the object to be undetected in the original traveling direction. On the other hand, when the presence of the object is not detected in the original traveling direction, the determination in step S140 is negative and the process proceeds to step S150.
  • step S150 the control unit 400 causes the drive signal generation unit 430 to generate a pseudo force sense control signal for generating a simulated force sense in the direction that is the original traveling direction, and to the simulated force sense generation unit 300 (FIG. 4). Then, the process according to FIG. 8 is ended. As a result, since the pseudo force sense generation unit 300 generates the pseudo force sense in the direction that is the original traveling direction, the user can walk in the same direction as the original traveling direction.
  • step S160 the control unit 400 causes the pseudo force sense generation unit 300 to generate a pseudo force sense in the direction opposite to the traveling direction. That is, the drive signal generation unit 430 generates a pseudo force sense control signal for generating a pseudo force sense in the direction opposite to the traveling direction and outputs it to the pseudo force sense generation unit 300 (FIG. 4). finish.
  • the user guided person can stop and avoid the collision by the pseudo force sense generated by the pseudo force sense generation unit 300.
  • the guiding device 1 held by the user detects an obstacle detection unit 200 that detects the distance and direction to surrounding obstacles, and a control unit 400 that detects an alternative direction instead of the traveling direction based on the detection result. (Ambient situation determination unit 410), and a pseudo force sense generation unit 300 that generates a pseudo force sense in the alternative direction. Since it comprised in this way, when a user continues advancing in the advancing direction, when it is likely to collide with an obstacle, a user can be appropriately guided in the safe direction which can avoid an obstacle. Specifically, since the pseudo force sense pulls the user in the alternative direction, the alternative direction can be transmitted in an easy-to-understand manner in a manner easily understandable to the user.
  • the guidance device 1 generates a distance profile (FIG. 5B) regarding the distance and direction to the obstacle by the obstacle detection unit 200, and the control unit 400 (the surrounding situation determination unit 410) generates the distance profile. Since the alternative direction is detected based on it, an appropriate alternative direction can be easily detected.
  • the artificial force sense generation unit 300 generates the artificial force sense in the opposite direction to the traveling direction when the control unit 400 (the surrounding situation determination unit 410) does not detect the alternative direction, so the user It can be guided in the direction in which the impact in the case of collision with etc. becomes smaller.
  • the obstacle detection unit 200 includes stereo cameras 210a and 210b that capture obstacles from different viewpoints, the distance and direction to surrounding obstacles can be determined with a simple configuration without emitting light or radio waves. It can be detected.
  • the artificial force sense generation unit 300 includes the inertial body 302 and the drive unit 304 that causes the inertial body 302 to vibrate asymmetrically in a predetermined direction, so that artificial force sense can be generated with a simple configuration.
  • the control unit 400 determines the alternative direction when an obstacle exists within a first distance (for example, 2.5 m) in a first angle range (for example, 30 degrees) including the traveling direction. Since the detection of the above is started, the user can be guided in the alternative direction at a time suitable for avoiding an obstacle or the like.
  • the control unit 400 (the surrounding situation determination unit 410) sets a second distance farther than the first distance (for example, 2.5 m) in the second angle range (for example, 30 degrees) including the first direction different from the traveling direction If an obstacle does not exist in the first area (for example, 5 m or less), the first direction is detected as the alternative direction. With this configuration, it is possible to detect a direction in which an obstacle does not exist far as an alternative direction.
  • control unit 400 (surrounding condition judging unit 410) does not detect the first direction as the alternative direction, and thus the control unit 400 guides the user to the alternative direction and immediately It is possible to reduce the risk of approaching an obstacle.
  • the control unit 400 (the surrounding situation determination unit 410) has no obstacle in the first area and has a second angle range (for example, 30 degrees) including a second direction different from the traveling direction.
  • a second angle range for example, 30 degrees
  • a second direction different from the traveling direction When no obstacle exists in the second region within the second distance (for example, 5 m), one of the first direction and the second direction closer to the traveling direction is detected as the alternative direction.
  • this configuration it is possible to detect a direction in which the change angle is smaller than the original traveling direction as possible as the alternative direction.
  • the artificial force sense generation unit 300 sets the third angle (eg, 1.5 m) closer to the first distance (eg, 2.5 m) in the first angle range (eg, 30 degrees) including the traveling direction.
  • pseudo force sense is generated in the direction opposite to the traveling direction. Since it comprised in this way, a user can be induced
  • stereo cameras 210a and 210b provided with fisheye lenses 211a and 211b for the imaging unit 210 of the obstacle detection unit 200
  • LIDAR Light Detection and Ranging, Laser
  • Imaging Detection and Ranging an ultrasonic scanner, or the like
  • the camera may use an imaging element sensitive to visible light or an imaging element sensitive to infrared light.
  • an imaging element having sensitivity to both visible light and infrared light may be used. By using infrared light, guidance can be properly performed even at night, daytime or even in a dark place (under guard, tunnel, etc.).
  • the surrounding situation determination unit 410 uses the output of the signal processing unit 220 of the obstacle detection unit 200 to detect a change in the traveling direction of the guidance device 1 and to calculate the original traveling direction.
  • this may be configured to calculate the change in the traveling direction or the original traveling direction using an angular velocity sensor or an acceleration sensor separately.
  • a position sensor using GPS Global Positioning System
  • GPS Global Positioning System
  • a posture sensor may be incorporated inside the guiding device 1 and the notification unit 600 may warn that the posture is not correct unless the posture is correct.
  • a helper may lead the visually impaired person.
  • the guidance device when the user walks carrying the guidance device, the guidance device guides the user to follow the leading assistant. The details of such an induction device will be described in detail with reference to the drawings.
  • FIG. 9 is a schematic view illustrating a use situation of the guidance device 1A according to the second embodiment.
  • the guidance device 1A guides the user 12 by following the person 13 who leads the user 12 as a helper.
  • the appearance of the guiding device 1A is the same as that of the guiding device 1 of the first embodiment.
  • the user 12 holds the guiding device 1A in front of the body as illustrated in FIG. 1 (b).
  • FIG. 10 is a block diagram illustrating the configuration of the guidance device 1A.
  • the guidance device 1A includes a leading person detection unit 1200 that detects a leading person 13, a control unit 1400, a notification unit 1600, and a simulated force sense generation unit 1300.
  • FIG. 10 and FIG. 2 of the first embodiment are different in that a leading person detection unit 1200 is provided instead of the obstacle detection unit 200 of FIG. 2.
  • the control unit 1400 of FIG. 10 corresponds to the control unit 400 of FIG. 2
  • the pseudo force sense generation unit 1300 of FIG. 10 corresponds to the pseudo force sense generation unit 300 of FIG. 2
  • the notification unit 1600 of FIG. This corresponds to the notification unit 600.
  • the leader detection unit 1200 includes an imaging unit 1210 and a signal processing unit 1220, captures an image of a scene centered on the front direction, and detects the position of the image of the person 13 on the imaging screen.
  • the imaging unit 1210 captures an image in the front direction
  • the signal processing unit 1220 sends information indicating the position of the image of the person 13 on the imaging screen captured by the imaging unit 1210 to the control unit 1400.
  • FIG. 11 is a diagram showing an example of the configuration of the leader detection unit 1200.
  • an imaging unit 1210 is configured of a monocular camera.
  • the camera 1210 has an imaging lens 1211 and an imaging element 1212 disposed on the image forming surface of the imaging lens 1211.
  • the imaging lens 1211 is not a fisheye lens shown by reference numerals 211a and 211b in FIG. 3 but a normal imaging lens.
  • the imaging element 1212 captures a subject image formed by the imaging lens 1211.
  • the image signal output from the imaging element 1212 is sent to the signal processing unit 1220.
  • the signal processing unit 1220 detects an image of the person 13 who is a leader from the captured image.
  • a known image recognition technology is used. For example, data of an image obtained by imaging the person 13 from behind is recorded in advance in a storage unit provided in the signal processing unit 1220.
  • the signal processing unit 1220 compares the data of the image stored in the storage unit with the data of the image captured by the camera 1210. Then, the signal processing unit 1220 detects an image of the person 13 corresponding to data similar to the stored data on the imaging screen captured by the camera 1210.
  • Control Unit 1400 includes a CPU, a ROM, a RAM, and the like, and controls the operation of each unit of the guidance device 1A based on a control program.
  • Control unit 1400 includes ambient condition determination unit 1410, storage unit 1420, and drive signal generation unit 1430.
  • the surrounding situation determination unit 1410 determines the direction in which the person 13 is present relative to the guidance device 1A. calculate. Specifically, when the distance from the center of the screen of the image sensor 1212 to the center of the image of the person 13 in FIG.
  • ⁇ 2 arctan (L2 / f2) (1)
  • the surrounding situation determination unit 1410 (FIG. 10) further calculates the distance from the guidance device 1A to the person 13 based on the information indicating the size of the image of the person 13 on the imaging screen, which is output from the signal processing unit 1220. Do.
  • the surrounding situation determination unit 1410 detects, for example, the size of the shoulder width on the imaging surface from the image of the person 13 on the imaging element 1212, and based on the ratio of the detected value to the shoulder width of the actual person 13, the image magnification ⁇ Calculate
  • the actual shoulder width of the person 13 is recorded in advance in the storage unit 1420 provided in the control unit 1400.
  • the value of the shoulder width recorded in storage unit 1420 may be a value calculated based on the imaging distance when imaging person 13 from behind and the size of the imaged image in advance. An average shoulder width value may be adopted.
  • the surrounding situation determination unit 1410 sends, to the drive signal generation unit 1430, information indicating the calculated direction in which the person 13 is present and information indicating the distance Y from the guidance device 1A to the person 13.
  • the drive signal generation unit 1430 calculates the direction Y from the guidance device 1A.
  • the artificial force sense generation unit 1300 generates an artificial force sense control signal for controlling the current supplied to each coil.
  • the simulated force sense control signal is output to the simulated force sense generation unit 1300.
  • the pseudo force generation unit 1300 (FIG. 10) is similar to the pseudo force generation unit 130 of FIG.
  • the artificial force sense control signal is input from the drive signal generation unit 1430
  • the artificial force sense generation unit 1300 causes asymmetric vibration in the direction in which the person 13 is present (indicated by the directional angle ⁇ 2 in this example).
  • the user 12 who holds the guiding device 1A feels a pseudo-force sense that is pulled in the direction of the leading person 13.
  • the notification unit 1600 (FIG. 10), like the notification unit 160 in FIG. 2, warns a person who is around the guidance device 1A or transmits a message.
  • the user 12 uses the guiding device 1A of the above configuration, as illustrated in FIG. 1B, the user 12 directs the leading person detection unit 1200 toward the person 13 who is the leading person, and holds the guiding device 1A.
  • the leader detection unit 1200 repeatedly performs imaging by the camera 1210 and detection of the person 13 by the signal processing unit 1220, for example, at a fixed cycle such as 30 frames / sec.
  • the control unit 1400 also repeatedly calculates the direction in which the person 13 is present and the distance Y to the person 13 based on the information from the leading person detection unit 1200.
  • the control unit 1400 performs the pseudo force sense from the pseudo force sense generation unit 1300 in the case where the guidance device 1A does not face the direction of the person 13 and in the case where the distance Y from the guidance device 1A to the person 13 is not appropriate. generate.
  • the determination that the guidance device 1A has turned in the direction of the person 13 is performed by detecting that the image of the person 13 has entered the predetermined range from the screen center on the imaging screen of the imaging device 1212. That is, when it is detected that the directional angle ⁇ 2 of the above equation (1) is smaller than the predetermined angle ⁇ 0, it is determined that the guiding device 1A is directed to the person 13.
  • the determination as to whether or not the distance Y from the guiding device 1A to the person 13 is appropriate is performed by detecting that the distance Y is within a predetermined range with respect to a preset distance. For example, when it is detected that the value of the image magnification ⁇ is within the predetermined range, the control unit 1400 determines that the distance Y from the guidance device 1A to the person 13 is appropriate.
  • the optimum distance Y between the guiding device 1A (user 12) and the person 13 changes depending on the degree of congestion around the user 12, etc. Therefore, the optimal distance Y is set in advance in the guiding device 1A in consideration of the surrounding conditions. It is good.
  • control unit 1400 determines that directional angle ⁇ 2 calculated by surrounding environment determination unit 1410 is smaller than predetermined angle ⁇ 0, and calculated image magnification ⁇ is within a predetermined range, artificial force sense generation unit 1300. Does not generate a pseudo force. That is, when the traveling direction of the user 12 matches the direction of the person 13 and the distance Y from the guiding device 1A to the person 13 is an appropriate distance, the pseudo force sense is generated from the pseudo force sense generation unit 1300. It does not occur. With such a configuration, the user 12 can keep the traveling direction and the moving speed the same as the present, because there is no artificial force sense from the guiding device 1A.
  • the control unit 1400 causes the pseudo force sense generation unit 1300 to generate a pseudo force sense in the direction of the directional angle ⁇ 2.
  • the direction angle ⁇ 2 calculated by the surrounding situation determination unit 1410 becomes larger than the predetermined angle ⁇ 0.
  • the drive signal generation unit 1430 of the control unit 1400 causes the coils 305-1, 305-2, and 305-3 of the artificial force sense generation unit 1300 (FIG. 10) to generate an artificial force sense in the direction indicated by the directional angle ⁇ 2. , And 305-4 to generate a simulated force sense control signal for controlling the current supplied to the unit 305-4, and outputs it to the simulated force sense generation unit 1300.
  • the artificial force sense is generated in the direction of the person 13 who leads the artificial force sense generation unit 1300.
  • control unit 1400 determines that directional angle ⁇ 2 calculated by surrounding environment determination unit 1410 is smaller than predetermined angle ⁇ 0 and calculated image magnification ⁇ is smaller than value ⁇ 1 which is the lower limit of the predetermined range, the control unit 1400
  • the force sense generation unit 1300 generates a pseudo force sense in the traveling direction. That is, although the traveling direction of the user 12 matches the direction of the person 13, if the distance Y from the guiding device 1A to the person 13 is longer than an appropriate distance, the pseudo force generation unit 1300 generates a pseudo force in the traveling direction. Generate a sense of mind. With this configuration, the guidance device 1A can prompt the user 12 to increase the moving speed in the traveling direction.
  • the pseudo force sense given to the user 12 is when the leader (person 13) changes course and when the distance between the leader (person 13) is long.
  • the form of force sense given to the user 12 in the former case and the latter case may be different.
  • the frequency in the latter case is increased compared to the former.
  • it may be changed according to the magnitude of the amplitude.
  • Control unit 1400 causes directional angle ⁇ 2 calculated by surrounding environment determination unit 1410 to be smaller than predetermined angle ⁇ 0, and calculated image magnification ⁇ to be an upper limit of a predetermined range from value ⁇ 2 ( ⁇ 2> ⁇ 1). If it is large, the artificial force sense generation unit 1300 generates an artificial force sense in the direction opposite to the traveling direction. That is, when the traveling direction of the user 12 matches the direction of the person 13 and the distance Y from the guiding device 1A to the person 13 is shorter than an appropriate distance, the pseudo force sense generation unit 1300 reverses the traveling direction To generate a false sense of force. With this configuration, the guiding device 1A urges the user 12 to lower the moving speed in the direction of the person 13. Therefore, it is possible to follow while keeping the distance to the leader constant.
  • step S210 in FIG. 12 the control unit 1400 sends an instruction to the leading person detection unit 1200, and causes the camera of the imaging unit 1210 to start imaging in the traveling direction (forward).
  • step S220 the signal processing unit 1220 of the leading person detection unit 1200 detects the image of the leading person (in this example, the person 13) in the captured image.
  • step S230 the control unit 1400 causes the surrounding environment determination unit 1410 to calculate the directional angle ⁇ 2, and the process proceeds to step S240.
  • step S240 the control unit 1400 determines whether the calculated direction angle ⁇ 2 is smaller than a predetermined angle ⁇ 0. When ⁇ 2 ⁇ 0 holds, the determination in step S240 is positive and the process proceeds to step S260. When ⁇ 2 ⁇ 0 is not established, the determination in step S40 is negative and the process proceeds to step S250.
  • step S250 the control unit 1400 causes the pseudo force sense generation unit 1300 to generate a pseudo force sense in the direction of the direction angle ⁇ 2. That is, the control unit 1400 causes the drive signal generation unit 1430 to generate a pseudo force sense control signal for generating a simulated force sense in the direction indicated by the direction angle ⁇ 2, and outputs it to the simulated force sense generation unit 1300 (FIG. 10). It returns to step S210.
  • the guiding device 1A urges the user 12 to change the moving direction in the direction of the person 13. After returning to step S210, the processing described above is repeated.
  • step S260 the control unit 1400 causes the surrounding environment determination unit 1410 to calculate the image magnification ⁇ , and the process proceeds to step S270.
  • step S270 the control unit 1400 causes the surrounding environment determination unit 1410 to determine whether the image magnification ⁇ is smaller than the value ⁇ 1 which is the lower limit of the predetermined range. If ⁇ ⁇ 1 holds, the determination in step S270 is affirmative and the process proceeds to step S290. If ⁇ ⁇ 1 does not hold, the determination in step S270 is negative, and the process proceeds to step S280.
  • step S280 the control unit 1400 causes the surrounding environment determination unit 1410 to determine whether the calculated image magnification ⁇ is larger than a value ⁇ 2 which is the upper limit of the predetermined range.
  • ⁇ > ⁇ 2 holds, the determination in step S280 is positive and the process proceeds to step S310. If ⁇ > ⁇ 2 is not established, the determination in step S280 is negative, and the process proceeds to step S300.
  • step S290 the image magnification ⁇ is smaller than the value ⁇ 1 which is the lower limit of the predetermined range.
  • the distance Y from the guiding device 1A to the person 13 who is the leader can be obtained by the above equation (2).
  • the distance Y when proceeding to step S290 is longer than the appropriate distance corresponding to the image magnification ⁇ in the predetermined range. Therefore, in step S290, the control unit 1400 causes the artificial force sense generation unit 1300 to generate an artificial force sense in the traveling direction. That is, control unit 1400 causes drive signal generation unit 1430 to generate a pseudo force sense control signal for generating a pseudo force sense in the traveling direction, and outputs the generated force sense control signal to pseudo force sense generation unit 1300 (FIG. 10), and returns to step S260. .
  • the guiding device 1A urges the user 12 to move closer to the traveling direction and to approach the person 13. After returning to step S260, the processing described above is repeated.
  • step S300 the image magnification ⁇ falls within the predetermined range. That is, the distance Y when proceeding to step S300 is an appropriate distance corresponding to the image magnification ⁇ in a predetermined range.
  • control unit 1400 causes drive signal generation unit 1430 to generate a pseudo force sense control signal so as not to generate a simulated force sense, and outputs the generated signal to simulated force sense generation unit 1300 (FIG. 10).
  • the guiding device 1A urges the user 12 to maintain the current traveling direction and the current traveling speed without generating a pseudo force sense. After returning to step S260, the processing described above is repeated.
  • step S310 the control unit 1400 causes the artificial force sense generation unit 1300 to generate an artificial force sense in the direction opposite to the traveling direction. That is, the control unit 1400 causes the drive signal generation unit 1430 to generate a pseudo force sense control signal for generating a simulated force sense in the direction opposite to the traveling direction, and outputs it to the simulated force sense generation unit 1300 (FIG. 10).
  • the guiding device 1A urges the user 12 to move away from the person 13 by reducing the moving speed in the traveling direction. After returning to step S260, the processing described above is repeated.
  • the guiding device 1A held by the user 12 includes a leading person detection unit 1200 that detects the person 13 who is a leading person, and a control unit 1400 (a surrounding situation determination unit 1410) that calculates the direction to the person 13; And a pseudo force sense generation unit 1300 for generating a pseudo force sense in the direction of the person 13.
  • a leading person detection unit 1200 that detects the person 13 who is a leading person
  • a control unit 1400 a surrounding situation determination unit 1410
  • a pseudo force sense generation unit 1300 for generating a pseudo force sense in the direction of the person 13.
  • the control unit 1400 calculates the distance Y to the person 13, and the pseudo force sense generation unit 1300 determines the direction of the person 13 if the distance Y is longer than a first predetermined distance. To generate a pseudo force. Since it comprised in this way, if it separates from the person 13 who leads the user 12 too much, the user 12 can be urged
  • the pseudo force sense generation unit 1300 generates a pseudo force sense in the opposite direction to the person 13 when the distance Y is shorter than the second distance, which is shorter than the first distance. With such a configuration, when the user 12 approaches the leading person 13 too much, the user 12 can be urged to leave the person 13.
  • the simulated force sense generation unit 1300 does not generate the simulated force sense if the distance Y is longer than the second distance and shorter than the first distance. With such a configuration, it is possible to prevent the user 12 from giving an extra sense of force when the user 12 follows the person 13 leading with keeping an appropriate distance. In addition, since a width from the second distance to the first distance is given as a distance that does not give an extra sense of force, it is possible to prevent generation of a sense of force that changes direction frequently at a certain distance.
  • control unit 1400 (surrounding situation judging unit 1410) obtains the size of the image of person 13 on the imaging surface of the imaging device 1212 instead of step S260, and proceeds to step S270.
  • Control unit 1400 determines whether the size of the image of person 13 on the imaging surface of imaging device 1212 is larger or smaller than a predetermined value instead of step S270. If the size of the image of the person 13 is larger than the predetermined value, it is determined that the distance Y from the guidance device 1A to the person 13 is shorter than an appropriate distance, and the process proceeds to step S310. In this case, the artificial force sense generation unit 1300 generates an artificial force sense in the direction opposite to the traveling direction. As a result, the guidance device 1A urges the user 12 to move away from the person 13 at a reduced moving speed.
  • step S290 when the size of the image of the person 13 is smaller than the predetermined value, it is determined that the distance Y from the guidance device 1A to the person 13 is longer than the appropriate distance, and the process proceeds to step S290.
  • the haptic force generation unit 1300 generates a pseudo force sense in the traveling direction.
  • the guidance device 1A urges the user 12 to move closer and move closer to the person 13.
  • the process proceeds to step S300, and the pseudo force sense generation unit 1300 does not generate the pseudo force sense.
  • the guidance device 1A urges the user 12 to maintain the current traveling direction and moving speed.
  • the control unit 1400 (surrounding condition judging unit 1410) detects the size of the shoulder width on the imaging surface from the image of the person 13 on the imaging element 1212 and detects the detected value and the actual person 13.
  • the image magnification ⁇ was calculated based on the ratio to the shoulder width.
  • the third modification instead of the shoulder width, the size of the mark printed on the clothes or the like of the person 13 is detected, and the image magnification ⁇ is calculated based on the ratio of the detected value to the size of the actual mark.
  • Clothing and the like includes all kinds of clothing such as shirts, sweaters, jackets, coats and kimonos.
  • FIG. 13 is a view illustrating a set of marks 131 and 132 printed on the back portion of the coat of the person 13.
  • the control unit 1400 detects the length W between the marks 131 and 132 from the image of the person 13 on the imaging device 1212 by the surrounding condition determination unit 1410, and detects the detected value and the actually printed marks 131 and 132.
  • the image magnification ⁇ is calculated based on the ratio to the length W1 (not shown). Here, it is assumed that the length W1 is recorded in advance in the storage unit 1420 provided in the control unit 1400.
  • the shoulder width of the image of the person 13 is likely to cause a detection error, for example, in the case of a shoulder.
  • a detection error for example, in the case of a shoulder.
  • the marks 131 and 132 described above can be clear indicators regardless of the shape of the shoulder, detection errors can be suppressed.
  • the control unit 1400 (ambient condition determination unit 1410) detects the length of a figure, a character, or an illustration from the image of the person 13 on the imaging device 1212 and detects the detected value and the figure, the character, or the like actually printed.
  • the image magnification ⁇ is calculated based on the ratio to the length of the illustration or the like. It is assumed that the lengths of figures, characters, or illustrations actually printed are stored in advance in the storage unit 1420 provided in the control unit 1400.
  • a bib on which figures, characters, or illustrations are printed may be used.
  • FIG. 14 is a block diagram illustrating the configuration of a guiding device 1B according to the third embodiment.
  • the guidance device 1B calculates an obstacle detection unit 2200 that calculates a distance profile indicating the distance to the object and the direction of the object viewed from the guidance device 1B, a control unit 2400, a notification unit 2600, and a pseudo force.
  • a sense generation unit 2300 and a motion trajectory detection unit 2700 are included.
  • the motion trajectory detection unit 2700 is different in that it is added.
  • the obstacle detection unit 2200 in FIG. 14 corresponds to the obstacle detection unit 200 in FIG. 2
  • the control unit 2400 in FIG. 14 corresponds to the control unit 400 in FIG. 2
  • the informing unit 2600 in FIG. 14 corresponds to the informing unit 600 in FIG. 2.
  • the motion trajectory detection unit 2700 includes, for example, a gyro sensor and an acceleration sensor, detects the distance and direction in which the guidance device 1B has moved for each predetermined time, and calculates the motion trajectory of the guidance device 1B based on the detected information. And stores the calculation result in the internal storage unit 2710.
  • the guidance device 1B guides the user to bypass the obstacle in the alternative direction when an obstacle in the traveling direction is detected. Then, after the obstacle is avoided, the user is guided to return the traveling direction from the alternative direction to the original traveling direction based on the movement locus detected by the movement locus detection unit 2700.
  • FIG. 15 is a diagram for explaining the guiding direction by the control unit 2400.
  • the user 12 travels from the point S in the solid arrow direction (upward direction).
  • the user 12 holds the guiding device 1B in front of the body as illustrated in FIG. 1 (b).
  • the guiding device 1B detects the object 30 present in the traveling direction by the obstacle detecting unit 2200, and when the user 12 reaches the point T, the alternative direction (in FIG. In the upper right), the artificial force sense is generated by the artificial force sense generation unit 2300.
  • the guided user 12 changes the traveling direction to the upper right.
  • the control unit 2400 reads, from the motion trace detection unit 2700, information indicating a motion trace from the movement start point S to the point T.
  • the data is recorded in the storage unit 2420 provided in the control unit 2400.
  • the control unit 2400 extends the movement locus (from the point S to the point T) based on the information indicating the read movement locus, calculates the route originally, and stores the information indicating the route originally in the control unit 2400. Record on The original route is the route before being guided in the alternative direction (that is, the route going upward from the point S), and is indicated by a dashed arrow in FIG.
  • the user 12 bypasses the object 30 by advancing in the alternative direction (from the point T to the solid arrow direction (upper right direction)) in accordance with the pseudo force sense. While traveling in the alternative direction, the obstacle detection unit 2200 repeatedly detects an object in the traveling direction, and also detects an object 30 in the direction (upper left in FIG. 15) that is essentially returning to the route.
  • the control unit 2400 reads information indicating the motion trajectory from the point T from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
  • the control unit 2400 (the surrounding situation determination unit 2410) is present, for example, in the direction orthogonal to the traveling direction, that is, in the original return direction (upper left in FIG. 15). Determine the presence or absence of the object to be
  • the control unit 2400 causes the pseudo force sense generating unit 2300 to return in the direction essentially to the route Generate a pseudo force. Thereby, the guided user 12 changes the traveling direction to the upper left.
  • the user 12 travels to the point V which originally joins the route by advancing in the direction (the direction of the arrow of the solid line (upper left direction) from the point U) which originally returns to the route according to the pseudo force.
  • the obstacle detection unit 2200 While traveling to the point V, the obstacle detection unit 2200 repeatedly detects an object in the traveling direction.
  • the control unit 2400 reads information indicating the motion trajectory from the point U from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
  • the control unit 2400 confirms that the user 12 has reached the point V at which the user 12 originally joins the route, based on the information indicating the movement locus stored in the storage unit 2420. That is, the point V where the route extending in the traveling direction changed at the point U and the original route stored at the point T intersect is the point V, and when this point V is detected, the control unit 2400 generates the artificial force sense generation unit By 2300, artificial force sense is generated in the direction of the route (up in FIG. 15). Thereby, the induced user 12 changes the traveling direction upward.
  • the obstacle detection unit 2200 repeatedly detects an object in the traveling direction while the vehicle originally travels in the direction of the route.
  • the control unit 2400 reads information indicating the movement locus from the point V from the movement locus detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400, and the user 12 proceeds in the direction of the main body route.
  • generation of the simulated force sense is stopped from the simulated force sense generation unit 2300.
  • FIG. 16 is a diagram for explaining the guiding direction by the control unit 2400.
  • the control unit 2400 essentially returns to the route (upper left in FIG. 16) other than the object 30
  • the artificial force sense generator 2300 generates an artificial force sense from the point U in a direction substantially parallel to the root (up in FIG. 16). Thereby, the induced user 12 changes the traveling direction upward.
  • the user 12 bypasses the objects 40 and 50 by advancing in the solid arrow direction (upward direction) from the point U according to the pseudo force sense. While traveling upward from the point U, the obstacle detection unit 2200 repeatedly detects an object in the traveling direction, and also repeatedly detects an object 40, 50 in the direction (upper left in FIG. 16) which originally returns to the route.
  • the control unit 2400 reads information indicating the motion trajectory from the point U from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
  • the obstacle detection unit 2200 no longer detects the objects 40, 50 in the direction (upper left in FIG. 16) returning to the root originally.
  • the control unit 2400 confirms that the objects 40, 50 have been bypassed.
  • the control unit 2400 generates a pseudo force sense in the direction from the pseudo force sense generation unit 2300 to the original route. Thereby, the guided user 12 changes the traveling direction to the upper left.
  • the user 12 travels to the point V2 which originally joins the route by advancing in the direction of returning to the route (from the point W to the solid arrow direction (upper left direction)) according to the pseudo force sense.
  • the control unit 2400 causes the obstacle detection unit 2200 to detect an object in the traveling direction while traveling to the point V2.
  • the control unit 2400 reads information indicating the motion trajectory from the point W from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
  • the control unit 2400 confirms that the user 12 has reached the point V2 at which the user 12 originally joins the route based on the information indicating the movement trajectory stored in the storage unit 2420.
  • the control unit 2400 causes the artificial force sense generation unit 2300 to generate an artificial force sense in the direction of the route (up in FIG. 16). Thereby, the induced user 12 changes the traveling direction upward.
  • the obstacle detection unit 2200 repeatedly detects an object in the traveling direction while the vehicle originally travels in the direction of the route.
  • the control unit 2400 reads information indicating the motion trajectory from the point V2 from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
  • the control unit 2400 causes the pseudo force sense generation unit 2300 to stop the generation of the pseudo force sense.
  • the guiding device 1B held by the user 12 guides the user 12 in the original traveling direction (essentially the route) after guiding the user 12 in a safe alternative direction that can avoid the obstacle 30. Can. Further, since the direction of the route is originally transmitted to the user 12 by the pseudo-force sense generated by the pseudo-force sense generation unit 2300, the direction to be advanced can be transmitted to the user 12 in an easy-to-understand manner.
  • the guiding device 1B While guiding the user 12 in the alternative direction, the guiding device 1B continues the detection of whether or not the obstacle 30 is originally present in the direction of the route by the obstacle detection unit 2200. Since it comprised in this way, the control part 2400 can determine appropriately whether the obstacle 30 was bypassed.
  • the guidance device 1B changes the direction of the simulated force sense generated by the simulated force sense generation unit 2300 to the direction essentially toward the route. Since it comprised in this way, the user 12 can be naturally guide
  • the guiding device 1B determines whether or not the control unit 2400 originally joins the route based on the motion locus detected by the motion locus detection unit 2700 and the information of the original route before guidance in the alternative direction. Do. Since it comprised in this way, having arrived at the confluence
  • the guidance device 1B changes the generation direction of the simulated force sense by the simulated force sense generation unit 2300 to the direction of the route. With such a configuration, the user 12 can be properly guided to the route.
  • the guiding device 1B causes the artificial force sense by the artificial force sense generation unit 2300 Stop the occurrence of With such a configuration, it is possible to properly notify the user 12 that the route has originally been returned.
  • the guiding device 1B After guiding the user 12 in the alternative direction, the guiding device 1B essentially returns to the route even when reaching the point U in FIG. 15 (that is, even after confirming that the object 30 has been bypassed) ((7)
  • the generation direction of the artificial force sense by the artificial force sense generation unit 2300 is the route originally from the point U And the direction (upper in FIG. 16).
  • control unit 2400 of the guidance device 1B extends the movement locus (from the point S to the point T) to calculate the route originally based on the information indicating the movement locus, but the route may be acquired as follows .
  • the control unit 2400 of the guidance device 1B acquires map information in advance via a network (not shown), and records the map information in the storage unit 2420 provided in the control unit 2400.
  • control unit 2400 sets the current position calculated based on information from a GPS (Global Positioning System) satellite as a point S, sets the destination set and input by the user 12 in advance as a point G, and stores it in the storage unit 2420
  • the route from the point S to the point G calculated based on the map information in question is originally taken as the route.
  • the guidance device is not limited to the visually impaired.
  • it may be used as a guiding device when the evacuation route is difficult to confirm due to smoke or the like at the time of disaster, or a guiding device at the time of mountain climbing disaster.
  • the obstacle detection unit a detection unit that receives infrared light or the like that is not easily affected by scattering by smoke or fog is used.

Landscapes

  • Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Rehabilitation Therapy (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • Traffic Control Systems (AREA)

Abstract

This guiding device includes: an obstacle detection unit that detects the distance to and the direction of an obstacle in the surrounding area; an alternative direction detection unit that detects an alternative direction to the direction of travel on the basis of the detection result from the obstacle detection unit; and a pseudo-haptic force generation unit that generates pseudo-haptic force in the alternative direction.

Description

誘導装置Guidance device
 本発明は、誘導装置に関する。 The present invention relates to an induction device.
 視覚障碍者などの歩行を補助するために、障害物を検知し、その情報を使用者に振動などの方法で伝達する装置が提案されている(特許文献1参照)。
 しかしながら、従来の技術では、振動による情報の伝達に限界があるため、その詳細について十分な情報を使用者に伝達できないという問題があった。
In order to assist walking of a visually impaired person or the like, an apparatus has been proposed which detects an obstacle and transmits the information to a user by a method such as vibration (see Patent Document 1).
However, in the prior art, there is a limit to the transmission of information by vibration, and there is a problem that sufficient information about the details can not be transmitted to the user.
日本国特開2010-158472号公報Japanese Unexamined Patent Publication No. 2010-158472
 本発明の第1の態様による誘導装置は、周囲の障害物までの距離および方向を検出する障害物検出部と、前記障害物検出部の検出結果に基づいて、進行方向に代わる代替方向を検出する代替方向検出部と、前記代替方向に疑似力覚を発生する疑似力覚発生部とを備える。
 本発明の第2の態様による誘導装置は、先導者を検出する先導者検出部と、前記先導者までの方向を算出する算出部と、前記先導者の方向に疑似力覚を発生する疑似力覚発生部とを備える。
The guidance device according to the first aspect of the present invention detects an alternative direction instead of the traveling direction based on an obstacle detection unit that detects the distance and direction to a surrounding obstacle and the detection result of the obstacle detection unit. And an artificial force sense generation unit for generating an artificial force sense in the alternative direction.
According to a second aspect of the present invention, there is provided a guidance apparatus comprising: a leader detection unit that detects a leader; a calculation unit that calculates a direction to the leader; and a pseudo force that generates artificial force in the direction of the leader And a sense generation unit.
図1(a)は第1の実施形態による誘導装置を例示する斜視図、図1(b)は誘導装置を右手で把持した状態を例示する図である。Fig.1 (a) is a perspective view which illustrates the guidance apparatus by 1st Embodiment, FIG.1 (b) is a figure which illustrates the state which hold | gripped the guidance apparatus with the right hand. 誘導装置の構成を例示するブロック図である。It is a block diagram which illustrates the composition of a guidance device. 障害物検出部の構成の一例を示す図である。It is a figure which shows an example of a structure of an obstruction detection part. 疑似力覚発生部の構成を説明する図である。It is a figure explaining the composition of a pseudo force sense generation part. 図5(a)は周囲の物体の状況を説明する図、図5(b)は距離プロファイルを例示する図である。FIG. 5 (a) is a view for explaining the situation of a surrounding object, and FIG. 5 (b) is a view for illustrating a distance profile. 図6(a)はある時点における周囲の物体の状況を説明する図、3秒後における周囲の物体の状況を説明する図である。FIG. 6A is a view for explaining the situation of the surrounding object at a certain time, and a view for explaining the situation of the surrounding object after 3 seconds. 制御部が実行する処理の流れを説明するフローチャートである。It is a flowchart explaining the flow of the processing which a control part performs. 制御部が実行する処理の流れを説明するフローチャートである。It is a flowchart explaining the flow of the processing which a control part performs. 第2の実施形態による誘導装置の使用場面を説明する模式図である。It is a schematic diagram explaining the use scene of the guidance apparatus by 2nd Embodiment. 誘導装置の構成を例示するブロック図である。It is a block diagram which illustrates the composition of a guidance device. 先導者検出部の構成の一例を示す図である。It is a figure which shows an example of a structure of a leading person detection part. 制御部が実行する処理の流れを説明するフローチャートである。It is a flowchart explaining the flow of the processing which a control part performs. 上着の背中部分にプリントされているマークを例示する図である。It is a figure which illustrates the mark currently printed on the back part of the jacket. 第3の実施形態による誘導装置の構成を例示するブロック図である。It is a block diagram which illustrates the composition of the guidance device by a 3rd embodiment. 制御部による誘導方向を説明する図である。It is a figure explaining the guidance direction by a control part. 制御部による誘導方向を説明する図である。It is a figure explaining the guidance direction by a control part.
(第1の実施形態)
 発明の第1の実施形態による誘導装置は、視覚障碍者の歩行を補助し誘導する。使用者が誘導装置を携帯して歩行すると、誘導装置は、障害物を避ける向きに使用者を誘導する。このような誘導装置の詳細について、図面を参照して詳細に説明する。
First Embodiment
The guidance device according to the first embodiment of the invention assists and guides the walking of a visually impaired person. When the user walks carrying the guidance device, the guidance device guides the user in a direction avoiding the obstacle. The details of such an induction device will be described in detail with reference to the drawings.
 図1(a)は、第1の実施形態による誘導装置1を例示する斜視図である。図1(a)において、誘導装置1は、円筒形の本体10に、障害物検出部200と、疑似力覚発生部300と、報知部600とが設けられている。 FIG. 1A is a perspective view illustrating the guiding device 1 according to the first embodiment. In FIG. 1A, in the guidance device 1, an obstacle detection unit 200, a pseudo force sense generation unit 300, and a notification unit 600 are provided in a cylindrical main body 10.
 使用者は、誘導装置1を使用する時、誘導装置1の本体10を、例えば身体の前に片手で把持する。図1(b)は、誘導装置1を右手で把持した状態を例示する図である。使用者は、本体10の姿勢を地面(床面)に対して垂直方向に保つとともに、障害物検出部200の姿勢を水平方向に保ち、かつ、障害物検出部200の前面を使用者が移動する方向(すなわち、進行方向)に正対させる。 When using the guiding device 1, the user holds the main body 10 of the guiding device 1 with one hand, for example, in front of the body. FIG.1 (b) is a figure which illustrates the state which hold | gripped the guidance apparatus 1 with the right hand. The user keeps the posture of the main body 10 in the vertical direction with respect to the ground (floor surface) and keeps the posture of the obstacle detection unit 200 in the horizontal direction, and the user moves the front of the obstacle detection unit 200 Faces in the direction (ie, the direction of travel).
 誘導装置1の本体10には、使用者が誘導装置1を見なくても正しく把持できるようにガイド溝11が設けられている。ガイド溝11が設けられる位置は、本体10の正面、かつ、障害物検出部200の前面側である。このため、使用者が本体10のガイド溝11を進行方向に向けて把持すると、障害物検出部200の前面が進行方向に正対する。本実施形態では、使用者が進みたい方向を進行方向と称する。 The main body 10 of the guiding device 1 is provided with a guide groove 11 so that the user can correctly grip the guiding device 1 without looking at it. The position where the guide groove 11 is provided is the front surface of the main body 10 and the front surface side of the obstacle detection unit 200. Therefore, when the user grips the guide groove 11 of the main body 10 in the traveling direction, the front surface of the obstacle detection unit 200 faces the traveling direction. In the present embodiment, the direction in which the user wants to move is referred to as the traveling direction.
 図1(b)によれば、使用者が指の腹でガイド溝11に触れるように本体10を握ると、障害物検出部200が進行方向に向く。使用者は、例えば人差し指の腹がガイド溝11に触れるように本体10を握る向きを調節することで、障害物検出部200を進行方向に向けることができる。 According to FIG. 1B, when the user grips the main body 10 so that the user touches the guide groove 11 with a finger pad, the obstacle detection unit 200 faces in the traveling direction. The user can turn the obstacle detection unit 200 in the traveling direction, for example, by adjusting the direction in which the main body 10 is gripped so that the belly of the index finger touches the guide groove 11.
<誘導装置の構成>
 図2は、誘導装置1の構成を例示するブロック図である。図2において、誘導装置1は、障害物検出部200と、制御部400と、疑似力覚発生部300と、報知部600と、を含む。
<Structure of Guide Device>
FIG. 2 is a block diagram illustrating the configuration of the guidance device 1. In FIG. 2, the guidance device 1 includes an obstacle detection unit 200, a control unit 400, a simulated force sense generation unit 300, and a notification unit 600.
1.障害物検出部
 障害物検出部200は、撮像部210および信号処理部220を含み、進行方向を中心とする所定の範囲内にある物体を検出する。例えば、撮像部210が進行方向を撮像し、信号処理部220が撮像部210で撮像された物体までの距離を検出するとともに、検出結果に基づく距離プロファイルを示す情報を生成する。本実施形態において、距離プロファイルは、誘導装置1から物体までの距離と、誘導装置1から見た物体の方向とを示す情報をいう。例えば、物体までの距離と方向を示す角度の関数は、距離プロファイルである。
1. Obstacle Detection Unit The obstacle detection unit 200 includes an imaging unit 210 and a signal processing unit 220, and detects an object within a predetermined range centered on the traveling direction. For example, the imaging unit 210 images the traveling direction, and the signal processing unit 220 detects the distance to the object imaged by the imaging unit 210, and generates information indicating a distance profile based on the detection result. In the present embodiment, the distance profile refers to information indicating the distance from the guidance device 1 to the object and the direction of the object viewed from the guidance device 1. For example, the function of the angle indicating the distance to the object and the direction is a distance profile.
 図3は、障害物検出部200の構成の一例を示す図である。図3において、図2の撮像部210は、所定の距離L(基線長とも称される)を隔てて配設された一対のカメラ210aおよび210bによって構成される。それぞれのカメラ210aおよび210bには、魚眼レンズ211aおよび211bと、それぞれの魚眼レンズ211aおよび211bの結像面に配設された撮像素子212aおよび212bを有する。 FIG. 3 is a diagram showing an example of the configuration of the obstacle detection unit 200. As shown in FIG. In FIG. 3, the imaging unit 210 in FIG. 2 is configured by a pair of cameras 210 a and 210 b which are disposed at a predetermined distance L (also referred to as a base length). The respective cameras 210a and 210b have fisheye lenses 211a and 211b, and imaging elements 212a and 212b disposed on the image forming planes of the respective fisheye lenses 211a and 211b.
 撮像素子212aおよび212bは、魚眼レンズ211aおよび211bによって結像された被写体像を撮像する。撮像素子212aおよび212bから出力される画像信号は、それぞれ信号処理部220に送られる。信号処理部220は、撮像素子212aによる画像信号と、撮像素子212bによる画像信号との相関に基づき、障害物検出部200(誘導装置1)から撮像された物体までの距離と、障害物検出部200の正面方向(本例では進行方向)からの角度との関数、すなわち、距離プロファイルを算出する。 The imaging elements 212a and 212b capture subject images formed by the fisheye lenses 211a and 211b. The image signals output from the imaging elements 212a and 212b are sent to the signal processing unit 220, respectively. The signal processing unit 220 determines the distance to the object captured from the obstacle detection unit 200 (guide device 1) and the obstacle detection unit based on the correlation between the image signal by the imaging device 212a and the image signal by the imaging device 212b. The distance profile is calculated as a function of the angle from the front direction of 200 (in this example, the traveling direction).
 魚眼レンズ211aおよび211bは、等距離射影方式の魚眼レンズによって構成する。このように構成すると、撮像素子212a、212bで得られる画像において、各光軸Xa、Xbからの距離が物体の方向を表す。また、撮像素子212a、212bで得られる画像において、同じ物体が撮像される位置の差が視差を表す。このため、撮像素子212aによる画像信号と、撮像素子212bによる画像信号とに基づき、三角測量の原理を用いて物体までの距離を求めることができる。 The fisheye lenses 211a and 211b are configured by equidistant projection type fisheye lenses. With this configuration, in the image obtained by the imaging elements 212a and 212b, the distance from each of the optical axes Xa and Xb represents the direction of the object. Also, in the images obtained by the imaging elements 212a and 212b, the difference in position at which the same object is imaged represents parallax. For this reason, based on the image signal by the imaging element 212a and the image signal by the imaging element 212b, the distance to the object can be obtained using the principle of triangulation.
2.制御部
 制御部400(図2)は、CPU、ROM、RAM等により構成され、制御プログラムに基づいて誘導装置1の各部の動作を制御する。制御部400は、周囲状況判定部410、記憶部420、および駆動信号生成部430を含む。周囲状況判定部410は、信号処理部220から出力された上記距離プロファイルを示す情報を解析して、誘導装置1の周囲の物体の状況、物体との衝突可能性、物体を回避することの可否、および物体を回避する方向などを判定する。
 記憶部420は、後に詳述する進行方向を変更する場合に、変更前の元の進行方向、例えば方位を示す情報を記憶する。
2. Control part The control part 400 (FIG. 2) is comprised by CPU, ROM, RAM etc., and controls operation | movement of each part of the guidance apparatus 1 based on a control program. Control unit 400 includes ambient condition determination unit 410, storage unit 420, and drive signal generation unit 430. The surrounding situation determination unit 410 analyzes the information indicating the above-mentioned distance profile output from the signal processing unit 220, and the situation of the object around the guiding device 1, the possibility of collision with the object, and the possibility of avoiding the object , And the direction to avoid the object.
The storage unit 420 stores information indicating the original traveling direction before the change, for example, the azimuth, when changing the traveling direction described later in detail.
 駆動信号生成部430は、周囲状況判定部410により判定された情報に基づいて、後に詳細を説明する疑似力覚発生部300(図4)が有する各コイル305-1、305-2、305-3、および305-4に供給する電流を制御するための駆動信号である疑似力覚制御信号を生成する。疑似力覚制御信号は、制御部400から疑似力覚発生部300へ出力される。また、駆動信号生成部430は、後述する報知部600を駆動する駆動信号も生成する。 The drive signal generation unit 430 is based on the information determined by the surrounding condition determination unit 410, and each coil 305-1, 305-2, 305- included in the pseudo force sense generation unit 300 (FIG. 4) will be described in detail later. 3 and a pseudo haptic control signal which is a drive signal for controlling the current supplied to 305-4. The artificial haptic control signal is output from the control unit 400 to the artificial haptic generation unit 300. The drive signal generation unit 430 also generates a drive signal for driving a notification unit 600 described later.
3.疑似力覚発生部
 疑似力覚発生部300(図2)は、例えば特開2015-226388号公報に開示されているように、ある方向に引かれるような感覚(疑似力覚と称する)を使用者に感じさせる。図4は、疑似力覚発生部300の構成を説明する図である。図4において、疑似力覚発生部300は、薄型の円筒形容器301と、その内側に配設された磁性を帯びた慣性体302と、慣性体302を円筒容器301内に保持する弾性体303と、駆動部304とから構成される。駆動部304にはコイル305-1、コイル305-2、コイル305-3、およびコイル305-4が設けられている。
3. Pseudo-force generation unit The pseudo-force generation unit 300 (FIG. 2) uses, for example, a sense of being pulled in a certain direction (referred to as pseudo-force sense) as disclosed in JP-A-2015-226388. Make people feel. FIG. 4 is a diagram for explaining the configuration of the artificial force sense generation unit 300. As shown in FIG. In FIG. 4, the artificial force sense generation unit 300 includes a thin cylindrical container 301, a magnetic inertia member 302 disposed inside the container 301, and an elastic member 303 for holding the inertia member 302 in the cylindrical container 301. And a drive unit 304. The drive unit 304 is provided with a coil 305-1, a coil 305-2, a coil 305-3, and a coil 305-4.
 駆動部304の各コイル305-1、305-2、305-3、および305-4に電流が流されると、電磁誘導作用によって駆動された慣性体302が面方向に動いて疑似力覚を発生する。各コイル305-1、305-2、305-3、および305-4に流れる電流は、制御部400(駆動信号生成部430)からの疑似力覚制御信号に基づいて制御される。 When a current is supplied to the coils 305-1, 305-2, 305-3, and 305-4 of the drive unit 304, the inertial body 302 driven by the electromagnetic induction moves in the surface direction to generate a pseudo force sense. Do. The current flowing through each of the coils 305-1, 305-2, 305-3, and 305-4 is controlled based on the artificial haptic control signal from the control unit 400 (drive signal generation unit 430).
 具体的には、駆動信号生成部430から疑似力覚制御信号が入力されると、疑似力覚発生部300は、疑似力覚制御信号に基づいて各コイル305-1、305-2、305-3、および305-4に対してそれぞれ駆動電流を流す。これにより、疑似力覚発生部300の慣性体302が駆動され、所定の方向に非対称振動を起こす。非対象振動は、一方向に強く速く、その反対方向に弱くゆっくり振れる振動をいう。この結果、誘導装置1を把持する使用者に対し、あたかも所定の方向に引かれるような疑似力覚を感じさせる。 Specifically, when the artificial force sense control signal is input from the drive signal generation unit 430, the artificial force sense generation unit 300 determines the coils 305-1, 305-2, 305- based on the artificial force sense control signal. Drive current is applied to each of 3 and 305-4. As a result, the inertial body 302 of the artificial force sense generation unit 300 is driven to cause asymmetric vibration in a predetermined direction. Asymmetric vibrations are vibrations that are strong and fast in one direction and weak and slow in the opposite direction. As a result, the user who holds the guidance device 1 feels a pseudo-force sensation as if drawn in a predetermined direction.
4.報知部
 報知部600(図2)は、誘導装置1の周囲にいる人に対して注意を促したり、メッセージを伝えたりするために設けられており、スピーカー、ブザー、LEDランプ、およびディスプレイ装置などの少なくとも一つにより構成される。報知部600がスピーカーを備える場合には、制御部400(駆動信号生成部430)からの駆動信号により、スピーカーから警告音を発したり、音声メッセージを再生したりする。
4. Notification unit The notification unit 600 (FIG. 2) is provided to alert the person around the guidance device 1 or to transmit a message, and includes a speaker, a buzzer, an LED lamp, a display device, etc. And at least one of When the notification unit 600 includes a speaker, a warning sound is emitted from the speaker or a voice message is reproduced by the drive signal from the control unit 400 (drive signal generation unit 430).
 また、報知部600がブザーを備える場合には、制御部400(駆動信号生成部430)からのブザー駆動信号によりブザー音を鳴らす。さらにまた、報知部600がLEDランプを備える場合には、制御部400(駆動信号生成部430)からのランプ駆動信号によりLEDランプを点灯させたり、点滅させたりする。そしてさらに、報知部600がディスプレイ装置を備える場合には、制御部400(駆動信号生成部430)からの表示駆動信号によりディスプレイ装置に警告画面を表示したり、メッセージを表示したりする。 When the notification unit 600 includes a buzzer, a buzzer sound is emitted by a buzzer drive signal from the control unit 400 (drive signal generation unit 430). Furthermore, when the notification unit 600 includes an LED lamp, the LED lamp is turned on or blinked by a lamp drive signal from the control unit 400 (drive signal generation unit 430). Furthermore, when the notification unit 600 includes a display device, a warning screen is displayed on the display device or a message is displayed by a display drive signal from the control unit 400 (drive signal generation unit 430).
<誘導装置の動作>
 使用者は、上記構成の誘導装置1を使用する際に、図1(b)に例示したように、障害物検出部200を進行方向へ向けて誘導装置1を把持する。障害物検出部200は、進行方向を中心に左右90度ずつ、合わせて180度の方向の物体を検出する。そして、障害物検出部200は、上記180度の検出範囲で検出された物体の位置、方向を示す距離プロファイルを算出する。
<Operation of induction device>
When the user uses the guiding device 1 configured as described above, as illustrated in FIG. 1B, the user holds the guiding device 1 with the obstacle detection unit 200 directed in the traveling direction. The obstacle detection unit 200 detects an object in the direction of 180 degrees in total by 90 degrees on the left and right around the traveling direction. Then, the obstacle detection unit 200 calculates a distance profile indicating the position and the direction of the object detected in the detection range of 180 degrees.
 図5(a)は、使用者(すなわち誘導装置1)の周囲の物体の状況を説明する図であり、誘導装置1の周囲を上から見た図である。誘導装置1は、使用者の進行方向(矢印方向)に向けられているものとする。図5(a)によると、誘導装置1の左方に縦長の矩形で示す壁2が存在するとともに、3人の人物3、人物4、および人物5がそれぞれ存在する。 FIG. 5A is a view for explaining the situation of an object around the user (that is, the guidance device 1), and is a view of the circumference of the guidance device 1 as viewed from above. It is assumed that the guidance device 1 is directed in the traveling direction (arrow direction) of the user. According to FIG. 5A, a wall 2 indicated by a vertically-long rectangle is present on the left side of the guiding device 1, and three persons 3, 4, and 5 are present, respectively.
 誘導装置1の障害物検出部200は、例えば、誘導装置1を中心とする半径5mの円のうちの進行方向側の半円の範囲を検出範囲とする。このため、障害物検出部200は、誘導装置1の左方の壁2と、誘導装置1の左斜め前方の人物3と、誘導装置1の正面前方の人物4と、誘導装置1の右斜め前方の人物5とを物体として検出する。
 ここで、障害物検出部200による検出範囲は、使用者や人物の歩行速度に基づいて決定してよい。例えば、1.4m/secよりも速い歩行速度を想定する場合は、検出範囲の半径を5mより長くし、1.3m/secよりも遅い歩行速度を想定する場合は、検出範囲の半径を5mより短くするなど、適宜変更して構わない。
The obstacle detection unit 200 of the guidance device 1 sets, for example, the range of a semicircle on the traveling direction side among circles of a radius of 5 m centered on the guidance device 1 as a detection range. Therefore, the obstacle detection unit 200 includes the left wall 2 of the guiding device 1, the person 3 at the left front of the guiding device 1, the person 4 at the front of the guiding device 1, and the right inclination of the guiding device 1 The forward person 5 is detected as an object.
Here, the detection range by the obstacle detection unit 200 may be determined based on the walking speed of the user or the person. For example, if walking speed higher than 1.4 m / sec is assumed, the radius of the detection range is made longer than 5 m, and if walking speed slower than 1.3 m / sec is assumed, the radius of the detection range is 5 m You may change suitably, such as shortening more.
 図5(b)は、障害物検出部200が図5(a)に示す物体を検出した場合において算出される距離プロファイルを例示する図である。図5(b)において、距離プロファイルは、誘導装置1から物体までの距離rと、進行方向から物体の方向までの角度θとを示す極座標で表示される。距離rの最大値が5mにクランプされているのは、検出範囲を5mに設定したためである。 FIG. 5B is a diagram illustrating a distance profile calculated when the obstacle detection unit 200 detects an object shown in FIG. 5A. In FIG. 5B, the distance profile is displayed in polar coordinates indicating the distance r from the guiding device 1 to the object and the angle θ from the traveling direction to the direction of the object. The maximum value of the distance r is clamped to 5 m because the detection range is set to 5 m.
 図5(b)の距離プロファイルによると、距離rが5mより短いプロファイルが生成されている方向(例えばH2で示す方向)には、誘導装置1の近くに物体が存在することを示す。距離rが5mのプロファイルが生成されている方向(例えばH1で示す方向)は、5m未満の範囲に物体が存在しないことを示す。すなわち、障害物検出部200によって物体が何も検出されていない場合、距離プロファイルは距離r=5mの半円形状になる。 According to the distance profile of FIG. 5 (b), it is indicated that an object is present near the guidance device 1 in the direction in which the profile with the distance r shorter than 5 m is generated (for example, the direction indicated by H2). The direction in which a profile with a distance r of 5 m is generated (for example, the direction indicated by H1) indicates that no object exists in the range of less than 5 m. That is, when no object is detected by the obstacle detection unit 200, the distance profile has a semicircular shape with a distance r = 5 m.
 障害物検出部200の撮像部210を構成する一対のカメラ210aおよび210bは、例えば30フレーム/secのような一定の周期で撮像を行う。障害物検出部200の信号処理部220は、カメラ210aおよび210bによる撮像が行われる毎に距離プロファイルを生成し、更新する。障害物検出部200は、最新の距離プロファイルを示す情報を制御部400へ送出する。 The pair of cameras 210a and 210b constituting the imaging unit 210 of the obstacle detection unit 200 performs imaging at a constant cycle such as 30 frames / sec, for example. The signal processing unit 220 of the obstacle detection unit 200 generates and updates a distance profile each time imaging is performed by the cameras 210a and 210b. The obstacle detection unit 200 sends information indicating the latest distance profile to the control unit 400.
 距離プロファイルは、障害物検出部200によって次のように算出される。
 障害物検出部200は、信号処理部220により、撮像部210の撮像素子212aと212bで撮像された画像に基づき、誘導装置1の前方に存在する物体を検知する。物体は上記角度θと距離rの極座標で算出される。図5(a)では、人物3~5のそれぞれを例えば(θ1,r1)、(θ2,r2)、(θ2,r2)として検出する。これら3つの極座標を用いて図5(b)のような距離プロファイルが作成される。後述するように、進行方向の変更を指示する疑似力覚の振動方向は、(θ1,r1)、(θ2,r2)、(θ2,r2)のデータにより算出される。したがって、疑似力覚の振動方向を発生するために図5(b)の視覚的な距離プロファイルを生成することが必須ということではなく、距離r1,距離r2が5m以下の極座標データに基づいて疑似力覚を発生するようにしてもよい。
The distance profile is calculated by the obstacle detection unit 200 as follows.
The obstacle detection unit 200 causes the signal processing unit 220 to detect an object present in front of the guidance device 1 based on the images captured by the imaging elements 212 a and 212 b of the imaging unit 210. The object is calculated at polar coordinates of the angle θ and the distance r. In FIG. 5A, each of the persons 3 to 5 is detected as, for example, (θ1, r1), (θ2, r2), and (θ2, r2). A distance profile as shown in FIG. 5B is created using these three polar coordinates. As described later, the vibration directions of the pseudo force sense instructing the change of the traveling direction are calculated from the data of (θ1, r1), (θ2, r2), and (θ2, r2). Therefore, it is not essential to generate the visual distance profile of FIG. 5B in order to generate the vibration direction of the pseudo force sense, but the pseudo based on polar coordinate data in which the distance r1 and the distance r2 are 5 m or less. Force sense may be generated.
 誘導装置1の周囲の状況は、制御部400によって次のように判定される。
 制御部400は、周囲状況判定部410により、複数フレームでそれぞれ作成された複数の距離プロファイルを用いて、移動する物体の移動方向と、使用者(誘導装置1)の移動速度と物体の移動速度とを加味した相対的な移動速度(相対速度と称する)を算出する。これにより、物体が使用者に接近しているか否かを物体毎に判断することができる。
 例えば、検知した物体を形状、色彩などで識別して番号を付し、複数フレームの同一番号が付された物体に着目して、その物体の位置や移動方向、誘導装置1を使用する使用者と物体との相対速度等を算出する。
The situation around the guiding device 1 is determined by the control unit 400 as follows.
The control unit 400 controls the moving direction of the moving object, the moving speed of the user (guidance device 1), and the moving speed of the object by using the plurality of distance profiles respectively created in the plurality of frames by the surrounding situation determination unit 410. And the relative moving speed (referred to as relative speed) is calculated. Thus, it can be determined for each object whether or not the object is approaching the user.
For example, the detected object is identified by shape, color, etc., and is numbered, focusing on the object to which the same number is attached in a plurality of frames, the position or movement direction of the object, the user using the guidance device 1 Calculate the relative velocity of the object and the object.
<疑似力覚を発生しない場合>
 制御部400は、障害物検出部200によって進行方向に対して所定の角度範囲(例えばプラスマイナス15度)に物体が検出されない場合は、疑似力覚発生部300から疑似力覚を発生させない。より具体的に説明すると、制御部400の周囲状況判定部410は、上述した距離プロファイルにおいて進行方向を中心とする30度の範囲、および、誘導装置1mから第1の所定距離(例えば2.5m)で規定される領域Aと、進行方向を中心とする30度の範囲、および、誘導装置1から第2の所定距離(例えば1.5m)で規定される領域Bとを設定し、領域Aおよび領域Bのそれぞれに物体が存在するか否かを判定する。そして、この判定結果に基づいて疑似力覚発生部300による力覚を生させるか、発生させないかを判定する。
<When not generating a pseudo force sense>
When the obstacle detection unit 200 does not detect an object within a predetermined angle range (for example, plus or minus 15 degrees) with the traveling direction, the control unit 400 does not cause the pseudo force sense generation unit 300 to generate a pseudo force sense. More specifically, the surrounding condition determination unit 410 of the control unit 400 has a range of 30 degrees centered on the traveling direction in the above-described distance profile and a first predetermined distance (for example, 2.5 m) from the guidance device 1 m. Area A, a range of 30 degrees centered on the traveling direction, and an area B defined by a second predetermined distance (for example, 1.5 m) from the guiding device 1, It is determined whether or not there is an object in each of and B. Then, based on the determination result, it is determined whether to cause or not to generate force sense by the artificial force sense generation unit 300.
 図6(a)は、ある時点における使用者(すなわち誘導装置1)の周囲の物体の状況を説明する図であり、障害物検出部200による検出範囲を上から見た図である。図6(a)において、実線の半円6は誘導装置1から2.5mの距離を表し、破線の半円7は誘導装置1から1.5mの距離を表す。直線8aは、進行方向に対して右15度を示し、直線8bは、進行方向に対して左15度を示す。半円6と直線8aおよび8bで囲まれた領域が前述の領域Aであり、半円7と直線8aおよび8bで囲まれた領域が前述の領域Bである。 FIG. 6A is a view for explaining the situation of an object around the user (that is, the guidance device 1) at a certain point in time, and a view from above of the detection range by the obstacle detection unit 200. In FIG. 6A, a solid semicircle 6 represents a distance of 2.5 m from the induction device 1, and a broken semicircle 7 represents a distance of 1.5 m from the induction device 1. The straight line 8a indicates the right 15 degrees with respect to the traveling direction, and the straight line 8b indicates the left 15 degrees with respect to the traveling direction. The area surrounded by the semicircle 6 and the straight lines 8a and 8b is the above-mentioned area A, and the area surrounded by the semicircle 7 and the straight lines 8a and 8b is the above-mentioned area B.
 図6(a)に示す例では、直線8a,8bで挟まれた扇形領域において、2.5m<r<5mの人物4のみが検出され、その極座標が演算されるが、進行方向の領域Aおよび領域Bでは物体が検出されないため信号処理部220では物体に関する極座標が演算されない。そのため、周囲状況判定部410は、信号処理部220から入力された距離プロファイルに基づいて、使用者の移動の障害となる物体(障害物と称する)は存在しないと判断することができる。すなわち、周囲状況判定部410は、使用者が安全に進行方向に進むことができると判断する。制御部400の駆動信号生成部430は、上記判断に基づいて、疑似力覚発生部300(図4)の各コイル305-1、305-2、305-3、および305-4に電流を供給しないように疑似力覚制御信号を生成し、疑似力覚発生部300へ出力する。これにより、疑似力覚発生部300が疑似力覚を発生しない。
 上記説明は、障害物の検出に伴い変更する進行方向を伝達するときのみ疑似力覚を発生させる一例である。しかし、障害物を検出しない場合においても、現在の進行方向を伝達するために進行方向前方に疑似力覚を発生させてもよい。
In the example shown in FIG. 6A, only the person 4 of 2.5 m <r <5 m is detected in the fan-shaped area sandwiched by the straight lines 8a and 8b, and its polar coordinates are calculated. Since the object is not detected in the region B and the region B, the signal processing unit 220 does not calculate polar coordinates regarding the object. Therefore, based on the distance profile input from the signal processing unit 220, the surrounding situation determination unit 410 can determine that there is no object (referred to as an obstacle) that is an obstacle to the movement of the user. That is, ambient condition determination unit 410 determines that the user can safely proceed in the traveling direction. The drive signal generation unit 430 of the control unit 400 supplies current to each of the coils 305-1, 305-2, 305-3, and 305-4 of the artificial force sense generation unit 300 (FIG. 4) based on the above determination. The pseudo force sense control signal is generated so as not to be generated, and is output to the pseudo force sense generation unit 300. As a result, the artificial force sense generation unit 300 does not generate an artificial force sense.
The above description is an example of generating the pseudo force sense only when transmitting the traveling direction to be changed with the detection of the obstacle. However, even in the case where an obstacle is not detected, a pseudo force sense may be generated forward in the traveling direction to transmit the current traveling direction.
<代替方向へ疑似力覚を発生する場合>
 制御部400は、周囲状況判定部410によって代替方向が算出された場合は、疑似力覚発生部300から代替方向へ疑似力覚を発生させる。図6(b)は、図6(a)の、例えば3秒後における使用者(すなわち誘導装置1)の周囲の物体の状況を説明する図である。使用者の歩行に伴い、誘導装置1が進行方向へ移動するため、誘導装置1に対する壁2の相対位置は、図6(b)において下方に移動する。また、人物5はその場所にとどまっているため、誘導装置1に対する人物5の相対位置は、壁2と同様に図6(b)において下方に移動する。これに対し、人物3は、使用者と同じ進行方向へ移動するため、誘導装置1に対する人物3の相対位置は、図6(a)と比べてほとんど変わらない。
<When generating a pseudo force in the alternative direction>
When the surrounding direction determination unit 410 calculates the alternative direction, the control unit 400 causes the artificial force sense generation unit 300 to generate a simulated force sense in the alternative direction. FIG. 6B is a view for explaining the situation of an object around the user (that is, the guiding device 1) after, for example, 3 seconds in FIG. 6A. Since the guiding device 1 moves in the traveling direction as the user walks, the relative position of the wall 2 to the guiding device 1 moves downward in FIG. 6 (b). In addition, since the person 5 remains at that place, the relative position of the person 5 with respect to the guiding device 1 moves downward in FIG. On the other hand, since the person 3 moves in the same traveling direction as the user, the relative position of the person 3 with respect to the guidance device 1 is hardly changed as compared with FIG.
 人物4は、使用者と逆方向に移動するため、誘導装置1に対する人物4の相対位置は、図6(b)において大きく下方に移動し、誘導装置1に近づく。上述したように、各物体の移動の有無、移動方向、および移動速度は、フレームごとに検出した同一物体(障害物)の極座標の変化に基づいて検知し、算出することができる。 Since the person 4 moves in the opposite direction to the user, the relative position of the person 4 with respect to the guiding device 1 largely moves downward in FIG. 6B and approaches the guiding device 1. As described above, the presence / absence of movement of each object, the movement direction, and the movement speed can be detected and calculated based on the change in polar coordinates of the same object (obstacle) detected for each frame.
 図6(b)によると、人物4が領域A内に入っているので、周囲状況判定部410は、図6(b)に対応する人物4の極座標、すなわち距離プロファイルに基づいて領域Aに物体が存在することを検知する。周囲状況判定部410は、使用者がこのまま移動すると人物4が障害になる可能性があると判断し、進行方向の両側に代替の進路を探索する。例えば、進行方向の領域の両側に、進行方向から時計回転方向と反時計回転方向に30度ずらした方向を中心とする所定の角度範囲(例えば30度)の領域DR,DLを代替方向候補として設定する。これらの領域DR,DLは、誘導装置1から半径5mで制限された領域である。これらの領域内に物体が存在するか否かを判定し、存在しなければその方向を代替方向候補と判定する。図6(b)においては、領域DRに物体が存在してしないから、その方向を代替方向とする。 According to FIG. 6B, since the person 4 is in the area A, the surrounding state determination unit 410 determines that the object in the area A is based on the polar coordinates of the person 4 corresponding to FIG. To detect the presence of The surrounding situation determination unit 410 determines that the person 4 may become an obstacle if the user moves as it is, and searches for alternative routes on both sides of the traveling direction. For example, regions DR and DL of a predetermined angle range (for example, 30 degrees) centered on a direction shifted 30 degrees clockwise and counterclockwise from the advancing direction on both sides of the area in the advancing direction are alternative direction candidates Set These areas DR and DL are areas restricted by a radius of 5 m from the guiding device 1. It is determined whether or not an object is present in these areas, and if it is not present, its direction is determined as a candidate alternative direction. In FIG. 6B, since no object exists in the region DR, the direction is taken as the alternative direction.
 そして、周囲状況判定部410は、人物4が存在する位置と人物4の移動速度から、余裕をもって衝突を回避でき、しかも現在の進行方向からの変更量(角度)が最小となる代替方向を算出する。 Then, based on the position where the person 4 is present and the movement speed of the person 4, the surrounding situation determination unit 410 can calculate an alternative direction which can avoid the collision with a margin and minimize the amount of change (angle) from the current traveling direction. Do.
 周囲状況判定部410が行う代替方向の算出について、さらに説明する。
(i)例えば人物4が領域Aのうち外寄り(2m以遠)に存在し、かつ、使用者(誘導装置1)に向かって接近する人物4の移動速度(相対速度)が遅い(例えば2m/sec以下)場合は、上述したように、物体が存在しない所定の角度範囲のうち、現在の進行方向からの変更角が小さい方向(例えば、変更角30度)を代替方向に決定する。
The calculation of the alternative direction performed by the surrounding situation determination unit 410 will be further described.
(I) For example, the moving speed (relative speed) of the person 4 approaching the user (guidance device 1) is low (eg, 2 m /). In the case of sec or less), as described above, the direction in which the change angle from the current traveling direction is smaller (for example, the change angle of 30 degrees) is determined as the alternative direction from the predetermined angle range in which no object exists.
(ii)また、例えば人物4が領域Aのうち内寄り(2m以内)に存在する場合、あるいは、使用者(誘導装置1)に向かって接近する人物4の移動速度(相対速度)が速い(2m/sec以上)場合は、物体が存在しない所定の角度範囲において、現在の進行方向からの変更角を上記30度より大きくする(例えば35度)。
(iii)上記(i)、(ii)は条件ごとに変更角を変える一例である。しかし、人物4の位置が領域Bに近いほど、あるいは接近する人物4の移動速度(相対速度)が速いほど、上記変更角を大きく変化させてもよい。
(Ii) Also, for example, when the person 4 exists in the inner side (within 2 m) of the area A, or the moving speed (relative speed) of the person 4 approaching toward the user (the guiding device 1) is high ( In the case of 2 m / sec or more), the change angle from the current traveling direction is made larger than the above 30 degrees (for example, 35 degrees) in a predetermined angle range in which no object exists.
(Iii) The above (i) and (ii) are an example of changing the change angle for each condition. However, as the position of the person 4 is closer to the area B, or as the moving speed (relative speed) of the approaching person 4 is higher, the change angle may be largely changed.
(iv)さらにまた、使用者(誘導装置1)に向かって接近する人物4のベクトルが、前述の物体が存在しない所定の角度範囲に近づく方向である場合は、その程度に応じて、上記変更角をより大きく変化させる。図6(b)を参照して説明すると、人物4が使用者(誘導装置1)の右斜め前方に向かって移動する場合は、使用者(誘導装置1)に向かってまっすぐ移動する場合よりも、上記変更角を大きく変化させる。 (Iv) Furthermore, if the vector of the person 4 approaching toward the user (the guiding device 1) is in a direction approaching the predetermined angle range in which the above-mentioned object does not exist, the above change Change the corners more greatly. As described with reference to FIG. 6 (b), when the person 4 moves toward the right front of the user (guide device 1), it moves more than when moving straight toward the user (guide device 1). , Greatly change the change angle.
 周囲状況判定部410は、代替方向の候補を複数算出した場合は、現在の進行方向に最も近い方向を代替方向として採用する。例えば、進行方向より右へ30度の方向と、進行方向より左へ40度の方向とが候補になる場合は、進行方向より右へ30度の方向を採用する。図6(b)の例では、進行方向より左側の領域DLには人物3および壁2が存在するため、物体までの距離が5mを超える方向はなく、代替方向は進行方向より右側の領域DRのみとなる。 When a plurality of alternative direction candidates are calculated, the surrounding situation determination unit 410 adopts the direction closest to the current traveling direction as the alternative direction. For example, when the direction of 30 degrees to the right from the traveling direction and the direction of 40 degrees to the left from the traveling direction become candidates, the direction of 30 degrees to the right from the traveling direction is adopted. In the example of FIG. 6B, since the person 3 and the wall 2 exist in the area DL on the left side of the traveling direction, there is no direction in which the distance to the object exceeds 5 m, and the alternative direction is the area DR on the right side of the traveling direction Only.
 制御部400の駆動信号生成部430は、周囲状況判定部410が決定した代替方向を示す情報に基づいて代替方向へ疑似力覚を発生させるべく、疑似力覚発生部300(図4)の各コイル305-1、305-2、305-3、および305-4に供給する電流を制御する疑似力覚制御信号を生成し、疑似力覚発生部300へ出力する。これにより、疑似力覚発生部300が代替方向へ疑似力覚を発生させる。 The drive signal generation unit 430 of the control unit 400 generates the pseudo force sense in the alternative direction based on the information indicating the alternative direction determined by the surrounding condition determination unit 410, each of the artificial force sense generation unit 300 (FIG. 4). The pseudo force sense control signal for controlling the current supplied to the coils 305-1, 305-2, 305-3, and 305-4 is generated and output to the pseudo force sense generation unit 300. Thereby, the artificial force sense generating unit 300 generates an artificial force sense in the alternative direction.
<進行方向と反対の方向へ疑似力覚を発生する場合>
 周囲状況判定部410が物体(障害物)との衝突回避処置を必要と判断したにも関わらず代替方向が算出できなかった場合、周囲状況判定部410は、代替方向が存在しないと判断する。制御部400の駆動信号生成部430は、上記判断に基づいて、進行方向と逆方向へ疑似力覚を発生させるべく疑似力覚制御信号を生成して疑似力覚発生部300に駆動信号を送る。すなわち駆動信号生成部430は、疑似力覚発生部300(図4)の各コイル305-1、305-2、305-3、および305-4に供給する電流を制御する疑似力覚制御信号を生成し、疑似力覚発生部300へ出力する。これにより、疑似力覚発生部300が進行方向とは逆方向へ疑似力覚を発生させる。逆方向へ疑似力覚を発するのは、誘導による衝突回避が困難なことから、使用者に停止するよう促すためである。
<When generating a pseudo force sense in the direction opposite to the traveling direction>
If the surrounding direction determination unit 410 can not calculate the alternative direction although it determines that the collision avoidance process with the object (obstacle) is necessary, the surrounding state determination unit 410 determines that there is no alternative direction. The drive signal generation unit 430 of the control unit 400 generates a pseudo force sense control signal to generate a pseudo force sense in the direction opposite to the traveling direction based on the above determination, and sends the drive signal to the pseudo force sense generation unit 300. . That is, the drive signal generation unit 430 controls the artificial force control signal for controlling the current supplied to the coils 305-1, 305-2, 305-3, and 305-4 of the artificial force sense generation unit 300 (FIG. 4). It generates and outputs to the artificial force sense generation unit 300. Thereby, the artificial force sense generation unit 300 generates the artificial force sense in the direction opposite to the traveling direction. The reason for generating the pseudo force sense in the reverse direction is to urge the user to stop because it is difficult to avoid collision by guidance.
<警告する場合>
 例えば、人物4が図6(b)の場合よりも誘導装置1(使用者)に接近し、領域B内に入った場合、周囲状況判定部410は、人物4の位置を表す極座標に基づいて、換言するとその極座標に基づき作成された距離プロファイルに基づいて、領域Bに物体(障害物)が存在することを検知する。領域Bに物体が存在する場合、周囲状況判定部410は、使用者が人物4と衝突する可能性があると判断する。制御部400の駆動信号生成部430は、上記判断に基づいて、報知部600を駆動させる駆動信号(警告指示信号と称する)を生成し、報知部600へ出力する。
<Warning>
For example, when the person 4 approaches the guidance device 1 (user) more than in the case of FIG. 6B and enters the area B, the surrounding situation determination unit 410 determines the position of the person 4 based on polar coordinates. In other words, the presence of an object (obstacle) in the region B is detected based on the distance profile created based on the polar coordinates. When an object is present in the area B, the surrounding state determination unit 410 determines that the user may collide with the person 4. The drive signal generation unit 430 of the control unit 400 generates a drive signal (referred to as a warning instruction signal) for driving the notification unit 600 based on the above determination, and outputs the drive signal to the notification unit 600.
 報知部600は、駆動信号生成部430から警告指示信号を受信し、警告を発する。警告は、上述した警告音、ブザー音、音声メッセージなどにより使用者と人物4の双方に知らせるものでもよく、LEDランプの光や警告画面、表示メッセージにより人物4へ知らせるものでもよい。
 なお、周囲状況判定部410は、領域Bに物体が存在することを検知した場合も、誘導による衝突回避が困難であると認識する。このため、制御部400は、疑似力覚発生部300によって進行方向とは逆方向へ疑似力覚を発生させて、使用者に停止を促してもよい。
The notification unit 600 receives a warning instruction signal from the drive signal generation unit 430 and issues a warning. The warning may be notified to both the user and the person 4 by the above-mentioned warning sound, buzzer sound, voice message or the like, or may be notified to the person 4 by a light of an LED lamp, a warning screen or a display message.
Also when the surrounding situation determination unit 410 detects that an object is present in the region B, it recognizes that collision avoidance by guidance is difficult. Therefore, the control unit 400 may cause the artificial force sense generation unit 300 to generate an artificial force sense in the direction opposite to the traveling direction to urge the user to stop.
<フローチャートの説明>
 上述した誘導装置1において実行される処理の流れについて、図7および図8に例示するフローチャートを参照して説明する。誘導装置1は、使用者によって誘導装置1のオン操作が行われると、図7および図8による処理を繰り返し行う。図7のステップS10において、制御部400は、障害物検出部200へ指示を送り、撮像部210のカメラ210aおよび210bにより進行方向の撮像を開始させる。これにより、撮像部210は所定のフレームレートで繰り返し前方を撮像する。ステップS20において、障害物検出部200の信号処理部220は、距離プロファイルを生成する。信号処理部220は、2つのカメラ210aおよび210bの画像信号に基づいて障害物検出の画像処理を行い、誘導装置1を座標原点とする座標系における障害物の位置を極座標として算出する。距離プロファイルは、算出された障害物の極座標に基づいて図5(b)のように作成される。図5(b)はプロファイルの一例を示すものであり、障害物の位置、方向により示される危険回避情報であり、図5(b)の形態に限定されるものではない。
<Description of flowchart>
The flow of processing executed in the above-described guidance device 1 will be described with reference to the flowcharts illustrated in FIGS. 7 and 8. When the user turns on the guidance device 1, the guidance device 1 repeatedly performs the processing according to FIGS. 7 and 8. In step S10 of FIG. 7, the control unit 400 sends an instruction to the obstacle detection unit 200, and causes the cameras 210a and 210b of the imaging unit 210 to start imaging in the traveling direction. Thereby, the imaging unit 210 repeatedly images the front at a predetermined frame rate. In step S20, the signal processing unit 220 of the obstacle detection unit 200 generates a distance profile. The signal processing unit 220 performs image processing of obstacle detection based on the image signals of the two cameras 210a and 210b, and calculates the position of the obstacle in the coordinate system with the guidance device 1 as the coordinate origin as polar coordinates. The distance profile is created as shown in FIG. 5 (b) based on the calculated polar coordinates of the obstacle. FIG. 5 (b) shows an example of the profile, and is risk avoidance information indicated by the position and direction of the obstacle, and is not limited to the form of FIG. 5 (b).
 上記ステップS10、S20は、誘導装置1のオンに伴って制御部400から指示を受けた障害物検出部200で行われる動作処理である。以下のステップS30以降は、制御部400で実行される処理を示す。
 ステップS30において、制御部400は、周囲状況判定部410によって、領域B(図6)に障害となる物体が存在するか否かを判定する。領域Bに障害となる物体の存在が検知された場合には、ステップS30を肯定判定して図8のステップS160へ進む。領域Bに障害となる物体の存在が検知されなかった場合には、ステップS30を否定判定してステップS40へ進む。
Steps S10 and S20 described above are operation processing performed by the obstacle detection unit 200 that has received an instruction from the control unit 400 when the guidance device 1 is turned on. The following step S30 and subsequent steps show processing executed by the control unit 400.
In step S <b> 30, the control unit 400 causes the surrounding situation determination unit 410 to determine whether or not there is an obstacle object in the region B (FIG. 6). When the presence of an obstacle object is detected in the region B, the determination in step S30 is positive and the process proceeds to step S160 in FIG. If the presence of the obstacle object in the region B is not detected, the determination in step S30 is negative and the process proceeds to step S40.
 ステップS40において、制御部400は、周囲状況判定部410によって、領域A(図6)に障害となる物体が存在するか否かを判定する。領域Aに障害となる物体の存在が検知された場合には、ステップS40を肯定判定してステップS50へ進む。領域Aに物体の存在が検知されなかった場合には、ステップS40を否定判定して図8のステップS80へ進む。 In step S40, the control unit 400 determines, by the surrounding situation determination unit 410, whether or not there is an obstacle object in the area A (FIG. 6). When the presence of an obstacle object is detected in the area A, the determination in step S40 is positive and the process proceeds to step S50. If the presence of an object is not detected in the area A, the determination in step S40 is negative and the process proceeds to step S80 in FIG.
 ステップS50へ進む場合は、代替方向へ疑似力覚を発生させる場合である。ステップS50において、制御部400は、周囲状況判定部410によって、代替方向を探索する。ステップS60において、制御部400は、周囲状況判定部410によって、代替方向探索結果に基づき代替方向があるか否かを判定する。代替方向が算出された場合には、ステップS60を肯定判定してステップS70へ進む。代替方向が算出されない(検出できない)場合には、ステップS60を否定判定して図8のステップS160へ進む。 In the case of proceeding to step S50, a pseudo force sense is generated in the alternative direction. In step S50, the control unit 400 causes the surrounding situation determination unit 410 to search for an alternative direction. In step S60, the control unit 400 determines, based on the alternative direction search result, whether or not there is an alternative direction by the surrounding situation determination unit 410. When the alternative direction is calculated, the determination in step S60 is positive and the process proceeds to step S70. If the alternative direction is not calculated (cannot be detected), the determination in step S60 is negative and the process proceeds to step S160 in FIG.
 ステップS70において、制御部400は、周囲状況判定部410によって、代替方向探索結果に基づき代替方向が複数あるか否かを判定する。代替方向の候補が複数算出された場合には、ステップS70を肯定判定して図8のステップS90へ進む。代替方向が一つ算出された場合には、ステップS70を否定判定して図8のステップS110へ進む。 In step S70, the control unit 400 determines, by the surrounding situation determination unit 410, whether there is a plurality of alternative directions based on the alternative direction search result. If a plurality of alternative direction candidates have been calculated, an affirmative decision is made in step S70, and the process proceeds to step S90 in FIG. If one alternative direction is calculated, the determination in step S70 is negative and the process proceeds to step S110 in FIG.
 図8のステップS80へ進む場合は、障害物が検知されなかったことから疑似力覚を発生させない場合である。ステップS80において、制御部400は、駆動信号生成部430によって、疑似力覚を発生しないように疑似力覚制御信号を生成して疑似力覚発生部300(図4)へ出力し、図8による処理を終了する。これにより、疑似力覚発生部300は、疑似力覚を発生させない。  The case where the process proceeds to step S80 of FIG. 8 is a case where the artificial force sense is not generated because the obstacle is not detected. In step S80, control unit 400 causes drive signal generation unit 430 to generate a pseudo force sense control signal so as not to generate a simulated force sense, and outputs the generated signal to simulated force sense generation unit 300 (FIG. 4). End the process. Thereby, the artificial force sense generation unit 300 does not generate an artificial force sense.
 図8のステップS90へ進む場合は、障害物を避けるための代替方向の候補が複数存在する場合である。ステップS90において、制御部400は、周囲状況判定部410によって、複数の代替方向について、その偏向方向、すなわち、使用者が正対する進行方向との偏角をそれぞれ算出する。ステップS100において、制御部400は、周囲状況判定部410によって、偏角が最小となる代替方向を選択する。制御部400はさらに、駆動信号生成部430によって、上記代替方向へ疑似力覚発生部300から疑似力覚が発生されるように、疑似力覚制御信号を生成して疑似力覚発生部300(図4)へ出力し、ステップS120へ進む。 The case where the process proceeds to step S90 in FIG. 8 is a case where a plurality of alternative direction candidates for avoiding an obstacle exist. In step S90, the control unit 400 causes the surrounding condition determination unit 410 to calculate deflection angles with respect to a plurality of alternative directions, that is, declination angles with respect to the direction in which the user faces. In step S100, the control unit 400 causes the surrounding condition determination unit 410 to select an alternative direction that minimizes the declination. The control unit 400 further generates a pseudo force sense control signal so that a pseudo force sense is generated from the pseudo force sense generation unit 300 in the alternative direction by the drive signal generation unit 430, It outputs to FIG. 4) and progresses to step S120.
 図8のステップS110へ進む場合は、障害物を避けるための代替方向が一つ存在する場合である。ステップS110において、制御部400は、駆動信号生成部430によって、上記代替方向へ疑似力覚発生部300から疑似力覚が発生されるように、疑似力覚制御信号を生成して疑似力覚発生部300(図4)へ出力し、ステップS120へ進む。 The case of proceeding to step S110 in FIG. 8 is a case where there is one alternative direction for avoiding an obstacle. In step S110, the control unit 400 causes the drive signal generation unit 430 to generate a pseudo force sense control signal so that a pseudo force sense is generated from the pseudo force sense generation unit 300 in the alternative direction, thereby generating a pseudo force sense. It outputs to the part 300 (FIG. 4), and progresses to step S120.
 上記ステップS100またはステップS110の処理が行われることにより、疑似力覚発生部300が疑似力覚を発生する。これにより、使用者は代替方向に進行方向を変更する。ステップS120において、制御部400は、周囲状況判定部410によって、誘導装置1の進行方向が変わったか否かを判定する。進行方向の変化は、障害物検出部200の信号処理部220からの出力を周囲状況判定部410により分析して検出する。より具体的には撮像素子212aによる画像信号と、撮像素子212bによる画像信号の時間変化(フレーム間相関)から誘導装置1の進行方向の変化がわかる。周囲状況判定部410によって元の進行方向と異なる進行方向に進路が変わったと判定された場合は、ステップS120を肯定判定してステップS130へ進む。元の進行方向と異なる進行方向に進路が変わったと判定されなかった場合は、ステップS120を否定判定して進行方向の変化を待つ。 By performing the process of step S100 or step S110, the pseudo force sense generation unit 300 generates a pseudo force sense. Thereby, the user changes the traveling direction in the alternative direction. In step S120, the control unit 400 determines whether the traveling direction of the guidance device 1 has been changed by the surrounding situation determination unit 410. The change in the traveling direction is detected by analyzing the output from the signal processing unit 220 of the obstacle detection unit 200 by the surrounding situation determination unit 410. More specifically, the change in the traveling direction of the guiding device 1 can be understood from the temporal change (inter-frame correlation) of the image signal by the imaging element 212a and the image signal by the imaging element 212b. If it is determined by the surrounding situation determination unit 410 that the course has changed in the traveling direction different from the original traveling direction, an affirmative determination is made in step S120 and the process proceeds to step S130. If it is not determined that the route has changed in the direction of travel different from the original direction of travel, the determination in step S120 is negative and a change in direction of travel is awaited.
 ステップS130において、制御部400は、元の進行方向、例えば方位を内部の記憶部420に記録してステップS140へ進む。このとき、使用者は代替方向に進路を変えて歩行している。元の進行方向はステップS120で誘導装置1の進行方向が変わったか否かを検出したのと同様に、障害物検出部200の信号処理部220からの出力を周囲状況判定部410により分析して検出する。すなわち撮像素子212aによる画像信号と、撮像素子212bによる画像信号の時間変化(フレーム間相関)から誘導装置1の進行方向の変化がわかるので、そこから元の進行方向を算出する。代替方向へ進行中のステップS140において、制御部400は、周囲状況判定部410によって、元の進行方向である方位(記録部420に記録されている)に障害となる物体が存在するか否かを判定する。元の進行方向に物体の存在が検知された場合は、ステップS140を肯定判定して当該判定処理を繰り返す。判定処理を繰り返すのは、元の進行方向に物体が検知されなくなるのを待つためである。一方、元の進行方向に物体の存在が検知されなかった場合には、ステップS140を否定判定してステップS150へ進む。 In step S130, the control unit 400 records the original traveling direction, for example, the azimuth in the internal storage unit 420, and proceeds to step S140. At this time, the user is walking while changing the course in the alternative direction. As in the case of detecting whether or not the traveling direction of the guidance device 1 has changed in step S120, the surrounding traveling direction determination unit 410 analyzes the output from the signal processing unit 220 of the obstacle detection unit 200. To detect. That is, since the change in the traveling direction of the guidance device 1 can be known from the temporal change (inter-frame correlation) of the image signal by the imaging element 212a and the image signal by the imaging element 212b, the original traveling direction is calculated therefrom. In step S140, which is progressing in the alternative direction, the control unit 400 causes the surrounding situation determination unit 410 to determine whether or not there is an obstacle object in the original direction of travel (recorded in the recording unit 420). Determine If the presence of an object is detected in the original traveling direction, the determination in step S140 is positive and the determination process is repeated. The determination process is repeated to wait for the object to be undetected in the original traveling direction. On the other hand, when the presence of the object is not detected in the original traveling direction, the determination in step S140 is negative and the process proceeds to step S150.
 ステップS150において、制御部400は、駆動信号生成部430によって、元の進行方向である方位に疑似力覚を発生させる疑似力覚制御信号を生成して疑似力覚発生部300(図4)へ出力し、図8による処理を終了する。これにより、疑似力覚発生部300が元の進行方向である方位に疑似力覚を発生させるので、使用者は、元の進行方向と同じ方位へ歩行することができる。 In step S150, the control unit 400 causes the drive signal generation unit 430 to generate a pseudo force sense control signal for generating a simulated force sense in the direction that is the original traveling direction, and to the simulated force sense generation unit 300 (FIG. 4). Then, the process according to FIG. 8 is ended. As a result, since the pseudo force sense generation unit 300 generates the pseudo force sense in the direction that is the original traveling direction, the user can walk in the same direction as the original traveling direction.
 図8のステップS160へ進む場合は、使用者が物体と衝突する危険がある場合である。ステップS160において、制御部400は、進行方向と逆方向へ疑似力覚発生部300から疑似力覚を発生させる。すなわち、駆動信号生成部430によって、進行方向と反対の方向へ疑似力覚を発生させる疑似力覚制御信号を生成して疑似力覚発生部300(図4)へ出力し、図8による処理を終了する。
 ステップS160によると、疑似力覚発生部300で発生した疑似力覚により、使用者(被誘導者)は停止して衝突を回避することができる。
The case of proceeding to step S160 in FIG. 8 is a case where there is a risk that the user may collide with the object. In step S160, the control unit 400 causes the pseudo force sense generation unit 300 to generate a pseudo force sense in the direction opposite to the traveling direction. That is, the drive signal generation unit 430 generates a pseudo force sense control signal for generating a pseudo force sense in the direction opposite to the traveling direction and outputs it to the pseudo force sense generation unit 300 (FIG. 4). finish.
According to step S160, the user (guided person) can stop and avoid the collision by the pseudo force sense generated by the pseudo force sense generation unit 300.
 上述した第1の実施形態によれば、次の作用効果が得られる。
(1)使用者が把持する誘導装置1は、周囲の障害物までの距離および方向を検出する障害物検出部200と、この検出結果に基づいて進行方向に代わる代替方向を検出する制御部400(周囲状況判定部410)と、代替方向に疑似力覚を発生する疑似力覚発生部300とを備える。このように構成したので、使用者が進行方向へ進み続けると障害物に衝突しそうな場合において、障害物を回避可能な安全な方向へ、使用者を適切に誘導することができる。具体的には、疑似力覚によって使用者を代替方向へ引っ張るため、使用者に容易に理解できる方法でわかりやすく代替方向を伝達することができる。
According to the first embodiment described above, the following effects can be obtained.
(1) The guiding device 1 held by the user detects an obstacle detection unit 200 that detects the distance and direction to surrounding obstacles, and a control unit 400 that detects an alternative direction instead of the traveling direction based on the detection result. (Ambient situation determination unit 410), and a pseudo force sense generation unit 300 that generates a pseudo force sense in the alternative direction. Since it comprised in this way, when a user continues advancing in the advancing direction, when it is likely to collide with an obstacle, a user can be appropriately guided in the safe direction which can avoid an obstacle. Specifically, since the pseudo force sense pulls the user in the alternative direction, the alternative direction can be transmitted in an easy-to-understand manner in a manner easily understandable to the user.
(2)誘導装置1は、障害物検出部200により障害物までの距離と方向に関する距離プロファイル(図5(b))を生成し、制御部400(周囲状況判定部410)は、距離プロファイルに基づいて代替方向を検出するので、適切な代替方向を容易に検出することができる。 (2) The guidance device 1 generates a distance profile (FIG. 5B) regarding the distance and direction to the obstacle by the obstacle detection unit 200, and the control unit 400 (the surrounding situation determination unit 410) generates the distance profile. Since the alternative direction is detected based on it, an appropriate alternative direction can be easily detected.
(3)疑似力覚発生部300は、制御部400(周囲状況判定部410)によって代替方向が検出されない場合には進行方向と逆方向に疑似力覚を発生するので、使用者を、障害物等と衝突する場合の衝撃が小さくなる方向へ誘導することができる。 (3) The artificial force sense generation unit 300 generates the artificial force sense in the opposite direction to the traveling direction when the control unit 400 (the surrounding situation determination unit 410) does not detect the alternative direction, so the user It can be guided in the direction in which the impact in the case of collision with etc. becomes smaller.
(4)障害物検出部200は、異なる複数の視点から障害物を撮像するステレオカメラ210aおよび210bを含むので、光や電波を発することなく簡単な構成で周囲の障害物までの距離および方向を検出することができる。 (4) Since the obstacle detection unit 200 includes stereo cameras 210a and 210b that capture obstacles from different viewpoints, the distance and direction to surrounding obstacles can be determined with a simple configuration without emitting light or radio waves. It can be detected.
(5)疑似力覚発生部300は、慣性体302と、慣性体302を所定の方向に非対称振動させる駆動部304とを備えるので、簡単な構成で疑似力覚を発生させることができる。 (5) The artificial force sense generation unit 300 includes the inertial body 302 and the drive unit 304 that causes the inertial body 302 to vibrate asymmetrically in a predetermined direction, so that artificial force sense can be generated with a simple configuration.
(6)制御部400(周囲状況判定部410)は、進行方向を含む第1の角度範囲(例えば30度)のうち第1距離(例えば2.5m)以内に障害物が存在すると、代替方向の検出を開始するので、障害物等を回避するのに適したタイミングで、使用者を代替方向へと誘導することができる。 (6) The control unit 400 (the surrounding situation determination unit 410) determines the alternative direction when an obstacle exists within a first distance (for example, 2.5 m) in a first angle range (for example, 30 degrees) including the traveling direction. Since the detection of the above is started, the user can be guided in the alternative direction at a time suitable for avoiding an obstacle or the like.
(7)制御部400(周囲状況判定部410)は、進行方向と異なる第1方向を含む第2の角度範囲(例えば30度)のうち第1距離(例えば2.5m)より遠い第2距離(例えば5m)以内の第1領域に障害物が存在しない場合には、第1方向を代替方向として検出する。このように構成したので、障害物が遠方まで存在しない方向を代替方向として検出することができる。 (7) The control unit 400 (the surrounding situation determination unit 410) sets a second distance farther than the first distance (for example, 2.5 m) in the second angle range (for example, 30 degrees) including the first direction different from the traveling direction If an obstacle does not exist in the first area (for example, 5 m or less), the first direction is detected as the alternative direction. With this configuration, it is possible to detect a direction in which an obstacle does not exist far as an alternative direction.
(8)制御部400(周囲状況判定部410)は、上記第1領域に障害物が存在する場合には、上記第1方向を代替方向として検出しないので、代替方向へ誘導してすぐに別の障害物に近づくおそれを低減することができる。 (8) When an obstacle is present in the first region, the control unit 400 (surrounding condition judging unit 410) does not detect the first direction as the alternative direction, and thus the control unit 400 guides the user to the alternative direction and immediately It is possible to reduce the risk of approaching an obstacle.
(9)制御部400(周囲状況判定部410)は、上記第1領域に障害物が存在せず、かつ、進行方向と異なる第2方向を含む第2の角度範囲(例えば30度)のうち第2距離(例えば5m)以内の第2領域にも障害物が存在しない場合には、第1方向および第2方向のうち進行方向に近い方を代替方向として検出する。このように構成したので、なるべく元の進行方向に対して変更角度が小さくなる方向を代替方向として検出することができる。 (9) The control unit 400 (the surrounding situation determination unit 410) has no obstacle in the first area and has a second angle range (for example, 30 degrees) including a second direction different from the traveling direction. When no obstacle exists in the second region within the second distance (for example, 5 m), one of the first direction and the second direction closer to the traveling direction is detected as the alternative direction. With this configuration, it is possible to detect a direction in which the change angle is smaller than the original traveling direction as possible as the alternative direction.
(10)疑似力覚発生部300は、進行方向を含む第1の角度範囲(例えば30度)のうち上記第1距離(例えば2.5m)より近い第3距離(例えば1.5m)以内に障害物が存在する場合には、進行方向と逆方向に疑似力覚を発生させる。このように構成したので、使用者を、障害物等と衝突する場合の衝撃が小さくなる方向へ誘導することができる。 (10) The artificial force sense generation unit 300 sets the third angle (eg, 1.5 m) closer to the first distance (eg, 2.5 m) in the first angle range (eg, 30 degrees) including the traveling direction. When an obstacle is present, pseudo force sense is generated in the direction opposite to the traveling direction. Since it comprised in this way, a user can be induced | guided | derived to the direction which the impact at the time of colliding with an obstruction etc. becomes small.
(第1の実施形態の変形例)
 上記の説明では、障害物検出部200の撮像部210に魚眼レンズ211a、211bを備えたステレオカメラ210a、210bを用いる例を説明したが、ステレオカメラの代わりに、例えばLIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)や、超音波による走査装置などを用いてもよい。
 カメラは、可視光に感度のある撮像素子を用いてもよいし、赤外光に感度のある撮像素子を用いてもよい。または、可視光と赤外光の双方に感度のある撮像素子を用いてもよい。赤外光を用いることにより夜間、あるいは、昼間でも暗い場所(ガード下やトンネルなど)でも誘導を適切に行うことができる。
 また、周囲状況判定部410(制御部400)は、誘導装置1の進行方向の変化の検出、そして元の進行方向の算出に、障害物検出部200の信号処理部220の出力を用いたが、これは別途角速度センサや加速度センサを用いて進行方向の変化や元の進行方向の算出をする如く構成してもよい。あるいはGPS(全地球測位システム)を利用した位置センサを設け、その出力から進行方向の変化や元の進行方向の算出をする如く構成してもよい。
 誘導装置1の内部に、姿勢センサを内蔵して正しい把持姿勢でないと報知部600から正しい把持姿勢ではない旨を警告するようにしてもよい。
(Modification of the first embodiment)
In the above description, an example using stereo cameras 210a and 210b provided with fisheye lenses 211a and 211b for the imaging unit 210 of the obstacle detection unit 200 has been described, but instead of stereo cameras, for example, LIDAR (Light Detection and Ranging, Laser) Imaging Detection and Ranging, an ultrasonic scanner, or the like may be used.
The camera may use an imaging element sensitive to visible light or an imaging element sensitive to infrared light. Alternatively, an imaging element having sensitivity to both visible light and infrared light may be used. By using infrared light, guidance can be properly performed even at night, daytime or even in a dark place (under guard, tunnel, etc.).
In addition, the surrounding situation determination unit 410 (control unit 400) uses the output of the signal processing unit 220 of the obstacle detection unit 200 to detect a change in the traveling direction of the guidance device 1 and to calculate the original traveling direction. However, this may be configured to calculate the change in the traveling direction or the original traveling direction using an angular velocity sensor or an acceleration sensor separately. Alternatively, a position sensor using GPS (Global Positioning System) may be provided, and the output may be used to calculate the change in the traveling direction or the original traveling direction.
A posture sensor may be incorporated inside the guiding device 1 and the notification unit 600 may warn that the posture is not correct unless the posture is correct.
(第2の実施形態)
 視覚障碍者が歩行する際、介助者が視覚障碍者を先導する場合がある。発明の第2の実施形態による誘導装置は、使用者が誘導装置を携帯して歩行すると、誘導装置は、先導する介助者に追従するよう使用者を誘導する。このような誘導装置の詳細について、図面を参照して詳細に説明する。
Second Embodiment
When a visually impaired person walks, a helper may lead the visually impaired person. In the guidance device according to the second embodiment of the present invention, when the user walks carrying the guidance device, the guidance device guides the user to follow the leading assistant. The details of such an induction device will be described in detail with reference to the drawings.
 図9は、第2の実施形態による誘導装置1Aの使用場面を説明する模式図である。図9において、誘導装置1Aは、介助者として使用者12を先導する人物13に追随して使用者12を誘導する。誘導装置1Aの外観は、第1の実施形態の誘導装置1と同様である。使用者12は、誘導装置1Aを使用する時、誘導装置1Aを図1(b)に例示したように身体の前に把持する。 FIG. 9 is a schematic view illustrating a use situation of the guidance device 1A according to the second embodiment. In FIG. 9, the guidance device 1A guides the user 12 by following the person 13 who leads the user 12 as a helper. The appearance of the guiding device 1A is the same as that of the guiding device 1 of the first embodiment. When using the guiding device 1A, the user 12 holds the guiding device 1A in front of the body as illustrated in FIG. 1 (b).
<誘導装置の構成>
 図10は、誘導装置1Aの構成を例示するブロック図である。図10において、誘導装置1Aは、先導する人物13を検出する先導者検出部1200と、制御部1400と、報知部1600と、疑似力覚発生部1300とを含む。
<Structure of Guide Device>
FIG. 10 is a block diagram illustrating the configuration of the guidance device 1A. In FIG. 10, the guidance device 1A includes a leading person detection unit 1200 that detects a leading person 13, a control unit 1400, a notification unit 1600, and a simulated force sense generation unit 1300.
 図10と第1の実施形態の図2とを比較すると、図2の障害物検出部200に代えて先導者検出部1200が設けられている点において相違する。図10の制御部1400は図2の制御部400に対応し、図10の疑似力覚発生部1300は図2の疑似力覚発生部300に対応し、図10の報知部1600は図2の報知部600に対応する。 10 and FIG. 2 of the first embodiment are different in that a leading person detection unit 1200 is provided instead of the obstacle detection unit 200 of FIG. 2. The control unit 1400 of FIG. 10 corresponds to the control unit 400 of FIG. 2, the pseudo force sense generation unit 1300 of FIG. 10 corresponds to the pseudo force sense generation unit 300 of FIG. 2, and the notification unit 1600 of FIG. This corresponds to the notification unit 600.
1.先導者検出部
 先導者検出部1200は、撮像部1210および信号処理部1220を含み、正面方向を中心とする情景を撮像し、撮像画面上における人物13の像の位置を検出する。例えば、撮像部1210が正面方向を撮像し、信号処理部1220が撮像部1210で撮像された撮像画面上の人物13の像の位置を示す情報を制御部1400へ送出する。図11は、先導者検出部1200の構成の一例を示す図である。図11において、撮像部1210は、単眼のカメラにより構成される。カメラ1210は、撮像レンズ1211と、撮像レンズ1211の結像面に配設された撮像素子1212とを有する。撮像レンズ1211は、図3の符号211a、211bで示した魚眼レンズではなく、通常の撮像レンズである。
1. The leader detection unit The leader detection unit 1200 includes an imaging unit 1210 and a signal processing unit 1220, captures an image of a scene centered on the front direction, and detects the position of the image of the person 13 on the imaging screen. For example, the imaging unit 1210 captures an image in the front direction, and the signal processing unit 1220 sends information indicating the position of the image of the person 13 on the imaging screen captured by the imaging unit 1210 to the control unit 1400. FIG. 11 is a diagram showing an example of the configuration of the leader detection unit 1200. As shown in FIG. In FIG. 11, an imaging unit 1210 is configured of a monocular camera. The camera 1210 has an imaging lens 1211 and an imaging element 1212 disposed on the image forming surface of the imaging lens 1211. The imaging lens 1211 is not a fisheye lens shown by reference numerals 211a and 211b in FIG. 3 but a normal imaging lens.
 撮像素子1212は、撮像レンズ1211によって結像された被写体像を撮像する。撮像素子1212から出力される画像信号は、信号処理部1220に送られる。信号処理部1220は、撮像された画像から先導者である人物13の像を検出する。検出の方法としては、公知の画像認識技術を用いる。例えば、予め人物13を背後から撮像した画像のデータを信号処理部1220に設けた記憶部に記録しておく。信号処理部1220は、記憶部に記憶された画像のデータとカメラ1210で撮像した画像のデータとを比較する。そして、信号処理部1220は、カメラ1210で撮像された撮像画面上で、記憶されたデータと類似するデータに対応する人物13の像を検出する。 The imaging element 1212 captures a subject image formed by the imaging lens 1211. The image signal output from the imaging element 1212 is sent to the signal processing unit 1220. The signal processing unit 1220 detects an image of the person 13 who is a leader from the captured image. As a method of detection, a known image recognition technology is used. For example, data of an image obtained by imaging the person 13 from behind is recorded in advance in a storage unit provided in the signal processing unit 1220. The signal processing unit 1220 compares the data of the image stored in the storage unit with the data of the image captured by the camera 1210. Then, the signal processing unit 1220 detects an image of the person 13 corresponding to data similar to the stored data on the imaging screen captured by the camera 1210.
2.制御部
 制御部1400(図10)は、CPU、ROM、RAM等により構成され、制御プログラムに基づいて誘導装置1Aの各部の動作を制御する。制御部1400は、周囲状況判定部1410、記憶部1420、および駆動信号生成部1430を含む。周囲状況判定部1410(図10)は、信号処理部1220から出力された、撮像画面上の人物13の像の位置を示す情報に基づいて、誘導装置1Aに対して人物13が存在する方向を算出する。具体的には、図11の撮像素子1212の画面中心から人物13の像の中心までの距離をL2とし、撮像レンズ1211の焦点距離をf2とすると、撮像レンズ1211の光軸Xから人物13が存在する方向との間の方向角θ2は、次式(1)により求めることができる。
θ2=arctan(L2/f2)…(1)
2. Control Unit The control unit 1400 (FIG. 10) includes a CPU, a ROM, a RAM, and the like, and controls the operation of each unit of the guidance device 1A based on a control program. Control unit 1400 includes ambient condition determination unit 1410, storage unit 1420, and drive signal generation unit 1430. Based on the information indicating the position of the image of the person 13 on the imaging screen output from the signal processing unit 1220, the surrounding situation determination unit 1410 (FIG. 10) determines the direction in which the person 13 is present relative to the guidance device 1A. calculate. Specifically, when the distance from the center of the screen of the image sensor 1212 to the center of the image of the person 13 in FIG. 11 is L2, and the focal distance of the imaging lens 1211 is f2, the person 13 is from the optical axis X of the imaging lens 1211. The directional angle θ2 with respect to the existing direction can be determined by the following equation (1).
θ2 = arctan (L2 / f2) (1)
 周囲状況判定部1410(図10)はさらに、信号処理部1220から出力された、撮像画面上の人物13の像の大きさを示す情報に基づいて、誘導装置1Aから人物13までの距離を算出する。周囲状況判定部1410は、例えば、撮像素子1212上の人物13の像から撮像面上の肩幅の大きさを検出し、検出した値と実際の人物13の肩幅との比に基づき、像倍率βを算出する。ここで、実際の人物13の肩幅は、予め制御部1400に設けた記憶部1420に記録しておくものとする。
 なお、記憶部1420に記録する肩幅の値は、予め人物13を背後から撮像した際の撮像距離と撮像された画像の大きさとに基づいて算出した値を採用してもよいし、例えば人の平均的な肩幅の値を採用してもよい。
The surrounding situation determination unit 1410 (FIG. 10) further calculates the distance from the guidance device 1A to the person 13 based on the information indicating the size of the image of the person 13 on the imaging screen, which is output from the signal processing unit 1220. Do. The surrounding situation determination unit 1410 detects, for example, the size of the shoulder width on the imaging surface from the image of the person 13 on the imaging element 1212, and based on the ratio of the detected value to the shoulder width of the actual person 13, the image magnification β Calculate Here, it is assumed that the actual shoulder width of the person 13 is recorded in advance in the storage unit 1420 provided in the control unit 1400.
The value of the shoulder width recorded in storage unit 1420 may be a value calculated based on the imaging distance when imaging person 13 from behind and the size of the imaged image in advance. An average shoulder width value may be adopted.
 周囲状況判定部1410は、カメラ1210の撮像レンズ1211の焦点距離f2を像倍率βで割る次式(2)により、誘導装置1Aから人物13までの距離Yを算出する。
Y=f2/β        …(2)
 周囲状況判定部1410は、算出した人物13が存在する方向を示す情報と、誘導装置1Aから人物13までの距離Yを示す情報とを、駆動信号生成部1430へ送出する。
The surrounding condition determination unit 1410 calculates the distance Y from the guidance device 1A to the person 13 by the following expression (2) that divides the focal length f2 of the imaging lens 1211 of the camera 1210 by the image magnification β.
Y = f2 / β (2)
The surrounding situation determination unit 1410 sends, to the drive signal generation unit 1430, information indicating the calculated direction in which the person 13 is present and information indicating the distance Y from the guidance device 1A to the person 13.
 駆動信号生成部1430(図10)は、周囲状況判定部1410によって算出された、人物13が存在する方向を示す情報と、誘導装置1Aから人物13までの距離Yを示す情報とに基づいて、疑似力覚発生部1300において各コイルに供給する電流を制御するための疑似力覚制御信号を生成する。疑似力覚制御信号は、疑似力覚発生部1300へ出力される。 Based on the information indicating the direction in which the person 13 is present and the information indicating the distance Y from the guiding device 1A to the person 13, the drive signal generation unit 1430 (FIG. 10) calculates the direction Y from the guidance device 1A. The artificial force sense generation unit 1300 generates an artificial force sense control signal for controlling the current supplied to each coil. The simulated force sense control signal is output to the simulated force sense generation unit 1300.
3.疑似力覚発生部
 疑似力覚発生部1300(図10)は、図2の疑似力覚発生部130と同様である。疑似力覚発生部1300は、駆動信号生成部1430から疑似力覚制御信号が入力されると、人物13が存在する方向(本例では方向角θ2で示される)に非対称振動を起こす。この結果、誘導装置1Aを把持する使用者12に対し、先導する人物13の方向に引かれるような疑似力覚を感じさせる。
3. Pseudo Force Generation Unit The pseudo force generation unit 1300 (FIG. 10) is similar to the pseudo force generation unit 130 of FIG. When the artificial force sense control signal is input from the drive signal generation unit 1430, the artificial force sense generation unit 1300 causes asymmetric vibration in the direction in which the person 13 is present (indicated by the directional angle θ2 in this example). As a result, the user 12 who holds the guiding device 1A feels a pseudo-force sense that is pulled in the direction of the leading person 13.
4.報知部
 報知部1600(図10)は、図2の報知部160と同様に、誘導装置1Aの周囲にいる人に対して注意を促したり、メッセージを伝えたりする。
4. Notification Unit The notification unit 1600 (FIG. 10), like the notification unit 160 in FIG. 2, warns a person who is around the guidance device 1A or transmits a message.
<誘導装置の動作>
 使用者12は、上記構成の誘導装置1Aを使用する際に、図1(b)に例示したように、先導者検出部1200を先導者である人物13へ向けて誘導装置1Aを把持する。先導者検出部1200は、カメラ1210による撮像と信号処理部1220による人物13の検出とを、例えば30フレーム/secのような一定の周期で繰り返し行う。また、制御部1400は、先導者検出部1200からの情報に基づく人物13が存在する方向の算出と人物13までの距離Yの算出とを繰り返し行う。
<Operation of induction device>
When the user 12 uses the guiding device 1A of the above configuration, as illustrated in FIG. 1B, the user 12 directs the leading person detection unit 1200 toward the person 13 who is the leading person, and holds the guiding device 1A. The leader detection unit 1200 repeatedly performs imaging by the camera 1210 and detection of the person 13 by the signal processing unit 1220, for example, at a fixed cycle such as 30 frames / sec. The control unit 1400 also repeatedly calculates the direction in which the person 13 is present and the distance Y to the person 13 based on the information from the leading person detection unit 1200.
 制御部1400は、誘導装置1Aが人物13の方向を向いていない場合と、誘導装置1Aから人物13までの距離Yが適切な距離でない場合とにおいて、疑似力覚発生部1300から疑似力覚を発生させる。誘導装置1Aが人物13の方向を向いたことの判定は、人物13の像が撮像素子1212の撮像画面上で画面中心から所定の範囲内に入ったことを検出して行う。つまり、上式(1)の方向角θ2が所定の角度θ0よりも小さくなったことを検出すると、誘導装置1Aが人物13の方向を向いたと判断する。 The control unit 1400 performs the pseudo force sense from the pseudo force sense generation unit 1300 in the case where the guidance device 1A does not face the direction of the person 13 and in the case where the distance Y from the guidance device 1A to the person 13 is not appropriate. generate. The determination that the guidance device 1A has turned in the direction of the person 13 is performed by detecting that the image of the person 13 has entered the predetermined range from the screen center on the imaging screen of the imaging device 1212. That is, when it is detected that the directional angle θ2 of the above equation (1) is smaller than the predetermined angle θ0, it is determined that the guiding device 1A is directed to the person 13.
 誘導装置1Aから人物13までの距離Yが適切であるか否かの判定は、予め設定された距離に対して所定の範囲内であることを検出して行う。例えば、上記像倍率βの値が所定の範囲に収まっていることを検出すると、制御部1400は、誘導装置1Aから人物13までの距離Yが適切であると判断する。 The determination as to whether or not the distance Y from the guiding device 1A to the person 13 is appropriate is performed by detecting that the distance Y is within a predetermined range with respect to a preset distance. For example, when it is detected that the value of the image magnification β is within the predetermined range, the control unit 1400 determines that the distance Y from the guidance device 1A to the person 13 is appropriate.
 なお、誘導装置1A(使用者12)と人物13までの最適な距離Yは、使用者12の周囲の混雑度などで変化するので、予め周囲状況を勘案して誘導装置1Aに設定しておくとよい。 The optimum distance Y between the guiding device 1A (user 12) and the person 13 changes depending on the degree of congestion around the user 12, etc. Therefore, the optimal distance Y is set in advance in the guiding device 1A in consideration of the surrounding conditions. It is good.
<疑似力覚を発生しない場合>
 制御部1400は、周囲状況判定部1410によって算出された方向角θ2が所定の角度θ0よりも小さく、かつ、算出された像倍率βが所定の範囲内である場合は、疑似力覚発生部1300から疑似力覚を発生させない。すなわち、使用者12の進行方向と人物13の方向とが一致し、かつ、誘導装置1Aから人物13までの距離Yが適切な距離である場合は、疑似力覚発生部1300から疑似力覚を発生させない。このように構成したことにより、使用者12は、誘導装置1Aからの疑似力覚がないため進行方向および移動速度を現在と同様に保つことができる。
<When not generating a pseudo force sense>
When control unit 1400 determines that directional angle θ2 calculated by surrounding environment determination unit 1410 is smaller than predetermined angle θ0, and calculated image magnification β is within a predetermined range, artificial force sense generation unit 1300. Does not generate a pseudo force. That is, when the traveling direction of the user 12 matches the direction of the person 13 and the distance Y from the guiding device 1A to the person 13 is an appropriate distance, the pseudo force sense is generated from the pseudo force sense generation unit 1300. It does not occur. With such a configuration, the user 12 can keep the traveling direction and the moving speed the same as the present, because there is no artificial force sense from the guiding device 1A.
<人物13の方向へ疑似力覚を発生する場合>
 制御部1400は、周囲状況判定部1410によって算出された方向角θ2が所定の角度θ0よりも大きい場合は、疑似力覚発生部1300から方向角θ2の方向へ疑似力覚を発生させる。例えば、先導者である人物13が、障害物を避けるために進行方向を変えた場合には、周囲状況判定部1410によって算出された方向角θ2が所定の角度θ0より大きくなる。
<When generating a pseudo force sense in the direction of the person 13>
When the directional angle θ2 calculated by the surrounding situation determination unit 1410 is larger than the predetermined angle θ0, the control unit 1400 causes the pseudo force sense generation unit 1300 to generate a pseudo force sense in the direction of the directional angle θ2. For example, when the person 13 who is the leader changes the traveling direction to avoid the obstacle, the direction angle θ2 calculated by the surrounding situation determination unit 1410 becomes larger than the predetermined angle θ0.
 制御部1400の駆動信号生成部1430は、方向角θ2が示す方向へ疑似力覚を発生させるべく、疑似力覚発生部1300(図10)の各コイル305-1、305-2、305-3、および305-4に供給する電流を制御する疑似力覚制御信号を生成し、疑似力覚発生部1300へ出力する。これにより、疑似力覚発生部1300が先導する人物13の方向へ疑似力覚を発生させる。このように構成したことにより、誘導装置1Aが使用者12に対し、進行方向を変更した人物13を追従することを疑似力覚として提示することが可能となる。 The drive signal generation unit 1430 of the control unit 1400 causes the coils 305-1, 305-2, and 305-3 of the artificial force sense generation unit 1300 (FIG. 10) to generate an artificial force sense in the direction indicated by the directional angle θ2. , And 305-4 to generate a simulated force sense control signal for controlling the current supplied to the unit 305-4, and outputs it to the simulated force sense generation unit 1300. Thereby, the artificial force sense is generated in the direction of the person 13 who leads the artificial force sense generation unit 1300. With such a configuration, it is possible to present the user 12 following the person 13 whose traveling direction has been changed as the pseudo force sense to the user 12.
<進行方向へ疑似力覚を発生する場合>
 制御部1400は、周囲状況判定部1410によって算出された方向角θ2が所定の角度θ0よりも小さく、かつ、算出された像倍率βが所定の範囲の下限となる値β1より小さい場合は、疑似力覚発生部1300から進行方向に疑似力覚を発生させる。すなわち、使用者12の進行方向と人物13の方向とが一致するものの、誘導装置1Aから人物13までの距離Yが適切な距離より長い場合は、疑似力覚発生部1300から進行方向へ疑似力覚を発生させる。このように構成したことにより、誘導装置1Aが使用者12に対し、進行方向へ移動速度を上げるように促すことができる。
<When generating a simulated force in the direction of travel>
When control unit 1400 determines that directional angle θ2 calculated by surrounding environment determination unit 1410 is smaller than predetermined angle θ0 and calculated image magnification β is smaller than value β1 which is the lower limit of the predetermined range, the control unit 1400 The force sense generation unit 1300 generates a pseudo force sense in the traveling direction. That is, although the traveling direction of the user 12 matches the direction of the person 13, if the distance Y from the guiding device 1A to the person 13 is longer than an appropriate distance, the pseudo force generation unit 1300 generates a pseudo force in the traveling direction. Generate a sense of mind. With this configuration, the guidance device 1A can prompt the user 12 to increase the moving speed in the traveling direction.
 このように使用者12に与える疑似力覚は、先導者(人物13)の進路変更に伴う場合と、先導者(人物13)との距離が離れた場合である。前者と後者の場合に使用者12に与える力覚の形態を異なるようにしてもよい。例えば、後者の場合の振動数を前者に比べて大きくする。あるいは、振幅の大小で変化させてもよい。 As described above, the pseudo force sense given to the user 12 is when the leader (person 13) changes course and when the distance between the leader (person 13) is long. The form of force sense given to the user 12 in the former case and the latter case may be different. For example, the frequency in the latter case is increased compared to the former. Alternatively, it may be changed according to the magnitude of the amplitude.
<進行方向と反対の方向へ疑似力覚を発生する場合>
 制御部1400は、周囲状況判定部1410によって算出された方向角θ2が所定の角度θ0よりも小さく、かつ、算出された像倍率βが所定の範囲の上限となる値β2(β2>β1)より大きい場合は、疑似力覚発生部1300から進行方向と逆方向に疑似力覚を発生させる。すなわち、使用者12の進行方向と人物13の方向とが一致して、誘導装置1Aから人物13までの距離Yが適切な距離より短い場合は、疑似力覚発生部1300から進行方向と逆方向へ疑似力覚を発生させる。このように構成したことにより、誘導装置1Aが使用者12に対し、人物13の方向への移動速度を下げるように促すので、先導者との距離を一定に保持しながら追従することができる。
<When generating a pseudo force sense in the direction opposite to the traveling direction>
Control unit 1400 causes directional angle θ2 calculated by surrounding environment determination unit 1410 to be smaller than predetermined angle θ0, and calculated image magnification β to be an upper limit of a predetermined range from value β2 (β2> β1). If it is large, the artificial force sense generation unit 1300 generates an artificial force sense in the direction opposite to the traveling direction. That is, when the traveling direction of the user 12 matches the direction of the person 13 and the distance Y from the guiding device 1A to the person 13 is shorter than an appropriate distance, the pseudo force sense generation unit 1300 reverses the traveling direction To generate a false sense of force. With this configuration, the guiding device 1A urges the user 12 to lower the moving speed in the direction of the person 13. Therefore, it is possible to follow while keeping the distance to the leader constant.
<フローチャートの説明>
 上述した誘導装置1Aにおいて実行される処理の流れについて、図12に例示するフローチャートを参照して説明する。誘導装置1Aは、使用者12によって誘導装置1Aのオン操作が行われると、図12による処理を繰り返し行う。図12のステップS210において、制御部1400は、先導者検出部1200へ指示を送り、撮像部1210のカメラにより進行方向(前方)の撮像を開始させる。ステップS220において、先導者検出部1200の信号処理部1220は、撮像画像において先導者(本例では人物13)の像を検出する。
<Description of flowchart>
The flow of processing executed in the above-described guidance device 1A will be described with reference to the flowchart illustrated in FIG. When the user 12 turns on the guidance device 1A, the guidance device 1A repeatedly performs the process shown in FIG. In step S210 in FIG. 12, the control unit 1400 sends an instruction to the leading person detection unit 1200, and causes the camera of the imaging unit 1210 to start imaging in the traveling direction (forward). In step S220, the signal processing unit 1220 of the leading person detection unit 1200 detects the image of the leading person (in this example, the person 13) in the captured image.
 ステップS230において、制御部1400は、周囲状況判定部1410によって、方向角θ2を演算してステップS240へ進む。ステップS240において、制御部1400は、算出した方向角θ2が所定の角度θ0より小さいか否かを判定する。θ2<θ0が成立する場合には、ステップS240を肯定判定してステップS260へ進む。θ2<θ0が成立しない場合には、ステップS40を否定判定してステップS250へ進む。 In step S230, the control unit 1400 causes the surrounding environment determination unit 1410 to calculate the directional angle θ2, and the process proceeds to step S240. In step S240, the control unit 1400 determines whether the calculated direction angle θ2 is smaller than a predetermined angle θ0. When θ2 <θ0 holds, the determination in step S240 is positive and the process proceeds to step S260. When θ2 <θ0 is not established, the determination in step S40 is negative and the process proceeds to step S250.
 ステップS250へ進む場合は、先導者である人物13の方向と使用者12の進行方向とが異なる。このためステップS250において、制御部1400は、上記方向角θ2の方向へ疑似力覚発生部1300から疑似力覚を発生させる。すなわち、制御部1400は、駆動信号生成部1430により、方向角θ2が示す方向へ疑似力覚を発生させる疑似力覚制御信号を生成して疑似力覚発生部1300(図10)へ出力し、ステップS210へ戻る。このように構成することにより、誘導装置1Aが使用者12に対し、人物13の方向へ移動方向を変えるように促す。ステップS210へ戻った後は以上説明した処理を繰り返す。 In the case of proceeding to step S250, the direction of the leading person 13 and the traveling direction of the user 12 are different. Therefore, in step S250, the control unit 1400 causes the pseudo force sense generation unit 1300 to generate a pseudo force sense in the direction of the direction angle θ2. That is, the control unit 1400 causes the drive signal generation unit 1430 to generate a pseudo force sense control signal for generating a simulated force sense in the direction indicated by the direction angle θ2, and outputs it to the simulated force sense generation unit 1300 (FIG. 10). It returns to step S210. By this configuration, the guiding device 1A urges the user 12 to change the moving direction in the direction of the person 13. After returning to step S210, the processing described above is repeated.
 ステップS260へ進む場合は、使用者12の進行方向と先導者である人物13の方向とが一致する。このため、ステップS260において、制御部1400は、周囲状況判定部1410によって、像倍率βを算出してステップS270へ進む。ステップS270において、制御部1400は、周囲状況判定部1410によって、像倍率βが所定の範囲の下限となる値β1より小さいか否かを判定する。β<β1が成立する場合には、ステップS270を肯定判定してステップS290へ進む。β<β1が成立しない場合には、ステップS270を否定判定してステップS280へ進む。 In the case of proceeding to step S260, the traveling direction of the user 12 matches the direction of the person 13 who is the leader. Therefore, in step S260, the control unit 1400 causes the surrounding environment determination unit 1410 to calculate the image magnification β, and the process proceeds to step S270. In step S270, the control unit 1400 causes the surrounding environment determination unit 1410 to determine whether the image magnification β is smaller than the value β1 which is the lower limit of the predetermined range. If β <β1 holds, the determination in step S270 is affirmative and the process proceeds to step S290. If β <β1 does not hold, the determination in step S270 is negative, and the process proceeds to step S280.
ステップS280において、制御部1400は、周囲状況判定部1410によって、算出した像倍率βが所定の範囲の上限となる値β2より大きいか否かを判定する。β>β2が成立する場合には、ステップS280を肯定判定してステップS310へ進む。β>β2が成立しない場合には、ステップS280を否定判定してステップS300へ進む。 In step S280, the control unit 1400 causes the surrounding environment determination unit 1410 to determine whether the calculated image magnification β is larger than a value β2 which is the upper limit of the predetermined range. When β> β2 holds, the determination in step S280 is positive and the process proceeds to step S310. If β> β2 is not established, the determination in step S280 is negative, and the process proceeds to step S300.
 ステップS290へ進む場合は、像倍率βが所定の範囲の下限となる値β1より小さい。上述したように、誘導装置1Aから先導者である人物13までの距離Yは、上式(2)で求められる。ステップS290へ進む場合の距離Yは、所定の範囲の像倍率βに対応する適切な距離よりも長い。このため、制御部1400は、ステップS290において疑似力覚発生部1300から進行方向に疑似力覚を発生させる。すなわち、制御部1400は、駆動信号生成部1430によって、進行方向へ疑似力覚を発生させる疑似力覚制御信号を生成して疑似力覚発生部1300(図10)へ出力し、ステップS260へ戻る。このように構成することにより、誘導装置1Aが使用者12に対し、進行方向への移動速度を上げて人物13に近づくように促す。ステップS260へ戻った後は以上説明した処理を繰り返す。 When the process proceeds to step S290, the image magnification β is smaller than the value β1 which is the lower limit of the predetermined range. As described above, the distance Y from the guiding device 1A to the person 13 who is the leader can be obtained by the above equation (2). The distance Y when proceeding to step S290 is longer than the appropriate distance corresponding to the image magnification β in the predetermined range. Therefore, in step S290, the control unit 1400 causes the artificial force sense generation unit 1300 to generate an artificial force sense in the traveling direction. That is, control unit 1400 causes drive signal generation unit 1430 to generate a pseudo force sense control signal for generating a pseudo force sense in the traveling direction, and outputs the generated force sense control signal to pseudo force sense generation unit 1300 (FIG. 10), and returns to step S260. . By this configuration, the guiding device 1A urges the user 12 to move closer to the traveling direction and to approach the person 13. After returning to step S260, the processing described above is repeated.
 ステップS300へ進む場合は、像倍率βが所定の範囲内に収まる。すなわち、ステップS300へ進む場合の距離Yは、所定の範囲の像倍率βに対応する適切な距離である。ステップS300において、制御部1400は、駆動信号生成部1430によって、疑似力覚を発生しないように疑似力覚制御信号を生成して疑似力覚発生部1300(図10)へ出力し、ステップS260へ戻る。このように構成することにより、誘導装置1Aが使用者12に対し、疑似力覚を発生させないで、現在の進行方向と現在の移動速度を保つように促す。ステップS260へ戻った後は以上説明した処理を繰り返す。 When the process proceeds to step S300, the image magnification β falls within the predetermined range. That is, the distance Y when proceeding to step S300 is an appropriate distance corresponding to the image magnification β in a predetermined range. In step S300, control unit 1400 causes drive signal generation unit 1430 to generate a pseudo force sense control signal so as not to generate a simulated force sense, and outputs the generated signal to simulated force sense generation unit 1300 (FIG. 10). Return. By this configuration, the guiding device 1A urges the user 12 to maintain the current traveling direction and the current traveling speed without generating a pseudo force sense. After returning to step S260, the processing described above is repeated.
 ステップS310へ進む場合は、像倍率βが所定の範囲の上限となる値β2より大きい。すなわち、ステップS310へ進む場合の距離Yは、所定の範囲の像倍率βに対応する適切な距離よりも短い。このため制御部1400は、ステップS310において、疑似力覚発生部1300から進行方向と逆方向に疑似力覚を発生させる。すなわち、制御部1400は、駆動信号生成部1430によって、進行方向と逆方向に疑似力覚を発生させる疑似力覚制御信号を生成して疑似力覚発生部1300(図10)へ出力し、ステップS260へ戻る。このように構成することにより、誘導装置1Aが使用者12に対し、進行方向への移動速度を下げて人物13から離れるように促す。ステップS260へ戻った後は以上説明した処理を繰り返す。 When the process proceeds to step S310, the image magnification β is larger than the value β2 which is the upper limit of the predetermined range. That is, the distance Y when proceeding to step S310 is shorter than the appropriate distance corresponding to the image magnification β in the predetermined range. Therefore, in step S310, the control unit 1400 causes the artificial force sense generation unit 1300 to generate an artificial force sense in the direction opposite to the traveling direction. That is, the control unit 1400 causes the drive signal generation unit 1430 to generate a pseudo force sense control signal for generating a simulated force sense in the direction opposite to the traveling direction, and outputs it to the simulated force sense generation unit 1300 (FIG. 10). Return to S260. By this configuration, the guiding device 1A urges the user 12 to move away from the person 13 by reducing the moving speed in the traveling direction. After returning to step S260, the processing described above is repeated.
 上述した第2の実施形態によれば、次の作用効果が得られる。
(1)使用者12が把持する誘導装置1Aは、先導者である人物13を検出する先導者検出部1200と、人物13までの方向を算出する制御部1400(周囲状況判定部1410)と、人物13の方向に疑似力覚を発生する疑似力覚発生部1300とを備える。このように構成したので、使用者12は、人物13の身体につかまらなくても、人物13に追従して進むことができる。
According to the second embodiment described above, the following effects can be obtained.
(1) The guiding device 1A held by the user 12 includes a leading person detection unit 1200 that detects the person 13 who is a leading person, and a control unit 1400 (a surrounding situation determination unit 1410) that calculates the direction to the person 13; And a pseudo force sense generation unit 1300 for generating a pseudo force sense in the direction of the person 13. With such a configuration, the user 12 can follow the person 13 without being caught by the body of the person 13.
(2)制御部1400(周囲状況判定部1410)は、人物13までの距離Yを算出し、疑似力覚発生部1300は、距離Yがあらかじめ定めた第1距離より長いと、人物13の方向に疑似力覚を発生する。このように構成したので、使用者12が先導する人物13から離れ過ぎると、人物13に近づくように使用者12に促すことができる。 (2) The control unit 1400 (the surrounding situation determination unit 1410) calculates the distance Y to the person 13, and the pseudo force sense generation unit 1300 determines the direction of the person 13 if the distance Y is longer than a first predetermined distance. To generate a pseudo force. Since it comprised in this way, if it separates from the person 13 who leads the user 12 too much, the user 12 can be urged | promoted so that the person 13 may be approached.
(3)疑似力覚発生部1300は、距離Yが上記第1距離よりさらに短い第2距離より短いと、人物13と逆方向に疑似力覚を発生する。このように構成したので、使用者12が先導する人物13に近づき過ぎると、人物13から離れるように使用者12に促すことができる。 (3) The pseudo force sense generation unit 1300 generates a pseudo force sense in the opposite direction to the person 13 when the distance Y is shorter than the second distance, which is shorter than the first distance. With such a configuration, when the user 12 approaches the leading person 13 too much, the user 12 can be urged to leave the person 13.
(4)疑似力覚発生部1300は、距離Yが上記第2距離より長く上記第1距離より短いと、疑似力覚を発生しない。このように構成したので、使用者12が先導する人物13に適切な距離を保って追従している場合には、使用者12に余計な力覚を与えないようにすることができる。また、余計な力覚を与えない距離として、第2距離から第1距離まで幅を持たせたので、ある距離を境に頻繁に向きが変わる力覚を発生することを防ぐことができる。 (4) The simulated force sense generation unit 1300 does not generate the simulated force sense if the distance Y is longer than the second distance and shorter than the first distance. With such a configuration, it is possible to prevent the user 12 from giving an extra sense of force when the user 12 follows the person 13 leading with keeping an appropriate distance. In addition, since a width from the second distance to the first distance is given as a distance that does not give an extra sense of force, it is possible to prevent generation of a sense of force that changes direction frequently at a certain distance.
(第2の実施形態の変形例1)
 第2の実施形態では、人物13の形状や大きさが既知であるので、1台のカメラ1210を用いて、制御部1400(周囲状況判定部1410)が像倍率βから距離Yを求める例を説明した。この代わりに、第1の実施形態と同様に、障害物検出部200の2台のカメラ210aおよび210bを用いて、その基線長と像の相関から三角測量の原理を用いて距離Yを算出するようにしてもよい。また、LIDAR、超音波測距装置など、他の距離測定手段を用いてもよい。
(Modification 1 of the second embodiment)
In the second embodiment, since the shape and size of the person 13 are known, an example in which the control unit 1400 (the surrounding situation determination unit 1410) obtains the distance Y from the image magnification β using one camera 1210 is described. explained. Instead of this, similarly to the first embodiment, using the two cameras 210a and 210b of the obstacle detection unit 200, the distance Y is calculated using the principle of triangulation from the correlation between the base length and the image. You may do so. In addition, other distance measuring means such as LIDAR and an ultrasonic distance measuring device may be used.
(第2の実施形態の変形例2)
 第2の実施形態では、誘導装置1Aから人物13までの距離Yを、所定の範囲の像倍率βに対応する適切な距離になるように制御する例を説明した。このような制御は、像倍率βの値を所定の範囲内に制御することと等価であるため、実質的に、撮像素子1212の撮像面上の人物13の像の大きさを所定の範囲内の大きさになるように制御するといえる。
(Modification 2 of the second embodiment)
In the second embodiment, an example in which the distance Y from the guidance device 1A to the person 13 is controlled to be an appropriate distance corresponding to the image magnification β in a predetermined range has been described. Since such control is equivalent to controlling the value of the image magnification β within a predetermined range, substantially, the size of the image of the person 13 on the imaging surface of the imaging element 1212 is within the predetermined range. It can be said that it controls to become the size of.
 変形例2では、撮像素子1212の撮像面上の人物13の像の大きさを所定の範囲内の大きさに制御する。このような処理について、図12のフローチャートを参照して説明する。制御部1400(周囲状況判定部1410)は、ステップS260の代わりに、撮像素子1212の撮像面における撮像面上の人物13の像の大きさを求め、ステップS270へ進む。 In the second modification, the size of the image of the person 13 on the imaging surface of the imaging element 1212 is controlled to a size within a predetermined range. Such processing will be described with reference to the flowchart of FIG. Control unit 1400 (surrounding situation judging unit 1410) obtains the size of the image of person 13 on the imaging surface of the imaging device 1212 instead of step S260, and proceeds to step S270.
 制御部1400(周囲状況判定部1410)は、ステップS270の代わりに、撮像素子1212の撮像面における人物13の像の大きさが所定の値より大きいか小さいかを判定する。人物13の像の大きさが所定の値より大きい場合は、誘導装置1Aから人物13までの距離Yが適切な距離よりも短いと判断し、ステップS310へ進む。この場合、疑似力覚発生部1300から進行方向と逆方向に疑似力覚を発生させる。これにより、誘導装置1Aが使用者12に対し、移動速度を下げて人物13から離れるように促す。 Control unit 1400 (surrounding situation determination unit 1410) determines whether the size of the image of person 13 on the imaging surface of imaging device 1212 is larger or smaller than a predetermined value instead of step S270. If the size of the image of the person 13 is larger than the predetermined value, it is determined that the distance Y from the guidance device 1A to the person 13 is shorter than an appropriate distance, and the process proceeds to step S310. In this case, the artificial force sense generation unit 1300 generates an artificial force sense in the direction opposite to the traveling direction. As a result, the guidance device 1A urges the user 12 to move away from the person 13 at a reduced moving speed.
 反対に、人物13の像の大きさが所定の値より小さい場合は、誘導装置1Aから人物13までの距離Yが適切な距離よりも長いと判断し、ステップS290へ進む。この場合、似力覚発生部1300から進行方向に疑似力覚を発生させる。これにより、誘導装置1Aが使用者12に対し、移動速度を上げて人物13に近づくように促す。
 なお、人物13の像の大きさが所定の大きさの範囲内にある場合は、ステップS300へ進み、疑似力覚発生部1300から疑似力覚を発生させない。これにより、誘導装置1Aが使用者12に対し、現在の進行方向と移動速度を保つように促す。
Conversely, when the size of the image of the person 13 is smaller than the predetermined value, it is determined that the distance Y from the guidance device 1A to the person 13 is longer than the appropriate distance, and the process proceeds to step S290. In this case, the haptic force generation unit 1300 generates a pseudo force sense in the traveling direction. As a result, the guidance device 1A urges the user 12 to move closer and move closer to the person 13.
When the size of the image of the person 13 is within the range of the predetermined size, the process proceeds to step S300, and the pseudo force sense generation unit 1300 does not generate the pseudo force sense. Thereby, the guidance device 1A urges the user 12 to maintain the current traveling direction and moving speed.
(第2の実施形態の変形例3)
 第2の実施形態では、制御部1400(周囲状況判定部1410)が、撮像素子1212上の人物13の像から撮像面上の肩幅の大きさを検出し、検出した値と実際の人物13の肩幅との比に基づき、像倍率βを算出した。変形例3では、肩幅の代わりに、人物13の着衣等にプリントされたマークの大きさを検出し、検出した値と実際のマークの大きさとの比に基づき、像倍率βを算出する。着衣等には、シャツ、セーター、上着、コート、着物等の衣類全般を含む。
(Modification 3 of the second embodiment)
In the second embodiment, the control unit 1400 (surrounding condition judging unit 1410) detects the size of the shoulder width on the imaging surface from the image of the person 13 on the imaging element 1212 and detects the detected value and the actual person 13. The image magnification β was calculated based on the ratio to the shoulder width. In the third modification, instead of the shoulder width, the size of the mark printed on the clothes or the like of the person 13 is detected, and the image magnification β is calculated based on the ratio of the detected value to the size of the actual mark. Clothing and the like includes all kinds of clothing such as shirts, sweaters, jackets, coats and kimonos.
 図13は、人物13の上着の背中部分にプリントされている1組のマーク131、132を例示する図である。制御部1400は、周囲状況判定部1410によって、撮像素子1212上の人物13の像からマーク131および132間の長さWを検出し、検出した値と実際にプリントされているマーク131および132間の長さW1(不図示)との比に基づき、像倍率βを算出する。ここで、長さW1は、予め制御部1400に設けた記憶部1420に記録しておくものとする。 FIG. 13 is a view illustrating a set of marks 131 and 132 printed on the back portion of the coat of the person 13. The control unit 1400 detects the length W between the marks 131 and 132 from the image of the person 13 on the imaging device 1212 by the surrounding condition determination unit 1410, and detects the detected value and the actually printed marks 131 and 132. The image magnification β is calculated based on the ratio to the length W1 (not shown). Here, it is assumed that the length W1 is recorded in advance in the storage unit 1420 provided in the control unit 1400.
 人物13の像の肩幅は、例えば、なで肩の場合に検出誤差が生じやすい。しかしながら、上述したマーク131および132は肩の形状にかかわらず明確な指標となり得ることから、検出誤差を抑えることができる。 The shoulder width of the image of the person 13 is likely to cause a detection error, for example, in the case of a shoulder. However, since the marks 131 and 132 described above can be clear indicators regardless of the shape of the shoulder, detection errors can be suppressed.
 人物13の上着の背中部分にプリントするマーク131、132に代えて、図形、文字、またはイラスト等をプリントしてもよい。制御部1400(周囲状況判定部1410)は、撮像素子1212上の人物13の像から図形、文字、またはイラスト等の長さを検出し、検出した値と実際にプリントされている図形、文字、またはイラスト等の長さとの比に基づき、像倍率βを算出する。実際にプリントされている図形、文字、またはイラスト等の長さは、予め制御部1400に設けた記憶部1420に記録しておくものとする。 Instead of the marks 131 and 132 printed on the back of the person 13, figures, characters, or illustrations may be printed. The control unit 1400 (ambient condition determination unit 1410) detects the length of a figure, a character, or an illustration from the image of the person 13 on the imaging device 1212 and detects the detected value and the figure, the character, or the like actually printed. Alternatively, the image magnification β is calculated based on the ratio to the length of the illustration or the like. It is assumed that the lengths of figures, characters, or illustrations actually printed are stored in advance in the storage unit 1420 provided in the control unit 1400.
 また、人物13の着衣等にプリントする代わりに、例えば図形、文字、またはイラスト等がプリントされたゼッケン等を使用してもよい。 Further, instead of printing on the clothes or the like of the person 13, for example, a bib on which figures, characters, or illustrations are printed may be used.
(第3の実施形態)
 発明の第3の実施形態では、第1の実施形態による誘導装置が障害物を避ける向きに使用者を誘導した後、使用者を元の進行方向へ誘導する場合の動作について、図面を参照して詳細に説明する。
Third Embodiment
In the third embodiment of the present invention, the operation in the case where the user is guided in the original traveling direction after the guiding device according to the first embodiment guides the user in the direction avoiding the obstacle will be described with reference to the drawings. Will be described in detail.
<誘導装置の構成>
 図14は、第3の実施形態による誘導装置1Bの構成を例示するブロック図である。図14において、誘導装置1Bは、物体までの距離と誘導装置1Bから見た物体の方向とを示す距離プロファイルを算出する障害物検出部2200と、制御部2400と、報知部2600と、疑似力覚発生部2300と、運動軌跡検出部2700とを含む。
<Structure of Guide Device>
FIG. 14 is a block diagram illustrating the configuration of a guiding device 1B according to the third embodiment. In FIG. 14, the guidance device 1B calculates an obstacle detection unit 2200 that calculates a distance profile indicating the distance to the object and the direction of the object viewed from the guidance device 1B, a control unit 2400, a notification unit 2600, and a pseudo force. A sense generation unit 2300 and a motion trajectory detection unit 2700 are included.
 第1の実施形態による誘導装置1の構成(図2)と比較すると、運動軌跡検出部2700が追加されている点において相違する。図14の障害物検出部2200は図2の障害物検出部200に対応し、図14の制御部2400は図2の制御部400に対応し、図14の疑似力覚発生部2300は図2の疑似力覚発生部300に対応し、図14の報知部2600は図2の報知部600に対応する。 Compared with the configuration (FIG. 2) of the guidance device 1 according to the first embodiment, the motion trajectory detection unit 2700 is different in that it is added. The obstacle detection unit 2200 in FIG. 14 corresponds to the obstacle detection unit 200 in FIG. 2, the control unit 2400 in FIG. 14 corresponds to the control unit 400 in FIG. 2, and the pseudo force sense generation unit 2300 in FIG. The informing unit 2600 in FIG. 14 corresponds to the informing unit 600 in FIG. 2.
 運動軌跡検出部2700は、例えば、ジャイロセンサと加速度センサを備えており、誘導装置1Bが移動した距離と方向を所定時間ごとに検出し、検出した情報に基づいて誘導装置1Bの運動軌跡を算出し、算出結果を内部の記憶部2710に記録する。 The motion trajectory detection unit 2700 includes, for example, a gyro sensor and an acceleration sensor, detects the distance and direction in which the guidance device 1B has moved for each predetermined time, and calculates the motion trajectory of the guidance device 1B based on the detected information. And stores the calculation result in the internal storage unit 2710.
 第3の実施形態による誘導装置1Bは、第1の実施形態と同様に、進行方向の障害物が検知されると代替方向へ迂回して障害物を回避するように使用者を誘導する。そして、障害物を回避した後に、運動軌跡検出部2700で検出された運動軌跡に基づいて、進行方向を代替方向から元の移動方向へ戻すように使用者を誘導する。 Similar to the first embodiment, the guidance device 1B according to the third embodiment guides the user to bypass the obstacle in the alternative direction when an obstacle in the traveling direction is detected. Then, after the obstacle is avoided, the user is guided to return the traveling direction from the alternative direction to the original traveling direction based on the movement locus detected by the movement locus detection unit 2700.
 図15は、制御部2400による誘導方向を説明する図である。図15において、使用者12は、地点Sから実線の矢印方向(上方向)へ進行する。使用者12は、誘導装置1Bを図1(b)に例示したように身体の前に把持する。誘導装置1Bは、図2の誘導装置1と同様に、障害物検出部2200により進行方向に存在する物体30を検出し、使用者12が地点Tへ達した時点で、代替方向(図15において右上)へ疑似力覚発生部2300により疑似力覚を発生させる。これにより、誘導された使用者12が進行方向を右上へ変更する。 FIG. 15 is a diagram for explaining the guiding direction by the control unit 2400. In FIG. 15, the user 12 travels from the point S in the solid arrow direction (upward direction). The user 12 holds the guiding device 1B in front of the body as illustrated in FIG. 1 (b). Similar to the guiding device 1 of FIG. 2, the guiding device 1B detects the object 30 present in the traveling direction by the obstacle detecting unit 2200, and when the user 12 reaches the point T, the alternative direction (in FIG. In the upper right), the artificial force sense is generated by the artificial force sense generation unit 2300. Thus, the guided user 12 changes the traveling direction to the upper right.
 制御部2400は、疑似力覚発生部2300から代替方向に疑似力覚を発生させた時点で、移動開始地点Sからその地点Tまでの運動軌跡を示す情報を運動軌跡検出部2700から読み取って、制御部2400内に設けられた記憶部2420に記録する。 When the pseudo force sense is generated from the pseudo force sense generation unit 2300 in the alternative direction, the control unit 2400 reads, from the motion trace detection unit 2700, information indicating a motion trace from the movement start point S to the point T. The data is recorded in the storage unit 2420 provided in the control unit 2400.
 制御部2400は、読み取った運動軌跡を示す情報に基づき運動軌跡(地点Sから地点T)を延長して本来ルートを算出し、本来ルートを示す情報を制御部2400内に設けられた記憶部2420に記録する。本来ルートとは、代替方向へ誘導される前のルート(すなわち地点Sから上方向へ向かうルート)であり、図15において破線の矢印で示す。 The control unit 2400 extends the movement locus (from the point S to the point T) based on the information indicating the read movement locus, calculates the route originally, and stores the information indicating the route originally in the control unit 2400. Record on The original route is the route before being guided in the alternative direction (that is, the route going upward from the point S), and is indicated by a dashed arrow in FIG.
 使用者12は、疑似力覚にしたがって代替方向(地点Tから実線の矢印方向(右上方向))へ進行することにより、物体30を迂回する。代替方向へ進行中に、障害物検出部2200は、進行方向の物体を繰り返し検出するとともに、本来ルートへ戻る方向(図15において左上)の物体30も検出する。制御部2400は、地点Tからの運動軌跡を示す情報を運動軌跡検出部2700から読み取って、制御部2400内に設けられた記憶部2420に記録する。 The user 12 bypasses the object 30 by advancing in the alternative direction (from the point T to the solid arrow direction (upper right direction)) in accordance with the pseudo force sense. While traveling in the alternative direction, the obstacle detection unit 2200 repeatedly detects an object in the traveling direction, and also detects an object 30 in the direction (upper left in FIG. 15) that is essentially returning to the route. The control unit 2400 reads information indicating the motion trajectory from the point T from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
 使用者12が物体30を回避する方向に歩行し始めると、制御部2400(周囲状況判定部2410)は、例えば進行方向に直交する方向、すなわち本来ルートへ戻る方向(図15において左上)に存在する物体の有無を判定する。障害物検出部2200によって本来ルートへ戻る方向に物体30が検出されなくなると(図15の例では地点Uへ達すると)、制御部2400は、疑似力覚発生部2300により本来ルートへ戻る方向へ疑似力覚を発生させる。これにより、誘導された使用者12が進行方向を左上へ変更する。 When the user 12 starts walking in the direction to avoid the object 30, the control unit 2400 (the surrounding situation determination unit 2410) is present, for example, in the direction orthogonal to the traveling direction, that is, in the original return direction (upper left in FIG. 15). Determine the presence or absence of the object to be When the obstacle detection unit 2200 no longer detects the object 30 in the direction returning to the root (when reaching the point U in the example of FIG. 15), the control unit 2400 causes the pseudo force sense generating unit 2300 to return in the direction essentially to the route Generate a pseudo force. Thereby, the guided user 12 changes the traveling direction to the upper left.
 使用者12は、疑似力覚にしたがって本来ルートへ戻る方向(地点Uから実線の矢印方向(左上方向))へ進行することにより、本来ルートに合流する地点Vへ向かう。地点Vへ進行中に、障害物検出部2200は、進行方向の物体を繰り返し検出する。制御部2400は、地点Uからの運動軌跡を示す情報を運動軌跡検出部2700から読み取って、制御部2400内に設けられた記憶部2420に記録する。 The user 12 travels to the point V which originally joins the route by advancing in the direction (the direction of the arrow of the solid line (upper left direction) from the point U) which originally returns to the route according to the pseudo force. While traveling to the point V, the obstacle detection unit 2200 repeatedly detects an object in the traveling direction. The control unit 2400 reads information indicating the motion trajectory from the point U from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
 制御部2400は、記憶部2420に記憶されている運動軌跡を示す情報に基づき、使用者12が本来ルートに合流する地点Vに達したことを確認する。すなわち、地点Uで変更した進行方向に延在するルートと地点Tで記憶した本来ルートとが交差する地点が地点Vであり、この地点Vを検出すると、制御部2400は、疑似力覚発生部2300により本来ルートの方向(図15において上)へ疑似力覚を発生させる。これにより、誘導された使用者12が進行方向を上へ変更する。 The control unit 2400 confirms that the user 12 has reached the point V at which the user 12 originally joins the route, based on the information indicating the movement locus stored in the storage unit 2420. That is, the point V where the route extending in the traveling direction changed at the point U and the original route stored at the point T intersect is the point V, and when this point V is detected, the control unit 2400 generates the artificial force sense generation unit By 2300, artificial force sense is generated in the direction of the route (up in FIG. 15). Thereby, the induced user 12 changes the traveling direction upward.
 本来ルートの方向へ進行中に、障害物検出部2200は、進行方向の物体を繰り返し検出する。制御部2400は、地点Vからの運動軌跡を示す情報を運動軌跡検出部2700から読み取って制御部2400内に設けられた記憶部2420に記録するとともに、使用者12が進行方向を本体ルートの方向へ変更したことを検出すると、疑似力覚発生部2300から疑似力覚の発生を停止させる。 The obstacle detection unit 2200 repeatedly detects an object in the traveling direction while the vehicle originally travels in the direction of the route. The control unit 2400 reads information indicating the movement locus from the point V from the movement locus detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400, and the user 12 proceeds in the direction of the main body route. When it is detected that the change has been made, generation of the simulated force sense is stopped from the simulated force sense generation unit 2300.
 図16は、制御部2400による誘導方向を説明する図である。図15を参照した説明では、本来ルートに沿って障害物が続けて存在する場合に本来ルートからだんだん離れてしまうことが想定される。そこで、制御部2400は、図15の地点Uに達しても(すなわち、物体30を迂回したことを確認しても)、本来ルートへ戻る方向(図16において左上)に物体30以外の他の物体40、50が検出されて本来ルートに戻ることができない場合は、地点Uから本来ルートとほぼ平行な方向(図16において上)へ、疑似力覚発生部2300から疑似力覚を発生させる。これにより、誘導された使用者12が進行方向を上へ変更する。 FIG. 16 is a diagram for explaining the guiding direction by the control unit 2400. In the description with reference to FIG. 15, it is assumed that when the obstacle continues to exist along the route, the route is gradually separated from the route. Therefore, even if the point U in FIG. 15 is reached (that is, even if it is confirmed that the object 30 has been bypassed), the control unit 2400 essentially returns to the route (upper left in FIG. 16) other than the object 30 When the objects 40 and 50 are detected and can not return to the root originally, the artificial force sense generator 2300 generates an artificial force sense from the point U in a direction substantially parallel to the root (up in FIG. 16). Thereby, the induced user 12 changes the traveling direction upward.
 使用者12は、疑似力覚にしたがって地点Uから実線の矢印方向(上方向)へ進行することにより、物体40、50を迂回する。地点Uから上方向へ進行中に、障害物検出部2200は、進行方向の物体を繰り返し検出するとともに、本来ルートへ戻る方向(図16において左上)の物体40、50をも繰り返し検出する。制御部2400は、地点Uからの運動軌跡を示す情報を運動軌跡検出部2700から読み取って制御部2400内に設けられた記憶部2420に記録する。 The user 12 bypasses the objects 40 and 50 by advancing in the solid arrow direction (upward direction) from the point U according to the pseudo force sense. While traveling upward from the point U, the obstacle detection unit 2200 repeatedly detects an object in the traveling direction, and also repeatedly detects an object 40, 50 in the direction (upper left in FIG. 16) which originally returns to the route. The control unit 2400 reads information indicating the motion trajectory from the point U from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
 図16の例では、使用者12が物体40、50を回避して地点Wへ達すると、障害物検出部2200によって本来ルートへ戻る方向(図16において左上)に物体40、50が検出されなくなる。制御部2400は、障害物検出部2200により本来ルートへ戻る方向の物体40、50が検出されなくなると、物体40、50を迂回したことを確認する。使用者12が地点Wへ達した時点で、制御部2400は、疑似力覚発生部2300から本来ルートへ戻る方向へ疑似力覚を発生させる。これにより、誘導された使用者12が進行方向を左上へ変更する。 In the example of FIG. 16, when the user 12 avoids the objects 40, 50 and reaches the point W, the obstacle detection unit 2200 no longer detects the objects 40, 50 in the direction (upper left in FIG. 16) returning to the root originally. . When the obstacle detection unit 2200 no longer detects the objects 40, 50 in the direction returning to the route, the control unit 2400 confirms that the objects 40, 50 have been bypassed. When the user 12 reaches the point W, the control unit 2400 generates a pseudo force sense in the direction from the pseudo force sense generation unit 2300 to the original route. Thereby, the guided user 12 changes the traveling direction to the upper left.
 使用者12は、疑似力覚にしたがって本来ルートへ戻る方向(地点Wから実線の矢印方向(左上方向))へ進行することにより、本来ルートに合流する地点V2へ向かう。制御部2400は、地点V2へ進行中に、障害物検出部2200により進行方向の物体を検出させる。制御部2400は、地点Wからの運動軌跡を示す情報を運動軌跡検出部2700から読み取って、制御部2400内に設けられた記憶部2420に記録する。 The user 12 travels to the point V2 which originally joins the route by advancing in the direction of returning to the route (from the point W to the solid arrow direction (upper left direction)) according to the pseudo force sense. The control unit 2400 causes the obstacle detection unit 2200 to detect an object in the traveling direction while traveling to the point V2. The control unit 2400 reads information indicating the motion trajectory from the point W from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400.
 制御部2400は、記憶部2420に記憶されている運動軌跡を示す情報に基づき、使用者12が本来ルートに合流する地点V2に達したことを確認する。制御部2400は、使用者12が地点V2へ達した時点で、疑似力覚発生部2300から本来ルートの方向(図16において上)へ疑似力覚を発生させる。これにより、誘導された使用者12が進行方向を上へ変更する。 The control unit 2400 confirms that the user 12 has reached the point V2 at which the user 12 originally joins the route based on the information indicating the movement trajectory stored in the storage unit 2420. When the user 12 reaches the point V2, the control unit 2400 causes the artificial force sense generation unit 2300 to generate an artificial force sense in the direction of the route (up in FIG. 16). Thereby, the induced user 12 changes the traveling direction upward.
 本来ルートの方向へ進行中に、障害物検出部2200は、進行方向の物体を繰り返し検出する。制御部2400は、地点V2からの運動軌跡を示す情報を運動軌跡検出部2700から読み取って、制御部2400内に設けられた記憶部2420に記録する。制御部2400は、使用者12が進行方向を本体ルートの方向へ変更したことを検出すると、疑似力覚発生部2300から疑似力覚の発生を停止させる。 The obstacle detection unit 2200 repeatedly detects an object in the traveling direction while the vehicle originally travels in the direction of the route. The control unit 2400 reads information indicating the motion trajectory from the point V2 from the motion trajectory detection unit 2700 and records the information in the storage unit 2420 provided in the control unit 2400. When the control unit 2400 detects that the user 12 changes the traveling direction to the direction of the main body route, the control unit 2400 causes the pseudo force sense generation unit 2300 to stop the generation of the pseudo force sense.
 上述した第3の実施形態によれば、次の作用効果が得られる。
(1)使用者12が把持する誘導装置1Bは、障害物30を回避可能な安全な代替方向へ使用者12を誘導した後、使用者12を元の進行方向(本来ルート)へ誘導することができる。また、疑似力覚発生部2300が発する疑似力覚によって使用者12に本来ルートの方向を伝えるため、進むべき方向を、使用者12に容易に理解できる方法でわかりやすく伝達することができる。
According to the third embodiment described above, the following effects can be obtained.
(1) The guiding device 1B held by the user 12 guides the user 12 in the original traveling direction (essentially the route) after guiding the user 12 in a safe alternative direction that can avoid the obstacle 30. Can. Further, since the direction of the route is originally transmitted to the user 12 by the pseudo-force sense generated by the pseudo-force sense generation unit 2300, the direction to be advanced can be transmitted to the user 12 in an easy-to-understand manner.
(2)誘導装置1Bは、使用者12を代替方向へ誘導している間、障害物検出部2200によって本来ルートの方向に障害物30が存在するか否かの検出を続ける。このように構成したので、制御部2400は、障害物30を迂回したか否かの判断を適切に行うことができる。 (2) While guiding the user 12 in the alternative direction, the guiding device 1B continues the detection of whether or not the obstacle 30 is originally present in the direction of the route by the obstacle detection unit 2200. Since it comprised in this way, the control part 2400 can determine appropriately whether the obstacle 30 was bypassed.
(3)誘導装置1Bは、制御部2400によって障害物30を迂回したことが判定されると、疑似力覚発生部2300が発生する疑似力覚の方向を本来ルートに向かう方向へ変更する。このように構成したので、障害物30がない安全な経路で、使用者12を本来ルートへ誘導することができる。 (3) When it is determined by the control unit 2400 that the obstacle 30 has been bypassed, the guidance device 1B changes the direction of the simulated force sense generated by the simulated force sense generation unit 2300 to the direction essentially toward the route. Since it comprised in this way, the user 12 can be naturally guide | induced to a route by the safe route without the obstruction 30. FIG.
(4)誘導装置1Bは、運動軌跡検出部2700によって検出された運動軌跡と、代替方向へ誘導する前の本来ルートの情報とに基づき、制御部2400によって本来ルートに合流したか否かを判定する。このように構成したので、合流地点Vに到達したことを適切に判定することができる。 (4) The guiding device 1B determines whether or not the control unit 2400 originally joins the route based on the motion locus detected by the motion locus detection unit 2700 and the information of the original route before guidance in the alternative direction. Do. Since it comprised in this way, having arrived at the confluence | merging point V can be determined appropriately.
(5)誘導装置1Bは、制御部2400によって合流地点Vに到達したことを判定すると、疑似力覚発生部2300による疑似力覚の発生方向を本来ルートの方向へ変更する。このように構成したので、使用者12を本来ルートへ適切に誘導することができる。 (5) When the control unit 2400 determines that the junction point V is reached, the guidance device 1B changes the generation direction of the simulated force sense by the simulated force sense generation unit 2300 to the direction of the route. With such a configuration, the user 12 can be properly guided to the route.
(6)誘導装置1Bは、合流地点Vに到達後、使用者12が本来ルートの方向へ進行方向を変更したことが制御部2400によって判定されると、疑似力覚発生部2300による疑似力覚の発生を停止する。このように構成したので、使用者12に対し、本来ルートへ戻ったことを適切に伝達することができる。 (6) When the control unit 2400 determines that the user 12 has originally changed the traveling direction in the direction of the route after reaching the junction V, the guiding device 1B causes the artificial force sense by the artificial force sense generation unit 2300 Stop the occurrence of With such a configuration, it is possible to properly notify the user 12 that the route has originally been returned.
(7)誘導装置1Bは、使用者12を代替方向へ誘導した後、図15の地点Uに達しても(すなわち、物体30を迂回したことを確認しても)、本来ルートへ戻る方向(図16において左上)に物体30以外の他の物体40、50が検出されて本来ルートに戻ることができない場合は、疑似力覚発生部2300による疑似力覚の発生方向を、地点Uから本来ルートとほぼ平行な方向(図16において上)とする。このように構成したので、本来ルートに沿って障害物が続けて存在する場合でも、使用者12が本来ルートからだんだん離れてしまうことを防止しつつ、使用者12を適切に誘導することができる。 (7) After guiding the user 12 in the alternative direction, the guiding device 1B essentially returns to the route even when reaching the point U in FIG. 15 (that is, even after confirming that the object 30 has been bypassed) ((7) When an object 40, 50 other than the object 30 is detected in the upper left in FIG. 16 and can not return to the root originally, the generation direction of the artificial force sense by the artificial force sense generation unit 2300 is the route originally from the point U And the direction (upper in FIG. 16). With this configuration, even when an obstacle continues along the route, the user 12 can be properly guided while preventing the user 12 from gradually getting away from the route. .
 以上の説明では、誘導装置1Bの制御部2400が、運動軌跡を示す情報に基づき運動軌跡(地点Sから地点T)を延長して本来ルートを算出したが、以下のように取得してもよい。例えば、誘導装置1Bの制御部2400は、予め不図示のネットワークを介して地図情報を取得し、地図情報を制御部2400内に設けられた記憶部2420に記録する。 In the above description, the control unit 2400 of the guidance device 1B extends the movement locus (from the point S to the point T) to calculate the route originally based on the information indicating the movement locus, but the route may be acquired as follows . For example, the control unit 2400 of the guidance device 1B acquires map information in advance via a network (not shown), and records the map information in the storage unit 2420 provided in the control unit 2400.
 制御部2400は、例えばGPS(Global Positioning System)衛星からの情報に基づいて算出した現在位置を地点Sとし、予め使用者12によって設定入力された目的地を地点Gとし、記憶部2420に記憶されている地図情報に基づいて算出した地点Sから地点Gへ向かう経路を、本来ルートとする。 For example, the control unit 2400 sets the current position calculated based on information from a GPS (Global Positioning System) satellite as a point S, sets the destination set and input by the user 12 in advance as a point G, and stores it in the storage unit 2420 The route from the point S to the point G calculated based on the map information in question is originally taken as the route.
 次のような変形も本発明の範囲内であり、変形例の一つ、もしくは複数を上述の実施形態と組み合わせることも可能である。 The following modifications are also within the scope of the present invention, and one or more of the modifications can be combined with the above-described embodiment.
 以上の説明では、視覚障碍者用の誘導装置を例示したが、誘導装置は視覚障碍者用に限られない。例えば、災害時に煙などで避難経路が確認困難な場合の誘導装置、あるいは、登山遭難時の誘導装置に用いてもよい。その場合は、障害物検出部としては、煙や霧による散乱の影響を受けにくい赤外光などを受光する検出部を用いる。 Although the above description exemplifies the guidance device for the visually impaired, the guidance device is not limited to the visually impaired. For example, it may be used as a guiding device when the evacuation route is difficult to confirm due to smoke or the like at the time of disaster, or a guiding device at the time of mountain climbing disaster. In that case, as the obstacle detection unit, a detection unit that receives infrared light or the like that is not easily affected by scattering by smoke or fog is used.
 上記では、種々の実施形態および変形例を説明したが、本発明はこれらの内容に限定されるものではない。実施形態および変形例で示された各構成を組み合わせて用いる態様も本発明の範囲内に含まれる。本発明の技術的思想の範囲内で考えられるその他の態様も本発明の範囲内に含まれる。 Although various embodiments and modifications have been described above, the present invention is not limited to these contents. The aspect used combining each structure shown by embodiment and a modification is also contained in the scope of the present invention. Other embodiments considered within the scope of the technical idea of the present invention are also included within the scope of the present invention.
 次の日本国出願公開公報の開示内容は引用文としてここに組み込まれる。
 日本国特開2015-226388号公報
The disclosure content of the following Japanese Patent Application Publications is disclosed herein as a quotation.
Japanese Patent Application Laid-Open No. 2015-226388
1、1A、1B…誘導装置
2~5、30、40、50…障害物
12…使用者
13…人物
200、2200…障害物検出部
300、1300、2300…疑似力覚発生部
400、1400、2400…制御部
1200…先導者検出部
2700…運動軌跡検出部
1, 1A, 1B ... guiding device 2-5, 30, 40, 50 ... obstacle 12 ... user 13 ... person 200, 2200 ... obstacle detection unit 300, 1300, 2300 ... simulated force sense generation unit 400, 1400, 2400 ... control unit 1200 ... leading person detection unit 2700 ... motion locus detection unit

Claims (14)

  1.  周囲の障害物までの距離および方向を検出する障害物検出部と、
     前記障害物検出部の検出結果に基づいて、進行方向に代わる代替方向を検出する代替方向検出部と、
     前記代替方向に疑似力覚を発生する疑似力覚発生部とを備える誘導装置。
    An obstacle detection unit that detects the distance and direction to the surrounding obstacle;
    An alternative direction detection unit that detects an alternative direction instead of the traveling direction based on the detection result of the obstacle detection unit;
    And an artificial force sense generating unit for generating an artificial force sense in the alternative direction.
  2.  請求項1に記載の誘導装置において、
     前記障害物検出部は、前記障害物までの距離と方向に関する距離プロファイルを生成し、
     前記代替方向検出部は、前記距離プロファイルに基づいて、進行方向に代わる代替方向を検出する誘導装置。
    In the induction device according to claim 1,
    The obstacle detection unit generates a distance profile regarding the distance and direction to the obstacle,
    The guidance device according to claim 1, wherein the alternative direction detection unit detects an alternative direction instead of the traveling direction based on the distance profile.
  3.  請求項1または2記載の誘導装置において、
     前記疑似力覚発生部は、前記代替方向検出部によって前記代替方向が検出されない場合には、前記進行方向と逆方向に疑似力覚を発生する誘導装置。
    In the induction device according to claim 1 or 2,
    The guidance apparatus, wherein the pseudo force sense generation unit generates a pseudo force sense in the opposite direction to the traveling direction when the alternative direction is not detected by the alternative direction detection unit.
  4.  請求項1から3のいずれか一項に記載の誘導装置において、
     前記障害物検出部は、異なる複数の視点から前記障害物を撮像する複数の撮像部を含む誘導装置。
    In the induction device according to any one of claims 1 to 3,
    The guidance apparatus, wherein the obstacle detection unit includes a plurality of imaging units for imaging the obstacle from a plurality of different viewpoints.
  5.  請求項1から4のいずれか一項に記載の誘導装置において、
     前記疑似力覚発生部は、慣性体と、前記慣性体を所定の方向に非対称振動させる駆動部とを備える誘導装置。
    In the induction device according to any one of claims 1 to 4,
    The guidance apparatus according to claim 1, wherein the artificial force sense generation unit includes an inertial body and a drive unit configured to asymmetrically vibrate the inertial body in a predetermined direction.
  6.  請求項1から5までのいずれか一項に記載の誘導装置において、
     前記代替方向検出部は、前記進行方向を含む第1の角度範囲のうち第1距離以内に前記障害物が存在すると、前記代替方向の検出を開始する誘導装置。
    In the induction device according to any one of claims 1 to 5,
    The guidance apparatus according to claim 1, wherein the alternative direction detection unit starts detection of the alternative direction when the obstacle is present within a first distance within a first angle range including the traveling direction.
  7.  請求項6に記載の誘導装置において、
     前記代替方向検出部は、前記進行方向と異なる第1方向を含む第2の角度範囲のうち前記第1距離より遠い第2距離以内の第1領域に前記障害物が存在しない場合には、前記第1方向を前記代替方向として検出する誘導装置。
    In the induction device according to claim 6,
    In the case where the obstacle does not exist in a first area within a second distance far from the first distance in a second angle range including a first direction different from the traveling direction, the alternative direction detection unit does not An induction device that detects a first direction as the alternative direction.
  8.  請求項7に記載の誘導装置において、
     前記代替方向検出部は、前記第1領域に前記障害物が存在する場合には、前記第1方向を前記代替方向として検出しない誘導装置。
    In the induction device according to claim 7,
    A guidance device in which the alternative direction detection unit does not detect the first direction as the alternative direction when the obstacle is present in the first region.
  9.  請求項7に記載の誘導装置において、
     前記代替方向検出部は、前記第1領域に前記障害物が存在せず、かつ、前記進行方向と異なる第2方向を含む前記第2の角度範囲のうち前記第2距離以内の第2領域にも前記障害物が存在しない場合には、前記第1方向および前記第2方向のうち進行方向に近い方を前記代替方向として検出する誘導装置。
    In the induction device according to claim 7,
    The alternative direction detection unit does not have the obstacle in the first area, and a second area within the second distance in the second angle range including a second direction different from the traveling direction. Also, in the case where the obstacle does not exist, a guidance device that detects one of the first direction and the second direction that is closer to the traveling direction as the alternative direction.
  10.  請求項6から9までのいずれか一項に記載の誘導装置において、
     前記疑似力覚発生部は、前記進行方向を含む第1の角度範囲のうち前記第1距離より近い第3距離以内に前記障害物が存在する場合には、前記進行方向と逆方向に疑似力覚を発生する誘導装置。
    In the induction device according to any one of claims 6 to 9,
    The pseudo force sense generation unit is configured to generate the pseudo force in the direction opposite to the traveling direction when the obstacle is present within a third distance closer than the first distance within a first angle range including the traveling direction. An induction device that generates a sense of mind.
  11.  先導者を検出する先導者検出部と、
     前記先導者までの方向を算出する算出部と、
     前記先導者の方向に疑似力覚を発生する疑似力覚発生部とを備える誘導装置。
    A leader detection unit for detecting a leader;
    A calculation unit that calculates the direction to the leader;
    A guidance apparatus comprising: a simulated force sense generation unit that generates a simulated force sense in the direction of the leader.
  12.  請求項11に記載の誘導装置において、
     前記算出部は、前記先導者までの距離を算出し、
     前記疑似力覚発生部は、前記距離が第1距離より長いと、前記先導者の方向に疑似力覚を発生する誘導装置。
    In the induction device according to claim 11,
    The calculation unit calculates the distance to the leader.
    The pseudo-force sense generation unit generates a pseudo-force sense in the direction of the leader when the distance is longer than a first distance.
  13.  請求項12に記載の誘導装置において、
     前記疑似力覚発生部は、前記距離が第1距離より短い第2距離より短いと、前記先導者と逆方向に疑似力覚を発生する誘導装置。
    In the induction device according to claim 12,
    The guidance apparatus, wherein the pseudo force sense generation unit generates a pseudo force sense in the opposite direction to the leader, when the distance is shorter than a second distance shorter than the first distance.
  14.  請求項13に記載の誘導装置において、
     前記疑似力覚発生部は、前記距離が前記第2距離より長く前記第1距離より短いと、疑似力覚を発生しない誘導装置。
    In the induction device according to claim 13,
    The pseudo force sense generation unit is a guidance device that does not generate a sense of force sense when the distance is longer than the second distance and shorter than the first distance.
PCT/JP2017/046037 2017-12-21 2017-12-21 Guiding device WO2019123622A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/046037 WO2019123622A1 (en) 2017-12-21 2017-12-21 Guiding device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/046037 WO2019123622A1 (en) 2017-12-21 2017-12-21 Guiding device

Publications (1)

Publication Number Publication Date
WO2019123622A1 true WO2019123622A1 (en) 2019-06-27

Family

ID=66993259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046037 WO2019123622A1 (en) 2017-12-21 2017-12-21 Guiding device

Country Status (1)

Country Link
WO (1) WO2019123622A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009050A1 (en) * 2018-07-04 2020-01-09 日本電信電話株式会社 Pseudo-tactile force presentation device
CN111728834A (en) * 2020-07-27 2020-10-02 王然冉 Handheld portable blind guider

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11221259A (en) * 1998-02-09 1999-08-17 Mitsubishi Electric Corp Guide apparatus for blind person
JP2001061916A (en) * 1999-08-31 2001-03-13 Toshiba Corp Device and method for obstruction teaching
JP2006134221A (en) * 2004-11-09 2006-05-25 Matsushita Electric Ind Co Ltd Tracking mobile device
JP2007271701A (en) * 2006-03-30 2007-10-18 Taketaka Yamamoto Obstacle recognition device for wheelchair for visually impaired person
JP2010282442A (en) * 2009-06-04 2010-12-16 Panasonic Electric Works Co Ltd Autonomous mobile device
JP2011224136A (en) * 2010-04-20 2011-11-10 Waseda Univ Force sense presentation device
JP2014092863A (en) * 2012-11-01 2014-05-19 Symtec Hozumi:Kk Tracking bogie
JP2015226388A (en) * 2014-05-28 2015-12-14 日本電信電話株式会社 Acceleration generating device
JP2017042251A (en) * 2015-08-25 2017-03-02 雅治 石塚 Walking assisting white stick

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11221259A (en) * 1998-02-09 1999-08-17 Mitsubishi Electric Corp Guide apparatus for blind person
JP2001061916A (en) * 1999-08-31 2001-03-13 Toshiba Corp Device and method for obstruction teaching
JP2006134221A (en) * 2004-11-09 2006-05-25 Matsushita Electric Ind Co Ltd Tracking mobile device
JP2007271701A (en) * 2006-03-30 2007-10-18 Taketaka Yamamoto Obstacle recognition device for wheelchair for visually impaired person
JP2010282442A (en) * 2009-06-04 2010-12-16 Panasonic Electric Works Co Ltd Autonomous mobile device
JP2011224136A (en) * 2010-04-20 2011-11-10 Waseda Univ Force sense presentation device
JP2014092863A (en) * 2012-11-01 2014-05-19 Symtec Hozumi:Kk Tracking bogie
JP2015226388A (en) * 2014-05-28 2015-12-14 日本電信電話株式会社 Acceleration generating device
JP2017042251A (en) * 2015-08-25 2017-03-02 雅治 石塚 Walking assisting white stick

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009050A1 (en) * 2018-07-04 2020-01-09 日本電信電話株式会社 Pseudo-tactile force presentation device
JP2020009006A (en) * 2018-07-04 2020-01-16 日本電信電話株式会社 Pseudo force sense presentation device
JP2021185519A (en) * 2018-07-04 2021-12-09 日本電信電話株式会社 Pseudo force sense presentation device
JP7107419B2 (en) 2018-07-04 2022-07-27 日本電信電話株式会社 Pseudo force sense presentation device
US11450184B2 (en) 2018-07-04 2022-09-20 Nippon Telegraph And Telephone Corporation Pseudo force sense generation apparatus
CN111728834A (en) * 2020-07-27 2020-10-02 王然冉 Handheld portable blind guider

Similar Documents

Publication Publication Date Title
AU2015262344B2 (en) Processing apparatus, processing system, processing program, and processing method
US10387733B2 (en) Processing apparatus, processing system, and processing method
KR101231510B1 (en) System for alarming a danger coupled with driver-viewing direction, thereof method and vehicle for using the same
JP4715325B2 (en) Information display device
US20190047588A1 (en) Driver state recognition apparatus, driver state recognition system, and driver state recognition method
JP5863607B2 (en) Pedestrian warning device
JP2016001170A (en) Processing unit, processing program and processing method
JP6693427B2 (en) Driver status detector
JP2012045706A (en) Device and method of preventing collision for remote control of mobile robot
KR20160125215A (en) The drone, the route guidance drone set and the method of route guidance using them
US11790783B2 (en) Pedestrian device, vehicle-mounted device, mobile body guidance system, and mobile body guidance method
CN109106563A (en) A kind of automation blind-guide device based on deep learning algorithm
KR20190083727A (en) Guide robot and operating method thereof
JP6501035B2 (en) Glasses-type wearable information terminal, control method thereof and control program
WO2019123622A1 (en) Guiding device
JPWO2019131143A1 (en) Information processing equipment, information processing methods, and programs
JP4270010B2 (en) Object danger judgment device
JP7276112B2 (en) Lane change decision device
JP2007230314A (en) Video presenting device and video presenting method
JP2007218655A (en) Navigation device
JP2008305283A (en) Drive support apparatus and drive support method
JP2014174091A (en) Information providing device and information providing program
JP2014174880A (en) Information processor and information program
JP2014174879A (en) Information processor and information program
JP2020144417A (en) Risk acquisition system, risk display system, and risk acquisition program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17935561

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17935561

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP