WO2014122750A1 - Mobile object - Google Patents

Mobile object Download PDF

Info

Publication number
WO2014122750A1
WO2014122750A1 PCT/JP2013/052888 JP2013052888W WO2014122750A1 WO 2014122750 A1 WO2014122750 A1 WO 2014122750A1 JP 2013052888 W JP2013052888 W JP 2013052888W WO 2014122750 A1 WO2014122750 A1 WO 2014122750A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
display content
person
unit
information
Prior art date
Application number
PCT/JP2013/052888
Other languages
French (fr)
Japanese (ja)
Inventor
丈二 五十棲
伸幸 中根
英明 野村
Original Assignee
富士機械製造株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士機械製造株式会社 filed Critical 富士機械製造株式会社
Priority to PCT/JP2013/052888 priority Critical patent/WO2014122750A1/en
Priority to JP2014560569A priority patent/JP6258875B2/en
Publication of WO2014122750A1 publication Critical patent/WO2014122750A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories
    • A61G5/14Standing-up or sitting-down aids
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven

Definitions

  • the present invention relates to a moving body that is allowed to move in a living area where a person lives and moves in the living area by driving by a driving source.
  • Patent Document 1 As one type of mobile body, one shown in Patent Document 1 is known. As shown in FIG. 1 of Patent Document 1, the moving body includes a notification unit 3 that notifies the operator that the operation mode has been switched to the autonomous movement mode. As the notification unit 3, a display unit 81 provided in the control unit 8 and including an image data processing unit 811 and a display 812 is described. In addition to the display unit 81, a sound source unit 83 including a speaker 831 and a sound data processing unit 832, and a reaction force applying unit 84 that applies a reaction force to the occupant's operation of the joystick 821 are described. .
  • the operator can recognize by the notification unit 3 that the operation mode has been switched to the autonomous movement mode, which is one of the states of the mobile body.
  • the notification unit 3 is one of the states of the mobile body.
  • a person in the area cannot recognize the state of the moving body unless he / she approaches the moving body and looks at the display unit 81 of the moving body.
  • Patent Document 2 As another type of mobile object, one disclosed in Patent Document 2 is known. As shown in Patent Document 2, the moving body is a robot for interacting with a person, an annunciator for indicating the presence of the robot, and a beacon for optically indicating the presence of the robot. (Paragraph 0051). It is described that when the robot is moving, the annunciator may emit a characteristic sound continuously or periodically, and the beacon may flash the strobe.
  • the present invention has been made to solve the above-described problems, and it is an object of the present invention to reliably convey the state of the moving body such as the traveling direction to a person around the moving body. To do.
  • a moving body is a moving body that is allowed to move in a living area where a person lives and moves within the living area by driving by a driving source, and is around the moving body.
  • An irradiation device that irradiates display content on the floor, wall, and ceiling of the display, a display content deriving unit that derives display content indicating state information that is the state of the moving object, and a display content derived by the display content deriving unit.
  • an irradiating unit for irradiating with an irradiating device.
  • FIG. 4 is a block diagram showing the control device. It is the trace figure which shows each arrow which shows the advancing direction of a moving body, and each passage plan area
  • FIG. 1 is a schematic view showing an outline of a care center 10 in which the moving body 20 is arranged.
  • the care center 10 is provided with a station 11, a training room 12, and individual rooms 13a to 13d.
  • the care center 10 is a living area where people live.
  • a person existing in the care center 10 is a person M1 who needs assistance or a person M2 who helps the person M1.
  • the station 11 is a place where the assistant M2 is packed, and is a base on which the mobile unit 20 is waiting or charged.
  • the moving body 20 is allowed to move in a living area where a person lives, and moves in the living area by driving by left and right drive wheel motors 21g and 21h as drive sources.
  • the training room 12 is a room where the person being assisted M1 performs training and rehabilitation.
  • Each of the private rooms 13a to 13d is a room in which the care recipient M1 lives.
  • the station 11, the training room 12, and the individual rooms 13a to 13d are provided with entrances / exits 11a, 12a, 13a1 to 13d1, respectively, and the entrances / exits 11a, 12a, and 13a1 to 13d1 are connected via a passage 14. .
  • an arrow in the vicinity of the moving body 20 indicates the traveling direction of the moving body 20.
  • the moving body 20 is an assistance moving body for assisting the person being assisted M1. As shown in FIGS. 2 and 3, the moving body 20 includes a base 21, a robot arm unit 22, a holding unit 23, a handle 24, an irradiation device 25, and a control device 26.
  • the base 21 includes left and right base portions 21a and 21b and left and right leg portions 21c and 21d.
  • the left and right base portions 21a and 21b are disposed at a predetermined distance in the left and right direction.
  • the left and right base portions 21a and 21b are provided with left and right drive wheels 21e and 21f, respectively, and drive the left and right drive wheels 21e and 21f, respectively.
  • the left and right drive wheel motors 21g and 21h (drive sources) are incorporated.
  • the moving body 20 travels by left and right drive wheels 21e and 21f driven by left and right drive wheel motors 21g and 21h (drive sources), respectively.
  • the left and right leg portions 21c and 21d extend horizontally from the left and right base portions 21a and 21b in the forward direction.
  • Left and right driven wheels 21i and 21j are provided at the distal ends of the left and right leg portions 21c and 21d, respectively.
  • a pair of collision prevention sensors 21k and 21l are provided at the ends of the left and right leg portions 21c and 21d, respectively.
  • the collision prevention sensors 21k and 21l are sensors that detect obstacles, and the detection signals thereof are transmitted to the control device 26.
  • the robot arm unit 22 is attached to the base 21 and can be moved relative to each other by a drive unit including the first and second rotation motors 22a1b and 22b2 and the slide motor 22a2a.
  • a plurality of arms 22a, 22b, and 22c are provided.
  • the base of the first arm 22 a is attached to the base 21.
  • the first arm 22a includes a slide base 22a1, a first slide 22a2, and a second slide 22a3.
  • the slide base 22a1 is formed in a substantially rectangular parallelepiped shape, and its base end is attached to the base 21 so as to be rotatable around the first rotation shaft 22a1a.
  • the slide base 22a1 is rotationally driven by a first rotation motor 22a1b.
  • the first rotation motor 22 a 1 b is provided on the base 21.
  • a knee sensor 22a1c that detects the distance from the knee of the person being assisted M1 is provided on the front surface of the slide base 22a1.
  • the knee sensor 22a1c is composed of, for example, an ultrasonic sensor or a distance sensor using a laser.
  • the first slide portion 22a2 is formed in a substantially rectangular parallelepiped shape, and is configured to be smaller than the slide base portion 22a1.
  • the first slide portion 22a2 slides in the longitudinal direction (axial direction) with respect to the slide base portion 22a1, and is configured to be substantially accommodated in the slide base portion 22a1 when contracted.
  • the second slide portion 22a3 is formed in a substantially rectangular parallelepiped shape, and is configured to be smaller than the first slide portion 22a2.
  • the second slide portion 22a3 slides in the longitudinal direction (axial direction) with respect to the first slide portion 22a2, and is configured to be substantially accommodated in the first slide portion 22a2 when contracted. .
  • the first slide portion 22a2 is provided with a slide motor 22a2a and a drive belt (not shown) driven by the slide motor 22a2a.
  • a slide base portion 22a1 and a second slide portion 22a3 are fixed to the drive belt.
  • the slide motor 22a2a is driven, the first slide portion 22a2 extends (or contracts) along the axial direction with respect to the slide base portion 22a1, and at the same time, the second slide portion 22a3 extends (or contracts) with respect to the first slide portion 22a2. Shrink).
  • the second arm 22b is formed in a substantially rectangular parallelepiped shape, and is formed at the distal end portion of the second slide portion 22a3 so as to extend in a direction orthogonal to the longitudinal direction (forward direction).
  • the third arm 22c is formed in a substantially rectangular parallelepiped shape, and the base end portion thereof is attached to the distal end portion of the second arm 22b so as to be rotatable around the second rotation shaft 22b1.
  • the third arm 22c is rotationally driven by the second rotation motor 22b2.
  • the second rotation motor 22b2 is accommodated in the second arm 22b.
  • the holding part 23 is fixed to the tip of the third arm 22c.
  • the holding portion 23 is a member that supports both arms (both sides) from the lower side when facing the person being assisted M1 in the standing and sitting movements of the person being assisted, for example, in the forward direction. It is formed in a substantially U shape in plan view that opens toward the top.
  • the holding portion 23 is formed of, for example, a relatively soft material on the premise that the holding portion 23 contacts the person being assisted.
  • the handle 24 is fixed to the upper surface of the third arm 22c.
  • the handle 24 is composed of a pair of left and right bar-shaped handles, and is gripped by the left and right hands of the person being assisted M1.
  • the handle 24 is provided with contact sensors 24a and 24b that detect gripping.
  • the handle 24 is provided with a left turning switch 24c for turning the moving body 20 to the left and a right turning switch 24d for turning the moving body 20 to the right. Further, the handle 24 is provided with a stop switch 24e for stopping the moving body 20.
  • the third arm 22c receives from the person being assisted M1 when walking while the person being assisted M1 is supported by the holding unit 23 or when the person being assisted is holding the handle 24.
  • a load sensor 22c1 for detecting the force received is provided.
  • the load sensor 22c1 detects a strain amount of the strain generating body that changes due to a change in load as a voltage change, or when pressure is applied to the silicon chip, the gauge resistance changes according to the deflection and is converted into an electric signal.
  • a semiconductor pressure sensor for example, a semiconductor pressure sensor.
  • the irradiation device 25 is a device that irradiates display contents on the floor surface, wall surface, and ceiling surface around the moving body 20.
  • the irradiation device 25 is a device that displays an image or video by enlarging and projecting the image or video on a floor surface, a wall surface, or a ceiling surface.
  • the irradiation device 25 is configured in the same manner as, for example, a projector or a projector, and includes a light source, a lens, and the like.
  • the irradiation device 25 transmits the light emitted from the light source through the transmissive liquid crystal on which an image, video, or the like is displayed, and irradiates the image, video, or the like to the floor surface, wall surface, or ceiling surface through the lens. It is configured.
  • each of the white light is reflected by a reflecting mirror made up of a number of micro-mirrors whose reflection angles can be controlled independently, and the reflected light passes through the lens. You may make it comprise so that a floor surface, a wall surface, and a ceiling surface may be irradiated. In this case, the reflection angle of each micromirror is controlled according to the image and video.
  • the irradiation device 25 is provided on each of the front surface of the slide base portion 22a1 and the back surface of the first slide portion 22a2.
  • the irradiation device 25 provided on the front surface of the slide base 22a1 irradiates an image or video on the floor or wall surface in front of the moving body 20.
  • the irradiation device 25 provided on the back surface of the first slide portion 22a2 irradiates the floor surface, wall surface, or ceiling surface behind the moving body 20 with an image or video.
  • the irradiation device 25 may adjust the focus and brightness of the image or video emitted by the irradiation device 25 and adjust the brightness based on the result of imaging by the imaging device 28.
  • the control device 26 controls the traveling and irradiation of the moving body 20.
  • the control device 26 includes the above-described collision prevention sensors 21k and 21l, knee sensors 22a1c, load sensors 22c1, contact sensors 24a and 24b, a left turn switch 24c, a right turn switch 24d, a stop switch 24e,
  • the left and right drive wheel motors 21g and 21h, the first rotation motor 22a1b, the slide motor 22a2a, the second rotation motor 22b2, the irradiation device 25, the storage device 27, the imaging device 28, and the guide devices 29 and 31 are connected.
  • the control device 26 includes a microcomputer (not shown), and the microcomputer includes an input / output interface, a CPU, a RAM, and a ROM (all not shown) connected via a bus.
  • the control device 26 includes an information acquisition unit 26a, an information deriving unit 26b, a display content deriving unit 26c, and an irradiation unit 26d.
  • the information acquisition unit 26 a acquires information related to the control of the moving body 20.
  • the information acquisition unit 26a there is a steering angle information acquisition unit that acquires steering angle information related to the steering angle of the moving body 20.
  • the steering angle information the left steering angle (or right steering angle) obtained by converting the on-time of the left turning switch 24c (or right turning switch 24d) into the steering angle, or the left drive wheel motor 21g.
  • the information deriving unit 26b derives information regarding the state of the moving body 20 as state information from the information acquired by the information acquiring unit 26a.
  • the information deriving unit 26b there is a progress information deriving unit that derives progress information regarding the progress of the moving body 20 as state information from the rudder angle information acquired by the rudder angle information acquiring unit 26a.
  • the progress information the left turn, the right turn, the straight direction (going straight forward), and the backward direction (going straight forward), the turning degree (turning angle, turning radius)
  • the steering angle information the left turn, the right turn, the straight direction (going straight forward), and the backward direction (going straight forward)
  • the turning degree turning angle, turning radius
  • the information acquisition unit 26a includes information on whether the moving body 20 is in a normal state or an abnormal state (a control signal indicating a normal state or an abnormal state), whether the moving body 20 is in a standing operation, or in a seating operation. Also, information related to the operation state such as whether it is (an operation signal such as a standing operation signal for instructing a standing operation and a seating operation signal for instructing a seating operation) is acquired.
  • the abnormal state is a state where the moving body 20 cannot travel (for example, due to an abnormality of the left drive wheel motor 21g) or a state where the robot arm unit 22 cannot be driven (for example, due to an abnormality of the first rotation motor 22a1b).
  • the normal state is a state in which the moving body 20 is normal as a whole. Furthermore, the information deriving unit 26b provides information on the state of the moving body 20 (normal state or abnormal state, or standing state or sitting state) from these signals. Derived as state information.
  • the display content deriving unit 26c derives display content indicating state information that is the state of the moving body 20.
  • the display contents are composed of characters, figures, symbols, shapes, and the like representing the contents. For example, when the status information is progress information, display contents indicating the travel direction and the degree of turning are derived. That is, the display content deriving unit 26c derives display content corresponding to the progress information from the progress information derived by the progress information deriving unit 26b.
  • the corresponding arrows are read from the storage device 27. Specifically, when the traveling direction is a left turn, an arrow indicating a left turn shown in FIG. 6 is read. Similarly, when it is a right turn, an arrow indicating a right turn is read. When the vehicle is going straight, an arrow indicating straight running is read. When the vehicle is in reverse, an arrow indicating reverse is read. Further, when the progress information is a turning degree, the degree of turning of an arrow indicating left turning or right turning may be set according to the turning degree.
  • the display content deriving unit 26c derives the display content corresponding to the scheduled passage area through which the moving body 20 is scheduled to pass from the steering angle information acquired by the steering angle information acquiring unit 26a.
  • the scheduled passage area is calculated by tracing each position every predetermined short time until several seconds after the moving body 20 calculated from the steering angle information and the speed of the moving body 20.
  • This scheduled passage area is an interference area that may collide with the moving body 20 if a person is present in the area.
  • the display content corresponding to the scheduled passage area is created by tracing up to a position several seconds later, having a width that is substantially the same as or slightly wider than the width of the moving body 20 when illuminated on the floor surface.
  • a figure hereinafter referred to as a trace figure).
  • the display content corresponding to the scheduled passage area may be configured from display contents in a plurality of stages corresponding to the magnitude of the steering angle acquired by the steering angle information acquisition unit 26a.
  • the trace figure may be stored in the storage device 27 according to the degree of bending at a plurality of stages.
  • the control device 26 reads a trace figure corresponding to the steering angle.
  • the control device 26 If the state information is information regarding a normal state (or abnormal state), the control device 26 reads characters, figures, symbols, shapes, etc. indicating the normal state (or abnormal state) from the storage device 27.
  • the state information is information related to the operation state, characters, figures, symbols, shapes, etc. indicating the operation state are read from the storage device 27.
  • the irradiation unit 26d irradiates the display surface derived from the display content deriving unit 26c to the floor surface, wall surface, and ceiling surface around the moving body 20 by the irradiation device 25.
  • the storage device 27 stores each display content indicating each state information.
  • the display contents include an arrow indicating a forward direction, an arrow indicating a backward direction, an arrow indicating a left turn and an arrow showing a right turn, a trace figure indicating a scheduled passage area when going straight, There are a trace figure indicating a scheduled passage area during rotation and a trace figure indicating a scheduled passage area during right turn.
  • the arrow indicating the left turn and the arrow indicating the right turn shown in FIG. 6, the trace figure indicating the scheduled passage area when turning left, and the trace figure indicating the scheduled passage area when turning right are 90 degrees.
  • the imaging device 28 is provided on each of the front surface of the slide base portion 22a1 and the back surface of the first slide portion 22a2.
  • the imaging device 28 provided on the front surface of the slide base 22a1 captures an object in front of the moving body 20.
  • the imaging device 28 provided on the back surface of the first slide portion 22a2 captures an object behind or above the moving body 20.
  • the moving body 20 is provided with a guide device 29 that guides the state of the moving body 20 by voice or display to people around the person including the care recipient M1 and the assistant M2.
  • the guidance device 29 may be a speaker that outputs voice, or a display device such as an LCD or LED that displays characters, graphics, and the like.
  • the moving body 20 is also provided with a guide device 31 for displaying that it is standing up, sitting, and in standby.
  • the guide device 31 includes an upward arrow 31a, a circle 31b, and a downward arrow 31c arranged in order from the top.
  • the upward arrow 31a, the circle 31b, and the downward arrow 31c are turned on and off, respectively.
  • An upward arrow 31a, a circle 31b, and a downward arrow 31c indicate a standing operation, a standby operation, and a seating operation, respectively.
  • the moving body 20 configured as described above will be described.
  • the movement of the moving body 20 will be described.
  • a case where the moving body 20 moves independently from the station 11 to each of the individual rooms 13a to 13d (or from each of the individual rooms 13a to 13d to the station 11) will be described.
  • the moving body 20 is a route from the entrance / exit 11a of the station 11 to each of the entrances / exits 13a1 to 13d1 of the individual rooms 13a to 13d. It moves along the route stored in the storage device 27.
  • the moving body 20 reads the guide mark 14a provided in the passage 14 via the imaging device 28, calculates the remaining journey from the information, and moves based on the result.
  • the guide mark 14a is, for example, a two-dimensional barcode.
  • the two-dimensional barcode includes the current point (for example, the intersection of the passage 14), the distance and direction from the current point to the destination (for example, the passage when the moving body 20 moves from the station 11 to the first private room 13a). Information such as the distance and direction (left turn) from the intersection to the first private room 13a when 14 intersections are reached is described.
  • the guide marks 14a are provided at the corners of the entrance / exit 11a of the station 11, the entrances / exits 13a1 to 13d1 of the individual chambers 13a to 13d, and predetermined locations of the passage 14 (for example, the corners of the intersections and the ceiling surface).
  • irradiation by the irradiation device 25 when the moving body 20 moves will be described.
  • the moving body 20 moves from the station 11 to the first private room 13a.
  • the moving body 20 turns left at the entrance / exit 11a of the station 11 and exits to the passage 14, turns right at the intersection of the passage 14, turns left at the entrance / exit 13a1 of the first private room 13a, and enters the first private room 13a.
  • the moving body 20 moves forward with the front surface of the moving body 20 in the traveling direction. At this time, irradiation is performed from the irradiation device 25 on the front surface.
  • the moving body 20 When the moving body 20 goes straight, the moving body 20 irradiates the floor surface or wall surface (including the ceiling surface in the case of the irradiation device 25 on the back surface) with an arrow indicating straight travel (or a trace figure indicating a scheduled passage area during straight travel). (See the upper part of FIG. 8).
  • the robot When making a left turn, the robot moves while irradiating a floor surface or a wall surface with an arrow indicating a left turn (a trace figure indicating a scheduled passage area during a left turn) (see the middle of FIG. 8).
  • the robot When making a right turn, the robot moves while irradiating the floor or wall surface with an arrow indicating a right turn (a trace figure indicating a scheduled passage area during a right turn) (see the lower part of FIG. 8).
  • an arrow indicating the direction of travel is superimposed on the race figure to indicate the scheduled passage area, but only the trace figure may be displayed or only the arrow indicating the direction of travel may be displayed. Good.
  • the assisting mobile body 20 when the assisting mobile body 20 is moving to a place where the attendee M1 is located (for example, the first private room 13a), at least one of the persons present around the attendee M1 and the assisting mobile body 20
  • the guiding device 29 guides that the assistance moving body 20 is moving to the place where the person being assisted M1 is (for example, the first private room 13a).
  • the moving body 20 enters the first private room 13a from the entrance / exit 13a1 of the first private room 13a, and then approaches the person M1 sitting on the side of the bed.
  • the moving body 20 moves forward with the front surface of the moving body 20 in the traveling direction.
  • the moving body 20 reads the guidance mark 14b provided in the vicinity of the person being assisted M1 through the front imaging device 28, and approaches the person assisted by M1 based on the information.
  • the moving body 20 moves from the irradiation device 25 on the front side while irradiating the floor surface or the wall surface with an arrow indicating a straight line (or a trace figure indicating a scheduled passage area during the straight line) (see the upper part of FIG. 8).
  • the display content indicating that the user is approaching may be emitted from the rear irradiation device 25 to the wall surface (the front wall surface of the person being assisted M1) or the ceiling surface.
  • the mobile body 20 when the mobile body 20 approaches to the distance which detects the knee of the care recipient M1 by the knee sensor 22a1c (see FIG. 9), the mobile body 20 is approaching by stopping the irradiation from the front surface.
  • the display content indicating this is irradiated from the back to the wall surface (the wall surface in front of the person being assisted M1) and the ceiling surface.
  • the moving body 20 uses a detection result of the knee sensor 22a1c (a distance between the moving body 20 and the knee of the person being assisted M1), and a predetermined position where the distance from the seated person M1 is a predetermined distance. Move up.
  • This predetermined position is an optimum position (an optimum standing position) for raising the person being assisted M1.
  • the moving body 20 provides guidance to the person being assisted with “hold the handle”. For example, the moving body 20 irradiates the display surface urging the user to hold the handle 24 from the rear surface to the wall surface (the front wall surface of the person being assisted M1) or the ceiling surface. In addition, a voice prompting the user to hold the handle 24 from the guide device 29 may be output.
  • the moving body 20 When the person being assisted M1 grasps the handle 24 with both hands, the moving body 20 performs an erection operation to erect the person to be assisted M1 in order to detect that the handle 24 has been grasped by the contact sensors 24a and 24b. At this time, the moving body 20 provides guidance to the person being assisted M1 (or a nearby person) that “the person is standing up”. For example, the moving body 20 irradiates the display surface indicating that the standing operation is being performed from the back to the wall surface (the front wall surface of the person being assisted M1) and the ceiling surface. Moreover, you may make it output the audio
  • the moving body 20 When the standing motion is started, the moving body 20 holds the upper half of the sitting person M1 by the holding unit 23 (see FIG. 10). And the moving body 20 raises the care receiver M1 in the state which hold
  • the moving body 20 is assisting in the standing state of the person being assisted M1.
  • the person being assisted M1 walks and moves while the armpit is held by the holding unit 23.
  • the moving body 20 that assists the walking of the person being assisted M1 moves from the first private room 13a to the training room 12, in the same manner as the case where the moving body 20 moves alone as described above, It moves along the stored route, or moves by reading the guide mark 14a with the imaging device 28.
  • the moving body 20 turns right at the entrance / exit 13a1 of the first private room 13a, exits the passage 14, turns right at the intersection of the passage 14, turns left at the entrance / exit 12a of the training room 12, and enters the training room 12 To do.
  • the moving body 20 moves forward with the back surface of the moving body 20 in the traveling direction.
  • the moving body 20 irradiates the irradiation device 25 on the back surface with an arrow or a trace figure indicating the traveling direction as shown in FIG. 8 as in the case where the front surface moves forward in the traveling direction.
  • the moving body 20 causes the person M1 (see FIG. 11) in the standing state to hold the upper body of the person being assisted by the holding unit 23.
  • the seat is held in a held state (see FIG. 10).
  • the moving body 20 provides guidance to the person being assisted M1 (or a nearby person) that the seating operation is being performed.
  • the moving body 20 irradiates the display surface indicating that the seating operation is being performed on the wall surface (the wall surface in front of the person being assisted M1) or the ceiling surface from the back surface.
  • the downward arrow of the guide device 31 may be turned on.
  • the moving body 20 provides guidance to the person being assisted, “Please let go of the handle”. For example, the moving body 20 irradiates the display surface that prompts the user to release the handle 24 to the wall surface (the front wall surface of the person being assisted M1) or the ceiling surface. Further, a sound prompting the user to release his / her hand from the handle 24 may be output from the guide device 29.
  • the moving body 20 moves away from the person being assisted M1 in order to detect that his / her hand has been released from the handle 24 by the contact sensors 24a and 24b.
  • the display content deriving unit 26c displays state information that is the state of the moving body 20 (for example, information such as the traveling direction of the moving body and whether the moving body is normal or abnormal).
  • the content is derived, and the irradiation unit 26d irradiates the display content derived by the display content deriving unit 26c with the irradiation device 25 that irradiates the display content on the floor surface, wall surface, and ceiling surface around the moving body 20.
  • the person around the moving body 20 can directly visually recognize the display contents indicating the state information of the moving body 20 irradiated on the floor surface, wall surface, and ceiling surface around the moving body 20. Therefore, even if the mobile object 20 is separated without being close to the mobile object 20, it can be reliably recognized. In other words, the moving body 20 can reliably convey the state of the moving body 20 even if it is away from a person around the moving body 20.
  • the steering angle information acquisition unit 26a acquires the steering angle information related to the steering angle of the moving body 20, and the progress information deriving unit 26b progresses of the moving body 20 from the steering angle information acquired by the steering angle information acquisition unit 26a.
  • the progress information regarding is derived as state information.
  • the display content deriving unit 26c derives display content corresponding to the progress information from the progress information derived by the progress information deriving unit 26b.
  • the display content deriving unit 26c derives display content corresponding to a scheduled passage area through which the moving body 20 is scheduled to pass from the steering angle information acquired by the steering angle information acquiring unit 26a (see FIG. 6).
  • the display content deriving unit 26c derives display content corresponding to a scheduled passage area through which the moving body 20 is scheduled to pass from the steering angle information acquired by the steering angle information acquiring unit 26a (see FIG. 6).
  • the display content corresponding to the scheduled passage area is composed of display contents in a plurality of stages according to the size of the steering angle acquired by the steering angle information acquisition unit. Thereby, the person around the moving body 20 can more accurately grasp the scheduled passage area of the moving body 20.
  • the moving body 20 is a moving body for assistance for assisting the person M1 who needs assistance.
  • the moving body 20 is an assistance moving body, a person around the assistance moving body 20 can intuitively and accurately grasp the progress information of the assistance moving body 20.
  • the guidance device 29 is provided to at least one of the persons around the assistance person M1 and the assistance mobile body 20, The assistance moving body 20 guides that it is moving to the place where the person being assisted M1 is. Thereby, when the assistance moving body 20 is moving to the place where the person being assisted M1 is present, the guide device 29 further assists in addition to the state information of the assistance moving body irradiated by the irradiation device 25. Since it is guided that the moving body 20 is moving, it can be notified more reliably that it is moving.
  • the moving body according to the present invention is not limited to the assisting moving body, and is allowed to move within a living area where a person lives (or within a pedestrian area where a person walks) and is driven within the living area by a drive source.
  • a drive source for example, an automatic guided vehicle in a factory, an electric wheelchair, a so-called senior car (single-seater electric vehicle), and the like are also included.
  • the rudder angle information acquisition unit is configured to acquire rudder angle information related to the rudder angle from a rudder angle sensor that detects the rudder angle of the steering wheel operated by the person. That's fine.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)
  • Rehabilitation Tools (AREA)

Abstract

Provided is a mobile object that reliably conveys the state of the mobile object, such as the traveling direction thereof, to people around the mobile object, even at a distance. This mobile object (20) is a mobile object that is permitted to move within a residential area where people reside, and that moves within the residential area by being driven by a drive source, the mobile object comprising: a projection device (25) that projects a display content onto a floor surface, a wall surface, and/or a ceiling surface in the periphery of the mobile object (20); a display content derivation unit (26c) that derives a display content indicating state information on the state of the mobile object (20); and a projection unit (26d) that projects, by means of the projection device (25), the display content derived by the display content derivation unit (26c).

Description

移動体Moving body
 本発明は、人が居住する居住区域内の移動が許可されかつ居住区域内を駆動源による駆動によって移動する移動体に関する。 The present invention relates to a moving body that is allowed to move in a living area where a person lives and moves in the living area by driving by a driving source.
 移動体の一形式として、特許文献1に示されているものが知られている。特許文献1の図1に示されているように、移動体は、操縦モードから自律移動モードに切り替わったことを操縦者に通知する通知部3を備えている。通知部3としては、操縦部8に設けられて画像データ処理部811およびディスプレイ812を備えた表示部81が記載されている。さらに、表示部81のほかに、スピーカ831、音データ処理部832を備えた音源部83や、ジョイスティック821への搭乗者の操縦に対して反力を与える反力付与部84が記載されている。 As one type of mobile body, one shown in Patent Document 1 is known. As shown in FIG. 1 of Patent Document 1, the moving body includes a notification unit 3 that notifies the operator that the operation mode has been switched to the autonomous movement mode. As the notification unit 3, a display unit 81 provided in the control unit 8 and including an image data processing unit 811 and a display 812 is described. In addition to the display unit 81, a sound source unit 83 including a speaker 831 and a sound data processing unit 832, and a reaction force applying unit 84 that applies a reaction force to the occupant's operation of the joystick 821 are described. .
 このように構成された移動体においては、操縦者は、移動体の状態の一である、操縦モードから自律移動モードに切り替わったことを通知部3によって認識することができるものの、移動体の周囲にいる人は、移動体の近くに寄って移動体の表示部81を見ることなどをしない限り、移動体の状態を認識することはできない。 In the mobile body configured as described above, the operator can recognize by the notification unit 3 that the operation mode has been switched to the autonomous movement mode, which is one of the states of the mobile body. A person in the area cannot recognize the state of the moving body unless he / she approaches the moving body and looks at the display unit 81 of the moving body.
 また、移動体の他の一形式として、特許文献2に示されているものが知られている。特許文献2に示されているように、移動体は、人と相互交流するためのロボットであり、ロボットの存在を聞こえるように示すためのアナンシエータと、ロボットの存在を光学的に示すためのビーコンとを備えることが可能である(段落0051)。ロボットが移動しているときは、アナンシエータは、継続的または周期的に特徴のある音を発してもよく、またビーコンはストロボを光らせてもよいことが記載されている。 Further, as another type of mobile object, one disclosed in Patent Document 2 is known. As shown in Patent Document 2, the moving body is a robot for interacting with a person, an annunciator for indicating the presence of the robot, and a beacon for optically indicating the presence of the robot. (Paragraph 0051). It is described that when the robot is moving, the annunciator may emit a characteristic sound continuously or periodically, and the beacon may flash the strobe.
特開2010-064215号公報JP 2010-066421 特表2009-509673号公報Special table 2009-509673
 上述した特許文献2に記載されている移動体においては、移動体の周囲にいる人は、移動体の近くに寄らなくても移動体が移動していることは認識することができるものの、進行方向などの詳細情報までは認識することができないという問題があった。また、移動体の移動している状態以外の状態は認識することはできないという問題があった。 In the moving object described in Patent Document 2 described above, a person around the moving object can recognize that the moving object is moving even if he / she does not approach the moving object, but proceeds. There was a problem that detailed information such as directions could not be recognized. In addition, there is a problem that states other than the moving body cannot be recognized.
 本発明は、上述した問題を解消するためになされたもので、移動体において、進行方向などの移動体の状態を、移動体の周囲にいる人に離れていても確実に伝えることを目的とする。 The present invention has been made to solve the above-described problems, and it is an object of the present invention to reliably convey the state of the moving body such as the traveling direction to a person around the moving body. To do.
 上記の課題を解決するため、本発明に係る移動体は、人が居住する居住区域内の移動が許可されかつ居住区域内を駆動源による駆動によって移動する移動体であって、移動体の周囲の床面、壁面や天井面に表示内容を照射する照射装置と、移動体の状態である状態情報を示す表示内容を導出する表示内容導出部と、表示内容導出部によって導出された表示内容を照射装置によって照射する照射部と、を備えたことを特徴とする。 In order to solve the above problems, a moving body according to the present invention is a moving body that is allowed to move in a living area where a person lives and moves within the living area by driving by a driving source, and is around the moving body. An irradiation device that irradiates display content on the floor, wall, and ceiling of the display, a display content deriving unit that derives display content indicating state information that is the state of the moving object, and a display content derived by the display content deriving unit. And an irradiating unit for irradiating with an irradiating device.
本発明による移動体の一実施形態についてその移動体が配置されている介護センタの概要を示す概要図である。It is a schematic diagram which shows the outline | summary of the care center where the moving body is arrange | positioned about one Embodiment of the moving body by this invention. 図1に示す移動体を示す右側面図である。It is a right view which shows the moving body shown in FIG. 図1に示す移動体を示す平面図である。It is a top view which shows the moving body shown in FIG. 図1に示す移動体を示すブロック図である。It is a block diagram which shows the moving body shown in FIG. 図4に制御装置を示すブロック図である。FIG. 4 is a block diagram showing the control device. 移動体の進行方向を示す各矢印、および各通過予定領域を示すトレース図形である。It is the trace figure which shows each arrow which shows the advancing direction of a moving body, and each passage plan area | region. 案内装置31を示す図である。It is a figure which shows the guide apparatus. 照射装置が矢印またはトレース図形を照射しながら移動するのを示す平面図であり、上段は直進を、中段は左旋回を、下段は右旋回を示す。It is a top view which shows that an irradiation apparatus moves, irradiating an arrow or a trace figure, an upper stage shows a straight advance, a middle stage shows a left turn, and a lower stage shows a right turn. 移動体が被介助者に接近する様子を示す側面図である。It is a side view which shows a mode that a mobile body approaches a care receiver. 移動体が着座している被介助者を支えている様子を示す側面図である。It is a side view which shows a mode that the moving body is supporting the person who is sitting. 移動体が起立している被介助者を支えている様子を示す側面図である。It is a side view which shows a mode that the to-be-supported person who the moving body has stood up is supported.
 以下、本発明による移動体の一実施形態について説明する。図1はこの移動体20が配置されている介護センタ10の概要を示す概要図である。この介護センタ10は、ステーション11、トレーニング室12、各個室13a~13dが設けられている。介護センタ10は、人が居住する居住区域である。介護センタ10内に存在する人は、介助を必要とする被介助者M1や被介助者M1を介助する介助者M2である。 Hereinafter, an embodiment of a moving body according to the present invention will be described. FIG. 1 is a schematic view showing an outline of a care center 10 in which the moving body 20 is arranged. The care center 10 is provided with a station 11, a training room 12, and individual rooms 13a to 13d. The care center 10 is a living area where people live. A person existing in the care center 10 is a person M1 who needs assistance or a person M2 who helps the person M1.
 図1に示すように、ステーション11は、介助者M2の詰め所であり、移動体20が待機したり、充電されたりする基地である。移動体20は、人が居住する居住区域内の移動が許可されかつ居住区域内を駆動源である左右駆動輪用モータ21g,21hによる駆動によって移動するものである。トレーニング室12は、被介助者M1がトレーニングやリハビリテーションを行う部屋である。各個室13a~13dは、被介助者M1が居住する部屋である。 As shown in FIG. 1, the station 11 is a place where the assistant M2 is packed, and is a base on which the mobile unit 20 is waiting or charged. The moving body 20 is allowed to move in a living area where a person lives, and moves in the living area by driving by left and right drive wheel motors 21g and 21h as drive sources. The training room 12 is a room where the person being assisted M1 performs training and rehabilitation. Each of the private rooms 13a to 13d is a room in which the care recipient M1 lives.
 ステーション11、トレーニング室12、および各個室13a~13dは、入出口11a,12a,13a1~13d1がそれぞれ設けられており、入出口11a,12a,13a1~13d1は通路14を介して接続されている。なお、図1において、移動体20の近傍の矢印は、移動体20の進行方向を示している。 The station 11, the training room 12, and the individual rooms 13a to 13d are provided with entrances / exits 11a, 12a, 13a1 to 13d1, respectively, and the entrances / exits 11a, 12a, and 13a1 to 13d1 are connected via a passage 14. . In FIG. 1, an arrow in the vicinity of the moving body 20 indicates the traveling direction of the moving body 20.
 移動体20は、被介助者M1を介助するための介助用移動体である。この移動体20は、図2および図3に示すように、基台21、ロボットアーム部22、保持部23、ハンドル24、照射装置25、および制御装置26を含んで構成されている。 The moving body 20 is an assistance moving body for assisting the person being assisted M1. As shown in FIGS. 2 and 3, the moving body 20 includes a base 21, a robot arm unit 22, a holding unit 23, a handle 24, an irradiation device 25, and a control device 26.
 基台21は、左右基部21a,21bと左右脚部21c,21dを備えている。左右基部21a,21bは、左右方向に所定距離を置いて配設されており、左右基部21a,21bには、左右駆動輪21e、21fがそれぞれ設けられるとともに、左右駆動輪21e、21fをそれぞれ駆動する左右駆動輪用モータ21g,21h(駆動源)が内蔵されている。移動体20は、左右駆動輪用モータ21g,21h(駆動源)によりそれぞれ駆動する左右駆動輪21e、21fによって走行する。 The base 21 includes left and right base portions 21a and 21b and left and right leg portions 21c and 21d. The left and right base portions 21a and 21b are disposed at a predetermined distance in the left and right direction. The left and right base portions 21a and 21b are provided with left and right drive wheels 21e and 21f, respectively, and drive the left and right drive wheels 21e and 21f, respectively. The left and right drive wheel motors 21g and 21h (drive sources) are incorporated. The moving body 20 travels by left and right drive wheels 21e and 21f driven by left and right drive wheel motors 21g and 21h (drive sources), respectively.
 左右脚部21c,21dは、左右基部21a,21bから前方方向に水平に延設されている。左右脚部21c,21dの先端部には、左右従動輪21i,21jがそれぞれ設けられている。また、左右脚部21c,21dの先端には、一対の衝突防止センサ21k、21lがそれぞれ設けられている。衝突防止センサ21k、21lは、障害物を検知するセンサであり、その検出信号は制御装置26に送信されるようになっている。 The left and right leg portions 21c and 21d extend horizontally from the left and right base portions 21a and 21b in the forward direction. Left and right driven wheels 21i and 21j are provided at the distal ends of the left and right leg portions 21c and 21d, respectively. A pair of collision prevention sensors 21k and 21l are provided at the ends of the left and right leg portions 21c and 21d, respectively. The collision prevention sensors 21k and 21l are sensors that detect obstacles, and the detection signals thereof are transmitted to the control device 26.
 ロボットアーム部22は、基台21にその基部が取り付けられており、第1および第2回転用モータ22a1b,22b2およびスライド用モータ22a2aを含んで構成される駆動部によって互いに相対移動が可能である複数のアーム22a,22b,22cを備えている。 The robot arm unit 22 is attached to the base 21 and can be moved relative to each other by a drive unit including the first and second rotation motors 22a1b and 22b2 and the slide motor 22a2a. A plurality of arms 22a, 22b, and 22c are provided.
 第1アーム22aは、その基部が基台21に取りつけられている。第1アーム22aは、スライド基部22a1、第1スライド部22a2および第2スライド部22a3を備えている。スライド基部22a1は、ほぼ直方体状に形成されており、その基端部が基台21に第1回転軸22a1a回りに回転可能に取り付けられている。スライド基部22a1は、第1回転用モータ22a1bにより回転駆動される。第1回転用モータ22a1bは、基台21に設けられている。
 なお、スライド基部22a1の前面には、被介助者M1の膝との距離を検知する膝センサ22a1cが設けられている。膝センサ22a1cは、例えば超音波センサやレーザを使用した距離センサなどで構成されている。
The base of the first arm 22 a is attached to the base 21. The first arm 22a includes a slide base 22a1, a first slide 22a2, and a second slide 22a3. The slide base 22a1 is formed in a substantially rectangular parallelepiped shape, and its base end is attached to the base 21 so as to be rotatable around the first rotation shaft 22a1a. The slide base 22a1 is rotationally driven by a first rotation motor 22a1b. The first rotation motor 22 a 1 b is provided on the base 21.
A knee sensor 22a1c that detects the distance from the knee of the person being assisted M1 is provided on the front surface of the slide base 22a1. The knee sensor 22a1c is composed of, for example, an ultrasonic sensor or a distance sensor using a laser.
 第1スライド部22a2は、ほぼ直方体状に形成されており、スライド基部22a1より小さく構成されている。第1スライド部22a2は、スライド基部22a1に対して長手方向(軸方向)にスライドするものであり、収縮した際にはスライド基部22a1内にほぼ収容されるように構成されている。 The first slide portion 22a2 is formed in a substantially rectangular parallelepiped shape, and is configured to be smaller than the slide base portion 22a1. The first slide portion 22a2 slides in the longitudinal direction (axial direction) with respect to the slide base portion 22a1, and is configured to be substantially accommodated in the slide base portion 22a1 when contracted.
 第2スライド部22a3は、ほぼ直方体状に形成されており、第1スライド部22a2より小さく構成されている。第2スライド部22a3は、第1スライド部22a2に対して長手方向(軸方向)にスライドするものであり、収縮した際には第1スライド部22a2内にほぼ収容されるように構成されている。 The second slide portion 22a3 is formed in a substantially rectangular parallelepiped shape, and is configured to be smaller than the first slide portion 22a2. The second slide portion 22a3 slides in the longitudinal direction (axial direction) with respect to the first slide portion 22a2, and is configured to be substantially accommodated in the first slide portion 22a2 when contracted. .
 第1スライド部22a2には、スライド用モータ22a2aおよびスライド用モータ22a2aによって駆動される駆動ベルト(図示省略)が設けられている。駆動ベルトには、スライド基部22a1と第2スライド部22a3が固定されている。スライド用モータ22a2aが駆動すると、第1スライド部22a2はスライド基部22a1に対して軸方向に沿って伸張(または収縮)すると同時に、第2スライド部22a3は第1スライド部22a2に対して伸張(または収縮)する。 The first slide portion 22a2 is provided with a slide motor 22a2a and a drive belt (not shown) driven by the slide motor 22a2a. A slide base portion 22a1 and a second slide portion 22a3 are fixed to the drive belt. When the slide motor 22a2a is driven, the first slide portion 22a2 extends (or contracts) along the axial direction with respect to the slide base portion 22a1, and at the same time, the second slide portion 22a3 extends (or contracts) with respect to the first slide portion 22a2. Shrink).
 第2アーム22bは、ほぼ直方体状に形成されており、第2スライド部22a3の先端部に、長手方向に対して直交する方向(前方向)に延ばして形成されている。
 第3アーム22cは、ほぼ直方体状に形成されており、その基端部が第2アーム22bの先端部に第2回転軸22b1回りに回転可能に取り付けられている。第3アーム22cは、第2回転用モータ22b2により回転駆動される。第2回転用モータ22b2は、第2アーム22b内に収容されている。
The second arm 22b is formed in a substantially rectangular parallelepiped shape, and is formed at the distal end portion of the second slide portion 22a3 so as to extend in a direction orthogonal to the longitudinal direction (forward direction).
The third arm 22c is formed in a substantially rectangular parallelepiped shape, and the base end portion thereof is attached to the distal end portion of the second arm 22b so as to be rotatable around the second rotation shaft 22b1. The third arm 22c is rotationally driven by the second rotation motor 22b2. The second rotation motor 22b2 is accommodated in the second arm 22b.
 保持部23は、第3アーム22cの先端に固定されている。この保持部23は、例えば、被介助者M1の起立動作及び着座動作において被介助者M1に対して対向したときに、その両腕(両脇)を下側から支える部材であり、前方向に向けて開放する平面視略U字形状に形成されている。保持部23は、例えば、被介助者M1に接することを前提とした比較的柔らかい材質で形成されている。 The holding part 23 is fixed to the tip of the third arm 22c. The holding portion 23 is a member that supports both arms (both sides) from the lower side when facing the person being assisted M1 in the standing and sitting movements of the person being assisted, for example, in the forward direction. It is formed in a substantially U shape in plan view that opens toward the top. The holding portion 23 is formed of, for example, a relatively soft material on the premise that the holding portion 23 contacts the person being assisted.
 ハンドル24は、第3アーム22cの上面に固定されている。このハンドル24は、左右一対の棒状取っ手で構成されており、被介助者M1の左右の手でそれぞれ握られるようになっている。ハンドル24には、握っていることを検知する接触センサ24a,24bが設けられている。ハンドル24には、移動体20を左旋回させるための左旋回スイッチ24cと移動体20を右旋回させるための右旋回スイッチ24dが設けられている。さらにハンドル24には、移動体20を停止させるための停止スイッチ24eが設けられている。 The handle 24 is fixed to the upper surface of the third arm 22c. The handle 24 is composed of a pair of left and right bar-shaped handles, and is gripped by the left and right hands of the person being assisted M1. The handle 24 is provided with contact sensors 24a and 24b that detect gripping. The handle 24 is provided with a left turning switch 24c for turning the moving body 20 to the left and a right turning switch 24d for turning the moving body 20 to the right. Further, the handle 24 is provided with a stop switch 24e for stopping the moving body 20.
 また、第3アーム22cには、被介助者M1が保持部23により支持されている状態で歩行する場合や被介助者M1がハンドル24を握った状態で歩行する場合に、被介助者M1から受ける力を検知する荷重センサ22c1が設けられている。荷重センサ22c1は、荷重の変化によって変化する起歪体の歪み量を電圧変化として検出するものや、シリコンチップに圧力が加わると、そのたわみに応じ、ゲージ抵抗が変化し電気信号に変換される半導体式圧力センサなどである。 In addition, the third arm 22c receives from the person being assisted M1 when walking while the person being assisted M1 is supported by the holding unit 23 or when the person being assisted is holding the handle 24. A load sensor 22c1 for detecting the force received is provided. The load sensor 22c1 detects a strain amount of the strain generating body that changes due to a change in load as a voltage change, or when pressure is applied to the silicon chip, the gauge resistance changes according to the deflection and is converted into an electric signal. For example, a semiconductor pressure sensor.
 照射装置25は、移動体20の周囲の床面、壁面や天井面に表示内容を照射する装置である。照射装置25は、画像や映像を床面、壁面や天井面に拡大して投影することにより表示する装置である。照射装置25は、例えば、プロジェクタや映写機などと同様に構成されているものであって、光源やレンズなどを含んで構成されている。 The irradiation device 25 is a device that irradiates display contents on the floor surface, wall surface, and ceiling surface around the moving body 20. The irradiation device 25 is a device that displays an image or video by enlarging and projecting the image or video on a floor surface, a wall surface, or a ceiling surface. The irradiation device 25 is configured in the same manner as, for example, a projector or a projector, and includes a light source, a lens, and the like.
 すなわち、照射装置25は、光源から出射された光を、画像、映像などが表示された透過型液晶を透過させ、その画像、映像などを、レンズを通して床面、壁面や天井面に照射するように構成されている。また、光源から出射された白色光を集光レンズで集光した後、それぞれが独立して反射角度を制御できる多数の極小のマイクロミラーからなる反射鏡で反射させ、その反射光を、レンズを通して床面、壁面や天井面に照射するように構成するようにしてもよい。この場合、画像、映像に応じて各マイクロミラーの反射角度を制御するようになっている。 That is, the irradiation device 25 transmits the light emitted from the light source through the transmissive liquid crystal on which an image, video, or the like is displayed, and irradiates the image, video, or the like to the floor surface, wall surface, or ceiling surface through the lens. It is configured. In addition, after the white light emitted from the light source is collected by the condenser lens, each of the white light is reflected by a reflecting mirror made up of a number of micro-mirrors whose reflection angles can be controlled independently, and the reflected light passes through the lens. You may make it comprise so that a floor surface, a wall surface, and a ceiling surface may be irradiated. In this case, the reflection angle of each micromirror is controlled according to the image and video.
 この照射装置25は、スライド基部22a1の前面および第1スライド部22a2の背面にそれぞれ設けられている。スライド基部22a1の前面に設けられた照射装置25は、移動体20の前方の床面や壁面に画像や映像を照射する。第1スライド部22a2の背面に設けられた照射装置25は、移動体20の後方の床面や壁面または天井面に画像や映像を照射する。
 なお、照射装置25は、照射装置25が照射した画像や映像を撮像装置28が撮像した結果に基づいてピントや明るさを調整して照射するようにしてもよい。
The irradiation device 25 is provided on each of the front surface of the slide base portion 22a1 and the back surface of the first slide portion 22a2. The irradiation device 25 provided on the front surface of the slide base 22a1 irradiates an image or video on the floor or wall surface in front of the moving body 20. The irradiation device 25 provided on the back surface of the first slide portion 22a2 irradiates the floor surface, wall surface, or ceiling surface behind the moving body 20 with an image or video.
Note that the irradiation device 25 may adjust the focus and brightness of the image or video emitted by the irradiation device 25 and adjust the brightness based on the result of imaging by the imaging device 28.
 制御装置26は、移動体20の走行や照射に関する制御を行う。制御装置26は、図4に示すように、上述した衝突防止センサ21k,21l、膝センサ22a1c、荷重センサ22c1、接触センサ24a,24b、左旋回スイッチ24c、右旋回スイッチ24d、停止スイッチ24e、左右駆動輪用モータ21g,21h、第1回転用モータ22a1b、スライド用モータ22a2a、第2回転用モータ22b2、照射装置25、記憶装置27、撮像装置28および案内装置29,31が接続されている。また、制御装置26はマイクロコンピュータ(図示省略)を有しており、マイクロコンピュータは、バスを介してそれぞれ接続された入出力インターフェース、CPU、RAMおよびROM(いずれも図示省略)を備えている。 The control device 26 controls the traveling and irradiation of the moving body 20. As shown in FIG. 4, the control device 26 includes the above-described collision prevention sensors 21k and 21l, knee sensors 22a1c, load sensors 22c1, contact sensors 24a and 24b, a left turn switch 24c, a right turn switch 24d, a stop switch 24e, The left and right drive wheel motors 21g and 21h, the first rotation motor 22a1b, the slide motor 22a2a, the second rotation motor 22b2, the irradiation device 25, the storage device 27, the imaging device 28, and the guide devices 29 and 31 are connected. . The control device 26 includes a microcomputer (not shown), and the microcomputer includes an input / output interface, a CPU, a RAM, and a ROM (all not shown) connected via a bus.
 制御装置26は、図5に示すように、情報取得部26a、情報導出部26b、表示内容導出部26cおよび照射部26dを備えている。
 情報取得部26aは、移動体20の制御に関する情報を取得するものである。例えば、情報取得部26aとしては、移動体20の舵角に関する舵角情報を取得する舵角情報取得部がある。本実施形態においては、舵角情報としては、左旋回スイッチ24c(または右旋回スイッチ24d)のオン時間を舵角に換算した左舵角(または右舵角)や、左駆動輪用モータ21g(または右駆動輪用モータ21h)の制御信号を舵角に換算した左舵角(または右舵角)がある。
As shown in FIG. 5, the control device 26 includes an information acquisition unit 26a, an information deriving unit 26b, a display content deriving unit 26c, and an irradiation unit 26d.
The information acquisition unit 26 a acquires information related to the control of the moving body 20. For example, as the information acquisition unit 26a, there is a steering angle information acquisition unit that acquires steering angle information related to the steering angle of the moving body 20. In the present embodiment, as the steering angle information, the left steering angle (or right steering angle) obtained by converting the on-time of the left turning switch 24c (or right turning switch 24d) into the steering angle, or the left drive wheel motor 21g. There is a left steering angle (or a right steering angle) obtained by converting a control signal of the (or right drive wheel motor 21h) into a steering angle.
 情報導出部26bは、情報取得部に26aよって取得された情報から移動体20の状態に関する情報を状態情報として導出する。例えば、情報導出部26bとしては、舵角情報取得部26aによって取得された舵角情報から移動体20の進行に関する進行情報を状態情報として導出する進行情報導出部がある。本実施形態においては、進行情報としては、左旋回、右旋回、直進(前方に直進する)、および後退(後方に直進する)の進行方向、旋回する際の旋回度合い(旋回角度、旋回半径)がある。これら進行方向、旋回度合いは、舵角情報から導出することができる。 The information deriving unit 26b derives information regarding the state of the moving body 20 as state information from the information acquired by the information acquiring unit 26a. For example, as the information deriving unit 26b, there is a progress information deriving unit that derives progress information regarding the progress of the moving body 20 as state information from the rudder angle information acquired by the rudder angle information acquiring unit 26a. In the present embodiment, as the progress information, the left turn, the right turn, the straight direction (going straight forward), and the backward direction (going straight forward), the turning degree (turning angle, turning radius) ) These traveling direction and turning degree can be derived from the steering angle information.
 また、情報取得部26aは、移動体20が正常状態であるか異常状態であるかに関する情報(正常状態または異常状態を示す制御信号)や、移動体20が起立動作中であるか着座動作中であるかなどの動作状態に関する情報(起立動作を指示する起立動作信号、着座動作を指示する着座動作信号などの動作信号)も取得する。異常状態は、移動体20が走行できない状態(例えば左駆動輪用モータ21gの異常に起因する)やロボットアーム部22が駆動できない状態(例えば第1回転用モータ22a1bの異常に起因する)であり、正常状態は、移動体20が全体として正常である状態である。さらに、情報導出部26bは、これらの信号から、移動体20の状態(正常状態であるか異常状態であるかの状態、あるいは起立動作であるか着座動作であるかの動作状態)に関する情報を状態情報として導出する。 In addition, the information acquisition unit 26a includes information on whether the moving body 20 is in a normal state or an abnormal state (a control signal indicating a normal state or an abnormal state), whether the moving body 20 is in a standing operation, or in a seating operation. Also, information related to the operation state such as whether it is (an operation signal such as a standing operation signal for instructing a standing operation and a seating operation signal for instructing a seating operation) is acquired. The abnormal state is a state where the moving body 20 cannot travel (for example, due to an abnormality of the left drive wheel motor 21g) or a state where the robot arm unit 22 cannot be driven (for example, due to an abnormality of the first rotation motor 22a1b). The normal state is a state in which the moving body 20 is normal as a whole. Furthermore, the information deriving unit 26b provides information on the state of the moving body 20 (normal state or abnormal state, or standing state or sitting state) from these signals. Derived as state information.
 表示内容導出部26cは、移動体20の状態である状態情報を示す表示内容を導出する。表示内容は、その内容を表わす文字、図形、記号、形状などで構成されている。例えば、状態情報が進行情報である場合には、進行方向、旋回度合いを示す表示内容を導出する。すなわち、表示内容導出部26cは、進行情報導出部26bによって導出された進行情報からその進行情報に対応する表示内容を導出する。 The display content deriving unit 26c derives display content indicating state information that is the state of the moving body 20. The display contents are composed of characters, figures, symbols, shapes, and the like representing the contents. For example, when the status information is progress information, display contents indicating the travel direction and the degree of turning are derived. That is, the display content deriving unit 26c derives display content corresponding to the progress information from the progress information derived by the progress information deriving unit 26b.
 進行情報が、進行方向であるときには、それぞれに相当する矢印を記憶装置27から読み込む。具体的には、進行方向が左旋回であるときには、図6に示す左旋回を示す矢印を読み込む。同様に、右旋回であるときには、右旋回を示す矢印を読み込む。直進であるときには、直進を示す矢印を読み込む。後退であるときには、後退を示す矢印を読み込む。
 また、進行情報が旋回度合いであるときには、左旋回または右旋回を示す矢印の曲がり具合を旋回度合いに応じて設定するようにしてもよい。
When the progress information is the direction of travel, the corresponding arrows are read from the storage device 27. Specifically, when the traveling direction is a left turn, an arrow indicating a left turn shown in FIG. 6 is read. Similarly, when it is a right turn, an arrow indicating a right turn is read. When the vehicle is going straight, an arrow indicating straight running is read. When the vehicle is in reverse, an arrow indicating reverse is read.
Further, when the progress information is a turning degree, the degree of turning of an arrow indicating left turning or right turning may be set according to the turning degree.
 さらに、表示内容導出部26cは、舵角情報取得部26aによって取得された舵角情報から、移動体20が通過する予定である通過予定領域に対応する表示内容を導出する。通過予定領域は、舵角情報および移動体20の速度から算出される移動体20の数秒後までの所定短時間毎の各位置をトレースして算出されるものである。この通過予定領域は、その領域内に人がいると移動体20と衝突するおそれがある干渉領域である。通過予定領域に対応する表示内容は、床面に照射表示された際に、移動体20の幅とほぼ同じ幅または若干広めの幅を有し、数秒後の位置までをトレースして作成される図形(以下、トレース図形という)である。 Furthermore, the display content deriving unit 26c derives the display content corresponding to the scheduled passage area through which the moving body 20 is scheduled to pass from the steering angle information acquired by the steering angle information acquiring unit 26a. The scheduled passage area is calculated by tracing each position every predetermined short time until several seconds after the moving body 20 calculated from the steering angle information and the speed of the moving body 20. This scheduled passage area is an interference area that may collide with the moving body 20 if a person is present in the area. The display content corresponding to the scheduled passage area is created by tracing up to a position several seconds later, having a width that is substantially the same as or slightly wider than the width of the moving body 20 when illuminated on the floor surface. A figure (hereinafter referred to as a trace figure).
 また、通過予定領域に対応する表示内容は、舵角情報取得部26aによって取得された舵角の大きさに応じた複数の段階の表示内容から構成されるようにしてもよい。このとき、トレース図形は、複数の段階の曲がり具合に応じたものであって記憶装置27に記憶するようにすればよい。制御装置26は舵角に応じたトレース図形を読み込むことになる。 Further, the display content corresponding to the scheduled passage area may be configured from display contents in a plurality of stages corresponding to the magnitude of the steering angle acquired by the steering angle information acquisition unit 26a. At this time, the trace figure may be stored in the storage device 27 according to the degree of bending at a plurality of stages. The control device 26 reads a trace figure corresponding to the steering angle.
 また、状態情報が正常状態(または異常状態)に関する情報である場合には、制御装置26は、正常状態(または異常状態)を示す文字、図形、記号、形状などを記憶装置27から読み込む。状態情報が動作状態に関する情報である場合には、動作状態を示す文字、図形、記号、形状などを記憶装置27から読み込む。 If the state information is information regarding a normal state (or abnormal state), the control device 26 reads characters, figures, symbols, shapes, etc. indicating the normal state (or abnormal state) from the storage device 27. When the state information is information related to the operation state, characters, figures, symbols, shapes, etc. indicating the operation state are read from the storage device 27.
 照射部26dは、表示内容導出部26cによって導出された表示内容を照射装置25によって、移動体20の周囲の床面、壁面や天井面に照射する。 The irradiation unit 26d irradiates the display surface derived from the display content deriving unit 26c to the floor surface, wall surface, and ceiling surface around the moving body 20 by the irradiation device 25.
 記憶装置27は、上記各状態情報を示す各表示内容を記憶する。表示内容としては、図6に示すように、前進方向を示す矢印、後退方向を示す矢印、左旋回を示す矢印および右旋回を示す矢印、並びに直進時の通過予定領域を示すトレース図形、左旋回時の通過予定領域を示すトレース図形および右旋回時の通過予定領域を示すトレース図形がある。図6に示している、左旋回を示す矢印および右旋回を示す矢印、並びに、左旋回時の通過予定領域を示すトレース図形および右旋回時の通過予定領域を示すトレース図形は、90度旋回するものを示しているが、記憶装置27は、各角度にて旋回するものを記憶するようにしてもよい。 The storage device 27 stores each display content indicating each state information. As shown in FIG. 6, the display contents include an arrow indicating a forward direction, an arrow indicating a backward direction, an arrow indicating a left turn and an arrow showing a right turn, a trace figure indicating a scheduled passage area when going straight, There are a trace figure indicating a scheduled passage area during rotation and a trace figure indicating a scheduled passage area during right turn. The arrow indicating the left turn and the arrow indicating the right turn shown in FIG. 6, the trace figure indicating the scheduled passage area when turning left, and the trace figure indicating the scheduled passage area when turning right are 90 degrees. Although what turns is shown, you may make it the memory | storage device 27 memorize | store what turns at each angle.
 撮像装置28は、スライド基部22a1の前面および第1スライド部22a2の背面にそれぞれ設けられている。スライド基部22a1の前面に設けられた撮像装置28は、移動体20の前方にある対象を撮影する。第1スライド部22a2の背面に設けられた撮像装置28は、移動体20の後方または上方にある対象を撮影する。 The imaging device 28 is provided on each of the front surface of the slide base portion 22a1 and the back surface of the first slide portion 22a2. The imaging device 28 provided on the front surface of the slide base 22a1 captures an object in front of the moving body 20. The imaging device 28 provided on the back surface of the first slide portion 22a2 captures an object behind or above the moving body 20.
 移動体20は、被介助者M1、介助者M2を含む周囲にいる人に対して移動体20の状態を音声や表示により案内する案内装置29を備えている。案内装置29は、音声を出力するスピーカでもよく、文字、図形等を表示するLCDやLEDなどの表示装置でもよい。 The moving body 20 is provided with a guide device 29 that guides the state of the moving body 20 by voice or display to people around the person including the care recipient M1 and the assistant M2. The guidance device 29 may be a speaker that outputs voice, or a display device such as an LCD or LED that displays characters, graphics, and the like.
 さらに、移動体20は、起立動作中、着座動作中およびスタンバイ中であることを表示する案内装置31も備えている。案内装置31は、図7に示すように、上から順番に配置されている上向きの矢印31a、円31bおよび下向きの矢印31cから構成されている。上向きの矢印31a、円31bおよび下向きの矢印31cは、それぞれ点灯・消滅されるようになっている。上向きの矢印31a、円31bおよび下向きの矢印31cは、それぞれ起立動作中、スタンバイ中および着座動作中を示すものである。 Furthermore, the moving body 20 is also provided with a guide device 31 for displaying that it is standing up, sitting, and in standby. As shown in FIG. 7, the guide device 31 includes an upward arrow 31a, a circle 31b, and a downward arrow 31c arranged in order from the top. The upward arrow 31a, the circle 31b, and the downward arrow 31c are turned on and off, respectively. An upward arrow 31a, a circle 31b, and a downward arrow 31c indicate a standing operation, a standby operation, and a seating operation, respectively.
 次に、上述したように構成された移動体20の作動を説明する。最初に、移動体20の移動について説明する。移動体20が、ステーション11から各個室13a~13dまで(または各個室13a~13dからステーション11まで)単独で移動する場合について説明する。ステーション11から各個室13a~13dまでの通路14を移動する場合には、移動体20は、ステーション11の入出口11aから各個室13a~13dの各入出口13a1~13d1までの経路であって予め記憶装置27に記憶されている経路に沿って移動する。 Next, the operation of the moving body 20 configured as described above will be described. First, the movement of the moving body 20 will be described. A case where the moving body 20 moves independently from the station 11 to each of the individual rooms 13a to 13d (or from each of the individual rooms 13a to 13d to the station 11) will be described. When moving along the passage 14 from the station 11 to each of the individual rooms 13a to 13d, the moving body 20 is a route from the entrance / exit 11a of the station 11 to each of the entrances / exits 13a1 to 13d1 of the individual rooms 13a to 13d. It moves along the route stored in the storage device 27.
 また、移動体20は、通路14に設けられている案内用マーク14aを、撮像装置28を介して読み取り、その情報から残りの道程を算出しその結果に基づいて移動する。案内用マーク14aは、例えば二次元バーコードである。二次元バーコードには、現在地点(例えば、通路14の交差点)、現在地点から目的地までの距離と方向(例えば、移動体20がステーション11から第1個室13aに移動する場合には、通路14の交差点に差し掛かった際に交差点から第1個室13aまでの距離と方向(左折))などの情報が記載されている。案内用マーク14aは、ステーション11の入出口11a、各個室13a~13dの各入出口13a1~13d1の角や、通路14の所定箇所(例えば、交差点の角や天井面)に設けられている。 The moving body 20 reads the guide mark 14a provided in the passage 14 via the imaging device 28, calculates the remaining journey from the information, and moves based on the result. The guide mark 14a is, for example, a two-dimensional barcode. The two-dimensional barcode includes the current point (for example, the intersection of the passage 14), the distance and direction from the current point to the destination (for example, the passage when the moving body 20 moves from the station 11 to the first private room 13a). Information such as the distance and direction (left turn) from the intersection to the first private room 13a when 14 intersections are reached is described. The guide marks 14a are provided at the corners of the entrance / exit 11a of the station 11, the entrances / exits 13a1 to 13d1 of the individual chambers 13a to 13d, and predetermined locations of the passage 14 (for example, the corners of the intersections and the ceiling surface).
 これらの場合において移動体20の移動の際の照射装置25による照射について説明する。例えば、移動体20が、ステーション11から第1個室13aに移動する場合について説明する。このとき、移動体20は、ステーション11の入出口11aで左折して通路14に出て、通路14の交差点で右折し、第1個室13aの入出口13a1で左折して第1個室13aに入室する。移動体20は、原則、移動体20の前面を進行方向に向けて前進する。このとき、前面の照射装置25から照射させる。移動体20は、直進のときは、直進を示す矢印(または直進時の通過予定領域を示すトレース図形)を床面や壁面(背面の照射装置25の場合には天井面も含む)に照射しながら移動する(図8上段参照)。左折するときは、左旋回を示す矢印(左旋回時の通過予定領域を示すトレース図形)を床面や壁面に照射しながら移動する(図8中段参照)。右折するときは、右旋回を示す矢印(右旋回時の通過予定領域を示すトレース図形)を床面や壁面に照射しながら移動する(図8下段参照)。 In these cases, irradiation by the irradiation device 25 when the moving body 20 moves will be described. For example, a case where the moving body 20 moves from the station 11 to the first private room 13a will be described. At this time, the moving body 20 turns left at the entrance / exit 11a of the station 11 and exits to the passage 14, turns right at the intersection of the passage 14, turns left at the entrance / exit 13a1 of the first private room 13a, and enters the first private room 13a. To do. In principle, the moving body 20 moves forward with the front surface of the moving body 20 in the traveling direction. At this time, irradiation is performed from the irradiation device 25 on the front surface. When the moving body 20 goes straight, the moving body 20 irradiates the floor surface or wall surface (including the ceiling surface in the case of the irradiation device 25 on the back surface) with an arrow indicating straight travel (or a trace figure indicating a scheduled passage area during straight travel). (See the upper part of FIG. 8). When making a left turn, the robot moves while irradiating a floor surface or a wall surface with an arrow indicating a left turn (a trace figure indicating a scheduled passage area during a left turn) (see the middle of FIG. 8). When making a right turn, the robot moves while irradiating the floor or wall surface with an arrow indicating a right turn (a trace figure indicating a scheduled passage area during a right turn) (see the lower part of FIG. 8).
 なお、図8においては、通過予定領域を示すとレース図形内に進行方向を示す矢印を重ねて表示したが、トレース図形のみを表示してもよく、進行方向を示す矢印のみを表示してもよい。 In FIG. 8, an arrow indicating the direction of travel is superimposed on the race figure to indicate the scheduled passage area, but only the trace figure may be displayed or only the arrow indicating the direction of travel may be displayed. Good.
 また、介助用移動体20が、被介助者M1がいる場所(例えば第1個室13a)に移動中である場合、被介助者M1および介助用移動体20の周囲に存在する人の少なくともいずれかに、介助用移動体20が被介助者M1がいる場所(例えば第1個室13a)に移動中であることを案内装置29により案内する。 In addition, when the assisting mobile body 20 is moving to a place where the attendee M1 is located (for example, the first private room 13a), at least one of the persons present around the attendee M1 and the assisting mobile body 20 In addition, the guiding device 29 guides that the assistance moving body 20 is moving to the place where the person being assisted M1 is (for example, the first private room 13a).
 次に、移動体20が着座している被介助者M1に接近している場合について説明する。このとき、移動体20は、第1個室13aの入出口13a1から第1個室13aに入室し、その後、ベッド脇に着座している被介助者M1に向けて接近する。移動体20は、移動体20の前面を進行方向に向けて前進する。移動体20は、被介助者M1の近傍に設けられている案内用マーク14bを前面の撮像装置28を介して読み取り、その情報に基づいて被介助者M1に接近する。 Next, the case where the moving body 20 is approaching the person M1 who is seated will be described. At this time, the moving body 20 enters the first private room 13a from the entrance / exit 13a1 of the first private room 13a, and then approaches the person M1 sitting on the side of the bed. The moving body 20 moves forward with the front surface of the moving body 20 in the traveling direction. The moving body 20 reads the guidance mark 14b provided in the vicinity of the person being assisted M1 through the front imaging device 28, and approaches the person assisted by M1 based on the information.
 このとき、移動体20は、前面の照射装置25から、直進を示す矢印(または直進時の通過予定領域を示すトレース図形)を床面や壁面に照射しながら移動する(図8上段参照)。また、背面の照射装置25から、接近中であることを示す表示内容を壁面(被介助者M1の正面の壁面)や天井面に照射するようにしてもよい。 At this time, the moving body 20 moves from the irradiation device 25 on the front side while irradiating the floor surface or the wall surface with an arrow indicating a straight line (or a trace figure indicating a scheduled passage area during the straight line) (see the upper part of FIG. 8). Further, the display content indicating that the user is approaching may be emitted from the rear irradiation device 25 to the wall surface (the front wall surface of the person being assisted M1) or the ceiling surface.
 そして、移動体20が膝センサ22a1cによって被介助者M1の膝を検知する距離まで接近した場合には(図9参照)、移動体20は、前面からの照射を中止して、接近中であることを示す表示内容を、背面から壁面(被介助者M1の正面の壁面)や天井面に照射する。 And when the mobile body 20 approaches to the distance which detects the knee of the care recipient M1 by the knee sensor 22a1c (see FIG. 9), the mobile body 20 is approaching by stopping the irradiation from the front surface. The display content indicating this is irradiated from the back to the wall surface (the wall surface in front of the person being assisted M1) and the ceiling surface.
 さらに、移動体20の起立動作と着座動作について図10および図11を参照して説明する。移動体20は、膝センサ22a1cの検出結果(移動体20と被介助者M1の膝との距離)を使用して、着座している被介助者M1との距離が所定距離となる所定の位置まで移動する。この所定の位置は、被介助者M1を起立させるために最適な位置(起立最適位置)である。 Further, the standing operation and the sitting operation of the moving body 20 will be described with reference to FIGS. The moving body 20 uses a detection result of the knee sensor 22a1c (a distance between the moving body 20 and the knee of the person being assisted M1), and a predetermined position where the distance from the seated person M1 is a predetermined distance. Move up. This predetermined position is an optimum position (an optimum standing position) for raising the person being assisted M1.
 そして、移動体20は、被介助者M1に対して「ハンドルを握ってください」という案内を行う。例えば、移動体20は、ハンドル24を握ることを促す表示内容を、背面から壁面(被介助者M1の正面の壁面)や天井面に照射する。また、案内装置29からハンドル24を握ることを促す音声を出力するようにしてもよい。 Then, the moving body 20 provides guidance to the person being assisted with “hold the handle”. For example, the moving body 20 irradiates the display surface urging the user to hold the handle 24 from the rear surface to the wall surface (the front wall surface of the person being assisted M1) or the ceiling surface. In addition, a voice prompting the user to hold the handle 24 from the guide device 29 may be output.
 被介助者M1が両手でハンドル24を握ると、接触センサ24a,24bによってハンドル24を握ったことを検知するため、移動体20は、被介助者M1を起立させるための起立動作を行う。このとき、移動体20は、被介助者M1(または周囲の人)に対して「起立動作中である」という案内を行う。例えば、移動体20は、起立動作中であることを示す表示内容を、背面から壁面(被介助者M1の正面の壁面)や天井面に照射する。また、案内装置29から起立動作中であることを示す音声を出力するようにしてもよい。これに加えて、案内装置31の上向き矢印を点灯させるようにしてもよい。 When the person being assisted M1 grasps the handle 24 with both hands, the moving body 20 performs an erection operation to erect the person to be assisted M1 in order to detect that the handle 24 has been grasped by the contact sensors 24a and 24b. At this time, the moving body 20 provides guidance to the person being assisted M1 (or a nearby person) that “the person is standing up”. For example, the moving body 20 irradiates the display surface indicating that the standing operation is being performed from the back to the wall surface (the front wall surface of the person being assisted M1) and the ceiling surface. Moreover, you may make it output the audio | voice which shows that it is standing-up operation from the guidance apparatus 29. FIG. In addition to this, the upward arrow of the guide device 31 may be turned on.
 起立動作が開始されると、移動体20は、着座している被介助者M1の上半身を保持部23によって保持する(図10参照)。そして、移動体20は、上半身を保持した状態で、被介助者M1を起立状態にする(図11参照)。この着座状態から起立状態までの間において、前述したように、移動体20は、被介助者M1(または周囲の人)に対して「起立動作中である」という案内を行う。 When the standing motion is started, the moving body 20 holds the upper half of the sitting person M1 by the holding unit 23 (see FIG. 10). And the moving body 20 raises the care receiver M1 in the state which hold | maintained the upper body (refer FIG. 11). During the period from the sitting state to the standing state, as described above, the moving body 20 provides guidance to the person being assisted M1 (or a nearby person) that the person is standing.
 移動体20は、被介助者M1の起立状態で補助している。被介助者M1は、その脇の下を保持部23により保持された状態で歩行して移動する。このように被介助者M1の歩行を補助している移動体20が、第1個室13aからトレーニング室12まで移動する場合には、上述した移動体20が単独で移動する場合と同様に、予め記憶されている経路に沿って移動し、または案内用マーク14aを撮像装置28により読み取って移動する。 The moving body 20 is assisting in the standing state of the person being assisted M1. The person being assisted M1 walks and moves while the armpit is held by the holding unit 23. As described above, when the moving body 20 that assists the walking of the person being assisted M1 moves from the first private room 13a to the training room 12, in the same manner as the case where the moving body 20 moves alone as described above, It moves along the stored route, or moves by reading the guide mark 14a with the imaging device 28.
 このとき、移動体20は、第1個室13aの入出口13a1で右折して通路14に出て、通路14の交差点で右折し、トレーニング室12の入出口12aで左折してトレーニング室12に入室する。移動体20は、移動体20の背面を進行方向に向けて前進する。このとき、移動体20は、前面を進行方向に向けて前進する場合と同様に図8に示すように進行方向を示す矢印またはトレース図形を、背面の照射装置25から照射させる。 At this time, the moving body 20 turns right at the entrance / exit 13a1 of the first private room 13a, exits the passage 14, turns right at the intersection of the passage 14, turns left at the entrance / exit 12a of the training room 12, and enters the training room 12 To do. The moving body 20 moves forward with the back surface of the moving body 20 in the traveling direction. At this time, the moving body 20 irradiates the irradiation device 25 on the back surface with an arrow or a trace figure indicating the traveling direction as shown in FIG. 8 as in the case where the front surface moves forward in the traveling direction.
 また、移動体20は、被介助者M1を着座させるための着座動作が開始されると、起立状態にある被介助者M1(図11参照)を、被介助者M1の上半身を保持部23によって保持した状態で着座状態にする(図10参照)。このとき、起立状態から着座状態までの間において、移動体20は、被介助者M1(または周囲の人)に対して「着座動作中である」という案内を行う。例えば、移動体20は、着座動作中であることを示す表示内容を、背面から壁面(被介助者M1の正面の壁面)や天井面に照射する。また、案内装置29から起立動作中であることを示す音声を出力するようにしてもよい。これに加えて、案内装置31の下向き矢印を点灯させるようにしてもよい。 In addition, when the sitting operation for seating the person being assisted M1 is started, the moving body 20 causes the person M1 (see FIG. 11) in the standing state to hold the upper body of the person being assisted by the holding unit 23. The seat is held in a held state (see FIG. 10). At this time, during the period from the standing state to the seated state, the moving body 20 provides guidance to the person being assisted M1 (or a nearby person) that the seating operation is being performed. For example, the moving body 20 irradiates the display surface indicating that the seating operation is being performed on the wall surface (the wall surface in front of the person being assisted M1) or the ceiling surface from the back surface. Moreover, you may make it output the audio | voice which shows that it is standing-up operation from the guidance apparatus 29. FIG. In addition to this, the downward arrow of the guide device 31 may be turned on.
 そして、移動体20は、着座動作が終了すると、被介助者M1に対して「ハンドルから手を放してください」という案内を行う。例えば、移動体20は、ハンドル24から手を放すことを促す表示内容を、背面から壁面(被介助者M1の正面の壁面)や天井面に照射する。また、案内装置29からハンドル24から手を放すことを促す音声を出力するようにしてもよい。 Then, when the seating operation is completed, the moving body 20 provides guidance to the person being assisted, “Please let go of the handle”. For example, the moving body 20 irradiates the display surface that prompts the user to release the handle 24 to the wall surface (the front wall surface of the person being assisted M1) or the ceiling surface. Further, a sound prompting the user to release his / her hand from the handle 24 may be output from the guide device 29.
 被介助者M1がハンドル24から手を放すと、接触センサ24a,24bによってハンドル24から手を放したことを検知するため、移動体20は、被介助者M1から離れる。 When the person being assisted M1 releases his / her hand from the handle 24, the moving body 20 moves away from the person being assisted M1 in order to detect that his / her hand has been released from the handle 24 by the contact sensors 24a and 24b.
 本実施形態によれば、表示内容導出部26cが、移動体20の状態である状態情報(例えば、移動体の進行方向、移動体が正常であるか異常であるかなどの情報)を示す表示内容を導出し、照射部26dが、移動体20の周囲の床面、壁面や天井面に表示内容を照射する照射装置25によって、表示内容導出部26cによって導出された表示内容を照射する。これにより、移動体20の周囲にいる人は、移動体20の周囲の床面、壁面や天井面に照射された、移動体20の状態である状態情報を示す表示内容を直接視認することができるため、移動体20の近くに寄らないで離れていても確実に認識することができる。すなわち、移動体20は、移動体20の状態を、移動体20の周囲にいる人に離れていても確実に伝えることができる。 According to the present embodiment, the display content deriving unit 26c displays state information that is the state of the moving body 20 (for example, information such as the traveling direction of the moving body and whether the moving body is normal or abnormal). The content is derived, and the irradiation unit 26d irradiates the display content derived by the display content deriving unit 26c with the irradiation device 25 that irradiates the display content on the floor surface, wall surface, and ceiling surface around the moving body 20. Thereby, the person around the moving body 20 can directly visually recognize the display contents indicating the state information of the moving body 20 irradiated on the floor surface, wall surface, and ceiling surface around the moving body 20. Therefore, even if the mobile object 20 is separated without being close to the mobile object 20, it can be reliably recognized. In other words, the moving body 20 can reliably convey the state of the moving body 20 even if it is away from a person around the moving body 20.
 また、舵角情報取得部26aが、移動体20の舵角に関する舵角情報を取得し、進行情報導出部26bが、舵角情報取得部26aによって取得された舵角情報から移動体20の進行に関する進行情報を状態情報として導出する。そして、表示内容導出部26cは、進行情報導出部26bによって導出された進行情報からその進行情報に対応する表示内容を導出する。これにより、移動体20の周囲にいる人は、移動体20の周囲の床面、壁面や天井面に照射された、移動体20の進行情報を示す表示内容を直接視認することができるため、移動体20の進行情報を直感的かつ正確に把握することができる。 Further, the steering angle information acquisition unit 26a acquires the steering angle information related to the steering angle of the moving body 20, and the progress information deriving unit 26b progresses of the moving body 20 from the steering angle information acquired by the steering angle information acquisition unit 26a. The progress information regarding is derived as state information. Then, the display content deriving unit 26c derives display content corresponding to the progress information from the progress information derived by the progress information deriving unit 26b. Thereby, since the person around the moving body 20 can directly view the display content indicating the progress information of the moving body 20 irradiated on the floor surface, the wall surface and the ceiling surface around the moving body 20, Progress information of the moving body 20 can be grasped intuitively and accurately.
 また、表示内容導出部26cは、舵角情報取得部26aによって取得された舵角情報から、移動体20が通過する予定である通過予定領域に対応する表示内容を導出する(図6参照)。これにより、移動体20の周囲にいる人は、移動体20の通過予定領域(図8参照)を直感的かつ正確に把握することができるため、移動体20の周囲にいる人と移動体20との接触をできるだけ回避することができる。 Further, the display content deriving unit 26c derives display content corresponding to a scheduled passage area through which the moving body 20 is scheduled to pass from the steering angle information acquired by the steering angle information acquiring unit 26a (see FIG. 6). Thereby, since the person around the moving body 20 can intuitively and accurately grasp the scheduled passage region (see FIG. 8) of the moving body 20, the person around the moving body 20 and the moving body 20 Can be avoided as much as possible.
 また、通過予定領域に対応する表示内容は、舵角情報取得部によって取得された舵角の大きさに応じた複数の段階の表示内容から構成される。これにより、移動体20の周囲にいる人は、移動体20の通過予定領域をより正確に把握することができる。 Also, the display content corresponding to the scheduled passage area is composed of display contents in a plurality of stages according to the size of the steering angle acquired by the steering angle information acquisition unit. Thereby, the person around the moving body 20 can more accurately grasp the scheduled passage area of the moving body 20.
 また、移動体20が、介助を必要とする被介助者M1を介助するための介助用移動体である。これにより、移動体20が介助用移動体である場合にも、介助用移動体20の周囲にいる人は、介助用移動体20の進行情報を直感的かつ正確に把握することができる Moreover, the moving body 20 is a moving body for assistance for assisting the person M1 who needs assistance. Thereby, even when the moving body 20 is an assistance moving body, a person around the assistance moving body 20 can intuitively and accurately grasp the progress information of the assistance moving body 20.
 また、介助用移動体20が、被介助者M1がいる場所に移動中である場合、案内装置29が、被介助者M1および介助用移動体20の周囲に存在する人の少なくともいずれかに、介助用移動体20が、被介助者M1がいる場所に移動中であることを案内する。これにより、介助用移動体20が、被介助者M1がいる場所に移動中である場合には、照射装置25が照射する介助用移動体の状態情報に加えて、さらに案内装置29が介助用移動体20が移動中であることを案内するため、移動中であることをより確実に通知することができる。 Further, when the assistance mobile body 20 is moving to a place where the person being assisted M1 is present, the guidance device 29 is provided to at least one of the persons around the assistance person M1 and the assistance mobile body 20, The assistance moving body 20 guides that it is moving to the place where the person being assisted M1 is. Thereby, when the assistance moving body 20 is moving to the place where the person being assisted M1 is present, the guide device 29 further assists in addition to the state information of the assistance moving body irradiated by the irradiation device 25. Since it is guided that the moving body 20 is moving, it can be notified more reliably that it is moving.
 なお、本発明に係る移動体は、介助用移動体に限らず、人が居住する居住区域内(または人が歩行する歩行者区域内)の移動が許可されかつ居住区域内を駆動源による駆動によって移動する移動体であればよく、例えば工場内の無人搬送車、電動車いす、いわゆるシニアカー(一人乗り電動車両)なども含まれる。なお、移動体の活動範囲は屋内、屋外の区別はない。人が搭乗してハンドルを操作(運転)する場合には、舵角情報取得部は、人が操作するハンドルの舵角を検出する舵角センサから舵角に関する舵角情報を取得するようにすればよい。 The moving body according to the present invention is not limited to the assisting moving body, and is allowed to move within a living area where a person lives (or within a pedestrian area where a person walks) and is driven within the living area by a drive source. For example, an automatic guided vehicle in a factory, an electric wheelchair, a so-called senior car (single-seater electric vehicle), and the like are also included. There is no distinction between indoor and outdoor activities. When a person gets on and operates (drives) the steering wheel, the rudder angle information acquisition unit is configured to acquire rudder angle information related to the rudder angle from a rudder angle sensor that detects the rudder angle of the steering wheel operated by the person. That's fine.
 10…介護センタ、11…ステーション、12…トレーニング室、13a~13d…第1~第4個室、14…通路、20…移動体、21…基台、21g,21h…左右駆動輪用モータ(駆動源)、22…ロボットアーム部、23…保持部、24…ハンドル、25…照射装置、26…制御装置、26a…情報取得部(舵角情報取得部)、26b…情報導出部(進行情報導出部)、26c…表示内容導出部、26d…照射部、27…記憶装置、28…撮像装置、29,31…案内装置、M1…被介助者、M2…介助者。 DESCRIPTION OF SYMBOLS 10 ... Care center, 11 ... Station, 12 ... Training room, 13a-13d ... 1st-4th private room, 14 ... Passage, 20 ... Moving body, 21 ... Base, 21g, 21h ... Motor for right and left driving wheels (drive) Source), 22 ... Robot arm unit, 23 ... Holding unit, 24 ... Handle, 25 ... Irradiation device, 26 ... Control device, 26a ... Information acquisition unit (steering angle information acquisition unit), 26b ... Information derivation unit (progress information derivation) Part), 26c ... display content deriving part, 26d ... irradiating part, 27 ... storage device, 28 ... imaging device, 29,31 ... guide device, M1 ... assistant, M2 ... assistant.

Claims (6)

  1.  人が居住する居住区域内の移動が許可されかつ前記居住区域内を駆動源による駆動によって移動する移動体であって、
     前記移動体の周囲の床面、壁面や天井面に表示内容を照射する照射装置と、
     前記移動体の状態である状態情報を示す前記表示内容を導出する表示内容導出部と、
     前記表示内容導出部によって導出された前記表示内容を前記照射装置によって照射する照射部と、
     を備えたことを特徴とする移動体。
    A moving body that is allowed to move in a living area where a person lives and moves in the living area by driving by a driving source,
    An irradiation device for irradiating display content on a floor surface, a wall surface or a ceiling surface around the mobile body;
    A display content deriving unit for deriving the display content indicating the state information that is the state of the mobile body;
    An irradiation unit for irradiating the display content derived by the display content deriving unit with the irradiation device;
    A moving object comprising:
  2.  前記移動体の舵角に関する舵角情報を取得する舵角情報取得部と、
     前記舵角情報取得部によって取得された前記舵角情報から前記移動体の進行に関する進行情報を前記状態情報として導出する進行情報導出部と、をさらに備え、
     前記表示内容導出部は、前記進行情報導出部によって導出された前記進行情報からその進行情報に対応する前記表示内容を導出することを特徴とする請求項1の移動体。
    Rudder angle information acquisition unit for acquiring rudder angle information related to the rudder angle of the moving body;
    A progress information deriving unit for deriving progress information regarding the progress of the moving body as the state information from the rudder angle information acquired by the rudder angle information acquiring unit;
    The mobile body according to claim 1, wherein the display content deriving unit derives the display content corresponding to the progress information from the progress information derived by the progress information deriving unit.
  3.  前記表示内容導出部は、前記舵角情報取得部によって取得された前記舵角情報から、前記移動体が通過する予定である通過予定領域に対応する前記表示内容を導出することを特徴とする請求項2の移動体。 The display content deriving unit derives the display content corresponding to a scheduled passage area where the moving body is scheduled to pass from the steering angle information acquired by the steering angle information acquiring unit. Item 2. The moving object according to item 2.
  4.  前記通過予定領域に対応する前記表示内容は、前記舵角情報取得部によって取得された舵角の大きさに応じた複数の段階の表示内容から構成されることを特徴とする請求項3の移動体。 4. The movement according to claim 3, wherein the display content corresponding to the scheduled passage region is configured by display content in a plurality of stages according to the size of the steering angle acquired by the steering angle information acquisition unit. body.
  5.  前記移動体が、介助を必要とする被介助者を介助するための介助用移動体であることを特徴とする請求項1ないし請求項4のいずれか一項の移動体。 The moving body according to any one of claims 1 to 4, wherein the moving body is an assisting moving body for assisting a person who needs assistance.
  6.  前記介助用移動体が、前記被介助者がいる場所に移動中である場合、前記被介助者および前記介助用移動体の周囲に存在する人の少なくともいずれかに、前記介助用移動体が、前記被介助者がいる場所に移動中であることを案内する案内装置をさらに備えたことを特徴とする請求項5の移動体。 When the assistance moving body is moving to a place where the person being assisted is present, at least one of the person being assisted and the person around the assistance moving body, the assistance moving body is 6. The moving body according to claim 5, further comprising a guide device for guiding that the person being assisted is moving.
PCT/JP2013/052888 2013-02-07 2013-02-07 Mobile object WO2014122750A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2013/052888 WO2014122750A1 (en) 2013-02-07 2013-02-07 Mobile object
JP2014560569A JP6258875B2 (en) 2013-02-07 2013-02-07 Moving body

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/052888 WO2014122750A1 (en) 2013-02-07 2013-02-07 Mobile object

Publications (1)

Publication Number Publication Date
WO2014122750A1 true WO2014122750A1 (en) 2014-08-14

Family

ID=51299363

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/052888 WO2014122750A1 (en) 2013-02-07 2013-02-07 Mobile object

Country Status (2)

Country Link
JP (1) JP6258875B2 (en)
WO (1) WO2014122750A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017012426A (en) * 2015-06-30 2017-01-19 パラマウントベッド株式会社 Transfer support system
JPWO2016114048A1 (en) * 2015-01-13 2017-09-14 日立マクセル株式会社 Video projection device
US20210146526A1 (en) * 2019-11-18 2021-05-20 Lg Electronics Inc. Robot system and method for controllng the same
CN113453650A (en) * 2019-04-11 2021-09-28 株式会社富士 Auxiliary device
EP4104791A1 (en) * 2021-06-15 2022-12-21 Kawasaki Jukogyo Kabushiki Kaisha Surgical robot and control method of surgical robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004182121A (en) * 2002-12-04 2004-07-02 Matsushita Electric Ind Co Ltd Drive assist system
JP2010172548A (en) * 2009-01-30 2010-08-12 Saitama Univ Autonomously moving wheelchair
JP2012035077A (en) * 2010-07-30 2012-02-23 Toyota Motor Engineering & Manufacturing North America Inc Robotic transportation device and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011156057A (en) * 2010-01-29 2011-08-18 Seiko Epson Corp Motor-driven vehicle, guiding system in facility and guiding method in facility

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004182121A (en) * 2002-12-04 2004-07-02 Matsushita Electric Ind Co Ltd Drive assist system
JP2010172548A (en) * 2009-01-30 2010-08-12 Saitama Univ Autonomously moving wheelchair
JP2012035077A (en) * 2010-07-30 2012-02-23 Toyota Motor Engineering & Manufacturing North America Inc Robotic transportation device and system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2016114048A1 (en) * 2015-01-13 2017-09-14 日立マクセル株式会社 Video projection device
US11249341B2 (en) 2015-01-13 2022-02-15 Maxell Holdings, Ltd. Image projection apparatus
US11701994B2 (en) 2015-01-13 2023-07-18 Maxell, Ltd. Vehicle
JP2017012426A (en) * 2015-06-30 2017-01-19 パラマウントベッド株式会社 Transfer support system
CN113453650A (en) * 2019-04-11 2021-09-28 株式会社富士 Auxiliary device
CN113453650B (en) * 2019-04-11 2024-01-02 株式会社富士 Auxiliary device
US20210146526A1 (en) * 2019-11-18 2021-05-20 Lg Electronics Inc. Robot system and method for controllng the same
US11717955B2 (en) * 2019-11-18 2023-08-08 Lg Electronics Inc. Robot system and method for controlling the same
EP4104791A1 (en) * 2021-06-15 2022-12-21 Kawasaki Jukogyo Kabushiki Kaisha Surgical robot and control method of surgical robot

Also Published As

Publication number Publication date
JPWO2014122750A1 (en) 2017-01-26
JP6258875B2 (en) 2018-01-10

Similar Documents

Publication Publication Date Title
JP6126139B2 (en) Mobility assist robot
JP6258875B2 (en) Moving body
US8677524B2 (en) Bed and combining method
JP6208155B2 (en) Assistance robot
US8862307B2 (en) Steering and control system for a vehicle for the disabled
Tomari et al. Development of smart wheelchair system for a user with severe motor impairment
JP6123901B2 (en) Mobile X-ray equipment
JP6815891B2 (en) Walking support robot and walking support system
JP6360048B2 (en) Assistance robot
JP2011204145A (en) Moving device, moving method and program
JPWO2011013377A1 (en) Traveling car and bed
US11806303B2 (en) Robotic walking assistant
JPWO2015045010A1 (en) Assistance robot
JP5084756B2 (en) Autonomous mobile wheelchair
JP2011156057A (en) Motor-driven vehicle, guiding system in facility and guiding method in facility
JP6306769B2 (en) Mobility assist robot
Choi et al. Design of self-localization based autonomous driving platform for an electric wheelchair
JP5115886B2 (en) Road guidance robot
Ramaraj et al. Development of a Modular Real-time Shared-control System for a Smart Wheelchair
JP2007310563A (en) Mobile robot incorporating scheduled path advance display function
JP6408666B2 (en) Assistance robot
Siddiqui et al. Development of an automated wheelchair for visually impaired people
Viswanathan et al. Adaptive navigation assistance for visually-impaired wheelchair users
JP2019196149A (en) Optical information display method, control device, movable body and program
Viswanathan et al. Smart wheelchairs for assessment and mobility

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13874742

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014560569

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13874742

Country of ref document: EP

Kind code of ref document: A1