WO2024090200A1 - 移動体、移動体の制御方法、およびプログラム - Google Patents
移動体、移動体の制御方法、およびプログラム Download PDFInfo
- Publication number
- WO2024090200A1 WO2024090200A1 PCT/JP2023/036744 JP2023036744W WO2024090200A1 WO 2024090200 A1 WO2024090200 A1 WO 2024090200A1 JP 2023036744 W JP2023036744 W JP 2023036744W WO 2024090200 A1 WO2024090200 A1 WO 2024090200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- moving body
- light
- area
- distance
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/243—Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/50—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
- B60Q1/503—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
- B60Q1/5035—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text electronic displays
- B60Q1/5037—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text electronic displays the display content changing automatically, e.g. depending on traffic situation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/50—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
- B60Q1/507—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/617—Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
- G05D1/622—Obstacle avoidance
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/656—Interaction with payloads or external entities
- G05D1/686—Maintaining a relative position with respect to moving targets, e.g. following animals or humans
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2101/00—Details of software or hardware architectures used for the control of position
- G05D2101/20—Details of software or hardware architectures used for the control of position using external object recognition
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2105/00—Specific applications of the controlled vehicles
- G05D2105/20—Specific applications of the controlled vehicles for transportation
- G05D2105/28—Specific applications of the controlled vehicles for transportation of freight
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2107/00—Specific environments of the controlled vehicles
- G05D2107/60—Open buildings, e.g. offices, hospitals, shopping areas or universities
- G05D2107/67—Shopping areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/10—Land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
Definitions
- the present invention relates to a moving object, a method for controlling a moving object, and a program.
- an autonomous mobile device includes a control means for controlling a moving means for moving the device so as to move the device following the movements of a person recognized as a person to be followed by a detection means for detecting the distance to surrounding objects and their shapes, and a control means for controlling a notification means to notify the person recognized as a person to be followed by a specific movement using a notification method that is preset in response to the specific movement (see, for example, Patent Document 1).
- the present invention has been made in consideration of the above circumstances, and one of its objectives is to provide a moving object, a method for controlling a moving object, and a program that enable a user to easily recognize whether the moving object is behaving in accordance with the user's intentions while moving.
- a moving object, a control method for a moving object, and a program according to the present invention employ the following configuration.
- a moving body includes an illumination device that emits light from a predetermined area of an emission-capable area from which light can be emitted; a recognition unit that recognizes objects and a user in the vicinity of the moving body based on an image of the vicinity of the moving body; a movement control unit that controls the moving body to move together with the user while avoiding the objects in the vicinity; and an emission control unit that controls the illumination device to control a radiation pattern in which the illumination device emits light, wherein the recognition unit recognizes the direction in which the user is located relative to the moving body, and the emission control unit controls the radiation pattern based on the direction that changes in accordance with the movement of one or both of the moving body and the user.
- the emission area of the lighting device is a circular or approximately circular area oriented in a radial direction relative to the vertical axis of the moving body and within a predetermined angular range of the radiation direction, and the emission control unit controls the lighting device so that light is emitted within a predetermined angular range of the radiation direction, centered on a position in the emission area directly facing the direction in which the user is present.
- the emission control unit controls the lighting device so that the strongest light is emitted from or near a position directly facing the direction in which the user is present in the emission-enabled area, and weaker light is emitted as the light moves away from or near the directly facing position in the circumferential direction.
- the emission area of the lighting device is a circular or approximately circular area oriented in a radial direction relative to the vertical axis of the moving body within a predetermined angular range of the radiation direction, and the emission control unit changes the radiation pattern for emitting the light in the emission area depending on the distance from the moving body to the user.
- the emission control unit emits light from a region of a first width in the emission possible region when the distance from the moving body to the user is a first distance, and emits light from a region of a second width wider than the first width when the distance from the moving body to the user is a second distance shorter than the first distance.
- the emission area of the lighting device is a circular or approximately circular area oriented in a radial direction relative to the vertical axis of the moving body and within a predetermined angular range of the radial direction, and the emission control unit changes the color of the light emitted in the emission area depending on the distance from the moving body to the user.
- the emission control unit when the distance from the moving body to the user is a third distance, the emission control unit emits light of a warm color or a color close to a warm color in the emission-enabled area more than when the distance from the moving body to the user is a fourth distance that is longer than the third distance.
- the user is a user being tracked by the moving object.
- the emission area of the lighting device is a circular or approximately circular area oriented in a radial direction relative to the vertical axis of the moving body and within a predetermined angular range of the radiation direction
- the emission control unit causes light to be emitted from an area of a first width centered on a position in the emission area directly facing the direction in which the user is present when the distance from the moving body to the user is a first distance
- a method for controlling a moving body includes a computer that recognizes objects and a user in the vicinity of the moving body based on an image of the periphery of the moving body, controls the moving body to move together with the user while avoiding the objects in the vicinity, controls a lighting device that emits light from a predetermined area of an emission-capable area from which light can be emitted, and controls a radiation pattern in which the lighting device emits light, recognizes the direction in which the user is located relative to the moving body, and controls the radiation pattern based on the direction that changes in accordance with the movement of one or both of the moving body and the user.
- a program causes a computer to execute the following processes: recognize objects and a user around a moving body based on an image of the periphery of the moving body; control the moving body to move together with the user while avoiding the objects in the periphery; control a lighting device that emits light from a predetermined area of an emission-capable area from which light can be emitted, thereby controlling a radiation pattern in which the lighting device emits light; recognize the direction in which the user is located relative to the moving body; and control the radiation pattern based on the direction that changes in accordance with the movement of one or both of the moving body and the user.
- the moving object controls the radiation pattern based on the direction that changes in response to the movement of either or both of the moving object and the user, thereby making it possible for the moving object to easily recognize whether the moving object is behaving in accordance with the user's intention while the user is moving.
- the radiation pattern changes depending on the distance between the moving object and the user, so that the user can easily grasp the position of the moving object.
- the color of the light changes depending on the distance between the moving object and the user, so that the user can easily recognize the behavior and position of the moving object.
- FIG. 1 is a diagram showing an example of a configuration of a mobile body system 1 including a mobile body 100.
- FIG. 1 is a diagram for explaining a usage mode of a moving body 100.
- FIG. FIG. 2 is a perspective view showing a moving body 100.
- FIG. 2 is a diagram illustrating an example of a functional configuration of a moving object 100.
- 13 is a diagram showing an example of the contents of control information 232.
- FIG. 1 is a diagram showing an example of a radiation pattern when the distance from a moving body 100 to a user is a first distance.
- 11 is a diagram showing an example of a radiation pattern when the distance from the moving body 100 to the user is a second distance.
- FIG. 1 is a diagram showing an example of a configuration of a mobile body system 1 including a mobile body 100.
- FIG. 1 is a diagram for explaining a usage mode of a moving body 100.
- FIG. FIG. 2 is a perspective view showing a moving body 100.
- FIG. 13 is a diagram for explaining the relationship between the user's position and the center C of the area from which light is emitted in the emission possible area of the lighting device 160.
- FIG. FIG. 1 is a diagram showing a scene in which a moving object 100 is following a user.
- 10 is a flowchart of a process executed by a control device 200 of a moving body 100.
- FIG. 13 is a diagram showing an example of control information 232 in a modified example.
- FIG. 13 is a diagram for explaining an example of a divided region.
- FIG. 13 is a diagram showing another example of an emission possible region.
- 13 is a diagram for explaining information output by an information output unit 116 according to the second embodiment.
- FIG. 1 is a diagram showing the relationship between the operation of the moving object 100 and information output to an information output unit 116.
- FIG. FIG. 2 is a diagram showing a moving body X of a comparative example.
- 11 is a diagram showing an example of an information output unit 116 outputting a charging rate.
- First Embodiment 1 is a diagram showing an example of the configuration of a mobile body system 1 including a mobile body 100.
- the mobile body system 1 includes, for example, one or more terminal devices 2, a management device 10, and one or more mobile bodies 100. These communicate with each other, for example, via a network NW.
- the network NW is any network, such as a LAN, a WAN, or an Internet line.
- the terminal device 2 is, for example, a computer device such as a smartphone, a tablet terminal, etc.
- the terminal device 2 requests the provision of authority to use the mobile object 100 from the management device 10 based on, for example, an operation by a user, and obtains information indicating that the use has been permitted.
- the management device 10 In response to a request from the terminal device 2, the management device 10 grants the user of the terminal device 2 the authority to use the mobile object 100 and manages reservations for the use of the mobile object 100.
- the management device 10 generates and manages, for example, schedule information that associates preregistered user identification information with the date and time of reservations for the use of the mobile object 100.
- FIG. 2 is a diagram for explaining the manner of use of the mobile body 100.
- the mobile body 100 is placed, for example, at a predetermined position in a facility or a town.
- the user can start using the mobile body 100 by operating the HMI (described later) of the mobile body 100, or can start using the mobile body 100 by operating the terminal device 2.
- HMI described later
- the terminal device 2 For example, when a user goes shopping and has a lot of luggage, the user starts using the mobile body 100 and puts the luggage in a storage unit of the mobile body 100. Then, the mobile body 100 moves together with the user so as to autonomously follow the user.
- the user can continue shopping or head to the next destination with the luggage stored in the mobile body 100.
- the mobile body 100 moves together with the user while moving on a sidewalk or a crosswalk on a roadway.
- the mobile body 100 can move in an area where pedestrians can pass, such as a roadway and a sidewalk.
- the mobile body 100 may be used in indoor or outdoor facilities or private grounds, such as shopping centers, airports, parks, and theme parks, and is capable of moving through areas where pedestrians can pass through.
- the moving body 100 may be capable of moving autonomously in a guidance mode, emergency mode, or other mode in addition to (or instead of) the following mode in which the moving body 100 follows the user as described above.
- the guidance mode is a mode in which the moving body 100 guides the user to a destination specified by the user, and moves autonomously in front of the user at the user's moving speed to lead the user.
- the emergency mode is a mode in which the moving body 100 moves autonomously to seek help from nearby people or facilities in order to help the user if something unusual happens to the user while moving with the user (for example, if the user falls down).
- the moving body 100 may move while maintaining a proper distance from the user in addition to (or instead of) following or guiding as described above.
- the respective controls described below are applied to a moving body that moves with the user while avoiding surrounding objects, but this is not limited to this, and each control may be applied to a moving body that does not move with the user.
- FIG. 3 is a perspective view showing the moving body 100.
- the forward direction of the moving body 100 is the positive X direction
- the rearward direction of the moving body 100 is the negative X direction
- the width direction of the moving body 100 is the right direction based on the positive X direction
- the left direction is the negative Y direction
- the height direction of the moving body 100 which is a direction perpendicular to the X and Y directions, is the positive Z direction.
- the mobile body 100 for example, comprises a base body 110, a door section 112 provided on the base body 110, and wheels (first wheel 120, second wheel 130, and third wheel 140) attached to the base body 110.
- wheels first wheel 120, second wheel 130, and third wheel 140
- a user can open the door section 112 to put luggage in a storage section provided on the base body 110 or take luggage out of the storage section.
- the first wheel 120 and the second wheel 130 are driving wheels
- the third wheel 140 is an auxiliary wheel (driven wheel).
- An information output unit 116 is provided on the side of the base 110 in the positive X direction, which is in the negative Z direction of the door section of the base 110.
- An information output unit 118 is also provided on the side of the base 110 in the negative X direction, similar to the positive X direction.
- the area where information from the information output unit 116 is output is an example of a "first information output area provided in the front of the moving body so that people around the moving body who are in front of the moving body can be seen”
- the area where information from the information output unit 118 is output is an example of a "second information output area provided in the rear of the moving body so that people around the moving body who are in the rear of the moving body can be seen".
- a cylindrical support 150 extending in the positive Z direction is provided on the positive Z direction surface of the base 110. Near the connection between this support 150 and the base 110, an HMI (Human Machine Interface) 114, which is an operation unit operated by the user, is provided.
- the HMI 114 is, for example, an operation unit such as a touch panel.
- the HMI 114 may be omitted.
- a disk-shaped lighting device 160 is provided at the positive Z end of the support 150.
- a cylindrical support 170 extending in the positive Z direction is provided on the positive Z surface of the lighting device 160.
- a camera 180 that captures images of the surroundings of the moving body 100 is provided at the positive Z end of the support 170.
- the position at which the camera 180 is provided may be any position different from the above.
- the lighting device 160 includes one or more light sources and controls the one or more light sources to emit light from a predetermined area of the emission-capable area.
- the emission-capable area is an area facing the horizontal direction (a direction perpendicular to the vertical Z direction). This emission-capable area is provided in a circular or approximately circular shape. In other words, the emission-capable area is an area that faces the radiation direction with respect to the vertical axis (Z direction) of the moving body 100 and is provided in a circular or approximately circular shape within a predetermined angular range of the radiation direction.
- the predetermined angular range may be a range of 360 degrees or an angular range less than 360 degrees.
- the predetermined angular range may be, for example, an angular range of 180 degrees or 270 degrees.
- Camera 180 is, for example, a camera capable of capturing images of the periphery of moving body 100 at a wide angle (e.g., 360 degrees). Camera 180 may include multiple cameras. Camera 180 may be realized, for example, by combining multiple 120-degree cameras or multiple 60-degree cameras.
- the information output unit 116 and the information output unit 118 are, for example, rectangular or approximately rectangular in shape with each side measuring several centimeters to several tens of centimeters. The shape is not limited to a rectangle and may be other shapes.
- the information output unit 116 and the information output unit 118 are devices capable of displaying information such as a segment display using LEDs (light-emitting diodes), a liquid crystal display, or an organic EL (Electro Luminescence) display.
- the information output unit 116 and the information output unit 118 notify the user or people other than the user around the mobile unit 100 of various information such as the mode set in the mobile unit 100, information indicating the direction in which the mobile unit 100 is traveling, and information indicating the amount of charge stored in the battery 134.
- FIG. 4 is a diagram showing an example of the functional configuration of the moving body 100.
- the moving body 100 further includes a first motor 122, a second motor 132, a battery 134, a brake device 136, a steering device 138, a communication unit 190, and a control device 200.
- the first motor 122 and the second motor 132 are operated by power supplied to the battery 134.
- the first motor 122 drives the first wheel 120
- the second motor 132 drives the second wheel 130.
- the first motor 122 may be an in-wheel motor provided in the wheel of the first wheel 120
- the second motor 132 may be an in-wheel motor provided in the wheel of the second wheel 130.
- the brake device 136 outputs a brake torque to each wheel based on instructions from the control device 200.
- the steering device 138 includes an electric motor.
- the electric motor for example, applies a force to a rack and pinion mechanism based on instructions from the control device 200 to change the direction of the first wheel 120 or the second wheel 130, thereby changing the course of the moving body 100.
- the communication unit 190 is a communication interface for communicating with the terminal device 2 or the management device 10.
- the control device 200 includes, for example, an information processing unit 202, a recognition unit 204, a trajectory generating unit 206, a traveling control unit 208, a lighting control unit 210, an output control unit (notification control unit) 220, and a storage unit 230.
- the information processing unit 202, the recognition unit 204, the trajectory generating unit 206, the traveling control unit 208, the lighting control unit 210, and the output control unit 220 are realized by, for example, a hardware processor such as a CPU (Central Processing Unit) executing a program (software).
- a hardware processor such as a CPU (Central Processing Unit) executing a program (software).
- Some or all of these components may be realized by hardware (including circuitry) such as an LSI (Large Scale Integration), an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a GPU (Graphics Processing Unit), or may be realized by cooperation between software and hardware.
- the program may be stored in advance in a storage device (a storage device having a non-transient storage medium) such as a hard disk drive (HDD) or a flash memory, or may be stored in a removable storage medium (a non-transient storage medium) such as a DVD or a CD-ROM, and may be installed by mounting the storage medium in a drive device.
- a storage device a storage device having a non-transient storage medium
- HDD hard disk drive
- flash memory or may be stored in a removable storage medium (a non-transient storage medium) such as a DVD or a CD-ROM, and may be installed by mounting the storage medium in a drive device.
- the storage unit 230 is realized by a storage device such as a HDD, a flash memory, or a random access memory (RAM).
- the storage unit 230 stores control information 232, which will be described later.
- Either or both of the trajectory generation unit 206 and the traveling control unit 208 are examples of a "movement control unit”.
- the information processing unit 202 manages information acquired from the terminal device 2 or the management device 10.
- the recognition unit 204 recognizes the position (distance from the moving body 100 and direction relative to the moving body 100) of objects around the moving body 100, as well as the state of the object, such as speed and acceleration, based on, for example, an image captured by the camera 180.
- the object includes traffic participants and obstacles in facilities and on roads.
- the recognition unit 204 recognizes and tracks the user of the moving body 100. For example, the recognition unit 204 tracks the user based on an image (e.g., a face image of the user) captured by the user who registered when using the moving body 100, or a face image of the user (or a feature amount obtained from the face image of the user) provided by the terminal device 2 or the management device 10.
- the recognition unit 204 recognizes gestures made by the user.
- the moving body 100 may be provided with a detection unit different from the camera, such as a radar device or LIDAR.
- the recognition unit 204 recognizes the situation around the moving body M using the detection results of the radar device or LIDAR instead of (or in addition to) the image.
- the trajectory generating unit 206 generates a trajectory along which the moving body 100 should travel in the future based on, for example, the user's gesture, a destination set by the user, surrounding objects, the user's position, and the like.
- the trajectory generating unit 206 generates a trajectory that allows the moving body 100 to move smoothly to the target point.
- the trajectory generating unit 206 generates a trajectory according to the behavior of the moving body 100 based on, for example, a correspondence between a predetermined gesture and an action, or generates a trajectory for heading toward the destination while avoiding surrounding objects.
- the trajectory generating unit 206 also generates, for example, a trajectory for following a user being tracked.
- the trajectory generating unit 206 generates, for example, a trajectory according to an action based on a preset mode.
- the trajectory generating unit 206 generates multiple trajectories according to the actions of the moving body 100, calculates the risk for each trajectory, and if the total value of the calculated risks or the risk of each trajectory point meets a preset criterion (for example, if the total value is equal to or less than a threshold Th1 and the risk of each trajectory point is equal to or less than a threshold Th2), the trajectory that meets the criterion is adopted as the trajectory along which the moving body 100 moves.
- a preset criterion for example, if the total value is equal to or less than a threshold Th1 and the risk of each trajectory point is equal to or less than a threshold Th2
- the trajectory that meets the criterion is adopted as the trajectory along which the moving body 100 moves.
- the risk tends to be higher the closer the distance between the obstacle and the trajectory (the trajectory point on the trajectory) is, and lower the
- the driving control unit 208 controls the motors (first motor 122, second motor 132), the brake device 136, and the steering device 138 so that the moving body 100 travels along a trajectory that satisfies preset criteria.
- the lighting control unit 210 controls the lighting device 160 to control the radiation pattern in which the lighting device 160 radiates light.
- the lighting control unit 210 controls the radiation pattern based on, for example, the distance between the moving body 100 and the user, which changes according to the movement of either or both of the moving body 100 and the user, and the direction of the user relative to the moving body 100.
- the radiation pattern is, for example, the area in which light is radiated in the radiation-capable area of the lighting device 160, the light intensity for each area, the color of the light, etc.
- the lighting control unit 210 controls the lighting device 160, for example, by referring to the control information 232.
- FIG. 5 is a diagram showing an example of the contents of the control information 232.
- the control information 232 is information in which the distance from the moving body 100 to the user being tracked and the radiation pattern are associated with each other.
- the output control unit 220 controls the information output unit 116 and the information output unit 118 to display desired information on the information output unit 116 and the information output unit 118. Details of the processing of the output control unit 220 will be described in the second embodiment.
- FIG. 6 is a diagram showing an example of a radiation pattern when the distance from the moving body 100 to the user is a first distance.
- the emission area AR of the lighting device 160 is assumed to be oriented horizontally and arranged in a 360-degree circular shape.
- the lighting control unit 210 controls the lighting device 160 so that light is emitted in an area of a first width W1 in the emission area AR.
- the lighting control unit 210 controls the lighting device 160 so that the center C or the vicinity of the center C in the width direction of the first width W1 is the brightest, and the brightness decreases the further away from the center in the circumferential direction (width direction).
- the center C is, for example, the direction in which the tracked user is located.
- the center C is, for example, the direction directly facing the tracked user.
- the center C is, for example, the position in the emission possible area AR that is closest to the user.
- the lighting control unit 210 sets the center C based on the direction in which the tracked user is located, and further sets the width (angular range) of the area in the emission possible area AR from which light is emitted based on the distance between the tracked user and the moving body 100.
- FIG. 7 is a diagram showing an example of a radiation pattern when the distance from the moving body 100 to the user is a second distance.
- the second distance is shorter than the first distance.
- the lighting control unit 210 controls the lighting device 160 so that light is emitted in an area of a second width W2 in the emission possible area AR.
- the second width W2 is wider than the first width W1.
- the center C is determined by the direction in which the user is present, as described above.
- the lighting control unit 210 may change the radiation pattern in a manner different from that described above depending on the distance between the moving body 100 and the user.
- the intensity of the emitted light may be changed depending on the width of the area in the emission possible area AR where the light is emitted. For example, when the width is narrow, stronger light may be emitted than when the width is wide. For example, the intensity of light at or near the center C when the width is narrow may be stronger than the intensity of light at or near the center C when the width is wide. This allows a user who is far away to more easily recognize that light is being emitted.
- FIG. 8 is a diagram for explaining the relationship between the user's position and the center C of the area from which light is emitted in the emission-capable area of the lighting device 160. If the user is located in the positive X direction and the positive Y direction of the moving body 100 at time T, the center C is located directly opposite the user. When the user views the lighting device 160 from this position, the area from which the strongest light is emitted is located directly in front or near the front.
- the center C moves in the negative Y direction in response to the user's movement, and the center C becomes a position directly facing the moved user.
- the area from which the strongest light is emitted is located directly in front or near the front.
- center C is described as changing when the user moves, but center C also changes in the same way when the moving body 100 moves without the user moving, or when the user and the moving body 100 move.
- the lighting control unit 210 controls the radiation pattern based on the distance and direction that change in response to the movement of either or both of the moving body 100 and the user, allowing the user to easily recognize whether the moving body 100 is behaving in accordance with the user's intentions while the user is moving.
- FIG. 9 is a diagram showing a scene in which the moving body 100 is following a user.
- the traffic participants may be located between the moving body 100 and the user, and the moving body 100 may not be able to follow the user.
- the moving body 100 may move around the traffic participants U1 and U2 to follow the user.
- the moving body 100 may stop or move in a direction not intended by the user.
- the moving body 100 behaves as described above, the user may feel that the moving body 100 has lost sight of the user and stopped following.
- the mobile body 100 may come into contact with the obstacle as a result of the turn. In such a case, the mobile body 100 may wait until the obstacle is no longer present, or may move away from the obstacle, thereby moving away from the user. In such a case, the user may feel that the mobile body 100 has lost sight of the user and stopped following.
- the lighting control unit 210 changes the radiation pattern of the lighting device 160 based on the distance between the moving body 100 and the user and the direction of the user relative to the moving body 100 as described above, thereby allowing the user to understand that the moving body 100 recognizes the user.
- [flowchart] 10 is a flowchart of the process executed by the control device 200 of the moving object 100.
- the recognition unit 204 of the control device 200 tracks the user (step S100).
- the lighting control unit 210 of the control device 200 determines whether the recognition unit 204 continues tracking the user without losing sight of the user (step S102).
- the lighting control unit 210 identifies the distance from the moving body 100 to the user and the direction of the user relative to the moving body 100 (step S104). Next, the lighting control unit 210 sets the center of the emission area (the area in the emission-enabled area from which light is emitted) according to the identified direction, and sets the width of the emission area according to the distance (step S106). Next, the lighting control unit 210 controls the lighting device 160 so that light is emitted in a radiation pattern based on the set center and width (step S108).
- the lighting control unit 210 executes a predetermined control (step S110). For example, the lighting control unit 210 causes light to be emitted from all areas of the emission possible area, or causes a predetermined area of the emission possible area to emit a color of light different from the color of light emitted when tracking is continuing. If tracking is not continuing, the lighting control unit 210 controls the lighting device 160 to notify the user that tracking is not continuing. Note that in this process, the emission pattern may be determined using either the distance or the direction.
- the radiation pattern is described as being controlled based on the direction of the user and the distance between the moving body 100 and the user. However, instead (or in addition), the radiation pattern may be controlled based on which of the divided areas around the moving body 100 the user is present in.
- FIG. 11 is a diagram showing an example of control information 232 in a modified example.
- the control information 232 is, for example, information in which regions and radiation patterns are associated with each other.
- FIG. 12 is a diagram for explaining an example of a divided area.
- a radial area centered on the moving body 100 is divided into an area within a predetermined distance from the moving body 100 and an area beyond the predetermined distance from the moving body 100, and each area is further divided into eight equal parts.
- the area is divided by shifting the dividing line by 45 degrees from a reference direction set for the moving body 100.
- the area within a predetermined distance from the moving body 100 is divided into areas AR1A-AR1H, and the area beyond the predetermined distance from the moving body 100 is divided into areas AR2A-AR2H.
- the second width W2 is used, and the center C is located directly opposite the center of the arc that divides the area AR1A.
- the first width W1 is used, and the center C is located directly opposite the center of the arc that divides the area AR1A.
- the control device 200 can easily cause the lighting device 160 to emit light in a radiation pattern that is suitable for the user based on the area in which the user is present.
- the color of the light is changed according to the distance between the moving body 100 and the user.
- the lighting control unit 210 changes the color of the light emitted in the emission possible area according to the distance from the moving body 100 to the user.
- the lighting control unit 210 may emit light of a warm color or a color close to a warm color (e.g., red, orange, yellow light, etc.) in the emission possible area compared to when the distance from the moving body 100 to the user is a fourth distance longer than the third distance. This process may be performed together with changing the width of the area from which light is emitted as described above.
- control device 200 changes the color of the light depending on the distance between the moving body 100 and the user, allowing the user to more easily recognize whether the moving body 100 is behaving in accordance with the user's intentions while moving.
- the emission possible area of the lighting device 160 has been described as being a single area without any partitions, but as shown in FIG. 13, the emission possible area may be composed of a plurality of areas AR1-AR-n. Also, the emission possible area may not be circular but may be polygonal as shown in FIG. 13. In this case, for example, the lighting control unit 210 may emit light from one emission possible area according to the direction of the user relative to the mobile body 100 from among the plurality of emission possible areas, or may select two or more emission possible areas according to the direction of the user relative to the mobile body 100 from among the plurality of emission possible areas, and emit light from the selected emission possible areas. Note that in the example of FIG. 13, the plurality of emission possible areas have been described as being connected without any gaps, but instead, there may be gaps between the plurality of emission possible areas.
- the light is adjusted so that the brightness decreases from the center C in the circumferential direction (by gradation so that the brightness decreases step by step).
- the brightness may be adjusted in other ways.
- many small lights may be arranged in a mesh pattern in the emission-enabled area. In this case, many lights may be turned on at or near the center C, and the number of lights turned on may be controlled to decrease in the circumferential direction.
- the color of light emitted in the emission-enabled area may be set for each user.
- the user may operate the operation unit of the terminal device 2 or the mobile body 100 to set the color to emit a preferred color.
- the color of light emitted in the emission-enabled area becomes the color set by the user.
- the user can easily recognize the mobile body 100 that he or she is using.
- the color of light emitted in the emission possible area may be set for each mode of the moving body 100.
- the modes may be, for example, the above-mentioned following mode, guidance mode, emergency mode, etc.
- the modes may also include a home delivery mode.
- the home delivery mode is a mode in which a package is delivered to a specified location. For example, a light color is determined for each mode, and light of a color corresponding to the mode is emitted in the emission possible area.
- the color of light emitted in the emission possible area may be set for each combination of user and mode. In this case, light of a color corresponding to the combination of user and mode is emitted in the emission possible area. This allows third parties to easily understand in what mode the moving body 100 is moving.
- the control device 200 recognizes the distance from the moving body 100 to the user and the direction in which the user is located relative to the moving body 100, and controls the light emission pattern based on the distance between the moving body 100 and the user and the direction of the user relative to the moving body 100, which change depending on the movement of either or both of the moving body 100 and the user, thereby making it possible for the user to easily recognize whether the moving body is behaving in accordance with the user's intentions while moving.
- Second Embodiment The second embodiment will be described below.
- the control mode of the lighting device 160 has been described.
- the control mode of the information output units 116 and 118 will be described.
- the control of the first embodiment and the control of the second embodiment may be executed in a superimposed manner, or only one of the controls may be executed.
- the contents of the second embodiment will be described below.
- the output control unit 220 causes the information output unit 116 to output light of a first color in the emission-enabled area (information output area) of the information output unit 116.
- the output control unit 220 causes the information output unit 116 to output light of a second color that does not include information indicating the turning direction and is lower in brightness than the light of the first color in the emission-enabled area for a predetermined time before the start of the turning operation, a first predetermined time before the start of the turning operation.
- the output control unit 220 causes the information output unit 116 to output information indicating the turning direction in the emission-enabled area, and causes the information output unit 116 to output information indicating the turning direction even while the moving body 100 is turning.
- the output control unit 220 may turn off the emission-enabled area for a predetermined time a first predetermined time before the start of the turning operation, and then cause the information output unit 116 to output information indicating the turning direction in the emission-enabled area, and cause the information output unit 116 to output information indicating the turning direction while the moving body 100 is turning.
- the specified time for which the second color light is output is, for example, 0.2 seconds or more.
- the second color is, for example, black or gray
- the first color is a color brighter than black or gray.
- the information indicating the turning direction is, for example, an output of an arrow indicating the turning direction, or an arrow indicating the turning direction flowing in the turning direction, as shown in FIG. 14.
- the information indicating the turning direction may be, in addition to the above, characters or symbols.
- the second color light is described as having a lower brightness than the first color light, but instead, the brightness of the second color light may be different from the brightness of the first color light. For example, the second color light may be brighter than the first color light.
- FIG. 15 is a diagram showing the relationship between the operation of the moving body 100 and the information output to the information output unit 116.
- the information output unit 116 When the moving body 100 is set to a specified mode and starts operating at time T, the information output unit 116 outputs light of a color corresponding to the set mode. At time T+1, the moving body 100 starts moving and moves straight ahead. In this case, light of a color corresponding to the set mode is output.
- Time T+2 is the first predetermined time before time T+4, which is the timing when the moving body 100 starts to turn right. Light of the second color is output for the predetermined time.
- Time T+3 is the timing before the moving body 100 starts to turn right and a predetermined time has passed since time T+2. At this time, light of the second color is output, including information indicating a right turn. Output of this information continues until time T+5.
- Time T+5 is the timing when the right turn operation is completed.
- the right turn operation is completed when, for example, the orientations of the first wheel 120 and the second wheel 130 match or coincide with the X direction.
- the information indicating a right turn is erased, and light of the second color is output.
- This output state continues for a predetermined time. Then, at time T+6 when the predetermined time has passed, light of a color according to the set mode is output.
- the moving body 100 causes the information output unit 116 to output a color corresponding to the set mode, and when turning, causes the information output unit 116 to temporarily output light of a second color before turning, and then causes the information output unit 116 to output information indicating the turning direction, thereby making it easier for people in the vicinity to recognize the state of the moving body.
- the moving body X of the comparative example shown in FIG. 16 turns on the direction indicator L to indicate the direction of the turn.
- moving bodies X have different types, shapes, actions, etc., and even if the direction indicator L turns on, people in the vicinity may not assume that the moving body X is turning, but may perform another action or assume that it has received a specific instruction. In this case, when the moving body X turns, the people in the vicinity may come into contact with them or hinder their actions due to unintended actions, so measures are necessary.
- the output mode of the information output by the information output unit 116 is controlled according to the operation of the mobile body 100.
- the color to the second color once before turning, it attracts the attention of people in the vicinity, and then outputs information indicating the turning direction. This allows people in the vicinity to easily anticipate the behavior of the mobile body 100.
- the information output unit 116 has been described as outputting information indicating the turning direction, but in addition to (or instead of) this, information indicating various states of the mobile body 100 may be output. For example, the charge rate of the battery 134, the time during which the mobile body 100 can operate, the distance the mobile body 100 can travel, etc. may be output.
- FIG. 17 is a diagram showing an example of the information output unit 116 outputting the charging rate.
- the information output unit 116 outputs the charging rate of the battery 134. This allows the administrator to easily recognize the need to charge or replace the battery 134.
- the output control unit 220 causes the information output unit 116 (or 118) to output light of a first color in the emission possible area (information output area), and when the moving body 100 is planning to start turning from a straight-moving state, causes the information output unit 116 to output light of a second color that does not include information indicating the turning direction and is lower in brightness than the first color in the emission possible area for a predetermined time a first predetermined time before the start, and then causes the information output unit 116 to output information indicating the turning direction in the emission possible area, thereby making it possible to provide information so that people in the vicinity can more easily recognize the state of the moving body.
- a storage medium for storing computer-readable instructions
- a processor coupled to the storage medium;
- the processor executes the computer-readable instructions to: Recognizing objects and users around the moving object based on the images; Controlling the moving object so as to move together with the user while avoiding the surrounding objects; Controlling a lighting device that emits light from a predetermined area of an emission possible area capable of emitting light to control a radiation pattern in which the lighting device emits light; Recognizing a direction in which a user is present relative to the moving object; controlling the radiation pattern based on the direction that changes in response to movement of one or both of the moving object and the user; Mobile body.
- a storage medium for storing computer-readable instructions
- the processor executes the computer-readable instructions to: a process of controlling an information output unit having an information output area in which objects around the moving body are visible and which outputs information indicating a turning direction of the moving body, when the moving body is moving straight in an area where pedestrians can pass through, to cause the information output unit to output light of a first color in the information output area; a process in which, when the moving body is scheduled to start a turning operation, a first predetermined time before the start, the information output unit is caused to output light of a second color that does not include information indicating the turning direction and has a lightness different from that of the first color in the information output area for a predetermined time, or the information output area is turned off for the predetermined time, and then the information output unit is caused to output information indicating the turning direction in the information output area; Mobile body.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Circuit Arrangement For Electric Light Sources In General (AREA)
Abstract
Description
本発明は、このような事情を考慮してなされたものであり、ユーザが移動中において移動体がユーザの意図に応じた行動を行っているかを容易に認識することができる移動体、移動体の制御方法、およびプログラムを提供することを目的の一つとする。
(1):この発明の一態様に係る移動体は、光を放射可能な放射可能領域のうち所定の領域から光を放射させる照明装置と、移動体の周辺が撮像された画像に基づいて前記移動体の周辺の物体およびユーザを認識する認識部と、前記周辺の物体を避けながら前記ユーザと共に移動するように前記移動体を制御する移動制御部と、前記照明装置を制御して前記照明装置が光を放射させる放射パターンを制御する放射制御部と、を備え、前記認識部は、前記移動体に対するユーザが存在する方向を認識し、前記放射制御部は、前記移動体と前記ユーザとの一方または双方の移動に応じて変化する前記方向に基づいて、前記放射パターンを制御する。
図1は、移動体100を含む移動体システム1の構成の一例を示す図である。移動体システム1は、例えば、一以上の端末装置2と、管理装置10と、一以上の移動体100とを含む。これらは、例えば、ネットワークNWを介して通信を行う。ネットワークNWは、ネットワークNWは、例えば、LAN、WAN、インターネット回線などの任意のネットワークである。
端末装置2は、例えば、スマートフォンや、タブレット端末などのコンピュータ装置である。端末装置2は、例えば、利用者の操作に基づいて、管理装置10から移動体100の利用の権限の提供をリクエストしたり、利用の許可がされたことを示す情報を取得したりする。
管理装置10は、端末装置2のリクエストに応じて、移動体100の利用の権限を端末装置2のユーザに付与したり、移動体100の利用の予約を管理したりする。管理装置10は、例えば、予め登録されたユーザの識別情報と、移動体100の利用予約の日時とを対応付けたスケジュール情報を生成し管理する。
移動体100は、以下のような利用態様でユーザに利用される。図2は、移動体100の利用態様について説明するための図である。移動体100は、例えば、施設や街の所定の位置に配置されている。ユーザは、移動体100を利用したいとき、移動体100のHMI(後述)を操作して利用を開始したり、端末装置2を操作した移動体100の利用を開始したりすることができる。例えば、ユーザが、買い物に出かけて荷物が多くなったとき、移動体100の利用を開始して、移動体100の収納部に荷物を入れる。そして、移動体100は、ユーザに自律的に追従するようにユーザと共に移動する。ユーザは、荷物を移動体100に収納した状態で買い物を続けたり、次の目的地に向かったりすることができる。例えば、移動体100は、ユーザと共に歩道や車道の横断歩道を移動しながら移動する。移動体100は、車道および歩道などの歩行者が通行可能な領域を移動可能である。例えば、移動体100は、ショッピングセンターや空港、公園、テーマパークなど屋内または屋外の施設や私有地内において利用されてもよく、歩行者が通行可能な領域を移動可能である。
制御装置200は、例えば、情報処理部202と、認識部204と、軌道生成部206と、走行制御部208と、照明制御部210と、出力制御部(通知制御部)220と、記憶部230とを備える。情報処理部202と、認識部204と、軌道生成部206と、走行制御部208と、照明制御部210と、出力制御部220とは、例えば、CPU(Central Processing Unit)などのハードウェアプロセッサがプログラム(ソフトウェア)を実行することにより実現される。これらの構成要素のうち一部または全部は、LSI(Large Scale Integration)やASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processing Unit)などのハードウェア(回路部;circuitryを含む)によって実現されてもよいし、ソフトウェアとハードウェアの協働によって実現されてもよい。プログラムは、予めHDD(Hard Disk Drive)やフラッシュメモリなどの記憶装置(非一過性の記憶媒体を備える記憶装置)に格納されていてもよいし、DVDやCD-ROMなどの着脱可能な記憶媒体(非一過性の記憶媒体)に格納されており、記憶媒体がドライブ装置に装着されることでインストールされてもよい。記憶部230は、HDDやフラッシュメモリ、RAM(Random Access Memory)などの記憶装置により実現される。記憶部230には、後述する制御情報232が記憶されている。軌道生成部206と走行制御部208の双方または一方は「移動制御部」の一例である。
図9は、移動体100がユーザに追従している場面を示す図である。交通参加者が多い領域において、移動体100がユーザUに追従している場合、移動体100とユーザとの間に交通参加者が位置して、移動体100がユーザに追従できないことがある。図9のように、交通参加者U1および交通参加者U2によって、移動体100の進路がふさがれた場合、移動体100は、交通参加者U1および交通参加者U2を回避するように回り込んでユーザに追従する軌道を移動することがある。このような場面では、移動体100が停止したり、ユーザが意図しない方向に移動体100が進んだりすることがある。上記のように移動体100が挙動する場合、ユーザは、移動体100がユーザを見失い追従が停止したと感じることがある。
図10は、移動体100の制御装置200が実行する処理の流れのフローチャートである。まず、制御装置200の認識部204が、ユーザをトラッキングする(ステップS100)。次に、制御装置200の照明制御部210が、認識部204がユーザを見失わずにトラッキングを継続しているか否かを判定する(ステップS102)。
上述した例では、ユーザの方向と移動体100とユーザとの距離とに基づいて放射パターンが制御されるものとして説明したが、これに代えて(または加えて)、移動体100の周辺において区分された領域のうちどの領域にユーザが存在するかによって、放射パターンが制御されてもよい。
変形例2では、移動体100とユーザとの距離に応じて、光の色が変更される。照明制御部210は、移動体100からユーザまでの距離に応じて、放射可能領域において放射させる光の色を変更する。照明制御部210は、移動体100からユーザまでの距離が第3距離である場合、移動体100からユーザまでの距離が第3距離よりも長い第4距離である場合よりも、放射可能領域において暖色系または暖色系に近い色の光(例えば赤やオレンジ色、黄色の光等)を放射させてもよい。この処理は、上述したように光を放射させる領域の幅の変更と共に実行されてもよい。
上記の例では、照明装置160の放射可能領域は、区切れがない一つの領域であるものとして説明したが、図13に示すように放射可能領域は複数の領域AR1-AR-nで構成されていてもよい。また、放射可能領域は円状でなく図13に示すように多角形であってもよい。この場合、例えば、照明制御部210は、複数の放射可能領域のうちから移動体100に対するユーザの方向に応じた一つの放射可能領域から光を放射させてもよいし、複数の放射可能領域のうちから移動体100に対するユーザの方向に応じた二つまたは二つ以上の放射可能領域を選択し、選択した放射可能領域から光を放射させてもよい。なお、図13の例では、複数の放射可能領域は、隙間なく連結されているものとして説明したが、これに代えて、複数の放射可能領域の間に隙間が存在していてもよい。
上記の例では、中心Cから円周方向に向かうに従って明るさが低減するように(段階的に明るさが低減するように、グラデーションによって)光が調整されるものとして説明した。これに代えて、その他の態様によって明るさが調整されてもよい。例えば、放射可能領域において、多数の小さなライトが網目状に配置されていてもよい。この場合、中心Cまたは中心C付近において多くのライトが点灯し、円周方向に向かうに従って点灯するライトの数が減少するように制御されてもよい。
以下、第2実施形態について説明する。第1実施形態では、照明装置160の制御態様について説明した。第2実施形態では、情報出力部116および118の制御態様について説明する。第1実施形態の制御と、第2実施形態の制御とは、重畳して実行されてもよいし、一方の制御が実行されてもよい。以下、第2実施形態の内容について説明する。
コンピュータによって読み込み可能な命令(computer-readable instructions)を格納する記憶媒体(storage medium)と、
前記記憶媒体に接続されたプロセッサと、を備え、
前記プロセッサは、前記コンピュータによって読み込み可能な命令を実行することにより(the processor executing the computer-readable instructions to:)
画像に基づいて移動体の周辺の物体およびユーザを認識し、
前記周辺の物体を避けながら前記ユーザと共に移動するように前記移動体を制御し、
光を放射可能な放射可能領域のうち所定の領域から光を放射させる照明装置を制御して前記照明装置が光を放射させる放射パターンを制御し、
前記移動体に対するユーザが存在する方向を認識し、
前記移動体と前記ユーザとの一方または双方の移動に応じて変化する前記方向に基づいて、前記放射パターンを制御する、
移動体。
コンピュータによって読み込み可能な命令(computer-readable instructions)を格納する記憶媒体(storage medium)と、
前記記憶媒体に接続されたプロセッサと、を備え、
前記プロセッサは、前記コンピュータによって読み込み可能な命令を実行することにより(the processor executing the computer-readable instructions to:)
歩行者が通行可能な領域を移動可能な移動体が直進している場合、前記移動体の周辺の物体が視認可能な情報出力領域であって前記移動体の旋回方向を示す情報を出力する情報出力領域を有する情報出力部を制御して、前記情報出力領域において第1色彩の光を前記情報出力部に出力させる処理と、
前記移動体が旋回する動作を開始する予定である場合、前記開始から第1所定時間前に、前記情報出力領域において前記旋回方向を示す情報を含まず且つ前記第1色彩の光と明度が異なる第2色彩の光を前記情報出力部に所定時間出力させた後または前記情報出力領域を前記所定時間消灯させた後、前記情報出力領域において前記旋回方向を示す情報を前記情報出力部に出力させる処理と、
移動体。
2 端末装置
10 管理装置
100 移動体
110 基体
112 扉部
116、118 情報出力部
120 第1車輪
122 第1モータ
130 第2車輪
132 第2モータ
134 バッテリ
140 第3車輪
160 照明装置
180 カメラ
200 制御装置
202 情報処理部
204 認識部
206 軌道生成部
208 走行制御部
210 照明制御部
220 出力制御部
230 記憶部
232 制御情報
Claims (11)
- 光を放射可能な放射可能領域のうち所定の領域から光を放射させる照明装置と、
移動体の周辺が撮像された画像に基づいて前記移動体の周辺の物体およびユーザを認識する認識部と、
前記周辺の物体を避けながら前記ユーザと共に移動するように前記移動体を制御する移動制御部と、
前記照明装置を制御して前記照明装置が光を放射させる放射パターンを制御する放射制御部と、を備え、
前記認識部は、前記移動体に対するユーザが存在する方向を認識し、
前記放射制御部は、前記移動体と前記ユーザとの一方または双方の移動に応じて変化する前記方向に基づいて、前記放射パターンを制御する、
移動体。 - 前記照明装置の前記放射可能領域は、前記移動体の鉛直軸に対して放射方向に向き前記放射方向のうち所定の角度範囲で円状または略円状に設けられた領域であり、
前記放射制御部は、前記放射可能領域における前記ユーザが存在する方向に正対する位置を中心に前記放射方向に関して予め設定された角度範囲の領域で光が放射されるように前記照明装置を制御する、
請求項1に記載の移動体。 - 前記放射制御部は、前記放射可能領域における前記ユーザが存在する方向に正対する位置または位置付近が最も強い光を放射し、前記正対する位置または位置付近から円周方向に離れるほど弱い光が放射されるように前記照明装置を制御する、
請求項2に記載の移動体。 - 前記照明装置の前記放射可能領域は、前記移動体の鉛直軸に対して放射方向に向き前記放射方向のうち所定の角度範囲で円状または略円状に設けられた領域であり、
前記放射制御部は、前記移動体からユーザまでの距離に応じて、前記放射可能領域において前記光を放射させる放射パターンを変更する、
請求項1に記載の移動体。 - 前記放射制御部は、前記移動体から前記ユーザまでの距離が第1距離である場合、前記放射可能領域のうち第1幅の領域から光を放射させ、前記移動体から前記ユーザまでの距離が第1距離よりも短い第2距離である場合、前記第1幅よりも広い第2幅の領域から光を放射させる、
請求項4に記載の移動体。 - 前記照明装置の前記放射可能領域は、前記移動体の鉛直軸に対して放射方向に向き前記放射方向のうち所定の角度範囲で円状または略円状に設けられた領域であり、
前記放射制御部は、前記移動体からユーザまでの距離に応じて、前記放射可能領域において放射させる光の色を変更する、
請求項1に記載の移動体。 - 前記放射制御部は、前記移動体から前記ユーザまでの距離が第3距離である場合、前記移動体から前記ユーザまでの距離が第3距離よりも長い第4距離である場合よりも、前記放射可能領域において暖色系または暖色系に近い色の光を放射させる、
請求項4に記載の移動体。 - 前記ユーザは、前記移動体がトラッキングしているユーザである、
請求項1から7のうちいずれか1項に記載の移動体。 - 前記照明装置の前記放射可能領域は、前記移動体の鉛直軸に対して放射方向に向き前記放射方向のうち所定の角度範囲で円状または略円状に設けられた領域であり、
前記放射制御部は、
前記移動体から前記ユーザまでの距離が第1距離である場合、前記放射可能領域における前記ユーザが存在する方向に正対する位置を中心に第1幅の領域から光を放射させ、 前記移動体から前記ユーザまでの距離が第1距離よりも短い第2距離である場合、前記放射可能領域における前記ユーザが存在する方向に正対する位置を中心に前記第1幅よりも広い第2幅の領域から光を放射させる、
請求項8に記載の移動体。 - コンピュータが、
移動体の周辺が撮像された画像に基づいて前記移動体の周辺の物体およびユーザを認識し、
前記周辺の物体を避けながら前記ユーザと共に移動するように前記移動体を制御し、 光を放射可能な放射可能領域のうち所定の領域から光を放射させる照明装置を制御して前記照明装置が光を放射させる放射パターンを制御し、
前記移動体に対するユーザが存在する方向を認識し、
前記移動体と前記ユーザとの一方または双方の移動に応じて変化する前記方向に基づいて、前記放射パターンを制御する、
移動体の制御方法。 - コンピュータに、
移動体の周辺が撮像された画像に基づいて前記移動体の周辺の物体およびユーザを認識する処理と、
前記周辺の物体を避けながら前記ユーザと共に移動するように前記移動体を制御する処理と、
光を放射可能な放射可能領域のうち所定の領域から光を放射させる照明装置を制御して前記照明装置が光を放射させる放射パターンを制御する処理と、
前記移動体に対するユーザが存在する方向を認識する処理と、
前記移動体と前記ユーザとの一方または双方の移動に応じて変化する前記方向に基づいて、前記放射パターンを制御する処理と、
を実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024552943A JPWO2024090200A1 (ja) | 2022-10-28 | 2023-10-10 | |
CN202380072273.4A CN120019346A (zh) | 2022-10-28 | 2023-10-10 | 移动体、移动体的控制方法及程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-172898 | 2022-10-28 | ||
JP2022172898 | 2022-10-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024090200A1 true WO2024090200A1 (ja) | 2024-05-02 |
Family
ID=90830713
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/036744 WO2024090200A1 (ja) | 2022-10-28 | 2023-10-10 | 移動体、移動体の制御方法、およびプログラム |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2024090200A1 (ja) |
CN (1) | CN120019346A (ja) |
WO (1) | WO2024090200A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019204414A (ja) * | 2018-05-25 | 2019-11-28 | 株式会社豊田自動織機 | 自律走行台車 |
JP2020086757A (ja) | 2018-11-21 | 2020-06-04 | 富士ゼロックス株式会社 | 自律移動装置およびプログラム |
JP2022123742A (ja) * | 2021-02-12 | 2022-08-24 | トヨタ自動車東日本株式会社 | 認識状態伝達装置 |
-
2023
- 2023-10-10 CN CN202380072273.4A patent/CN120019346A/zh active Pending
- 2023-10-10 WO PCT/JP2023/036744 patent/WO2024090200A1/ja active Application Filing
- 2023-10-10 JP JP2024552943A patent/JPWO2024090200A1/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019204414A (ja) * | 2018-05-25 | 2019-11-28 | 株式会社豊田自動織機 | 自律走行台車 |
JP2020086757A (ja) | 2018-11-21 | 2020-06-04 | 富士ゼロックス株式会社 | 自律移動装置およびプログラム |
JP2022123742A (ja) * | 2021-02-12 | 2022-08-24 | トヨタ自動車東日本株式会社 | 認識状態伝達装置 |
Also Published As
Publication number | Publication date |
---|---|
CN120019346A (zh) | 2025-05-16 |
JPWO2024090200A1 (ja) | 2024-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2750763C1 (ru) | Интерактивная внешняя связь транспортного средства с пользователем | |
JP6651678B2 (ja) | 自律的車両制御のためのニューラルネットワークシステム | |
US10160378B2 (en) | Light output system for a self-driving vehicle | |
US10766500B2 (en) | Sensory stimulation system for an autonomous vehicle | |
US9969326B2 (en) | Intention signaling for an autonomous vehicle | |
JP6941120B2 (ja) | 車両用照明システム、車両システム、車両及びデータ通信システム | |
US20170334453A1 (en) | Vehicle control system, vehicle control method, and vehicle control program product | |
CN108290519A (zh) | 用于划分运动区域的控制单元和方法 | |
CN107728610A (zh) | 自动驾驶系统 | |
US10933802B2 (en) | Vehicle illumination system and vehicle | |
KR102241603B1 (ko) | 이동 로봇 | |
WO2024090200A1 (ja) | 移動体、移動体の制御方法、およびプログラム | |
WO2024090203A1 (ja) | 移動体、移動体の制御方法、およびプログラム | |
US12264922B2 (en) | Pick-up/drop-off zone tracking system | |
KR102069765B1 (ko) | 이동 로봇 | |
JP2022022556A (ja) | 車両用灯具 | |
KR102350931B1 (ko) | 이동 로봇 | |
WO2025069392A1 (ja) | 移動体の制御装置、移動体の制御方法、およびプログラム | |
WO2025069390A1 (ja) | 移動体の制御装置、移動体の制御方法、およびプログラム | |
WO2025069378A1 (ja) | 移動体の制御装置、移動体の制御方法、およびプログラム | |
CN115593307A (zh) | 自动驾驶的提示方法、装置、设备及存储介质 | |
US20210225214A1 (en) | Display control apparatus, control method of display control apparatus, moving object, and storage medium for controlling visible direction in display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23882409 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024552943 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023882409 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2023882409 Country of ref document: EP Effective date: 20250424 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |