CN110119141B - Mobile body control system, electric wheelchair as mobile body, and mobile body control method - Google Patents

Mobile body control system, electric wheelchair as mobile body, and mobile body control method Download PDF

Info

Publication number
CN110119141B
CN110119141B CN201910100417.8A CN201910100417A CN110119141B CN 110119141 B CN110119141 B CN 110119141B CN 201910100417 A CN201910100417 A CN 201910100417A CN 110119141 B CN110119141 B CN 110119141B
Authority
CN
China
Prior art keywords
mobile body
electric wheelchair
travel
virtual line
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910100417.8A
Other languages
Chinese (zh)
Other versions
CN110119141A (en
Inventor
越膳孝方
宫本修吾
鸟本康夫
松本贵成
川濑广明
樋口诚
佐藤文明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Imasen Electric Industrial Co Ltd
Toho University
Original Assignee
Honda Motor Co Ltd
Imasen Electric Industrial Co Ltd
Toho University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd, Imasen Electric Industrial Co Ltd, Toho University filed Critical Honda Motor Co Ltd
Publication of CN110119141A publication Critical patent/CN110119141A/en
Application granted granted Critical
Publication of CN110119141B publication Critical patent/CN110119141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Abstract

The invention provides a mobile body control system capable of setting a travel path for ensuring the comfort of passengers. The mobile body control system is characterized by comprising an imaging means (208) mounted on a mobile body (102), a virtual line generation means (308) for generating a smooth virtual line for guiding the mobile body (102) along a predetermined travel route on the basis of an image captured by the imaging means (208), and a travel control means (390) for controlling the mobile body to travel along the virtual line.

Description

Mobile body control system, electric wheelchair as mobile body, and mobile body control method
Technical Field
The present invention relates to a mobile body control system, an electric wheelchair as a mobile body, and a mobile body control method.
Background
Conventionally, a travel system is known in which a guide route formed by continuous blind bricks is photographed by an on-vehicle image pickup unit and the travel of a vehicle is controlled based on the photographed image (for example, patent document 1). According to this traveling system, the vehicle can be guided based on the captured image, and therefore the system itself can be simplified.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2017-102601
However, in such a travel system, when a wheelchair is assumed as the vehicle, the vehicle has to travel and move in a narrow environment such as indoors. Therefore, in the conventional travel system (for example, see patent document 1), there is a demand for suppressing the rocking of the wheelchair so that the operation center is positioned at the center of the travelable path so that the wheelchair in travel does not collide with the surroundings. Further, unlike conventional travel systems, a travel system applicable to an environment without blind tiles is desired.
Disclosure of Invention
Therefore, an object of the present invention is to provide a mobile body control system, a mobile body, and a mobile body control method that can suppress a hunting during traveling and have a wider application environment than the conventional one.
A mobile body control system according to the present invention for solving the above-described problems includes: an imaging unit mounted on a mobile body; a virtual line generating unit that generates a smooth virtual line for guiding the moving object along a predetermined travel path based on the captured image captured by the imaging unit; and a travel control means for controlling the moving body to travel along the virtual line.
The moving object of the present invention is provided with such a moving object control system.
Further, a mobile body control method according to the present invention includes: a virtual line generation step of generating a smooth virtual line for guiding the moving object along a predetermined travel route based on the captured image captured by the imaging means; and a travel control step of controlling the moving object to travel along the virtual line.
Effects of the invention
According to the mobile body control system, the mobile body provided with the mobile body control system, and the mobile body control method of the present invention, it is possible to set a travel route that ensures comfort for the occupant.
Drawings
Fig. 1 is a schematic diagram of a mobile body control system according to an embodiment of the present invention.
Fig. 2 is an overall perspective view of the mobile body according to the embodiment of the present invention.
Fig. 3 is a block diagram of a mobile unit control system according to an embodiment of the present invention.
Fig. 4 is a map illustrating an area to which the mobile body control system of the embodiment of the present invention is applied.
Fig. 5 is a flowchart showing a procedure of a traveling control process of the mobile body in the mobile body control system according to the embodiment of the present invention.
Fig. 6 (a) to (c) are captured images of the travel route captured by the imaging means of the moving object.
Description of the reference numerals
10. Shooting image
11. Drivable region
13. Touch panel switch
14. Virtual line
100. Mobile body control system
102. Moving body
106. Beacon transmitter
107AR marker
108. Mobile body control device
110. Server
208. Image pickup unit
210. Terminal (Mobile terminal)
212. Input unit
214. Display unit
216. Geomagnetic detection unit
370. Travelable region extraction unit
380. Virtual line generation unit
390. Travel control unit
LAR line
R route
Detailed Description
A mobile body control system, a mobile body, and a mobile body control method that implement an embodiment (present embodiment) of the present invention will be described in detail.
Hereinafter, the present embodiment will be described in detail by taking as an example a case where an electric wheelchair as a moving body is guided from a current position to a destination in a predetermined facility.
The moving body control system according to the present embodiment is configured to cause an electric wheelchair to travel along a smooth virtual line formed by a captured image captured by an imaging unit mounted on the electric wheelchair.
< integral structure of moving body control system >
As shown in fig. 1, which is a schematic diagram, a mobile body control system 100 according to the present embodiment includes a mobile body 102, which is an electric wheelchair, a plurality of beacon transmitters 106 and AR (Augmented Reality) markers 107 provided around a travel path 104 of the mobile body 102, and a server 110.
The mobile body 102 further includes a mobile body control device 108 communicably connected to the server 110, and a terminal 210 communicably connected to the mobile body control device 108 (see fig. 2).
The beacon transmitter 106 has its transmission frequency and its installation position set so as to have different frequencies at least in the range where the ranges of arrival of the radio waves overlap each other.
A plurality of AR markers 107 are arranged along the traveling path 104 so as to be able to take an image by an imaging unit 208 (see fig. 3) described later mounted on the moving object 102 traveling on the traveling path 104.
In the mobile object control system 100 according to the present embodiment, when the AR mark 107 is reflected in the corner of the captured image by the imaging means 208 (see fig. 3), the AR mark overlaps the captured image reflected by the display means 214 (see fig. 3) of the terminal 210, and reflects the route to the destination. In addition, the absolute position (coordinates) of the AR marker 107 is stored in advance in the server 110. Thus, the server 110 specifies the current position of the moving object 102 captured on the basis of the captured image of the AR marker 107, as will be described later.
The AR mark 107 may be formed of a predetermined character, graphic, symbol, or a mark formed by a combination thereof. The AR mark 107 may be configured by visual elements such as the appearance, color, and characteristic road surface pattern of surrounding daily necessities such as chairs, tables, and furnishings. In addition, the AR mark 107 can be formed of a characteristic spatial shape (passage shape) partitioned by a wall, a bottom surface, and a ceiling region. Such an AR marker can be configured as a location transfer marker in the present embodiment together with a beacon described later.
The mobile object control device 108 transmits various information data output from a terminal 210 (see fig. 3) mounted on the beacon transmitter 106 or the mobile object 102, which will be described later, to the server 110. The mobile object control device 108 receives information of the target vehicle speed and the target steering angle from the server 110, and performs travel control of the mobile object 102.
These mobile object control device 108 and server 110 will be described in detail later.
Fig. 2 is an overall perspective view of the mobile body 102 (electric wheelchair).
As shown in fig. 2, the mobile body 102 (electric wheelchair) includes a wheelchair main body 103 having drive wheels 103a and 103a (one drive wheel is not shown in the figure) on both sides thereof, and an actuator 204 for independently driving the drive wheels 103a and 103 a. The above-described mobile body control device 108 is mounted on the wheelchair main body 103.
Further, a terminal holder 103d is attached to the frame 103b of the wheelchair main body 103 via a support arm 103 c. The terminal 210 is detachably attached to the terminal holder 103d.
Although not shown in the drawing, a rotation angle sensor that outputs a signal pulse (rotation signal pulse) every time the drive wheels 103a and 103a rotate by a predetermined rotation angle is disposed on each of the drive wheels 103a and 103 a. The rotation angle sensor cooperates with a distance measurement information processing unit 360 (see fig. 3) of the moving body control device 108, which will be described later.
Fig. 3 is a block diagram of the mobile body control system 100 (see fig. 1).
As described above, the mobile object control system 100 is mainly configured by the terminal 210, the mobile object control device 108, and the server 110.
"terminal
As shown in fig. 3, the terminal 210 (mobile terminal) includes a communicator 200 communicating with the mobile object control device 108, an input unit 212 for inputting information such as a destination to the terminal 210, an image pickup unit 208 for picking up an image of an environment in front of the mobile object 102, a display unit 214 for displaying a virtual line 14 (see fig. 6 c) and the like described later, and a geomagnetic detection unit 216 for measuring and outputting the geomagnetism at a point where the mobile object 102 is located.
The terminal 210 in the present embodiment is assumed to be a tablet as a mobile terminal, but as will be described in detail later, the terminal is not particularly limited as long as the terminal can perform mutual communication with the mobile object control device 108. Thus, the terminal 210 may also be a smartphone, a laptop, or the like, for example. The relationship between the terminal 210 and the mobile object control device 108 will be described in more detail later.
Mobile body control device
Mobile object control device 108 is mainly configured by including a communicator 300, a processing device 304, a travelable region extraction unit 370, a virtual line generation unit 380, and a travel control unit 390.
The communicator 300 performs communication between the terminal 210 and the server 110.
The processing device 304 in the present embodiment outputs the various information data input thereto to the server 110 via the communicator 300.
Specifically, the processing device 304 is mainly configured by an input information processing unit 310, a beacon processing unit 320, an AR marker processing unit 340, a geomagnetism processing unit 350, an image processing unit 330, and a distance measurement information processing unit 360.
The input information processing unit 310 cooperates with the input unit 212 of the terminal 210.
Specifically, input information processing section 310, which detects that the user has turned on a predetermined switch (not shown) of input section 212, transmits a travel control start request signal, destination information, and the like, which will be described later, to server 110 together with the identification ID, based on each switch.
As will be described later, input information processing section 310 transmits, to server 110, a request signal for causing server 110 to determine the route of mobile object 102 based on a beacon, an AR (Augmented Reality) marker, and which element in the geomagnetism, in accordance with the selection made by the user via input section 212.
As will be described later, the mobile object control system 100 according to the present embodiment is assumed to be configured to determine a route by any one of a beacon, an AR (Augmented Reality) marker, and geomagnetism. However, in the present embodiment, the server 110 may be configured to select two or more elements and set the optimal route determination.
The beacon processing unit 320 measures the reception intensity of the beacon radio wave received from each beacon transmitter 106 (see fig. 1) around the mobile body 102 (see fig. 1).
Beacon processing section 320 transmits beacon reception information including the measurement value and the frequency of the beacon radio wave to server 110 via communicator 300.
Note that the reception of the beacon radio wave by the beacon processing unit 320 is performed via a beacon receiver (not shown) incorporated in the beacon processing unit 320. The reception intensity of the beacon radio wave is measured for each beacon radio wave at predetermined time intervals (that is, at the frequency of the received beacon radio wave). Then, the beacon processing unit 320 transmits such beacon reception information to the server 110 at prescribed time intervals.
The image processing unit 330 transmits the data of the captured image acquired by the image capturing unit 208 of the terminal 210 to the server 110 together with the identification ID. The data of the captured image is shared with the AR marker processing unit 340, the travelable region extraction unit 370, and the virtual line generation unit 380, which will be described below.
The AR marker processing unit 340 extracts the AR marker 107 (see fig. 1) reflected in the captured image from the captured image of the imaging unit 208 of the terminal 210. The AR marker processing unit 340 transmits the data of the captured image of the extracted AR marker 107 to the server 110 together with the identification ID.
The geomagnetic processing unit 350 receives the geomagnetic information (magnetic bearing, magnetic flux density, and the like) detected by the geomagnetic detection unit 216 of the terminal 210, and the geomagnetic processing unit 350 transmits the detected geomagnetic information to the server 110 together with the identification ID. It is assumed that the geomagnetic detection unit 216 in the present embodiment is mounted as an optional unit on the terminal 210 such as the tablet. Such a geomagnetic detection unit 216 can be easily implemented by constructing a magnetic orientation sensor that uses an API (Application Programming Interface) disclosed by an OS (operating system) of the terminal 210.
The distance measurement information processing unit 360 detects a rotation signal pulse from the rotation angle sensor (not shown) disposed on each of the drive wheels 103a and 103a (see fig. 2) interlocked with the actuator 204 of the mobile body 102. The distance measurement information processing unit 360 transmits the respective travel distances of the driving wheels 103a, which are converted based on the number of generated rotation signal pulses, to the server 110 as distance measurement information.
The travelable region extraction unit 370 extracts a region in which the mobile body 102 can travel from the captured image of the image capturing unit 208 of the terminal 210.
Specifically, travelable region extraction section 370 determines and calculates a boundary line between the ground surface and the wall surface based on the image of the captured image, and sets the side closer to the ground surface than the boundary line as a travelable region. At this time, when an obstacle that hinders the travel of the mobile body 102 is reflected in the captured image, the travelable region is set so as to avoid the obstacle by the image determination.
The virtual line generating unit 380 generates the virtual line 14 in the travelable region (see fig. 6 c). The virtual line 14 is generated based on a travel command, which will be described later, transmitted from the server 110 and a captured image transmitted from the imaging unit 208 of the terminal 210. As will be described later, the virtual line generation unit 380 is generated by, for example, the bezier curve method.
The travel control unit 390 controls the actuator 204 that drives each of the drive wheels 103a and 103a (see fig. 2) independently so that the mobile body 102 travels along the virtual line 14 (see fig. 6 c).
As described above, the mobile body control device 108 can be configured by a processor such as a CPU (Central Processing Unit), a ROM (Read Only Memory) in which a program is written, a RAM (Random Access Memory) for temporarily storing data, and the like.
"Server
As shown in fig. 3, the server 110 includes a communicator 400, a processing device 402, and a storage device 404. Examples of the server 110 in the present embodiment include a cloud server connected to the mobile object control device 108 via a wireless network, and a server in a facility including the travel path 104 of the mobile object 102.
The communicator 400 performs communication between the server 110 and the mobile body control device 108.
The storage device 404 is configured by a computer-readable device such as a hard disk device, DVD, or CD ROM.
The storage device 404 is mainly composed of a map information database 450, an AR marker information database 451, a beacon information database 452, and a geomagnetism map information database 453.
The map information database 450 stores a map for identifying the route and shape of the traveling path 104 (see fig. 1) of the mobile object 102 (see fig. 1) and the situation around the traveling path 104.
The AR marker information database 451 stores embedded information and installation position coordinates (for example, latitude and longitude) of each AR marker 107 (see fig. 1).
The beacon information database 452 stores the transmission frequency and the installation position coordinates (for example, latitude and longitude) of each beacon transmitter 106 (see fig. 1). In the beacon information database 452, the beacon information (the reception intensity and frequency of the beacon radio wave) of each beacon transmitter 106 (see fig. 1) which divides each map into, for example, 100 × 100 minutes and corresponds to each of the minutes is stored in association with the map stored in the map information database 450. The beacon information of each map is measured for each minute in advance. The beacon information of each minute is different from each other, and the position on the map measured by the specific beacon information becomes clear. The map division is not limited to 100 × 100 described above, and can be set as appropriate according to the size of the map.
In the geomagnetic map information database 453, geomagnetism that divides each map into, for example, 100 × 100 points and corresponds to each of the points is stored in correspondence with the map stored in the map information database 450. The geomagnetism of each map is measured in advance for each minute. The geomagnetism of each minute is different from each other, and the position on the map measured by the geomagnetism is clarified by the specific geomagnetism. The map division is not limited to 100 × 100 described above, and can be set as appropriate according to the size of the map.
The processing device 402 in the present embodiment receives various information data transmitted from the mobile object control device 108 via the communicator 400. The processing device 402 determines a route for the mobile object 102 to move from the current position to the destination with reference to the storage device 404 based on the various pieces of information data that are input. Then, the processing device 402 transmits the determined path to the mobile object control device 108 via the communicator 400.
Specifically, the processing device 402 includes a current position specifying unit 410, a destination specifying unit 412, and a route determining unit 414.
The current position specifying unit 410 calculates the current position of the mobile body 102 with reference to the storage device 404 based on various information data from the mobile body control device 108.
Specifically, when the information data from the moving object control device 108 is data of a captured image of the AR marker 107 captured by the moving object 102 at the current position, the current position specifying unit 410 refers to the AR marker information database 451 of the storage device 404. Thus, current position specifying section 410 specifies the current position of moving object 102, that is, the coordinates of AR mark 107 (see fig. 1) of the captured image.
When the information data from the mobile object control device 108 is beacon information (reception intensity and frequency of a beacon radio wave), the current position specifying unit 410 refers to the beacon information database 452 of the storage device 404. Thereby, the current position specifying unit 410 specifies the current position of the moving object 102 on the map corresponding to the beacon information.
Further, when the information data from mobile object control device 108 is geomagnetic information, current position specifying section 410 refers to geomagnetic map information database 453 of storage device 404. Thereby, the current position specifying unit 410 specifies the current position of the mobile body 102 on the geomagnetic map corresponding to the geomagnetic information.
The destination specifying unit 412 specifies the destination of the mobile object 102 with reference to the map information database 450 based on the destination information from the mobile object control device 108.
The route determining unit 414 determines a route from the current position to the destination with reference to the map information database 450 of the storage device 404 based on the current position and the destination specified by the current position specifying unit 410 and the destination specifying unit 412. This route determination can use a known algorithm used for route search of a map that is generally disclosed.
Then, the route determined by the route determination means 414 is transmitted to the mobile object control device 108 via the communicator 400.
The Processing device 402 can be configured by a processor such as a CPU (Central Processing Unit), a ROM (Read Only Memory) in which a program is written, a RAM (Random Access Memory) for temporarily storing data, and the like.
< actions of Mobile body control System >
Next, operations will be described while showing control steps of the mobile body control system 100.
Fig. 4 is a map illustrating an area to which the mobile body control system 100 of the present embodiment is applied. Fig. 5 is a flowchart showing a procedure of a travel control process of the mobile object 102 (see fig. 1) performed by the mobile object control system 100 according to the present embodiment.
As shown in fig. 4, the area as an example of the application of the mobile body control system 100 is composed of an area α of 1F (layer) and an area β of 2F (layer 2).
The "current position a" in fig. 4 refers to a departure point of the mobile body 102. In the example shown in fig. 4, it is assumed that the mobile object control system 100 sets routes R1 to R7 from the current position a of 1F (first floor) to the destination B of 2F (second floor) via the elevator. Such paths R1 to R7 are determined by the mobile unit control system 100 as described below based on the beacon and the AR marker in fig. 4 and/or the geomagnetism not shown in fig. 4.
As shown in fig. 5, the control process of the mobile object control system 100 includes a route determination process in steps S501 to S503 of the server 110, a virtual line generation process in steps S505 and S506 of the mobile object control device 108, and a travel control process of the mobile object 102 in step S507 of the mobile object control device 108.
In the route determination step, the server 110 specifies the current position of the mobile object 102 as described above (step S501). Note that the "current position a" in fig. 4 described above refers to the departure point of the mobile object 102, but the current position in step S501 may be different from the departure point (departure point). That is, the current position of the mobile object 102 after entering the predetermined route from the actual departure point does not refer to the departure point (departure point).
In this route determination step, when the mobile object 102 is located at the "current position" in a state where the terminal 210, the mobile object control device 108, and the server 110 are activated, the imaging unit 208 of the terminal 210 captures a captured image in which the AR marker 107 located near the mobile object 102 is reflected. In addition, the geomagnetic detection unit 216 of the terminal 210 detects the geomagnetism at the departure point. The mobile object control device 108 transmits beacon information (reception intensity and frequency of a beacon radio wave) from the beacon transmitter 106 to the server 110.
In addition, the mobile body control device 108 transmits the AR marker information and the geomagnetic information from the terminal 210 to the server 110. As described above, the server 110 refers to at least one of the map information database 450, the AR mark information database 451, the beacon information database 452, and the geomagnetic map information database 453 of the storage device 404. Thus, the server 110 specifies the "current position" of the mobile body 102 based on at least one of the AR marker information, the beacon information, and the geomagnetic information.
In the present embodiment, the AR marker and the beacon in the "AR marker information" and the "beacon information" constitute "location transmission indicator" described in the claims.
Next, the mobile body control system 100 receives the destination information from the mobile body 102 via the server 110 (refer to step S502).
Further, it is assumed that the destination information from the mobile body 102 is input by the user of the mobile body 102 via the input unit 212 of the terminal 210. As described above, the destination information from such a mobile body 102 is transmitted to the server 110 via the input information processing unit 310 of the mobile body control device 108.
As described above, the server 110 in the mobile body control system 100 determines a travel route from the current position to the destination (step S503).
Then, the server 110 outputs a traveling command for the mobile object 102 to the destination to the mobile object control device 108 (see step S504).
When the mobile object control device 108 that has input the travel command to the destination takes this as a trigger, the travelable region of the mobile object 102 is extracted based on the captured image of the image capturing unit 208 of the terminal 210 (see step S505). As described above, the travelable region is extracted by travelable region extraction section 370 of mobile body control device 108.
In the mobile body control system 100 according to the present embodiment, the travelable region extracted by the mobile body control device 108 is output to the terminal 210 and displayed on the display unit 214.
Fig. 6 (a) to (c) are captured images 10 of the path captured by the imaging unit 208 of the moving object 102.
As shown in fig. 6 (a), the travelable region 11 is displayed on the display unit 214 of the terminal 210 so as to overlap with the actual captured image 10 captured by the imaging unit 208. In fig. 6 (a), reference numeral 17 denotes an obstacle.
In the mobile body control system 100 according to the present embodiment, an AR (Augmented Reality) line L is set so that the center portion of the extracted travelable region 11 extends along the route. This AR line L is shown by a dotted arrow in fig. 6 (a).
A trigger for displaying the travelable region 11 on the display unit 214 is that the user inputs a display request to the input unit 212.
The travelable region 11 may be information in which the AR mark 107 is added in advance. When the AR mark 107 is reflected on the captured image 10, the travelable region 11 is displayed so as to overlap with the captured image 10.
When the travelable region 11 is displayed on the display unit 214 by the AR mark 107, the display is switched to the selection screen 12 shown in fig. 6 (b) after a predetermined time (for example, several seconds) has elapsed. A plurality of touch panel switches 13 are set on the selection screen 12. Then, the on/off of the travelable region 11 and the switching from the selection screen 12 to the normal screen can be performed via these touch panel switches 13. In fig. 6 (b), reference symbol L denotes an AR line L.
Referring back to fig. 5 again, the mobile body control system 100 sets the AR line L in the travelable region 11 (see fig. 6 a) by the mobile body control device 108 (see step S506). The AR line L is set in the center of the travelable region 11 by travelable region extraction means 370 of the mobile body control device 108.
Next, the virtual line generating means 380 of the moving object control device 108 smoothes the AR line L to form the virtual line 14 (see step S507).
The "smoothing process" is not particularly limited as long as it is a process for rounding the corner of the AR line L, and examples thereof include a spline curve method and a bezier curve method. Among them, the bezier curve method is preferable.
Fig. 6 (c) shows a smooth virtual line 14 formed by the bezier curve method.
Paths R1 to R7 (hereinafter, referred to as destination path R) shown in fig. 4 set by the server 110 form corners on the branch path 15 toward the destination.
On the other hand, when the virtual line generation unit 380 performs image determination on the branch line 15 based on the captured image 10 shown in fig. 6 (c), it goes on to this pointThe last passing point 16 of the travel trajectory that has traveled the target route R is set as the control point B of the bezier curve method 0 . Further, virtual line generating section 380 sets, for example, an arbitrary point far from the corner in destination path R toward the entrance of branch path 15 as control point B of bezier curve method 2 . Then, the virtual line generation unit 380 will be at control point B 0 And control point B 2 An arbitrary point on the target path R extending therebetween is set as a control point B of the Bezier curve method 1
The virtual line generation unit 380 in this embodiment will be based on control point B 0 ~B 2 The 2 nd order bezier curve of (1) is taken as the virtual line 14. The bezier curve may be 3 or more times.
In addition, the control point is B 0 、B 1 、……B N-1 The N-fold bezier curve of (a) is expressed by the following mathematical formula.
[ mathematical formula 1 ]
Figure GDA0003638549240000131
Here, J is ni (t) is a blending function, represented by the following equation.
[ math figure 2 ]
Figure GDA0003638549240000132
Even if the target path R is macroscopically viewed as a straight line, it is microscopically formed in a zigzag shape. The virtual line generating means 380 in the present embodiment can form the smooth virtual line 14 by the bezier curve method even in the target route R formed in a zigzag shape in a microscopic view.
Returning again to fig. 5, the mobile body control system 100 controls the traveling of the mobile body 102 by the mobile body control device 108 so that the mobile body 102 is along the virtual line 14 (see step S508).
Then, the mobile object control system 100 specifies the current position of the mobile object 102 via the server 110 (see step S509). The current position of the mobile object 102 can be specified in the same process as step S501.
Next, the server 110 determines whether or not the specified current position is an area including the destination (refer to step S510). If the current position is not an area including the destination (no in step S510), the process returns to step S504, and the server 110 continues to execute the travel command to the destination.
On the other hand, when the current position is an area including the destination (yes in step S510), the server 110 transmits a stop command to the mobile object 102 to the mobile object control device 108 (step S511). Then, the mobile unit control device 108 waits for reception of the travel control end request signal from the mobile unit 102 (yes in step S512), and ends the travel control process of the mobile unit 102.
< action Effect >
Next, the operational effects achieved by the present embodiment will be described.
According to the present embodiment, since the traveling of the mobile object 102 is controlled along the smooth virtual line 14, the mobile object 102 is prevented from wobbling during the steering or from changing rapidly in the steering behavior. Thus, unlike the conventional travel system, the mobile object control system 100 can ensure comfort to the user of the mobile object 102 without requiring a blind brick or the like. Then, by controlling the vibration, the moving body 102 does not collide with the articles present in the surroundings even in a narrow environment such as indoors.
In addition, according to the present embodiment, since the virtual line 14 is set in the travelable region 11, the traveling of the moving object is not hindered in the middle. This enables the mobile object 102 to move more smoothly to the departure point or the destination, and the comfort of the mobile object 102 is further improved.
In addition, according to the present embodiment, the virtual line 14 can be visually confirmed by the display unit 214 of the terminal 210. That is, the user of the mobile body 102 can confirm the movement path in advance. This improves the reliability of the mobile body control system 100.
In addition, according to the present embodiment, the position of the moving object 102 during traveling can be accurately grasped using the beacon. This enables the mobile body 102 to travel accurately on the destination route without play.
In addition, according to the present embodiment, the virtual line 14 is generated by, for example, bezier curve method, and therefore, a smoother travel path can be formed. This further improves the comfort of the mobile body 102.
In addition, according to the present embodiment, the position of the mobile body 102 can be specified by comparing the measured geomagnetism and the geomagnetism map. That is, the mobile body control system 100 is enabled to specify the position of the mobile body 102 more accurately by applying the geomagnetism inherent to the place.
In addition, according to the present embodiment, since the terminal 210 is embedded via the geomagnetic detection unit 216, the configurations of the mobile body control system 100 and the mobile body 102 can be simplified as compared with the case where the geomagnetic detection unit 216 is separately provided.
The embodiments of the present invention have been described above, but the present invention is not limited to the above embodiments and can be implemented in various ways.
In the above embodiment, the moving body control system 100 that controls the travel of the electric wheelchair in a predetermined facility has been described, but the present invention can also be applied to a moving body control system that controls an electric transport vehicle in a factory. The present invention can also be applied to a mobile body control system that controls the traveling of an automobile in an outdoor area.
In the above-described embodiment, the server 110 has been described as setting the destination route R based on the beacon information, the AR marker information, and/or the geomagnetic information, but the setting of the destination route R in the present invention is not limited to this.
Therefore, another example of the target route R in the present invention is a target route R that is a reference line along a white line or the like on the road detected by image determination in the captured image. In this case, the destination path R is an object for generating a virtual line by the bezier curve method.
In the above embodiment, the mobile unit 102 and the server 110 are separated from each other, but the server 110 may be mounted on the mobile unit 102.

Claims (8)

1. A mobile body control system is characterized by comprising:
an imaging unit mounted on an electric wheelchair as a mobile body;
a travelable region extraction unit that extracts a travelable region of the electric wheelchair based on the captured image captured by the image capturing unit;
a virtual line generation unit that generates a smooth virtual line from a departure point to a destination of the electric wheelchair determined by referring to a map information database based on image determination of a travel path in augmented reality in which the travel path is superimposed on the captured image, for a travel path of the electric wheelchair in a polygonal line shape, the travel path being set so that the electric wheelchair is guided along the path from the departure point to the destination on the path of the electric wheelchair from the departure point to the destination;
and
and a travel control unit that controls the electric wheelchair to travel along the virtual line.
2. The movable body control system according to claim 1,
and a display unit for displaying the virtual line.
3. The mobile body control system according to claim 1 or 2,
the travel control unit controls travel of the electric wheelchair based on information acquired from position transmission markers arranged at a plurality of locations in a movement area of the electric wheelchair.
4. The moving body control system according to claim 1 or 2,
the virtual line generating unit generates a smooth virtual line by a bezier curve method.
5. The mobile body control system according to claim 1 or 2,
further comprises a geomagnetic measurement unit mounted on the electric wheelchair,
the travel control means controls travel of the electric wheelchair based on the geomagnetism at a predetermined position output from the geomagnetism measurement means and a geomagnetic map created in advance so that a movement area of the electric wheelchair is associated with the geomagnetism at a plurality of places in the movement area.
6. The mobile body control system according to claim 5, wherein,
there is also a display unit that displays the virtual line,
the geomagnetic measurement unit is disposed in a mobile terminal mounted on the electric wheelchair together with the display unit.
7. An electric wheelchair as a mobile body, characterized in that,
the mobile body control system according to any one of claims 1 to 6 is provided.
8. A mobile body control method is characterized by comprising:
a travelable region extraction step of extracting a travelable region of the electric wheelchair as a moving body based on a captured image captured by the imaging unit;
a virtual line generation step of generating a smooth virtual line from a departure point to a destination point of the electric wheelchair determined by referring to a map information database based on image determination of a travel path in augmented reality in which the travel path is superimposed on the captured image, for a travel path of the electric wheelchair in a broken line shape, the travel path being set to guide the electric wheelchair along the path at the center of the travelable area on the path from the departure point to the destination point of the electric wheelchair; and
and a travel control step of controlling the electric wheelchair to travel along the virtual line.
CN201910100417.8A 2018-02-05 2019-01-31 Mobile body control system, electric wheelchair as mobile body, and mobile body control method Active CN110119141B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018017985A JP6774445B2 (en) 2018-02-05 2018-02-05 Mobile control system, mobile and mobile control method
JP2018-017985 2018-02-05

Publications (2)

Publication Number Publication Date
CN110119141A CN110119141A (en) 2019-08-13
CN110119141B true CN110119141B (en) 2022-11-15

Family

ID=67520435

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910100417.8A Active CN110119141B (en) 2018-02-05 2019-01-31 Mobile body control system, electric wheelchair as mobile body, and mobile body control method

Country Status (2)

Country Link
JP (1) JP6774445B2 (en)
CN (1) CN110119141B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7376078B2 (en) 2019-11-20 2023-11-08 国立大学法人 鹿児島大学 Free-range chicken house management system
JP7366373B2 (en) * 2020-10-02 2023-10-23 本田技研工業株式会社 Mobile object control device, mobile object, mobile object control method, and program
JP7447922B2 (en) * 2021-03-22 2024-03-12 株式会社リコー Display system, communication system, display control method and program
WO2023204025A1 (en) * 2022-04-20 2023-10-26 パナソニックIpマネジメント株式会社 Movement management system, movement management method, and program

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3362569B2 (en) * 1995-08-04 2003-01-07 三菱自動車工業株式会社 Vehicle guidance device
JPH1186182A (en) * 1997-09-01 1999-03-30 Honda Motor Co Ltd Automatic driving control system
JP3909300B2 (en) * 2003-04-18 2007-04-25 有限会社ミキシィ Automatic traveling wheelchair, wheelchair automatic traveling system, and wheelchair automatic traveling method
JP2007328486A (en) * 2006-06-07 2007-12-20 Yaskawa Electric Corp Moving robot
JP5093024B2 (en) * 2008-09-23 2012-12-05 株式会社豊田自動織機 Position display system for moving objects
US9014848B2 (en) * 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
JP5218479B2 (en) * 2010-06-10 2013-06-26 株式会社安川電機 Mobile system
CN101984382B (en) * 2010-10-15 2014-06-18 重庆市电力公司检修分公司 Method for intelligently inspection substation equipment by using robot
CN102085875B (en) * 2010-11-25 2012-05-30 中国民航大学 Intelligent baggage car and control method thereof
CN102183958B (en) * 2011-04-12 2014-04-23 中国民航大学 Autonomous guiding vehicle and control method thereof
CN102521566B (en) * 2011-11-25 2013-07-24 中国科学院光电技术研究所 Low-power-consumption real-time road automatic identifying device for ground independent equipment
US8831255B2 (en) * 2012-03-08 2014-09-09 Disney Enterprises, Inc. Augmented reality (AR) audio with position and action triggered virtual sound effects
WO2014091611A1 (en) * 2012-12-13 2014-06-19 株式会社日立製作所 Autonomous travelling apparatus
BR112015019173A2 (en) * 2013-03-11 2017-07-18 Honda Motor Co Ltd server device, congestion forecast information display system, congestion forecast information distribution method, congestion forecast information display method, and program
KR101455236B1 (en) * 2013-04-19 2014-11-07 주식회사 제이엠랩 Mobile apparatus for providing augmented reality and method thereof
WO2015178195A1 (en) * 2014-05-20 2015-11-26 本田技研工業株式会社 Driving characteristic evaluation method, program, and driving characteristic evaluation device
JP6304785B2 (en) * 2014-05-23 2018-04-04 本田技研工業株式会社 Driving support device and control method of driving support device
CN104019813B (en) * 2014-06-19 2017-01-25 无锡知谷网络科技有限公司 Method and system of target immediate location and map establishing
KR20160019760A (en) * 2014-08-12 2016-02-22 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
KR101870067B1 (en) * 2014-08-25 2018-06-22 엑스 디벨롭먼트 엘엘씨 Methods and systems for augmented reality to display virtual representations of robotic device actions
JP6269534B2 (en) * 2015-03-02 2018-01-31 トヨタ自動車株式会社 Travel control device
KR101682880B1 (en) * 2015-03-19 2016-12-20 현대자동차주식회사 Vehicle and remote vehicle manipulating system comprising the same
JP5973610B1 (en) * 2015-03-27 2016-08-23 本田技研工業株式会社 Control equipment for unmanned work vehicles
CN104739519B (en) * 2015-04-17 2017-02-01 中国科学院重庆绿色智能技术研究院 Force feedback surgical robot control system based on augmented reality
CN106575402A (en) * 2015-06-26 2017-04-19 深圳市赛亿科技开发有限公司 Smart shopping guide system
JP2017102601A (en) * 2015-11-30 2017-06-08 本田技研工業株式会社 Automatic travel system, automatic travel vehicle provided with the automatic travel system, method of controlling automatic travel system, and control program therefor
JP6304777B2 (en) * 2016-05-17 2018-04-04 本田技研工業株式会社 Moving body
CN106123892A (en) * 2016-06-22 2016-11-16 武汉科技大学 A kind of robot localization method based on wireless sensor network Yu earth magnetism map
CN106289257A (en) * 2016-07-27 2017-01-04 无锡知谷网络科技有限公司 Indoor orientation method and alignment system
CN106338289A (en) * 2016-08-11 2017-01-18 张满仓 Robot-based indoor positioning and navigation system and method
CN106485233B (en) * 2016-10-21 2020-01-17 深圳地平线机器人科技有限公司 Method and device for detecting travelable area and electronic equipment
CN106584472A (en) * 2016-11-30 2017-04-26 北京贝虎机器人技术有限公司 Method and device for controlling autonomous mobile equipment
CN107272685B (en) * 2017-06-21 2021-01-08 国机智能技术研究院有限公司 Method and system for controlling turning of robot
CN107554642B (en) * 2017-08-03 2019-07-02 求实科技集团有限公司 A kind of active leading type navigating robot

Also Published As

Publication number Publication date
JP2019135579A (en) 2019-08-15
JP6774445B2 (en) 2020-10-21
CN110119141A (en) 2019-08-13

Similar Documents

Publication Publication Date Title
CN110119141B (en) Mobile body control system, electric wheelchair as mobile body, and mobile body control method
CN111065891B (en) Indoor navigation system based on augmented reality
US6725154B2 (en) Image display apparatus
JP4771147B2 (en) Route guidance system
EP3147630B1 (en) 3d helicopter view at destination
US7412326B2 (en) Map display device and map display method
WO2011074165A1 (en) Autonomous mobile device
JP6665506B2 (en) Remote control device, method and program
US20040059500A1 (en) Navigation apparatus
TW200829872A (en) Navigation device and method for displaying navigation information
JP6636260B2 (en) Travel route teaching system and travel route teaching method for autonomous mobile object
US10473480B2 (en) Display control device, and display device having a virtual image displayed on a windshield, and display control method thereof
CN110392908A (en) For generating the electronic equipment and its operating method of map datum
JP4833384B1 (en) Navigation device, navigation method, navigation program, and recording medium
JP6609588B2 (en) Autonomous mobility system and autonomous mobility control method
EP3842321A1 (en) In-vehicle processing apparatus and in-vehicle processing system
JP4381205B2 (en) Navigation device and map display method in the device
JP2012143383A (en) Controller, model device, and control method
JP2006250673A (en) Navigation device and program for navigation
TW202001191A (en) Indoor navigation system capable of accurately guiding a user to walk in an indoor space in an augmented reality real-time navigation manner
JP7366373B2 (en) Mobile object control device, mobile object, mobile object control method, and program
WO2023243221A1 (en) Movement path determination system, landing site determination system, movement path determination device, drone control device, and computer program
JP5356483B2 (en) Navigation device and navigation method
JP2009019970A (en) Navigation device
JP2016200515A (en) Display control device, information processing device, display control method, and control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant