CN113168793A - Guidance support control device and guidance support system - Google Patents

Guidance support control device and guidance support system Download PDF

Info

Publication number
CN113168793A
CN113168793A CN201880100013.2A CN201880100013A CN113168793A CN 113168793 A CN113168793 A CN 113168793A CN 201880100013 A CN201880100013 A CN 201880100013A CN 113168793 A CN113168793 A CN 113168793A
Authority
CN
China
Prior art keywords
image
guidance
guidance support
control unit
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880100013.2A
Other languages
Chinese (zh)
Inventor
坂田礼子
片冈龙成
相川真实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN113168793A publication Critical patent/CN113168793A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/18Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion

Abstract

A guidance support control device (100) is used for a guidance support system (200), the guidance support system (200) supports guidance of a guided person by a guiding person using an image projected into a guidance target space (R) by an output device (2), and the guidance support control device (100) is provided with: an image selection control unit (31) that performs control for selecting any one of a plurality of preset images in accordance with an operation input by a guide person during operation of the guidance support system (200); and a selection image output instruction unit (33) that instructs the output device (2) to project the selection preset image selected by the image selection control unit (31).

Description

Guidance support control device and guidance support system
Technical Field
The present invention relates to a guidance support control device and a guidance support system.
Background
Conventionally, a technique of transmitting information related to a vehicle to pedestrians or other occupants of the vehicle using an image projected onto a road surface by a projection device provided in the vehicle, so-called "road illumination", has been developed (for example, see patent document 1). In addition, a system for guiding a person to be guided in a facility by using a moving image projected onto a floor surface, a wall surface, a ceiling surface, or the like by a projection device provided in the facility, a so-called "moving image illumination guiding system" has been developed.
Documents of the prior art
Patent document
Patent document 1: international publication No. 2018/138842
Disclosure of Invention
Problems to be solved by the invention
In recent years, a system (hereinafter, referred to as a "guidance support system") has been developed that supports guidance of a guided person in a guidance target space by using an image projected onto a floor, a wall, a ceiling, or the like by a projection device provided in the guidance target space. The guidance target space is, for example, a lobby of an airport.
In the conventional guidance support system, a predetermined image is projected by a projection device. Therefore, there is a problem that the degree of freedom of guidance of the person is low. As a result, there is a problem that the use convenience of the leader is poor.
The present invention has been made to solve the above-described problems, and an object of the present invention is to improve the degree of freedom of guidance of a person in a guidance support system using an image projected into a guidance target space.
Means for solving the problems
A guidance support control device according to the present invention is a guidance support system that supports guidance of a guided person by a guiding person using an image projected into a guidance target space by an output device, the guidance support control device including: an image selection control unit that performs control for selecting any one of the plurality of preset images in accordance with an operation input by a guide person during operation of the guidance support system; and a selection image output instruction section that instructs the projection of the selection preset image selected by the image selection control section to the output device.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, since the above configuration is adopted, it is possible to improve the degree of freedom of guidance of a person to be guided in the guidance support system using an image projected into the guidance target space.
Drawings
Fig. 1 is an explanatory diagram showing a system configuration of a guidance support system according to embodiment 1.
Fig. 2A is an explanatory diagram illustrating a projector in a case where each output device included in the guidance support system according to embodiment 1 is configured by the projector.
Fig. 2B is a block diagram showing a main part of each output device included in the guidance support system according to embodiment 1.
Fig. 3A is an explanatory diagram illustrating a tablet PC in a case where a terminal device included in the guidance support system according to embodiment 1 is configured by the tablet PC.
Fig. 3B is a block diagram showing a main part of a terminal device included in the guidance support system according to embodiment 1.
Fig. 4 is a block diagram showing a main part of a control unit in a terminal device included in the guidance support system according to embodiment 1, and is a block diagram showing a main part of the guidance support control device according to embodiment 1.
Fig. 5 is an explanatory diagram showing an example of the selection screen.
Fig. 6A is an explanatory diagram showing a hardware configuration of a control unit in a terminal device included in the guidance support system according to embodiment 1.
Fig. 6B is an explanatory diagram showing another hardware configuration of the control unit in the terminal device included in the guidance support system according to embodiment 1.
Fig. 7 is a flowchart showing the operation of the guidance support control device according to embodiment 1.
Fig. 8 is a block diagram showing a main part of another guidance support control apparatus according to embodiment 1.
Fig. 9 is an explanatory diagram showing a system configuration of another guidance support system according to embodiment 1.
Fig. 10 is a block diagram showing a main part of the guidance support control device according to embodiment 2.
Fig. 11 is a flowchart showing the operation of the guidance support control device according to embodiment 2.
Fig. 12 is a block diagram showing a main part of the guidance support control device according to embodiment 3.
Fig. 13 is an explanatory diagram showing an example of the editing screen.
Fig. 14 is an explanatory diagram showing an example of generating an image.
Fig. 15 is an explanatory diagram showing another example of the generated image.
Fig. 16 is an explanatory diagram showing another example of the generated image.
Fig. 17 is a flowchart showing the operation of the guidance support control device according to embodiment 3.
Fig. 18 is a block diagram showing a main part of the guidance support control device according to embodiment 4.
Fig. 19 is a flowchart showing the operation of the guidance support control device according to embodiment 4.
Fig. 20 is a flowchart showing another operation of the guidance support control apparatus according to embodiment 4.
Fig. 21 is a block diagram showing a main part of another guidance support control apparatus according to embodiment 4.
Fig. 22 is a block diagram showing a main part of the guidance support control device according to embodiment 5.
Fig. 23A is an explanatory diagram showing an example of a projection position of a generated image.
Fig. 23B is an explanatory view showing an example of a projection position of the additional preset image corresponding to the generated image shown in fig. 23A.
Fig. 24 is an explanatory diagram showing an example of the editing screen.
Fig. 25 is an explanatory diagram showing an example of generating an image.
Fig. 26 is an explanatory view showing an example of the selection preset image corrected according to the position of the terminal device in the case where the selection preset image is the additional preset image corresponding to the generated image shown in fig. 25.
Fig. 27 is a flowchart showing the operation of the guidance support control device according to embodiment 5.
Fig. 28 is a flowchart showing another operation of the guidance support control device according to embodiment 5.
Fig. 29 is a block diagram showing a main part of the guidance support control apparatus according to embodiment 6.
Fig. 30 is an explanatory diagram showing an example of the congestion degree distribution shown in the spatial state information.
Fig. 31 is an explanatory diagram showing an example of generating an image.
Fig. 32 is an explanatory diagram showing an example of the corrected image.
Fig. 33 is a flowchart showing the operation of the guidance support control device according to embodiment 6.
Fig. 34 is a flowchart showing another operation of the guidance support control device according to embodiment 6.
Fig. 35 is an explanatory diagram showing an example of the editing screen.
Detailed Description
Hereinafter, in order to explain the present invention in more detail, a mode for carrying out the present invention will be described with reference to the drawings.
Embodiment 1.
Fig. 1 is an explanatory diagram showing a system configuration of a guidance support system according to embodiment 1. Fig. 2A is an explanatory diagram illustrating a projector in a case where each output device included in the guidance support system according to embodiment 1 is configured by the projector. Fig. 2B is a block diagram showing a main part of each output device included in the guidance support system according to embodiment 1. Fig. 3A is an explanatory diagram illustrating a tablet PC in a case where a terminal device included in the guidance support system according to embodiment 1 is configured by the tablet PC. Fig. 3B is a block diagram showing a main part of a terminal device included in the guidance support system according to embodiment 1. A guidance support system 200 according to embodiment 1 will be described with reference to fig. 1 to 3.
Hereinafter, embodiments 1 to 6 will be described mainly with reference to the following examples: the guidance target space R is a lobby of an airport, the guidance person is a worker at the airport, and the guided person is a user at the airport.
The guidance support system 200 includes an output device group 1. The output device group 1 is composed of a plurality of output devices 2. In the example shown in FIG. 1, the output device group 1 is composed of 6 output devices 2_1 to 2_ 6. The plurality of output devices 2 are provided at different positions in the guidance target space R, for example. The guidance support system 200 includes a terminal device 3. The terminal device 3 is held by a leader, for example.
Each output device 2 is constituted by, for example, a projector (see fig. 2A). Each output device 2 projects an image for guidance onto the floor, wall, or ceiling in the guidance target space R. Each output device 2 is also free to communicate with the terminal device 3 via the network 4. That is, as shown in fig. 2B, each output device 2 includes a projection unit 11, a communication unit 12, and a control unit 13. The projection unit 11 is used for projecting the image. The communication unit 12 is used for this communication. The control unit 13 is used for controlling various operations of the projection unit 11 and the communication unit 12.
For example, 5 gates G1 to G5 are provided in a lobby of an airport as the guidance target space R. The 2 output devices 2_1 and 2_2 project images for guidance to the floor in a predetermined area corresponding to the 1 st gate G1 in the lobby. Further, the 1 output device 2_3 projects an image for guidance onto the floor in a predetermined area corresponding to the 2 nd gate G2 in the lobby. Further, the 1 output device 2_4 projects an image for guidance onto the wall surface in the predetermined area corresponding to the 3 rd gate G3 in the lobby. Further, the 1 output device 2_5 projects an image for guidance onto the ceiling surface in the predetermined area corresponding to the 4 th gate G4 in the lobby. Further, the 1 output device 2_6 projects an image for guidance onto the floor in a predetermined area corresponding to the 4 th gate G4 in the lobby. Hereinafter, a region in the guidance target space R where images for guidance are projected by the plurality of output devices 2 is referred to as a "projection target region".
The guidance support system 200 is not limited to the case where 6 output devices 2_1 to 2_6 are provided. The guidance support system 200 may include 5 or less output devices 2, or may include 7 or more output devices 2. Note that, although the description is given of an example in which the number of output devices 2 and the projected predetermined regions are different for each of the gates G1 to G5, the same number and the same predetermined regions may be provided for all the gates. In this case, 3 output devices 2 may be provided for each gate, and the output devices 2 may be projected onto the floor, wall, or ceiling surface.
The terminal device 3 is constituted by, for example, a tablet PC (Personal Computer) (see fig. 3A). The terminal apparatus 3 freely communicates with each output apparatus 2 via the network 4. That is, as shown in fig. 3B, the terminal device 3 includes a communication unit 14, a control unit 15, a display unit 16, and an operation input unit 17. The communication unit 14 is used for this communication. The control unit 15 controls various operations of the communication unit 14, the display unit 16, and the operation input unit 17. The display unit 16 is constituted by, for example, a liquid crystal display or an organic EL (Electro Luminescence) display. The operation input unit 17 is composed of, for example, a touch panel and a microphone for voice input.
In this way, the guidance support system 200 is mainly configured. The guidance support system 200 supports guidance of a person to be guided by a guiding person by projection of an image for guidance by at least 1 output device 2 of the plurality of output devices 2.
Fig. 4 is a block diagram showing a main part of a control unit in a terminal device included in the guidance support system according to embodiment 1, and is a block diagram showing a main part of the guidance support control device according to embodiment 1. The guidance support control apparatus 100 according to embodiment 1 will be described with reference to fig. 4.
The display control unit 21 performs control to display various screens of the guidance support control apparatus 100 on the display unit 16.
The image data storage unit 22 stores image data representing a plurality of images in advance. The plurality of images are used for guiding the person to the guided person, respectively. Hereinafter, the plurality of images are referred to as "preset images". Further, image data representing each preset image is referred to as "preset image data".
The image selection control unit 31 performs control of selecting 1 preset image out of the plurality of preset images in accordance with an operation input to the operation input unit 17 by the lead person during the operation of the guidance support system 200. Hereinafter, the selected preset image is referred to as a "selected preset image".
The device selection control unit 32 performs control for selecting at least 1 output device 2 of the plurality of output devices 2 in accordance with an operation input to the operation input unit 17 by the lead person during the operation of the guidance support system 200. Hereinafter, the selected output device 2 is referred to as a "selected output device".
The selection control by the image selection control unit 31 and the selection control by the device selection control unit 32 are executed, for example, in a state where the display control unit 21 causes the display unit 16 to display a dedicated screen (hereinafter, referred to as "selection screen") S1. Fig. 5 shows a specific example of the selection screen S1.
As shown in fig. 5, the selection screen S1 has an area a1 in which a plurality of thumbnail images corresponding to a plurality of preset images are displayed. The plurality of thumbnail images are freely selected by a touch operation of the touch panel using the operation input unit 17. The selection control by the image selection control section 31 is control in accordance with the touch operation on the area a 1.
The selection screen S1 has an area a2 in which the selected preset image is displayed. The display size of the selection preset image in the area a2 is larger than the display size of each thumbnail image in the area a 1. Thereby, the leader can visually and easily recognize which of the plurality of preset images is the selected state.
The selection screen S1 has an area A3 in which a plurality of button images corresponding to a plurality of output devices 2 are displayed. The plurality of button images are each freely pressed by a touch operation of the touch panel using the operation input unit 17. The selection control by the device selection control unit 32 is control in accordance with the touch operation on the area a 3.
The corresponding preset image data is used in the display of each of the plurality of thumbnail images in the area a1 and the display of the selected preset image in the area a 2. The preset image data is stored in the image data storage unit 22 in advance as described above. In addition, for the display of the plurality of button images in the area a3, information indicating the installation positions of the plurality of output devices 2 in the guidance target space R, information indicating the projection target areas of the plurality of output devices 2 in the guidance target space R (hereinafter referred to as "projection target area information"), or the like is used. These pieces of information are stored in advance in the memory 43 of the terminal apparatus 3 (described later), for example.
The selected image output instructing section 33 instructs the selected output device 2 to select the output of the preset image, that is, to select the projection of the preset image. More specifically, the selected image output instructing section 33 executes control of transmitting a signal instructing output of a selected preset image, that is, a signal instructing projection of a selected preset image (hereinafter referred to as "output instruction signal") to the selected output device 2. The communication unit 14 is used for transmitting the output instruction signal.
The communication unit 12 of the selection output device 2 receives the output instruction signal transmitted from the terminal device 3. Next, the control unit 13 of the selection output device 2 executes control for causing the projection unit 11 of the selection output device 2 to project the selection preset image. Thereby, the selection preset image is projected in the projection target area of the selection output device 2. The leader can use the projected selection preset image for guidance of the guided person.
The image selection control unit 31, the device selection control unit 32, and the selection image output instruction unit 33 constitute the main part of the guidance support control device 100. The display control unit 21, the image data storage unit 22, and the guidance support control device 100 constitute the main part of the control unit 15.
Next, a hardware configuration of a main part of the control unit 15 will be described with reference to fig. 6.
As shown in fig. 6A, the control unit 15 includes a processor 41, a memory 42, and a memory 43. The memory 42 stores programs corresponding to the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, and the selected image output instruction unit 33. The functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, and the selected image output instruction unit 33 are realized by the processor 41 reading and executing the program stored in the memory 42. Further, the function of the image data storage section 22 is realized by the memory 43.
Alternatively, as shown in fig. 6B, the control unit 15 includes a memory 43 and a processing circuit 44. In this case, the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, and the selected image output instruction unit 33 are realized by the dedicated processing circuit 44. Further, the function of the image data storage section 22 is realized by the memory 43.
Alternatively, the control unit 15 includes the processor 41, the memory 42, the memory 43, and the processing circuit 44 (not shown). In this case, some of the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, and the selected image output instructing unit 33 are realized by the processor 41 and the memory 42, and the remaining functions are realized by the dedicated processing circuit 44. Further, the function of the image data storage section 22 is realized by the memory 43.
The Processor 41 is configured by at least one of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, and a DSP (Digital Signal Processor), for example.
The memory 42 is composed of a volatile memory and a nonvolatile memory. The volatile Memory in the Memory 42 is, for example, a RAM (Random Access Memory). The nonvolatile Memory in the Memory 42 is configured by at least one of a ROM (Read Only Memory), a flash Memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), an SSD (Solid State Drive) or an HDD (Hard Disk Drive), for example.
The memory 43 is constituted by a nonvolatile memory. Specifically, for example, the memory 43 is configured by at least one of a ROM, a flash memory, an EPROM, an EEPROM, an SSD, or an HDD.
The processing Circuit 44 is configured by at least one of an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), an SoC (System-on-a-Chip: single Chip System), or a System LSI (Large Scale Integration), for example.
Next, the operation of the guidance support control apparatus 100 will be described with reference to the flowchart of fig. 7. When an operation for instructing the display of the selection screen S1 is input to the operation input unit 17 during the operation of the guidance support system 200, the display control unit 21 causes the display unit 16 to display the selection screen S1. Of the processes at steps ST1 to ST3 shown in fig. 7, at least the processes at steps ST1 and ST2 are executed in a state where the selection screen S1 is displayed on the display unit 16.
First, in step ST1, the image selection control section 31 performs control of selecting 1 of the plurality of preset images in accordance with an operation input with respect to the selection screen S1. Next, in step ST2, the device selection control section 32 performs control for selecting at least 1 output device 2 of the plurality of output devices 2 in accordance with an operation input to the selection screen S1. The processing in steps ST1 and ST2 may be executed in parallel with each other. Alternatively, the process of step ST2 may be executed first, and then the process of step ST1 may be executed.
Next, in step ST3, the selected image output instructing section 33 instructs the selection output device 2 to select projection of the preset image. More specifically, the selected image output instructing unit 33 executes control to transmit an output instruction signal to the selected output device 2.
Next, the communication unit 12 of the selection output device 2 receives the output instruction signal transmitted from the terminal device 3. Next, the control unit 13 of the selection output device 2 executes control for causing the projection unit 11 of the selection output device 2 to project the selection preset image. The control unit 13 of the selection output device 2 executes control for ending the projection of the selection preset image when a predetermined time (for example, 2 minutes or 3 minutes) Tref has elapsed since the projection of the selection preset image was started.
In this way, according to the guidance support control apparatus 100, when the guiding person guides the guided person, the preset image to be projected among the plurality of preset images can be selected in real time. This can improve the degree of freedom of guidance of the person.
Further, according to the guidance support control device 100, when the person who is guided by the guiding person, the output device 2 that projects and selects the preset image among the plurality of output devices 2 can be selected in real time. This enables the projection position of the selection preset image in the guidance target space R to be set in real time. As a result, for example, when the guided person is guided to a guidance target point distant from the current position of the guiding person, it is possible to suppress occurrence of movement of the guiding person. As a result, it is possible to achieve an efficient guidance of the person. Further, the guided person can visually recognize the distance from the current position of the guided person to the guidance target point, and the like.
Next, a modification of the guidance support control device 100 and a modification of the guidance support system 200 will be described.
First, the guidance target space R is not limited to a lobby of an airport. The guidance target space R may be any space as long as it is a space in which the guidance person needs to guide the guided person. The guide object space R may also be, for example, in a station, a department store, a supermarket, an exhibition hall, or an event hall. In particular, the guidance support system 200 is suitable for use in a large space or a space with many dead spaces due to many installation items.
The image selection control unit 31 may be configured to be able to correct the selection preset image selected by the leader using the region a1 of the selection screen S1 in the region a 2. For example, as shown in fig. 5, in a situation where the area a2 displays the selection preset image selected by the leader using the area a1, when the leader moves, rotates, enlarges, or reduces the selection preset image displayed in the area a2, the selection preset image projected by the output device 2 is similarly changed.
Further, the guide person may select a plurality of preset images at a time, instead of selecting one preset image at a time. For example, the image selection control unit 31 displays the plurality of selection preset images selected by the leader using the area a1 of the selection screen S1 in the area a2 for a predetermined time, and the output device 2 projects the plurality of selection preset images as if the leader is disposed in the area a2 or if the guide is corrected.
As shown in fig. 8, the device selection control unit 32 may be provided outside the guidance support control device 100. That is, the image selection control unit 31 and the selected image output instruction unit 33 may constitute the main part of the guidance support control apparatus 100.
The guidance support system 200 may include a server device (not shown). The server apparatus freely communicates with the terminal apparatus 3 via the network 4, and freely communicates with each output apparatus 2 via the network 4. In this case, a part or all of the functions of the guidance support control apparatus 100 may be realized by a server apparatus instead of the terminal apparatus 3.
As shown in fig. 9, the guidance support system 200 may include a plurality of terminal apparatuses 3. The plurality of terminal apparatuses 3 may be held by a plurality of respective leaders, for example. In the example shown in fig. 9, 3 terminal devices 3_1 to 3_3 are included in the guidance support system 200.
Instead of the projector, some of the output devices 2 may be configured by a display. For example, 1 output device 2_4 of the 6 output devices 2_1 to 2_6 may be constituted by a display provided at the 3 rd gate G3. In addition, in the case where the selection output device 2 is configured by a display, the output instruction signal may be a display signal instructing selection of the preset image instead of a signal instructing selection of projection of the preset image.
As described above, the guidance support control device 100 according to embodiment 1 is a guidance support control device 100 for a guidance support system 200 that supports guidance of a guided person by a guiding person using an image projected into a guidance target space R by an output device 2, and includes: an image selection control unit 31 that performs control for selecting any one of the plurality of preset images in accordance with an operation input by a guide person during the operation of the guidance support system 200; and a selection image output instructing section 33 that instructs the output device 2 to project the selection preset image selected by the image selection control section 31. Thus, when the leader guides the guided person, the preset image to be projected among the plurality of preset images can be selected in real time. As a result, the degree of freedom of guidance of the person to be guided can be improved.
The guidance support system 200 according to embodiment 1 is a guidance support system 200 that supports guidance of a guided person by a guiding person using an image projected into a guidance target space R by an output device 2, and includes: an image selection control unit 31 that performs control for selecting any one of the plurality of preset images in accordance with an operation input by a guide person during the operation of the guidance support system 200; and a selection image output instructing section that instructs the output device 2 to project the selection preset image selected by the image selection control section 31. This can provide the same effects as those of the guidance support control apparatus 100.
Embodiment 2.
Fig. 10 is a block diagram showing a main part of the guidance support control device according to embodiment 2. The guidance support control device 100a according to embodiment 2 will be described with reference to fig. 10.
The system configuration of the guidance support system 200 is the same as the system configuration described with reference to fig. 1 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of each output device 2 is the same as that described with reference to fig. 2 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of the terminal device 3 is the same as that described with reference to fig. 3 in embodiment 1, and therefore, illustration and description thereof are omitted. In fig. 10, the same blocks as those shown in fig. 4 are denoted by the same reference numerals, and description thereof is omitted.
The output continuation instructing unit 34 counts the number of times (hereinafter referred to as "selection times") that each of the plurality of preset images is selected by the image selection control unit 31 within a predetermined period (for example, a period of approximately 10 minutes) Δ T1. When the selected image output instruction unit 33 instructs the selection output device 2 to project the selection preset image, the output continuation instruction unit 34 compares the count value Vc of the number of selections corresponding to the selection preset image with a predetermined threshold value (for example, 5 times) Vth.
When the count value Vc is equal to or greater than the threshold value Vth, the output continuation instruction unit 34 instructs the selection output device 2 to continue outputting the selected preset image, that is, to continue projecting the selected preset image. More specifically, the output continuation instruction unit 34 transmits a signal (hereinafter referred to as "continuation instruction signal") instructing to continue outputting the selected preset image, that is, to continue projecting the selected preset image, to the selection output device 2. The communication unit 14 is used to transmit a continuation instruction signal.
When the terminal device 3 transmits the continuation instruction signal, the communication unit 12 of the selection output device 2 receives the transmitted continuation instruction signal. The control unit 13 of the selection output device 2 executes control for continuing projection of the selection preset image by the projection unit 11 of the selection output device 2 until a predetermined termination condition is satisfied.
Specific examples of the termination condition are as follows. That is, when an operation instructing to end the projection of the selected preset image is input to the operation input unit 17 or when an operation to select another preset image is input to the operation input unit 17, the communication unit 14 transmits a signal indicating this to the selection output device 2. The communication unit 12 of the selection output device 2 receives the transmitted signal, and the termination condition is satisfied.
On the other hand, when the continuation instruction signal is not transmitted from the terminal device 3, the control unit 13 of the selection output device 2 executes the same control as that described in embodiment 1. That is, the control unit 13 of the selection output device 2 executes control for ending the projection of the selection preset image when a predetermined time Tref has elapsed since the projection of the selection preset image was started.
The image selection control unit 31, the device selection control unit 32, the selected image output instruction unit 33, and the output continuation instruction unit 34 constitute the main parts of the guidance support control device 100 a.
The hardware configuration of the control unit 15 including the guidance support control device 100a is the same as that described with reference to fig. 6 in embodiment 1, and therefore, illustration and description thereof are omitted. In other words, the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33, and the output continuation instructing unit 34 may be realized by the processor 41 and the memory 42, or may be realized by the dedicated processing circuit 44.
Next, the operation of the guidance support control apparatus 100a will be described with reference to the flowchart of fig. 11. In fig. 11, the same steps as those shown in fig. 7 are denoted by the same reference numerals, and description thereof is omitted.
First, the processing of steps ST1 to ST3 is executed. Next, in step ST4, the output continuation instruction unit 34 compares the count value Vc of the number of selections corresponding to the selection of the preset image in step ST1 with the threshold value Vth. When the count value Vc is equal to or greater than the threshold Vth (yes in step ST 4), in step ST5, the output continuation instruction unit 34 instructs the selection output device 2 to continue projecting the selected preset image. More specifically, the output continuation instruction unit 34 executes control for transmitting a continuation instruction signal to the selection output device 2. On the other hand, when the count value Vc is smaller than the threshold Vth (no in step ST 4), the process in step ST5 is skipped.
In this way, according to the guidance support control apparatus 100a, when the guidance person wants to continue the projection of the same preset image, the number of inputs of the operation of selecting the preset image by the guidance person can be reduced. This can save the operation trouble of the leader.
As described in embodiment 1, the guidance support system 200 may include a plurality of terminal apparatuses 3. In this case, the count value Vc may be a total value based on the number of selections of the plurality of terminal devices 3. This makes it possible to save the operation of each of the guiding persons in a situation where the same guidance is required for a plurality of guided persons, for example.
Otherwise, the guidance support system 200 can employ various modifications similar to the various modifications described in embodiment 1. Note that the guidance support control device 100a can employ various modifications similar to the various modifications described in embodiment 1.
As described above, the guidance support control device 100a according to embodiment 2 includes the output continuation instruction unit 34, and when the count value Vc of the number of times that the selected preset image is selected within the predetermined period Δ T1 is equal to or greater than the threshold value Vth, the output continuation instruction unit 34 instructs the output device 2 to continue projection of the selected preset image. Thus, when the leader wants to continue the projection of the same preset image, the number of inputs of the operation of selecting the preset image by the leader can be reduced. As a result, the operation by the leader can be omitted.
Embodiment 3.
Fig. 12 is a block diagram showing a main part of the guidance support control device according to embodiment 3. The guidance support control device 100b according to embodiment 3 will be described with reference to fig. 12.
The system configuration of the guidance support system 200 is the same as the system configuration described with reference to fig. 1 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of each output device 2 is the same as that described with reference to fig. 2 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of the terminal device 3 is the same as that described with reference to fig. 3 in embodiment 1, and therefore, illustration and description thereof are omitted. In fig. 12, the same blocks as those shown in fig. 4 are denoted by the same reference numerals, and description thereof is omitted.
Hereinafter, in embodiments 3 to 6, description will be given mainly on an example in which the projection target areas of the plurality of output devices 2 are set on the ground in the guidance target space R. The following description will be centered on an example in which the plurality of output devices 2 are arranged so that the output device group 1 can project an image onto substantially the entire ground in the guidance target space R.
The image generation control section 51 performs the following control: in the operation of the guidance support system 200, an image to be output by the output device group 1, that is, an image to be projected by the output device group 1 is generated in accordance with an operation input to the operation input unit 17 by a guidance person. Hereinafter, the generated image is referred to as a "generated image".
The image generation control by the image generation control unit 51 is executed, for example, in a state where the display control unit 21 causes the display unit 16 to display a dedicated screen (hereinafter referred to as an "editing screen") S2. Referring to fig. 13, a specific example of the editing screen S2 will be described later. A specific example of generating an image will be described later with reference to fig. 14 to 16.
The generated image output instructing unit 52 instructs the output device group 1 to output the generated image, that is, to project the generated image.
For example, the generated image output instructing unit 52 determines whether or not the generated image falls within the projection target area of any of the 1 output devices 2, using the projection target area information stored in the memory 43 of the terminal device 3. When the generated image falls in the projection target area of any 1 output device 2, the generated image output instruction unit 52 instructs the 1 output device 2 to project the generated image. On the other hand, when the generated image spans the projection target area of any 2 or more output devices 2, the generated image output instruction unit 52 instructs the 2 or more output devices 2 to output the generated image. More specifically, the generated image output instruction unit 52 instructs the 2 or more output devices 2 to project the corresponding portions of the generated image. For the instruction of these projections, the same output instruction signal as that described in embodiment 1 is used.
The image selection control unit 31, the device selection control unit 32, the selected image output instruction unit 33, the image generation control unit 51, and the generated image output instruction unit 52 constitute the main parts of the guidance support control device 100 b.
The hardware configuration of the control unit 15 including the guidance support control device 100b is the same as that described with reference to fig. 6 in embodiment 1, and therefore, illustration and description thereof are omitted. That is, the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33, the image generation control unit 51, and the generated image output instructing unit 52 may be realized by the processor 41 and the memory 42, or may be realized by the dedicated processing circuit 44.
Next, a specific example of the editing screen S2 will be described with reference to fig. 13.
As shown in fig. 13, the editing screen S2 includes an image I1 corresponding to the top view of the guidance target space R. Information showing a top view of the guidance target space R (hereinafter referred to as "top view information") is used for displaying the image I1. The top view information is stored in advance in the memory 43 of the terminal device 3, for example.
The editing screen S2 includes a dot-like image I2 corresponding to the position P1 of the terminal device 3. Information showing the position P1 of the terminal apparatus 3 (hereinafter referred to as "terminal position information") is used for display of the image I2. The terminal position information is acquired by, for example, a GPS (Global Positioning System) receiver (not shown) provided in the terminal apparatus 3. Alternatively, for example, when the position P1 of the terminal device 3 in the guidance target space R is fixed, the terminal position information is stored in advance in the memory 43 of the terminal device 3.
Next, a specific example of generating an image will be described with reference to fig. 14 to 16. In other words, a specific example of the image generation control performed by the image generation control unit 51 will be described.
In the example shown in fig. 14, the leader draws an arrow-like image I3 by a touch operation on the editing screen S2 using the touch panel of the operation input unit 17. The image I3 shows a guidance route from the position P1 of the terminal device 3 to a guidance target point (for example, the 5 th gate G5). The image generation control unit 51 includes an arrow-shaped image I3 in the generated image.
Thereby, the arrow-shaped image I3 is projected onto the floor in the guide target space R. At this time, the projection position of the image I3 on the ground in the guidance target space R is set based on the position P1 of the terminal device 3 using the terminal position information.
Here, the generated image is not limited to the arrow-shaped image I3. For example, as shown in fig. 15, the generated image may include balloon-shaped images I4_1 and I4_2 in addition to the arrow-shaped image I3. The positions indicated by the text included in the images I4_1 and I4_2 and the images I4_1 and I4_2 may be set in accordance with the sound input to the microphone of the operation input unit 17.
That is, in the example shown in fig. 15, the person is guided to draw an arrow-shaped image I3 by a touch operation on the editing screen S2 using the touch panel of the operation input unit 17.
Here, the arrow-like image I3 has 2 bent portions B1, B2. The leader explains the guidance route to the guided person by speaking while viewing the guided person on the editing screen S2 during drawing of the image I3 by touch operation.
For example, when the depiction of the image I3 based on the touch operation reaches the bent portion B1, the person is guided to say "corner here" for explanation. In addition, the spoken voice is input to a microphone. At this time, the image generation control unit 51 draws a balloon-shaped image I4_1 that includes text corresponding to the content of the utterance (i.e., text of "corner" here) and indicates the touched position at this time (i.e., the position corresponding to the bent portion B1). The image generation control unit 51 includes the balloon-shaped image I4_1 in the generated image.
Next, when the drawing of the image I3 based on the touch operation reaches the bent portion B2, the person is guided to say "next corner" for explanation. In addition, the spoken voice is input to a microphone. At this time, the image generation control unit 51 draws a balloon-shaped image I4_2 that includes text corresponding to the content of the utterance (i.e., text of the "next corner") and indicates the touched position at this time (i.e., the position corresponding to the bent portion B2). The image generation control unit 51 includes the balloon-shaped image I4_2 in the generated image.
Next, drawing of the arrow-shaped image I3 based on the touch operation is completed. The image generation control unit 51 includes an arrow-shaped image I3 in the generated image.
Thereby, in addition to the arrow-shaped image I3, the balloon-shaped images I4_1 and I4_2 are also projected onto the ground in the guidance target space R. The guided person can visually and easily recognize where the "corner here" and the "next corner" in the explanation of the guiding person are located in the actual guiding object space R by visually confirming the projected images I3, I4_1, and I4_ 2. That is, according to the example shown in fig. 15, when the leader performs the explanation using the instruction pronoun, the content of the explanation can be accurately and easily delivered to the guided person.
Alternatively, for example, as shown in fig. 16, the generated images may include balloon-shaped images I5_1 to I5_3 in addition to the arrow-shaped image I3. The positions indicated by the text included in the images I5_1 to I5_3 and the images I5_1 to I5_3 may be set according to the touch operation on the touch panel of the operation input unit 17.
For example, when the leader starts drawing the arrow-shaped image I3, the leader touches a position corresponding to the image I2 on the editing screen S2. At this time, the image generation controller 51 draws a balloon-shaped image I5_1 including text indicating that the touch is the 1 st touch (for example, text of "FIRST") and indicating the touch position at that time (that is, the position corresponding to the image I2). The image generation control unit 51 includes the balloon-shaped image I5_1 in the generated image.
Next, when the drawing of the image I3 by the touch operation reaches the bent portion B1, the person is guided to temporarily interrupt the drawing of the image I3 for explanation by a speech or a gesture, and to move his or her finger away from the touch panel. After that, the person who is guided touches the touch panel again, and the drawing of the image I3 is resumed. At this time, the image generation control unit 51 draws a balloon-shaped image I5_2 including text indicating that the touch is the 2 nd touch (for example, text of "next" touch) and indicating the touched position at this time (that is, the position corresponding to the bent portion B1). The image generation control unit 51 includes the balloon-shaped image I5_2 in the generated image.
Next, when the drawing of the image I3 by the touch operation reaches the center between the bent portions B1 and B2, the person is guided to temporarily interrupt the drawing of the image I3 and separate the finger or the like from the touch panel in order to explain by speaking, a gesture, or the like. After that, the person who is guided touches the touch panel again, and the drawing of the image I3 is resumed. At this time, the image generation control unit 51 draws a balloon-shaped image I5_3 including text indicating that the touch is the 3 rd touch (for example, "third" text) and indicating the touched position at this time (i.e., the position corresponding to the center between the bent portions B1 and B2). The image generation control unit 51 includes the balloon-shaped image I5_3 in the generated image.
Next, drawing of the arrow-shaped image I3 based on the touch operation is completed. The image generation control unit 51 includes an arrow-shaped image I3 in the generated image.
Thus, in addition to the arrow-shaped image I3, the balloon-shaped images I5_1 to I5_3 are also projected onto the floor surface in the guidance target space R. The guided person can visually recognize easily where the position corresponding to the explanation by the guidance person such as a speech or gesture is located in the actual guidance target space R by visually checking the projected images I3 and I5_1 to I5_ 3. That is, according to the example shown in fig. 16, when the leading person has performed a description using a speech, a gesture, or the like, the content of the description can be accurately and easily delivered to the guided person.
Further, a balloon-shaped image may be drawn based on the number of clicks made on each position in the editing screen S2. For example, when any position in the 1-time editing screen S2 is clicked, a balloon-shaped image including the text "first" and indicating the clicked position may be drawn. Next, when any other arbitrary position on the 2-time editing screen S2 is continuously clicked, a balloon-shaped image including the text "next time" and indicating the clicked position may be drawn. Next, when any other arbitrary position on the 3-time editing screen S2 is continuously clicked, a balloon-shaped image including the text "third" and indicating the clicked position may be drawn.
Next, the operation of the guidance support control apparatus 100b will be described with reference to the flowchart of fig. 17, centering on the operations of the image generation control unit 51 and the generated image output instruction unit 52. When an operation for instructing the display of the editing screen S2 is input to the operation input unit 17 during the operation of the guidance support system 200, the display control unit 21 causes the display unit 16 to display the editing screen S2. Of the processes at steps ST11 and ST12 shown in fig. 17, at least the process at step ST11 is executed in a state where the editing screen S2 is displayed on the display unit 16.
First, in step ST11, the image generation control unit 51 performs control to generate an image to be output by the output device group 1, that is, an image to be projected by the output device group 1, in accordance with an operation input to the operation input unit 17 by a person in charge of guidance. A specific example of the image generation control by the image generation control unit 51 is as already described with reference to fig. 14 to 16, and therefore, a description thereof is omitted.
Next, in step ST12, the generated image output instruction unit 52 instructs the output device group 1 to output the generated image, that is, to project the generated image. Since a specific example of generating the instruction of the image output instructing unit 52 is as described above, the explanation thereof will be omitted.
In this way, according to the guidance support control apparatus 100b, when the person who is guided by the guiding person, the image to be projected by the output apparatus group 1 can be generated in real time. This can further improve the degree of freedom of guidance of the person to be guided. As described with reference to fig. 15, when the leader performs a description using the instruction pronoun, the content of the description can be accurately and easily delivered to the guided person. As described with reference to fig. 16, when the guidance person performs a description using a speech, a gesture, or the like, the content of the description can be accurately and easily transmitted to the guidance-target person.
The generated image is not limited to the specific examples shown in fig. 14 to 16. The guidance person can draw an arbitrary image by a touch operation or a voice input to the editing screen S2. For example, an image including any shape may be used, such as an image representing only a corner in the guide route that needs to be turned, an image highlighting the final destination, or an image showing a person moving in the guide route in a character. The image generation control unit 51 includes the generated image with the arbitrary image thus drawn.
The generated image output instructing unit 52 is not limited to simultaneously instructing the output of the generated image, that is, the projection of the generated image, to the plurality of output devices 2 forming the output device group 1. For example, the generated image output instructing unit 52 may instruct the plurality of output devices 2 forming the output device group 1 to generate the projection of the image with a time difference. In this way, the output devices 2 can shift the timing of displaying the generated image, generate motion in the image projected by the entire output device group 1, and project the image as a moving image.
The text of the balloon-shaped images I4_1 and I4_2 is not limited to text input by voice input to the microphone of the operation input unit 17. For example, the operation input unit 17 may include a keyboard to guide a person to input a text by keyboard input as the text of the balloon-shaped images I4_1 and I4_ 2. Alternatively, the balloon-shaped images I4_1 and I4_2 may be stored in the memory 43 in advance, displayed as preset images in an area (not shown) at the end of the editing screen S2, and arranged on the image I1 so as to guide a person to select a desired preset image.
In addition, the guidance support system 200 can employ various modifications similar to the modifications described in embodiments 1 and 2. The guidance support control device 100b can employ various modifications similar to the modifications described in embodiments 1 and 2.
As described above, the guidance support control device 100b according to embodiment 3 includes: an image generation control unit 51 that performs control for generating an image to be projected by the output device 2 in accordance with an operation input by a guide person during operation of the guidance support system 200; and a generated image output instruction unit 52 that instructs the output device 2 to project the generated image generated by the image generation control unit 51. This can further improve the degree of freedom of guidance of the person to be guided. Further, when the leader performs the explanation using the directive pronoun, the content of the explanation can be accurately and easily delivered to the guided person. Further, when the leader performs a description using a speech, a gesture, or the like, the content of the description can be accurately and easily delivered to the leader.
Embodiment 4.
Fig. 18 is a block diagram showing a main part of the guidance support control device according to embodiment 4. The guidance support control device 100c according to embodiment 4 will be described with reference to fig. 18.
The system configuration of the guidance support system 200 is the same as the system configuration described with reference to fig. 1 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of each output device 2 is the same as that described with reference to fig. 2 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of the terminal device 3 is the same as that described with reference to fig. 3 in embodiment 1, and therefore, illustration and description thereof are omitted. In fig. 18, the same blocks as those shown in fig. 12 are denoted by the same reference numerals, and description thereof is omitted.
The image registration control section 53 performs control of registering the generated image as a new preset image when the image is generated by the image generation control section 51. More specifically, the image registration control unit 53 performs control of additionally storing image data representing the generated image (hereinafter referred to as "generated image data") as new preset image data in the image data storage unit 22.
Therefore, in the guidance support control device 100c, when the image selection control unit 31 performs control of selecting 1 preset image out of the plurality of preset images, the past generated image can be included in the plurality of preset images. Hereinafter, a preset image corresponding to preset image data stored in advance in the image data storage section 22 among the plurality of preset images is referred to as an "initial preset image". Among the plurality of preset images, a preset image corresponding to the preset image data additionally stored in the image data storage unit 22 (i.e., generated image data) is referred to as an "additional preset image".
The selected image output instructing section 33a instructs the selection output device 2 to project the selected preset image in the case where the selected preset image is the initial preset image. In this case, the instruction to select the image output instructing unit 33a is the same as the instruction to select the image output instructing unit 33, and therefore, detailed description thereof is omitted.
On the other hand, in the case where the selected preset image is the additional preset image, the selected image output instructing section 33a instructs the projection of the selected preset image to the output device group 1. In this case, the instruction to select the image output instructing unit 33a is the same as the instruction to generate the image output instructing unit 52, and therefore, detailed description thereof is omitted.
The image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33a, the image generation control unit 51, the generated image output instructing unit 52, and the image registration control unit 53 constitute the main parts of the guidance support control device 100 c.
The hardware configuration of the control unit 15 including the guidance support control device 100c is the same as that described with reference to fig. 6 in embodiment 1, and therefore, illustration and description thereof are omitted. That is, the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33a, the image generation control unit 51, the generated image output instructing unit 52, and the image registration control unit 53 may be realized by the processor 41 and the memory 42, or may be realized by the dedicated processing circuit 44.
Next, referring to the flowchart of fig. 19, the operation of the guidance support control apparatus 100c will be described centering on the operations of the image generation control unit 51, the generated image output instruction unit 52, and the image registration control unit 53. In fig. 19, the same steps as those shown in fig. 17 are denoted by the same reference numerals, and description thereof is omitted.
First, the processing in steps ST11 and ST12 is executed.
Next, in step ST13, the image registration control section 53 performs control of registering the generated image in step ST11 as an additional preset image. More specifically, the image registration control unit 53 performs control of additionally storing generated image data representing the generated image in the image data storage unit 22 as new preset image data. The processing in steps ST12 and ST13 may be executed in parallel with each other. Alternatively, the process of step ST13 may be executed first, and then the process of step ST12 may be executed.
Next, referring to the flowchart of fig. 20, the operation of the guidance support control device 100c will be described centering on the operations of the image selection control unit 31, the device selection control unit 32, and the selected image output instruction unit 33 a. In fig. 20, the same steps as those shown in fig. 7 are denoted by the same reference numerals, and description thereof is omitted.
First, the process of step ST1 is executed. In the case where the selected preset image is the initial preset image in step ST1 (no in step ST 21), the processes in steps ST2, ST3 are performed.
On the other hand, in the case where the selected preset image is the additional preset image in step ST1 (yes in step ST 21), in step ST22, the selected image output instructing section 33a instructs the output device group 1 to project the selected preset image. The instruction of the selected image output instructing unit 33a in step ST22 is the same as the instruction of the generated image output instructing unit 52 in step ST12, and therefore, detailed description thereof is omitted.
In addition, when the generated image is composed of a plurality of images, the generated image data stored in the image data storage unit 22 may indicate only a part of the plurality of images. More specifically, the generated image data stored in the image data storage unit 22 may be only an image with high versatility of guidance among the plurality of images.
For example, the generated image in the example shown in fig. 15 includes balloon-shaped images I4_1 and I4_2 in addition to the arrow-shaped image I3. The arrow-shaped image I3 is considered to be an image commonly used for guidance from the position P1 of the terminal apparatus 3 to a guidance target point (for example, the 5 th gate G5). On the other hand, the balloon-shaped images I4_1 and I4_2 are images corresponding to the words of the guidance person during the drawing, and are considered to be less likely to be used for guidance next and later. In contrast, the image registration control unit 53 causes the image data storage unit 22 to store generated image data representing only the arrow-shaped image I3 of the generated image. Thereby, the arrow-shaped image I3 is registered as the additional preset image. The same applies to the example shown in fig. 16.
The image registration control unit 53 may execute control to delete the generated image data stored in the image data storage unit 22, that is, the preset image data indicating the addition of the preset image, each time Δ T2 elapses for a predetermined period (for example, 1 day). Alternatively, the image registration control unit 53 may execute this control every time the logout process in the terminal apparatus 3 is executed. Thereby, an excessive increase in the number of storage of the preset image data in the image data storage section 22 can be avoided.
Note that, every time a predetermined period (for example, 1 day) elapses, the image registration control unit 53 may execute control to exclude the additional preset image registered before that from the display objects on the selection screen S1. Alternatively, the image registration control unit 53 may execute this control every time the logout process in the terminal apparatus 3 is executed. Thereby, it is possible to avoid an excessive increase in the number of displays of the preset images in the selection screen S1, more specifically, an excessive increase in the number of displays of the thumbnail images in the area a 1.
In this control, the additional preset image is excluded from the display object, but the control may be performed to add the additional preset image to the display object again in response to the request of the leader while keeping the state registered in the image data storage unit 22.
Here, the value of the predetermined period Δ T2 may be freely set by an operation input to the operation input unit 17. Alternatively, the predetermined period Δ T2 may be set based on external information, for example, calendar information. For example, when the guidance target space R is a lobby of an airport, information indicating a continuous rest such as the labrum section or the beginning of the year is acquired as calendar information, and the continuous rest period is set to the predetermined period Δ T2. This allows the leader to use the same additional preset image while assuming an increase in airport users.
Next, another guidance support control apparatus 100c according to embodiment 4 will be described with reference to fig. 21.
In the example shown in fig. 21, the image selection control unit 31a has a function of setting the display priority on the selection screen S1 for each of the plurality of preset images. The display priority is represented by, for example, two high and low values. Hereinafter, a preset image whose display priority is set to a high value among the plurality of preset images is referred to as a "recommended preset image". Further, a preset image in which the display priority is set to a low value among the plurality of preset images is referred to as a "non-recommended preset image".
The image selection control unit 31a has the following functions: the recommended preset images are included in the display objects in the selection screen S1, and the non-recommended preset images are excluded from the display objects in the selection screen S1. Alternatively, the image selection control unit 31a has the following functions: an area (not shown) other than the area a1 and displaying a thumbnail image corresponding to the recommended preset image is included in the selection screen S1.
That is, the setting of the display priority for each preset image is also a determination of whether or not each preset image is a recommended preset image. For the determination of whether or not each preset image is a recommended preset image, for example, a database (hereinafter referred to as "recommendation database") containing the following information is used. The recommendation database is stored in a recommendation database storage unit (hereinafter referred to as "recommendation DB storage unit") 23. The function of the recommendation DB storage unit 23 is realized by the memory 43 of the terminal device 3, for example.
First, the recommendation database includes information indicating a guidance route that is initially set (hereinafter referred to as "initial setting information") for each position P1 of each terminal device 3 and for each guidance target point. The guidance route that is initially set is set based on the structure of an airport, for example, when the guidance target space R is a lobby of the airport. The image selection control unit 31a includes a preset image including the guidance route that is initially set in the recommended preset image, for example, until a predetermined period elapses after the guidance support system 200 is introduced.
Second, the recommendation database includes, for each position P1 of the terminal device 3 and for each content of the generated image, information indicating a history of the number of times the generated image is registered as the additional preset image, that is, the number of times the generated image data is stored in the image data storage unit 22 (hereinafter referred to as "registration history information"). The image selection control unit 31a includes, for example, an additional preset image, which is indicated by the registration history information, a predetermined number of times (for example, 100 times) or more, in the recommended preset image.
Third, the recommendation database contains information indicating a schedule in the guidance target space R (hereinafter referred to as "schedule information"). For example, in the case where the guidance target space R is an airport lobby, the schedule information indicates the departure/arrival schedule of the aircraft of each airline company in the airport. Alternatively, when the guidance target space R is within the station, the schedule information indicates a departure/arrival schedule of the electric train in the station. Alternatively, in the case where the guidance target space R is a department store, the schedule information indicates the activity schedule for each season in the department store. The image selection control unit 31a determines whether or not each preset image is a recommended preset image based on these schedules. The determination method may be as follows: the recommended preset image is set in advance for each schedule, or the guide person registers information related to the schedule together with new registration of generated image data, and the corresponding schedule or the like is determined based on the information related to the schedule at the time of determination.
For example, in the case where the guidance target space R is a department store, it is considered that the use demand of the guidance person decreases after the new year in the additional preset image registered in the event hosting period of the end of the year, and the use demand of the guidance person increases again in the event hosting period of the end of the next year. The same applies to the additional preset image registered during the event hosting period of the labrum section. In contrast, by using the schedule information, it is possible to include an additional preset image, which requires a high use demand of the leader, in the recommended preset image according to the currently held event.
The information included in the recommendation database is not limited to the above-described specific examples. The image selection control section 31a can use various information in the determination of whether or not each preset image is a recommended preset image.
In addition, the guidance support system 200 can employ various modifications similar to the modifications described in embodiments 1 to 3. The guidance support control device 100c can employ various modifications similar to those described in embodiments 1 to 3.
As described above, the guidance support control device 100c according to embodiment 4 includes the image registration control unit 53 that performs control to include the generated image in the plurality of preset images, and the plurality of preset images include the additional preset image corresponding to the generated image. This enables the generated image to be used for guidance next time or later.
Embodiment 5.
Fig. 22 is a block diagram showing a main part of the guidance support control device according to embodiment 5. The guidance support control apparatus 100d according to embodiment 5 will be described with reference to fig. 22.
The system configuration of the guidance support system 200 is the same as the system configuration described with reference to fig. 1 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of each output device 2 is the same as that described with reference to fig. 2 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of the terminal device 3 is the same as that described with reference to fig. 3 in embodiment 1, and therefore, illustration and description thereof are omitted. In fig. 22, the same blocks as those shown in fig. 18 are denoted by the same reference numerals, and description thereof is omitted.
In the guidance support control apparatuses 100b and 100c, the guidance person can draw an arbitrary image. Therefore, the guidance support control device 100c can register the rendered arbitrary image (i.e., the generated image) as an additional preset image and use the image for guidance next or later.
Here, in a case where the position P1 of the terminal device 3 in the guidance target space R is not fixed, the position P1 of the terminal device 3 at the time of projection of the additional preset image corresponding to the generated image (i.e., at the time of projection of the 2 nd time or later) may be different from the position P1 of the terminal device 3 at the time of projection of the generated image (i.e., at the time of projection of the 1 st time). Therefore, based on the position P1 of the terminal device 3, the projection position of the additional preset image corresponding to the generated image (see fig. 23B) may be offset from the projection position of the generated image (see fig. 23A).
In contrast, the guidance support control apparatus 100d prevents the occurrence of the deviation by limiting the drawing of the image to be projected.
The image generation control section 51a performs the following control: in the operation of the guidance support system 200, an image to be output by the output device group 1, that is, an image to be projected by the output device group 1 is generated in accordance with an operation input to the operation input unit 17 by a guidance person. The image generation control by the image generation control unit 51a is executed, for example, in a state where the display control unit 21 displays the editing screen S3 on the display unit 16.
Fig. 24 shows a specific example of the editing screen S3. As shown in fig. 24, the editing screen S3 includes an image I1 corresponding to the top view of the guidance target space R and a dot-like image I2 corresponding to the position P1 of the terminal device 3. In addition, the editing screen S3 includes a plurality of dot-like images (hereinafter referred to as "auxiliary dots") I6 arranged in the image I1. That is, the plurality of assist points I6 correspond to the plurality of positions P2 fixed in the guidance target space R, respectively.
Information indicating the corresponding position P2 in the guidance target space R (hereinafter referred to as "auxiliary point position information") is used for displaying each of the plurality of auxiliary points I6. The assist point position information is stored in advance in the memory 43 of the terminal device 3, for example.
The editing screen S3 includes an icon image I7 indicating the shape of the image to be drawn. In the example shown in fig. 24, since the arrow-shaped image I3 is a drawing target, an icon image I7 showing an arrow shape is displayed.
In a state where the editing screen S3 is displayed on the display unit 16, the leader selects 1 or more auxiliary points I6 of the plurality of auxiliary points I6 by a touch operation of the touch panel using the operation input unit 17. The image generation control unit 51a draws an arrow image I3 that passes through the selected assist point (hereinafter referred to as "selected assist point") I6, where the arrow image I3 corresponds to a guidance route from the position P1 of the terminal apparatus 3 to a guidance target point (for example, the 5 th gate G5). Fig. 25 shows an example of an arrow-shaped image I3. The image generation control unit 51a includes an arrow-shaped image I3 in the generated image.
Thereby, the arrow-shaped image I3 is projected onto the floor in the guide target space R. At this time, the projection position of the image I3 on the floor in the guidance target space R is set based on the position P1 of the terminal device 3 and the position P2 corresponding to the selected assist point I6 using the terminal position information and the assist point position information.
When the image generation control unit 51a generates an image, the image registration control unit 53a performs control to register the generated image as a new preset image. More specifically, the image registration control unit 53a performs control to store information indicating the shape of the generated image, that is, information indicating the shape of the icon image I7 when the generated image is drawn (hereinafter referred to as "image shape information") in the image data storage unit 22. Here, in the present embodiment, the image shape information is information indicating an arrow which is the shape of the image I3. The image registration controller 53a also stores information indicating the assist point I6 through which the generated image passes, that is, information indicating the selection assist point I6 when the generated image is drawn (hereinafter referred to as "selection assist point information") in the image data storage 22. The guidance support control device 100d generates image data from the image shape information and the selection assist point information.
The selected image output instructing section 33b instructs the projection of the selected preset image to the selection output device 2 in the case where the selected preset image is the initial preset image. In this case, the instruction to select the image output instructing unit 33b is the same as the instruction to select the image output instructing unit 33, and therefore, detailed description thereof is omitted.
On the other hand, in the case where the selected preset image is the additional preset image, the selected image output instructing section 33b instructs the projection of the selected preset image to the output device group 1. At this time, the selected image output instructing unit 33b instructs the projection of the selected preset image corrected in accordance with the position P1 of the terminal device 3 to the output device group 1 based on the terminal position information.
Fig. 26 shows an example of the selection preset image corrected in accordance with the position P1 of the terminal device 3 in the case where the selection preset image is the additional preset image corresponding to the generated image shown in fig. 25. The arrow-shaped image I3' shown in fig. 26 is obtained by correcting the start point of the arrow in accordance with the position P1 of the terminal device 3 with respect to the arrow-shaped image I3 shown in fig. 25.
Here, the projection position of the image I3' on the floor in the guidance target space R is set based on the position P1 of the terminal device 3 and the position P2 corresponding to the selected assist point I6 using the terminal position information and the assist point position information. This can avoid the occurrence of the offset.
The image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33b, the image generation control unit 51a, the generated image output instructing unit 52, and the image registration control unit 53a constitute the main parts of the guidance support control device 100 d.
The hardware configuration of the control unit 15 including the guidance support control device 100d is the same as that described with reference to fig. 6 in embodiment 1, and therefore, illustration and description thereof are omitted. That is, the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33b, the image generation control unit 51a, the generated image output instructing unit 52, and the image registration control unit 53a may be realized by the processor 41 and the memory 42, or may be realized by the dedicated processing circuit 44.
Next, referring to the flowchart of fig. 27, the operation of the guidance support control apparatus 100d will be described centering on the operations of the image generation control unit 51a, the generated image output instruction unit 52, and the image registration control unit 53 a. In fig. 27, the same steps as those shown in fig. 19 are denoted by the same reference numerals, and description thereof is omitted.
First, in step ST11a, the image generation control unit 51a performs control to generate an image to be output by the output device group 1, that is, an image to be projected by the output device group 1, in accordance with an operation input to the operation input unit 17 by the lead person. A specific example of the image generation control by the image generation control unit 51a is the same as that already described with reference to fig. 24 and 25, and therefore, a description thereof is omitted.
Subsequently, the process of step ST12 is executed.
Next, in step ST13a, the image registration control section 53a performs control of registering the generated image in step ST11a as the additional preset image. More specifically, the image registration control section 53 performs control to store the image shape information and the selection auxiliary line information on the generated image in the image data storage section 22. The processing in steps ST12 and ST13a may be executed in parallel with each other. Alternatively, the process of step ST13a may be executed first, and then the process of step ST12 may be executed.
Next, referring to the flowchart of fig. 28, the operation of the guidance support control apparatus 100d will be described centering on the operations of the image selection control unit 31, the apparatus selection control unit 32, and the selected image output instruction unit 33 b. In fig. 28, the same steps as those shown in fig. 20 are denoted by the same reference numerals, and description thereof is omitted.
First, the process of step ST1 is executed. In the case where the selected preset image is the initial preset image in step ST1 (no in step ST 21), the processes in steps ST2, ST3 are performed.
On the other hand, in the case where the selected preset image is the additional preset image in step ST1 (yes in step ST 21), in step ST22a, the selected image output instructing section 33b instructs the output device group 1 to project the selected preset image. At this time, the selected image output instructing unit 33b instructs the projection of the selected preset image corrected in accordance with the position P1 of the terminal device 3 to the output device group 1 based on the terminal position information.
In addition, the shape of the generated image is not limited to the arrow. The generated image may have any shape as long as it includes a linear image passing through 1 or more auxiliary points I6.
In addition, the guidance support system 200 can employ various modifications similar to the modifications described in embodiments 1 to 4. The guidance support control device 100d can employ various modifications similar to those described in embodiments 1 to 4.
As described above, in the guidance support control device 100d according to embodiment 5, the generated image includes a linear image passing through 1 or more auxiliary points I6 among the plurality of auxiliary points I6, and the plurality of auxiliary points I6 correspond to the plurality of positions P2 in the guidance target space R, respectively. Thus, when the preset image is selected as the additional preset image, the occurrence of the offset can be avoided.
Embodiment 6.
Fig. 29 is a block diagram showing a main part of the guidance support control apparatus according to embodiment 6. The guidance support control device 100e according to embodiment 6 will be described with reference to fig. 29.
The system configuration of the guidance support system 200 is the same as the system configuration described with reference to fig. 1 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of each output device 2 is the same as that described with reference to fig. 2 in embodiment 1, and therefore, illustration and description thereof are omitted. Note that the configuration of the terminal device 3 is the same as that described with reference to fig. 3 in embodiment 1, and therefore, illustration and description thereof are omitted. In fig. 29, the same blocks as those shown in fig. 22 are denoted by the same reference numerals, and description thereof is omitted.
The spatial state information acquisition unit 61 acquires information indicating a state in the spatial state information acquisition unit 61 (hereinafter referred to as "spatial state information") using sensors (not shown) provided in the guidance target space R.
Specifically, for example, 1 or more human motion sensors are provided in the guidance target space R. In this case, the spatial state information includes information indicating the distribution of the degree of congestion in the guidance target space R. Fig. 30 shows an example of the congestion degree distribution shown in the spatial state information.
When the image generation control unit 51a generates an image, the image correction control unit 62 determines whether or not the generated image needs to be corrected, using the spatial state information. When it is determined that the generated image needs to be corrected, the image correction control unit 62 performs control for correcting the generated image using the spatial state information.
When the image selection control unit 31 selects a preset image, the image correction control unit 62 determines whether or not the selected preset image needs to be corrected, using the spatial state information, when the selected preset image is an additional preset image. When it is determined that the selected preset image needs to be corrected, the image correction control unit 62 performs control for correcting the selected preset image using the spatial state information.
For example, as shown in fig. 31, the generated image includes an arrow-shaped image I3, and a part of the image I3 passes through an area with a high degree of congestion. In this case, the image correction control unit 62 determines that the generated image needs to be corrected, and corrects the generated image so as to avoid the area with a high degree of congestion. Fig. 32 shows an example of the generated image corrected by the image correction control unit 62. As shown in fig. 32, the arrow-shaped image I3 "avoids the region with a high degree of congestion. Since a specific example of the correction of the selected preset image is the same as a specific example of the correction of the generated image, illustration and description thereof are omitted.
Hereinafter, the generated image corrected by the image correction control unit 62 and the selected preset image corrected by the image correction control unit 62 are collectively referred to as "corrected images".
The corrected image output instruction unit 63 instructs the output device group 1 to output the corrected image, that is, to project the corrected image. The instruction of the corrected image output instructing unit 63 is the same as the instruction of the generated image output instructing unit 52 or the instruction of the selected image output instructing unit 33b in the case where the selected preset image is the additional preset image, and therefore, detailed description thereof is omitted.
The image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33b, the image generation control unit 51a, the generated image output instructing unit 52, and the image registration control unit 53a constitute a1 st guidance support control unit 71. The spatial state information acquisition unit 61, the image correction control unit 62, and the corrected image output instruction unit 63 constitute a2 nd guidance support control unit 72. The 1 st guidance support control unit 71 and the 2 nd guidance support control unit 72 constitute the main part of the guidance support control device 100 e.
The hardware configuration of the control unit 15 including the guidance support control device 100e is the same as that described with reference to fig. 6 in embodiment 1, and therefore, illustration and description thereof are omitted. That is, the functions of the display control unit 21, the image selection control unit 31, the device selection control unit 32, the selected image output instructing unit 33b, the image generation control unit 51a, the generated image output instructing unit 52, the image registration control unit 53a, the spatial state information acquiring unit 61, the image correction control unit 62, and the corrected image output instructing unit 63 may be realized by the processor 41 and the memory 42, or may be realized by the dedicated processing circuit 44.
Next, with reference to the flowchart of fig. 33, the operation of the guidance support control apparatus 100e will be described centering on the operation of the image correction control unit 62 for correcting the generated image. In fig. 33, the same steps as those shown in fig. 27 are denoted by the same reference numerals, and description thereof is omitted.
First, the process of step ST11a is executed.
Next, in step ST31, the spatial state information acquisition unit 61 acquires spatial state information. Next, at step ST32, the image correction control unit 62 determines whether or not correction is necessary for the generated image at step ST11a, using the acquired spatial state information. Since a specific example of the determination by the image correction control unit 62 is as described above, a description thereof will be omitted. If it is determined that the generated image does not need to be corrected (no in step ST 32), the processing in steps ST12 and ST13a is executed.
On the other hand, when it is determined that the generated image needs to be corrected (yes in step ST 32), in step ST33, the image correction control unit 62 performs control for correcting the generated image using the acquired spatial state information. A specific example of the correction by the image correction control unit 62 is as described above, and therefore, a description thereof will be omitted. Next, at step ST34, the corrected image output instruction unit 63 instructs the output device group 1 to project the corrected image at step ST 33. Subsequently, the process of step ST13a is executed. In addition, although the image registered as the additional preset image is used as the generated image, the image may be a corrected image. In this case, a guidance route that can avoid an area with a high degree of congestion as usual can be registered as the additional preset image.
Next, with reference to the flowchart of fig. 34, the operation of the guidance support control apparatus 100e will be described centering on the operation of the image correction control unit 62 for correcting the selected preset image when the selected preset image is the additional preset image. In fig. 34, the same steps as those shown in fig. 28 are denoted by the same reference numerals, and description thereof is omitted.
First, the process of step ST1 is executed. In the case where the selected preset image is the initial preset image in step ST1 (no in step ST 21), the processes in steps ST2, ST3 are performed.
On the other hand, when the selected preset image is the additional preset image in step ST1 (yes in step ST 21), the spatial state information acquisition unit 61 acquires the spatial state information in step ST31 a. Next, in step ST32a, the image correction control unit 62 determines whether or not correction is necessary for the selected preset image in step ST1, using the acquired spatial state information. Since a specific example of the determination by the image correction control unit 62 is as described above, a description thereof will be omitted. If it is determined that the selected preset image does not need to be corrected (no in step ST32 a), the process of step ST22a is executed.
On the other hand, when it is determined that the selected preset image needs to be corrected (yes in step ST32 a), in step ST33a, the image correction control unit 62 performs control for correcting the selected preset image using the acquired spatial state information. A specific example of the correction by the image correction control unit 62 is as described above, and therefore, a description thereof will be omitted. Next, in step ST34a, the corrected image output instruction unit 63 instructs the output device group 1 to project the corrected image in step ST33 a.
The sensors for acquiring the spatial state information are not limited to 1 or more human motion sensors. The spatial state information is not limited to information indicating the distribution of the degree of congestion in the guidance target space R.
For example, 1 or more smoke sensors may be provided in the guidance target space R. In this case, the spatial state information may include information indicating the smoke density distribution in the guidance target space R. The correction by the image correction control unit 62 may be a correction to avoid a region having a high smoke density from the linear image included in the generated image or the selected preset image.
In addition, 1 or more temperature sensors may be provided in the guidance target space R. In this case, the spatial state information may include information indicating a temperature distribution in the guidance target space R. The correction by the image correction control unit 62 may be a correction to avoid a high temperature region from a linear image included in the generated image or the selected preset image.
This enables, for example, an appropriate evacuation route to be guided when a fire breaks out.
As shown in fig. 35, an image I8 corresponding to the congestion degree distribution, smoke density distribution, or temperature distribution indicated by the spatial state information may be included in the editing screen S4. Thus, the lead person may select 1 or more auxiliary points I6 from the plurality of auxiliary points I6 so as to draw a linear image avoiding a region with a high degree of congestion, smoke density, or temperature.
In addition, the guidance support system 200 can employ various modifications similar to the modifications described in embodiments 1 to 5. The guidance support control device 100e can employ various modifications similar to those described in embodiments 1 to 5.
As described above, the guidance support control device 100e according to embodiment 6 includes: a spatial state information acquisition unit 61 that acquires spatial state information indicating a state in the guidance target space R; an image correction control unit 62 that performs control for correcting the generated image using the spatial state information; and a corrected image output instruction unit 63 that instructs the output device 2 to project the corrected image corrected by the image correction control unit 62. This enables guidance of an appropriate route based on the congestion degree distribution, the smoke density distribution, the temperature distribution, or the like in the guidance target space R.
The guidance support control device 100e according to embodiment 6 includes: a spatial state information acquisition unit 61 that acquires spatial state information indicating a state in the guidance target space R; an image correction control section 62 that, in a case where the selected preset image is the additional preset image, performs control of correcting the selected preset image using the spatial state information; and a corrected image output instruction unit 63 that instructs the output device 2 to project the corrected image corrected by the image correction control unit 62. This enables guidance of an appropriate route based on the congestion degree distribution, the smoke density distribution, the temperature distribution, or the like in the guidance target space R.
In the present invention, the optional combinations of the respective embodiments, modifications of arbitrary components of the respective embodiments, or omission of arbitrary components in the respective embodiments can be made within the scope of the present invention.
Industrial applicability
The guidance support control device and the guidance support system according to the present invention are used, for example, for guidance of a worker to a user in a hall of an airport.
Description of the reference symbols
1 output device group, 2 output devices, 3 terminal devices, 4 networks, 11 projection units, 12 communication units, 13 control units, 14 communication units, 15 control units, 16 display units, 17 operation input units, 21 display control units, 22 image data storage units, 23 recommendation database storage units (recommendation DB storage units), 31a image selection control units, 32 device selection control units, 33a, 33b selection image output instruction units, 34 output continuation instruction units, 41 processors, 42 memories, 43 memories, 44 processing circuits, 51a image generation control units, 52 generation image output instruction units, 53a image registration control units, 61 spatial state information acquisition units, 62 image correction control units, 63 correction image output instruction units, 71 1 st guidance support control unit, 72 nd 2 nd guidance support control units, 100a, 100b, 100c, 100d, 100e guide support control device, 200 guide support system.

Claims (8)

1. A guidance support control device for a guidance support system that supports guidance of a guided person by a guiding person using an image projected into a guidance target space by an output device,
the guidance support control device includes:
an image selection control unit that performs control for selecting any one of a plurality of preset images in accordance with an operation input by the guidance person during operation of the guidance support system; and
a selection image output instruction section that instructs the output device of the projection of the selection preset image selected by the image selection control section.
2. The guidance support control device according to claim 1,
the guidance support control device includes an output continuation instruction unit that instructs the output device to continue projecting the selected preset image when a count value of a number of times the selected preset image is selected within a predetermined period is equal to or greater than a threshold value.
3. The guidance support control device according to claim 1,
the guidance support control device includes:
an image generation control unit that executes control for generating an image to be projected by the output device in accordance with an operation input by the guidance person during operation of the guidance support system; and
a generated image output instruction unit that instructs the output device to project the generated image generated by the image generation control unit.
4. The guidance support control device according to claim 3,
the guidance support control device includes an image registration control unit that performs control to include the generated image in the plurality of preset images,
the plurality of preset images include an additional preset image corresponding to the generated image.
5. The guidance support control device according to claim 4,
the generated image includes a linear image passing through 1 or more auxiliary points among the plurality of auxiliary points,
the plurality of auxiliary points respectively correspond to a plurality of positions in the guide object space.
6. The guidance support control device according to claim 5,
the guidance support control device includes:
a spatial state information acquisition unit that acquires spatial state information indicating a state in the guidance target space;
an image correction control unit that performs control for correcting the generated image using the spatial state information; and
and a corrected image output instruction unit that instructs the output device to project the corrected image corrected by the image correction control unit.
7. The guidance support control device according to claim 5,
the guidance support control device includes:
a spatial state information acquisition unit that acquires spatial state information indicating a state in the guidance target space;
an image correction control section that performs control of correcting the selected preset image using the spatial state information in a case where the selected preset image is the additional preset image; and
and a corrected image output instruction unit that instructs the output device to project the corrected image corrected by the image correction control unit.
8. A guidance support system for supporting guidance of a guided person by a guiding person using an image projected into a guidance target space by an output device,
the guidance support system includes:
an image selection control unit that performs control for selecting any one of a plurality of preset images in accordance with an operation input by the guidance person during operation of the guidance support system; and
a selection image output instruction section that instructs the output device of the projection of the selection preset image selected by the image selection control section.
CN201880100013.2A 2018-12-12 2018-12-12 Guidance support control device and guidance support system Pending CN113168793A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/045686 WO2020121439A1 (en) 2018-12-12 2018-12-12 Guidance assistance control device and guidance assistance system

Publications (1)

Publication Number Publication Date
CN113168793A true CN113168793A (en) 2021-07-23

Family

ID=71075300

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880100013.2A Pending CN113168793A (en) 2018-12-12 2018-12-12 Guidance support control device and guidance support system

Country Status (3)

Country Link
JP (1) JP6840305B2 (en)
CN (1) CN113168793A (en)
WO (1) WO2020121439A1 (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1671199A (en) * 2004-03-15 2005-09-21 精工爱普生株式会社 Projector
JP2007149053A (en) * 2005-10-24 2007-06-14 Shimizu Corp Route guidance system and method
CN103037186A (en) * 2011-09-28 2013-04-10 卡西欧计算机株式会社 Projector control device and projector control method
KR20130042229A (en) * 2011-10-18 2013-04-26 성균관대학교산학협력단 Method of image displaying at umbrella, image displaying umbrella, system of image displaying umbrella
CN103888719A (en) * 2012-12-21 2014-06-25 索尼公司 Display control system and recording medium
US20140282009A1 (en) * 2013-03-15 2014-09-18 Daniel Avrahami System and method for content creation
CN104368094A (en) * 2014-10-30 2015-02-25 无锡艾科瑞思产品设计与研究有限公司 Projection type emergency escape indication device
JP2016011905A (en) * 2014-06-30 2016-01-21 アルパイン株式会社 Guide system, guide method, server, and electronic device
JP2016123074A (en) * 2014-12-25 2016-07-07 パナソニックIpマネジメント株式会社 Projector
CN106973274A (en) * 2015-09-24 2017-07-21 卡西欧计算机株式会社 Optical projection system
CN107113950A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
JP2018036480A (en) * 2016-08-31 2018-03-08 株式会社リコー Image projection system, information processing apparatus, image projection method, and program

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1671199A (en) * 2004-03-15 2005-09-21 精工爱普生株式会社 Projector
JP2007149053A (en) * 2005-10-24 2007-06-14 Shimizu Corp Route guidance system and method
CN103037186A (en) * 2011-09-28 2013-04-10 卡西欧计算机株式会社 Projector control device and projector control method
KR20130042229A (en) * 2011-10-18 2013-04-26 성균관대학교산학협력단 Method of image displaying at umbrella, image displaying umbrella, system of image displaying umbrella
CN103888719A (en) * 2012-12-21 2014-06-25 索尼公司 Display control system and recording medium
US20140282009A1 (en) * 2013-03-15 2014-09-18 Daniel Avrahami System and method for content creation
JP2016011905A (en) * 2014-06-30 2016-01-21 アルパイン株式会社 Guide system, guide method, server, and electronic device
CN104368094A (en) * 2014-10-30 2015-02-25 无锡艾科瑞思产品设计与研究有限公司 Projection type emergency escape indication device
JP2016123074A (en) * 2014-12-25 2016-07-07 パナソニックIpマネジメント株式会社 Projector
CN107113950A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
CN106973274A (en) * 2015-09-24 2017-07-21 卡西欧计算机株式会社 Optical projection system
JP2018036480A (en) * 2016-08-31 2018-03-08 株式会社リコー Image projection system, information processing apparatus, image projection method, and program

Also Published As

Publication number Publication date
WO2020121439A1 (en) 2020-06-18
JPWO2020121439A1 (en) 2021-03-11
JP6840305B2 (en) 2021-03-10

Similar Documents

Publication Publication Date Title
US10788328B2 (en) Methods and systems for determining routing
US10592998B2 (en) Graphical user interface based airline travel planning
US6621423B1 (en) System and method for effectively implementing an electronic visual map device
US9904450B2 (en) System and method for creating and sharing plans through multimodal dialog
US10545623B2 (en) Information processing device and information processing method to coordinate with a plurality of information processing devices
JP2020531941A (en) Map user interaction based on temporal proximity
US20110246059A1 (en) Information processing apparatus, behavior prediction display method, and computer program therefor
JP2021193376A (en) Information processing system, information processing device, information processing program, and information processing method
CN105009114B (en) Search capability is predictably presented
US20180181663A1 (en) Portable information terminal and application recommending method thereof
US9127961B2 (en) Methods and systems for use in planning a trip
US20190362387A1 (en) Information presentation device, information presentation method, and non-transitory computer readable medium storing program
CN113168793A (en) Guidance support control device and guidance support system
US20110022404A1 (en) Development of travel plans including at least one environmental impact indication
JP2007163226A (en) Navigation apparatus
EP3112807A1 (en) Mobile terminal and method for controlling the same
JP2020046939A (en) Server device, guidance device and program
KR20200009812A (en) Method and system for supporting spell checking within input interface of mobile device
JP6378635B2 (en) Client, server and information sharing system
KR20200010144A (en) Method and system for supporting spell checking within input interface of mobile device
KR102434137B1 (en) Method and system for providing departure timer
CN109866221B (en) Guiding robot system
US20210191682A1 (en) Method and system for associating and displaying content and list of contents on dual screen
KR102140435B1 (en) Method for maximizing content display effect using a plurality of geographically adjacent mobile terminals
JP6977780B2 (en) Display device, display system and display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20230228