WO2023203853A1 - Remote experience system - Google Patents

Remote experience system Download PDF

Info

Publication number
WO2023203853A1
WO2023203853A1 PCT/JP2023/005479 JP2023005479W WO2023203853A1 WO 2023203853 A1 WO2023203853 A1 WO 2023203853A1 JP 2023005479 W JP2023005479 W JP 2023005479W WO 2023203853 A1 WO2023203853 A1 WO 2023203853A1
Authority
WO
WIPO (PCT)
Prior art keywords
local
image
guide
unit
client
Prior art date
Application number
PCT/JP2023/005479
Other languages
French (fr)
Japanese (ja)
Inventor
登仁 福田
Original Assignee
株式会社サンタ・プレゼンツ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社サンタ・プレゼンツ filed Critical 株式会社サンタ・プレゼンツ
Publication of WO2023203853A1 publication Critical patent/WO2023203853A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This invention relates to a remote experience system for viewing local conditions and shopping remotely.
  • a system has been proposed that allows people to experience traveling and shopping without actually going to the location.
  • Patent Document 1 discloses a system that transmits images captured from a bus or the like running in a local area to a terminal device, which is a user, in real time via a server device. This allows the user to enjoy the local scenery while staying at home or the like.
  • Patent Document 2 a local person located far away from the user captures a video and transmits it to the user's terminal device so that the user can view it.
  • the user gives instructions to local people regarding the imaging direction, etc. This allows the user to enjoy images in the desired direction.
  • Patent Document 1 and Patent Document 2 do not have a system in place for a client to appropriately select and decide on a local guide, and there is a problem in that it is not easy to make a request.
  • Patent Document 2 when a remote user wants to change the imaging direction, he or she has to instruct a local person to change the imaging direction, making it difficult for the user to see the direction he or she wants to see.
  • the purpose of this invention is to solve any of the above-mentioned problems and provide a more advanced remote experience system that allows people to experience sightseeing and shopping in remote locations.
  • the remote experience system is a remote experience system comprising a plurality of local devices, a server device, and a client device,
  • the local device transmits an image of the local guide or the local guide robot to the imaging unit attached to the body or a moving body that moves with the local guide, and the transmitting unit.
  • a guide capable transmitting means transmits the location and the fact that guidance is possible to the server device as possible information
  • a receiving unit receives request information from the server device and requests information to enter a guide mode for the client device.
  • the server device includes a possibility information receiving means for receiving possibility information from the local device by a receiving section, and a possibility information receiving means for receiving possibility information from the local device by a receiving section, and a possible device list in which local devices that have transmitted the possibility information are arranged on a map by a transmitting section.
  • the client device includes, by means of a receiving unit, possible device list receiving means for receiving the possible device list, possible device list display means for displaying the possible device list on a display unit, and a possible device list displaying means for displaying the possible device list on a display unit.
  • a guidance request transmitting means for transmitting the identification code of the local device selected by the client's operation, the guidance request, and the identification code of the client's device as request information to the server device by the transmitting unit, and the receiving unit receives the captured image. and a captured image display unit that displays the received captured image using a display section.
  • the client can select a guide or guide robot near the place he or she wants to experience and receive guidance.
  • the imaging unit of the field device is a wide-angle imaging unit that outputs a wide-angle captured image
  • the captured image transfer means of the server device receives direction instructions from the client device. selects an area corresponding to the direction instruction from the received wide-angle captured image and transmits it as a selected area image to the client device by the transmitting unit, and the client device displays the selected area image displayed on the display unit.
  • the present invention is characterized by comprising a direction instruction transmitting means for transmitting a direction instruction input by the client to the server device.
  • the wide-angle imaging unit of the field device detects a change in the direction of the wide-angle imaging unit, regardless of the movement of the local guide, the local guide robot, or the mobile object.
  • the wide-angle captured image is output so that a predetermined direction becomes a reference direction, and the direction instruction is given with the reference direction as a reference.
  • a plurality of client devices are provided that receive and display the selected area image from the local device via a server device, and the direction instruction given by each client device is Each is characterized by being different.
  • each client can enjoy images taken in their desired direction.
  • the system according to this invention a projection unit in which the local device is attached to a local guide or a local guide robot on his or her own body or a moving body that moves with the local guide, and projects an instruction image onto the local space based on given instruction image data; In response to outputs from a drive unit that changes the projection direction of the projection unit and a sensor that detects the orientation of the projection unit, the projection unit changes regardless of the movement of the local guide, the local guide robot, or the moving body.
  • the client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets a vicinity of the characteristic partial image of the captured image as a reference captured image, and sets the reference captured image as a reference captured image.
  • the present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
  • the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
  • the system according to the present invention includes a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. It is characterized by additional features.
  • the on-site device is a wide-angle projection unit that is attached to the on-site guide or the on-site guide robot on his or her own body or on a moving object that moves with the on-site guide, and projects an instruction image into the on-site space based on given instruction image data.
  • the client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets a vicinity of the characteristic partial image of the captured image as a reference captured image, and sets the reference captured image as a reference captured image.
  • the present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
  • the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
  • the system according to the present invention is characterized by further comprising a correction means for correcting the projection of the instruction image by the projection unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space.
  • the on-site device receives outputs from a drive unit that changes the imaging direction of the imaging unit and a sensor that detects the orientation of the imaging unit, regardless of the movement of the on-site guide, the on-site guide robot, or the mobile object. further comprising a direction control means for controlling a drive unit so that the imaging unit faces in a predetermined direction based on a direction instruction with the local guide or local guide robot as the center;
  • the client device is characterized in that it includes a direction instruction transmitting means for transmitting a direction instruction input by the client to the server device while looking at the captured image displayed on the display unit.
  • the image can be viewed in a desired direction by the client's operation.
  • the local device further includes a projection unit that is attached to a local guide or a local guide robot on his or her own body or a moving body that moves with the local guide, and projects an instruction image onto the local space based on given instruction image data.
  • the drive unit also changes the projection direction of the projection unit, and receives an output from a sensor that detects the direction of the projection unit, and changes the projection direction regardless of the movement of the local guide, the local guide robot, or the mobile object.
  • the client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and sets the captured image of interest to the captured image of interest.
  • the present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
  • the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
  • the system according to the present invention includes a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. It is characterized by additional features.
  • the "guidable transmission means" corresponds to step S103.
  • the "request information receiving means" corresponds to step S104.
  • the "captured image transmitting means" corresponds to steps S106, S21, S30, and S35.
  • the "availability information receiving means" corresponds to step S121.
  • the "capable device list transmitting means" corresponds to step S123.
  • the "request information transfer means" corresponds to step S124.
  • the "captured image transfer means" corresponds to steps S125 and S91.
  • the "capable device list display means" corresponds to step S143.
  • the "guidance request transmitting means" corresponds to step S144.
  • the "captured image receiving means" corresponds to steps S145 and S41.
  • the "captured image display means" corresponds to steps S146 and S41.
  • the term "device” is a concept that includes not only one computer but also multiple computers connected via a network. Therefore, when the means (or a part of the means) of the present invention is distributed over multiple computers, these multiple computers correspond to the apparatus.
  • Program refers to not only programs that can be directly executed by the CPU, but also programs in source format, compressed programs, encrypted programs, programs that cooperate with the operating system to perform their functions, etc. It is a concept that includes
  • FIG. 1 is a functional configuration diagram of a remote experience system according to an embodiment of the present invention.
  • This is the system configuration of the remote experience system.
  • This is the hardware configuration of Smartphone GT.
  • This is the hardware configuration of the server device SV.
  • This is the hardware configuration of the client device IT.
  • It is a flowchart of local guide determination processing.
  • It is a flowchart of local guide determination processing.
  • It is a flowchart of guidance processing.
  • FIG. 2 is a functional configuration diagram of a remote experience system according to a second embodiment. It is a diagram showing a local guide wearing a local device ST. It is a diagram showing a projector 776 with direction control.
  • FIG. 7 is a diagram showing a cross section of a unit 80.
  • FIG. It is a figure showing the attachment position of gel bush 120. This is the hardware configuration of the motor control circuit 400.
  • 5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display.
  • FIG. 5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display.
  • FIG. 5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display.
  • FIG. 6 is a diagram showing a case where image feature points 512 are used instead of markers 60.
  • FIG. It is a functional configuration of a remote experience system according to a third embodiment. It is a figure showing the local guide 54 wearing the local device GT. It is a flowchart of guidance processing. It is a flowchart of guidance processing. It is a functional configuration of a remote experience system according to a fourth embodiment. It is a figure showing the local guide 54 wearing the local device GT.
  • 9 is a diagram showing a camera/laser projector integrated body 90.
  • FIG. It is a flowchart of guidance processing. It is a flowchart of guidance processing.
  • FIG. 1 shows the functional configuration of a remote experience system according to an embodiment of the present invention.
  • Field devices GT1, GT2, . . . GTn are provided to be able to communicate with the server device SV.
  • a client device IT is provided to be able to communicate with the server device SV.
  • the local devices GT1, GT2, . . . GTn are mobile terminal devices such as smartphones owned by each local guide at the site such as a tourist spot or a shopping mall. Although a plurality of local devices GT1, GT2, . . . GTn are provided, the description below will focus on the local device GT1.
  • the local guide When the local guide becomes available for guidance, the local guide inputs that fact into the local device GT1.
  • the guiding capability transmitting means 614 of the local device GT1 transmits the fact that guidance is possible, the identification code of the local device GT1, and the current position to the server device SV as possible information by the transmitting unit 620.
  • the current location of the local device GT1 is obtained, for example, from a built-in GPS receiver.
  • Local devices other than the local device GT1 also transmit the above-mentioned availability information to the server device SV when the local guide becomes available for guidance.
  • the availability information receiving means 646 of the server device SV receives these availability information through the receiving unit 642. Therefore, the server device SV can grasp which field devices are currently available for guidance.
  • the server device SV generates a possible device list showing local devices that are ready to guide on the map based on the possibility information received from each local device GT1, GT2, . . . GTn.
  • the possible device list transmitting means 652 uses the transmitting unit 644 to transmit the generated possible device list to the client device IT.
  • the possible device list receiving means 664 of the client device IT receives the possible device list.
  • Possible device list display means 666 displays a list of possible devices on display section 668. This allows the client to confirm on the map the local guides who are available to guide them and their locations.
  • the client refers to the list displayed on the display section 668 and selects one of the local devices GT1, GT2, . . . GTn (ie, local guide).
  • the guide request transmitting means 676 transmits the identification code of the selected local device GT1 and the identification code of the requester's device to the server device SV by the transmitting unit 674 as request information.
  • the request information transfer means 648 of the server device SV transfers this request information to the selected local device GT1.
  • the request information receiving means 618 of the local device GT1 receives the request information through the receiving unit 622. This causes the local device GT1 to enter the guidance mode.
  • the local device GT1 images the site using the imaging unit 612.
  • the captured image transmitting means 616 transmits the captured image of the site to the server device SV using the transmitting unit 620.
  • the captured image transfer means 650 of the server device SV transfers the captured image to the client device IT.
  • the captured image receiving means 670 of the client device IT receives the captured image through the receiving unit 662.
  • the captured image display means 672 displays the received captured image of the site on the display section 668.
  • the client can select a local guide and request guidance based on his/her selection.
  • FIG. 2 shows the system configuration of the remote experience system.
  • smartphones GT1, GT2, . . . GTn are used as local devices.
  • the smartphones GT1, GT2, . . . GTn owned by these local guides can communicate with the server device SV via the Internet.
  • the client device IT used by the client can also communicate with the server device SV via the Internet.
  • FIG. 3 shows the hardware configuration of smartphone GT.
  • a memory 204 Connected to the CPU 202 are a memory 204, a touch display 206, a short-range communication circuit 208, a built-in camera 217 serving as an imaging unit, a nonvolatile memory 212, a speaker 214, a microphone 216, a communication circuit 218, and a GPS receiver 219.
  • the short range communication circuit 208 is a circuit for short range communication such as Bluetooth.
  • the communication circuit 218 is a circuit for communicating with a base station in order to connect to the Internet.
  • the GPS receiver 219 is for receiving radio waves from satellites and acquiring its own position.
  • the built-in camera 217 is for capturing still images and videos of the site.
  • the microphone 216 is used to collect the guide's voice and local sounds.
  • An operating system 222 and a field program 224 are recorded in the nonvolatile memory 212.
  • the field program 224 cooperates with the operating system 222 to perform its functions.
  • FIG. 4 shows the hardware configuration of the server device SV.
  • a memory 554, a display 556, an SSD 558, a DVD-ROM drive 560, and a communication circuit 562 are connected to the CPU 552.
  • Communication circuit 562 is for connecting to the Internet.
  • An operating system 564 and a server program 566 are recorded on the SSD 558.
  • the server program 566 cooperates with the operating system 564 to perform its functions.
  • These programs were recorded on the DVD-ROM 568 and installed into the SSD 558 via the DVD-ROM drive 560.
  • FIG. 5 shows the hardware configuration of the client device IT.
  • a memory 304, a display 306, a microphone 308, a communication circuit 310, an SSD 312, a DVD-ROM drive 314, a mouse/keyboard 316, and a speaker 318 are connected to the CPU 302.
  • Communication circuit 310 is for connecting to the Internet.
  • An operating system 320 and a client program 322 are recorded on the SSD 312 .
  • the client program 322 cooperates with the operating system 320 to perform its functions.
  • These programs were recorded on the DVD-ROM 324 and installed into the SSD 312 via the DVD-ROM drive 314.
  • FIG. 6 to 8 show flowcharts of remote experience processing. 6 and 7 show the local guide determination process, and FIG. 8 shows the guide process.
  • a local guide who owns a local device GT registers the local guide by transmitting his or her name, photo, address, and payment method (guide fee transfer destination, etc.) to the server device SV in advance.
  • the server device SV records this information by adding an identification code of the local device GT.
  • the local guide inputs information to the effect that he/she is available to provide guidance into the local device (smartphone) GT that he or she owns.
  • the CPU 202 of the local device GT acquires the position of the local device using the GPS receiver 219 (step S102). Further, the local device GT transmits possibility information including the fact that guidance is possible, the identification code of the own device, and the position (latitude and longitude) of the own device to the server device SV (step S103).
  • the server device SV receives the availability information and records it in the nonvolatile memory 212 (step S121). Since there are many local devices, a lot of possibility information is accumulated in the server device SV. Note that if the local guide becomes unable to guide the guide because he or she has other things to do, a message to the effect that the guide is unable to guide is sent to the server device SV. In response to this, the server device SV changes the local device GT from being guideable to being unable to guide it.
  • a client who wishes to have a remote experience operates the client's device IT, specifies the place or area in which he or she desires the experience, and inputs a request for a list of local devices GT that are available for guidance.
  • the CPU 302 of the client device IT (hereinafter sometimes abbreviated as client device IT) requests the server device SV for a list of local devices GT that can be guided (step S141). For example, if a client wishes to be guided around the Metropolitan Museum of Art, he or she may input "Metropolitan Museum of Art” or specify "Metropolitan Museum of Art” on a map, and request a list of local equipment GTs that are available for guidance.
  • the server device SV extracts local devices GT that are available for guidance near the Metropolitan Museum of Art based on the stored availability information. This is placed on the map to generate a list of possible devices (step S122). Note that the list of possible devices may be generated in advance for each location.
  • the server device SV transmits the generated list of possible devices to the client device IT (step S123).
  • the client device IT receives this (step S142) and displays it on the display 306 (step S143).
  • FIG. 9 shows a list of possible devices displayed on the client's device IT.
  • Local equipment GTs located near the designated Metropolitan Museum of Art are displayed together with the name of the local guide. Further, each local device GT is associated with its identification code.
  • the client places the mouse cursor on the local device GT, the local guide's profile, ratings from past clients, facial photos, etc. can be sent from the server device SV.
  • the client uses the mouse/keyboard 316 to select one of the displayed local devices GT (local guide) with reference to these.
  • the client device IT transmits the identification code of the selected local device GT and the identification code of the client device IT as request information to the server device SV (step S144). For example, assume that Emma's local device GT is selected.
  • the server device SV transfers this request information to the selected Emma's local device GT (step S124).
  • the CPU 202 of Emma's local device GT receives the request information (step S104).
  • Emma's local device GT enters the guidance mode (step S105).
  • a desired local guide can be selected by comparing and selecting local guides who are near the place where the user desires guidance.
  • FIG. 8 shows a processing flowchart in the guide mode.
  • Emma the local guide who owns the local device GT, and the client who operates the client device IT, use the Internet call function of the local device GT and client device IT to communicate via the server device SV (or directly). , can make voice calls to each other (steps S106, S125, S145, S146, S147, S126, S107, S108 in FIG. 8).
  • the client tells Emma, the local guide, that he wants to go to the Metropolitan Museum of Art.
  • the local guide Emma takes a video with the camera of the local equipment GT and heads to the Metropolitan Museum of Art.
  • the client can view the captured video and surrounding sounds on the client's device IT.
  • the local guide Emma Upon arriving at the ticket counter of the Metropolitan Museum of Art, the local guide Emma operates the local device GT to display a payment window 700 superimposed on the video being captured, as shown in FIG.
  • the payment window 700 displays a two-dimensional barcode for accessing a site for paying admission fees to the Metropolitan Museum of Art. It is preferable that such two-dimensional barcodes be obtained in advance from target facilities or shops and stored in the server device SV.
  • the local guide can record it on his or her smartphone GT or obtain it from the server device SV.
  • the client reads the two-dimensional barcode displayed on the client's device IT using a smartphone or the like and makes a payment.
  • the local guide Emma may pay the entrance fee in advance and settle the payment together with the guide fee later.
  • the client can watch a video of the inside of the Metropolitan Museum of Art guided by the local guide Emma using the client's IT device.
  • the client can select a local guide and experience the local location from a remote location.
  • the two-dimensional payment barcode 700 prepared in the same manner as in FIG. 10 can be used.
  • the client when the client selects a local guide, the client is allowed to check the profile of the local guide. Alternatively or in addition to this, it may be possible to select a local guide after checking the camera image of the local device GT. This allows you to check the quality of the camera images from the on-site equipment.
  • the local guide may upload videos of past guidance to the server device SV so that the client can view them.
  • evaluations and comments made by past clients to the local guide may be displayed.
  • the local guide fixes, holds, or wears the local device.
  • the on-site device may be fixed, held, or attached to a robot for on-site guidance. The same applies to the following embodiments.
  • the local guide's smartphone is used as the local device GT in the guidance mode.
  • a case will be explained in which a local device GT is used, which is designed to be able to send more stable videos and to make it easier to convey instructions from the client.
  • FIG. 11 shows the functional configuration of the remote experience system according to the second embodiment.
  • This embodiment does not provide a mechanism for matching clients and local guides. However, a mechanism similar to the first embodiment can be used. Alternatively, other matching methods may be used.
  • the local guide is wearing a wide-angle imaging unit 12. Further, a projection section 14 is provided via a drive section 16 .
  • the projection unit 14 is configured so that its projection direction can be changed by a drive unit 16.
  • the projection direction of the projection unit 14 is detected by the sensor 28.
  • the direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and maintains the projection direction of the projection unit 14 in a predetermined direction centered on the local guide, regardless of the movement of the local guide.
  • the imaging unit 12 of the field device GT is a wide-angle camera such as a spherical camera, and captures images of the local guide in all spherical directions to generate a wide-angle captured image.
  • This wide-angle captured image is transmitted to the server device SV by the transmitter 22 under the control of the captured image transmitter 18.
  • the server device SV receives this wide-angle captured image, selects a portion in that direction from the wide-angle captured image based on the direction instruction received from the client device IT, and generates a selected area image.
  • the captured image transfer means of the server device SV transmits the generated selected area image to the client device IT using the transmitter 758.
  • the captured image receiving means 36 of the client's device IT receives the selected area image in the direction specified by the client, and displays it on the captured image display section 40.
  • the client looks at this selection area image and wants to change the direction, he inputs a direction instruction into the client's device IT.
  • the direction instruction transmitting means 39 of the client device IT transmits the direction instruction to the server device SV through the transmitter 34.
  • the server device SV receives this direction instruction and uses it to select a selection area image from the wide-angle captured image. Further, the direction command transfer means 752 transmits the direction command to the local device GT using the transmitter 758.
  • the client can select and view images of the local guide in any direction in the celestial sphere. This allows the client to enjoy the experience of viewing images in any direction without having a fixed field of view.
  • each client can independently enjoy the selected area image in their preferred direction.
  • this embodiment has a function for the client to give clear instructions to the local guide. For example, when a local guide enters a store and makes purchases for a client, it is preferable that the client be able to clearly indicate the items purchased. This is not always possible through voice calls. When multiple similar products are lined up, it is difficult to determine which product to purchase.
  • the client when the client gives an instruction, the client inputs a fixed command into the client device IT.
  • the captured image display unit 40 uses the selected area image at that time as a reference selected area image and displays it as a still image.
  • the client inputs an instruction image from the instruction image input section 44 while viewing the local reference selection area image displayed on the captured image display section 40.
  • an instruction image for example, a circle image
  • the instruction image transmitting means 38 transmits the input instruction image to the server device SV by the transmitting unit 34.
  • the fixed command transfer means 754 of the server device SV transfers the instruction image to the local device GT.
  • the local device GT receives the instruction image through the receiving section 24 and projects the instruction image from the projection section 14.
  • the instruction image for example, a circle image
  • the instruction image 62 is projected onto the product 52c desired by the client among the products 52a, 52b, 52c, and 52d.
  • the projection direction of the projection unit 14 is fixed, even if the local guide changes the direction of his or her face, the instruction image will be projected onto the location intended by the client.
  • the correction means 26 of the local device GT compares the characteristic partial image (marker, etc.) in the reference selection area image with the characteristic partial image in the current selection area image so that the instruction image is correctly projected at the intended position. Then, the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
  • the transmitting unit 34 transmits the fixing command to the local device GT via the server device SV.
  • the receiving unit 24 of the local device GT receives this and records the selected area image at that time as a reference selected area image. Further, the local guide places markers near the products 52a, 52b, 52c, and 52d so that the products can be imaged.
  • the local guide can confirm which product to purchase based on the actually projected instruction image 62.
  • This instruction image 62 is displayed at the correct position by the direction control means 20 even if the local guide changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the local guide's head, reducing stress. Furthermore, if multiple local guides are accompanying the local guide, even if the local guide wearing the local device GT moves his or her head significantly, the instruction image 62 will continue to be projected, which may confuse the other local guides. None. Furthermore, even if the local guide moves, the instruction image 62 is displayed correctly.
  • Figure 12 shows the local guide 54 equipped with the local device GT.
  • the local guide 54 goes to sightseeing spots, facilities, shops, etc., and allows remote clients to experience various things by transmitting videos.
  • a smartphone 772, a laser projector 776 with direction control, a spherical camera 774, and a headset 770 constitute a field device GT.
  • the speaker and microphone of headset 770 are connected to smartphone 772 by short-range communication (such as Bluetooth).
  • the omnidirectional camera 774 (with built-in short-range communication circuit) and the laser projector 776 with direction control (with built-in short-range communication circuit) are similarly connected to the smartphone 772 by short-range communication.
  • the omnidirectional camera 772 is provided at the top of the headset 770. Images in all directions are captured by a half-celestial camera that captures an image in front of the local guide 54 and a half-celestial camera that captures an image behind the local guide.
  • a laser projector 776 with direction control is provided above the omnidirectional camera 772.
  • FIG. 13 shows the external appearance of the laser projector 776 with direction control.
  • the laser projector 776 with direction control is provided with a base 93 , and the base 93 is fixed to the top of the omnidirectional camera 772 .
  • a unit 80 housing a laser projector 84 is fixed to the base 93 via a triaxial structure 90 (another multiaxial structure may be used) as a drive section.
  • a motor 92 is fixed to the base 93 of the triaxial structure 90, and one end of an intermediate member 92A that is rotated in the XY plane by the motor 92 is connected.
  • the intermediate member 92A is formed in an L-shape, and a motor 94 is fixed to the other end.
  • One end of an intermediate member 94A that is rotated in the ZX plane by the motor 94 is connected to the motor 94.
  • the intermediate member 94A is formed in an L shape, and a motor 96 is fixed to the other end.
  • a mount member 97 that is rotated in the ZY plane by the motor 96 is connected to the motor 96 . Note that the XYZ axes shown in FIG. 13 vary as each member 92A, 94A, 97 rotates.
  • the three-axis structure 90 can adjust the orientation of the mount member 97 with three-axis degrees of freedom by driving the motors 92, 94, and 96.
  • the base 93 is provided with a triaxial gyro sensor JS and a triaxial acceleration sensor AS as the sensors 28. Further, the base 93 is provided with a motor control circuit (not shown) that controls the motors 92, 94, and 96 described above. Each of the motors 92, 94, and 96 is controlled by a motor control circuit based on the outputs of the three-axis gyro sensor JS and the three-axis acceleration sensor AS.
  • a unit 80 housing a laser projector 84 is fixed to the mount member 97 of the triaxial structure 90.
  • a laser projector control circuit 104 (including a short-range communication circuit) that controls the laser projector 84 is provided within the housing 81 of the unit 80.
  • the laser projector control circuit 104 may be provided in the base 93, it is preferable that at least the MEMS circuit of the laser projector 104 is provided in the unit 80.
  • the casing 81 is attached to the mount top surface 101, mount side surface 97, and mount bottom surface 99 of the triaxial structure 90 via silicone gel bushings 120 (for example, Taica's anti-vibration material gel bushing B-1). Note that in FIG. 13, the mount top surface 101 is omitted for easy understanding.
  • the silicone gel bush 120 includes a ring-shaped silicone gel 114 inserted outside the upper part of the ring-shaped silicone gel 116.
  • the upper part of the silicone gel 116 is inserted into a hole provided in the housing 81.
  • the housing 81 is sandwiched between the silicon gel 114 and the silicon gel 116.
  • the silicon gels 114 and 116 are screwed to the mount bottom surface 99 by bolts 110 and washers 112. With this structure, the housing 81 is held by the silicone gels 116 and 114. This can prevent high-frequency vibrations from being transmitted to the housing 81 from the outside.
  • silicone gel bushes 120 are provided at two locations on each of the top, side, and bottom surfaces of the housing 81.
  • FIG. 16 shows the hardware configuration of the motor control circuit 400.
  • a memory 404, a gyro sensor JS, an acceleration sensor AS, a camera 82, a laser projector 84, motors 92, 94, and 96, and a nonvolatile memory 406 are connected to the CPU 402.
  • the operating system 31 and motor control program 32 are recorded in the nonvolatile memory 406.
  • the motor control program 32 cooperates with the operating system 31 to perform its functions.
  • the hardware configuration of the client device IT and the smartphone 772 is the same as in the first embodiment.
  • FIG. 17 shows a flowchart during guidance.
  • the local guide's smartphone 200 acquires a wide-angle captured image (video) of the omnidirectional camera 774 through short-range communication, and transmits it to the server device SV (step S21).
  • the server device SV receives this wide-angle captured image and records it on the SSD 558.
  • the wide-angle captured image output by the omnidirectional camera 774 is an image captured in all directions.
  • the server device SV selects an image in a predetermined direction from this wide-angle captured image according to the direction command acquired and recorded from the client device IT, and generates a selected area image (step S91). Therefore, the selected area image is the same as the image obtained when the local guide takes an image in a predetermined direction with a normal camera.
  • the server device SV transmits the generated selection area image to the client device IT.
  • the client device IT receives the selected area image and displays it on the display 306 (step S41).
  • FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
  • an image in a predetermined direction is selected and displayed as a selected area image. Since this predetermined direction is determined as a direction in the wide-angle captured image, it is determined as the up, down, left and right directions centered on the local guide.
  • the client wishes to change this predetermined direction and view the image in a different direction, he/she operates the keyboard/mouse 316 of the client's device IT and clicks on the direction command button 500.
  • the direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
  • the client device IT transmits a direction command corresponding to the click to the server device SV (step S42).
  • Server device SV receives this and updates the direction command. Therefore, the server device SV changes the predetermined direction for selecting the selection area image in step S91, and the image in the direction commanded by the direction command button 500 is displayed on the display 306 of the client device IT. For example, as shown in FIG. 19, selected area images can be viewed in different directions.
  • the wide-angle captured image is an image in all celestial sphere directions, it is possible to view selected area images in any direction, up, down, left, or right, centered on the local guide.
  • the client can enjoy the local image in the desired direction through his/her own operations.
  • the client can give instructions to the local guide by projecting an image onto a local object.
  • An instruction image is transmitted from the client's device IT, and is projected onto an object at the site using a direction-controlled laser projector 776 worn by the local guide.
  • the guide can provide accurate instructions.
  • a direction-controlled laser projector 776 is used to ensure that the instruction image is projected correctly.
  • the CPU 402 of the motor control circuit 400 acquires the outputs of the gyro sensor JS and acceleration sensor AS of the direction-controlled laser projector 776 (FIG. 17, step S1).
  • a gyro sensor and an acceleration sensor in three orthogonal axes are used.
  • the motor control circuit 400 calculates in which position and in which direction the base 93 (see FIG. 13) is located in three-dimensional space based on the outputs of the gyro sensor JS and the acceleration sensor AS.
  • the rotation angles of the motors 92, 94, and 96 are then controlled so that the unit 80 faces in a predetermined direction regardless of the position and direction of the base 93 (step S2). Therefore, regardless of the orientation of the local guide 54's head, the unit 80 is kept in a constant direction.
  • Such control is similar to that of a gimbal used as a stabilizing device for cameras and the like.
  • the above-mentioned predetermined direction is changed by a direction command from the client device IT (steps S42, S92, S22, S3). Therefore, the projection direction of the laser projector 84 matches the direction of the local image that the client is viewing on the display 306. Thereby, the range of the selected area image and the projection range of the laser projector 84 are made to match.
  • FIG. 20 shows a flowchart of instruction image projection.
  • FIG. 21 a case will be described in which an instruction is given to purchase 52c among products 52a, 52b, 52c, 52d, etc. lined up at a store.
  • the client instructs the local guide 54 to place a marker 60 prepared in advance nearby by voice call or the like.
  • the size and shape of the image of the marker 60 are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the laser projector 84 to the marker 60 based on the captured image of the marker 60.
  • the selected area image is displayed as a moving image on the display 306 of the client device IT. While looking at this selected area image, the client clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 21 to give a fixing instruction. Note that here, it is assumed that the marker 60 prepared as a card is placed leaning against the product 52b.
  • the client device IT When the client device IT receives a fixing command by clicking the instruction input mode button 501, the client device IT sets the selected area image at that time as the reference selected area image and displays it on the display 306 as a still image (step S52).
  • the client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, a circle mark 62 is drawn using the mouse 316 on the image of the product 52c displayed on the display 306, and inputted.
  • the client device IT transmits the fixed command to the smartphone 200 via the server device SV (steps S51, S93).
  • the smartphone 200 that has received the fixing command records the selected area image at the time of receiving the fixing instruction in the nonvolatile memory 212 as a reference selected area image (step S32). Note that since the smartphone 200 receives and updates the direction command in step S22, it can generate the selected area image from the wide-angle captured image.
  • the client device IT and the smartphone 200 can recognize the selected area image at the same time as the reference selected area image.
  • information for identifying the frame such as a frame number
  • the smartphone 200 by determining the reference selection area image based on information specifying this frame, it is possible to prevent deviations due to time lag.
  • the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94).
  • the client device IT cancels the instruction input mode, stops displaying the still image as the reference selection area image, and displays the transmitted selection area image as a moving image (step S54). This allows the instructor to see the local situation again.
  • the data structure of the instruction image sent to the smartphone 200 is shown in FIG. 23A.
  • the instruction image data is the actual data of the instruction image input by the instructor, as shown in FIG. 23B.
  • the reference coordinate position is the XY coordinate value of the reference point of the instruction image when the reference point of the marker image (for example, the lower center point of M) is set as the origin.
  • the reference point is the upper left of the rectangle circumscribing the instruction image input by the instructor.
  • the instruction image is transmitted as image data, but the parameters may be transmitted as numerical values depending on the shape of the image determined in advance. For example, if it is a perfect circle, the center coordinates and radius, and if it is a square, the upper left coordinates and side lengths may be expressed numerically and transmitted.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 23A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image (selected area image) from the omnidirectional camera 774 (step S33).
  • the range of the selected area image (the imaging range when outputting the selected area image with a normal camera) and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current selection area image is exactly the same as the recorded reference selection area image (that is, if the field personnel has not moved at all since the reference selection area image), the position based on the reference coordinate position (FIG. 23C) Then, if the instruction image data is projected by the laser projector 84, the instruction image 62 will be projected onto the product 52c.
  • this instruction image 62 Since the position of this instruction image 62 matches the position input by the client on the display 306, the target product 52c can be accurately shown to the local guide.
  • the local guide can use the instruction image 62 as a landmark to purchase the product 52c without making a mistake.
  • the smartphone 200 calculates the distance and direction between the laser projector 84 and the marker 60 (and the location where the instruction image 62 is to be projected) based on the image of the marker 60 in the reference selection area image. As mentioned above, since a known pattern is printed on the marker 60 in advance, the distance to the marker 60 placed near the product 52c (and the location where the instruction image 62 should be projected) can be determined based on the captured image. The direction can be calculated.
  • the smartphone 200 calculates the distance and direction to the marker 60 (and the location where the instruction image 62 is to be projected) based on the current selection area image acquired in step S33.
  • the smartphone 200 determines the distance and direction to the marker 60 (and the place where the instruction image 62 should be projected) in the reference selection area image, and the distance and direction to the marker 60 (and the place where the instruction image 62 should be projected) of the current selection area image.
  • the instruction image 62 is transformed based on the comparison with the distance and direction to the location), and the position where the instruction image 62 is projected is controlled (step S34).
  • the instruction image 62 is controlled to be enlarged/reduced and projected according to the change in the distance between the camera 82 and the marker 60 (or the instruction image 62).
  • control is performed to move the position where the instruction image 62 is projected as the marker 60 moves.
  • the direction fixing control (see FIG. 17) is performed separately by the triaxial structure 90, in many cases, the instruction image 62 is displayed at the correct position by performing the control for FIGS. 24B and 24C. be able to.
  • the direction fixing control by the triaxial structure 90 is separately performed and the above control is performed, so the instruction image 62 can be stably displayed at the correct position. Moreover, even if the local guide 54 changes the direction of his head and takes his line of sight away from the object 52, the instruction image 62 continues to be displayed by the direction fixing control. Therefore, stress on the local guide 54 is reduced.
  • the slope of surface 510 may change from the reference selection area image of FIG. 25A to the current selection area image of FIG. 25B.
  • the smartphone 200 calculates the inclination of the surface 510 of the object 52 based on the image of the marker 60 in the reference selection area image (FIG. 25A). Thereby, the actual distance LL between the marker 60 and the instruction image 62 is calculated based on the reference coordinate position PL1 (X or Y) sent from the client device IT.
  • the inclination of the surface 510 of the object 52 is calculated based on the image of the marker 60 in the current selected area image (FIG. 25B).
  • the position where the instruction image 62 should be displayed is determined based on the actual distance LL calculated above, and the reference coordinate position PL2 (X or Y) is calculated.
  • the smartphone 200 can control the position at which the instruction image 62 is projected based on this reference coordinate position PL2, and can project the instruction image 62 at the correct position. Furthermore, the instruction image 62 is transformed so that the projected instruction image 62 is not distorted.
  • the above process can be performed in the same way in both the vertical and horizontal directions.
  • the imaging range 504 for the reference selection area image may be tilted diagonally as shown in the imaging range 506.
  • FIG. 26 shows a tilt in a direction horizontal to the plane of the paper, such a tilt may occur in all three-dimensional directions. As a result, the projected instruction image 62 will also be distorted.
  • the instruction image 62 is deformed (inversely deformed to the distortion) based on the image of the marker 60 in the reference selection area image and the image of the marker 60 in the current selection area image ), it is possible to project the correct instruction image.
  • Smartphone 200 calculates the distance and direction between laser projector 84 and marker 60 based on the image of marker 60 in the reference selection area image. Furthermore, the smartphone 200 calculates the distance and direction to the marker 60 near the target object 52 based on the current selected area image acquired in step S33. The smartphone 200 transforms the instruction image 62 based on the comparison between the distance and direction to the marker 60 in the reference selection area image and the distance and direction to the marker 60 in the current selection area image. Control the projection position.
  • the instruction image intended by the client is projected and displayed on the local object 52.
  • the marker 60 is preferably placed on the plane where the instruction image 60 is to be displayed.
  • the smartphone 200 analyzes the captured image to calculate feature points (points on the boundary of the object, etc.) near the object (near the marker 60). By comparing the feature points in the reference photographed selection area image and the feature points in the current selection area image, the positional relationship between the marker 60 and the surface on which the instruction image 62 is to be displayed is determined.
  • the display of the instruction image can be stopped by operating the client device IT or the local device GT.
  • a correction means 26 is provided for correctly displaying the instruction image by performing image processing or projection control.
  • the direction control means 20 that controls the projection direction of the projection section 14 by the drive section 16 may be sufficient.
  • a laser projector 776 with direction control is provided.
  • the laser projector 776 with direction control may not be provided, and only the omnidirectional camera 774 may be provided. Even in this case, the client can view the image in the direction he or she desires.
  • the direction instruction is given based on the reference direction of the wide-angle captured image (for example, in front of the local guide). Therefore, when the local guide changes the orientation of his or her body, the orientation of the selected area image changes accordingly. Even if the client wants to see the buildings on the right side of the road, if the local guide turns to the side, he or she will not be able to see the desired direction.
  • the omnidirectional camera 774 can be attached via a triaxial structure to control the direction, similar to the directional laser projector 776. .
  • the omnidirectional camera 774 can be attached via a triaxial structure to control the direction, similar to the directional laser projector 776.
  • the laser projector 84 may be fixed to the omnidirectional camera 774, and the projection direction may be controlled by the above three-axis structure.
  • a sensor such as a three-axis gyro sensor or a three-axis acceleration sensor
  • a sensor is installed to detect the orientation of the omnidirectional camera 774, so that even if the local guide changes the orientation, images of the selected area in the same orientation will be extracted. Good too. In this case, a triaxial structure is not required.
  • the orientation of the omnidirectional camera 774 may be detected by analyzing the image itself captured by the omnidirectional camera 774.
  • the instruction image 62 is always projected onto the object 52 by the laser projector 84. However, if there are people in the projection direction, the laser projector 84 may not emit radiation.
  • the laser projector 84 is used as the projection section.
  • a normal projector may also be used.
  • a three-axis structure 90 (gimbal) is used, but a single-axis structure, a two-axis structure (gimbal), a structure with four or more axes, etc. may also be used.
  • the local guide 54 attaches the marker 60 to the object 52.
  • the marker 60 may be placed on the object 52 at the site in advance.
  • the marker 60 is used to determine the distance and direction to the laser projector 84.
  • SLAM SLAM or the like to grasp this only from the feature points of the captured image and perform similar processing.
  • the smartphone 200 recognizes the feature points 512 (vertices that characterize the image, etc.), transmits them to the client device IT, and displays them on the display 306 as shown in FIG. 27.
  • the instructor looks at this image, operates the mouse 316, and selects a feature point 512 to be used for position specification. It is preferable to select a feature point 512 on the same plane as the object 52 as the feature point 512 used for position identification.
  • the instruction input mode button 501 When the instruction input mode button 501 is clicked, information on the selected feature point 512 (coordinate values on the screen) is transmitted to the smartphone 200.
  • the smartphone 200 can specify the position and direction based on these feature points 512.
  • the client confirms the screen shown in FIG. 21b on the client device IT and clicks the instruction input mode button 502.
  • the client device IT or the smartphone 200 detects that the marker 60 in the captured image has entered a predetermined area (for example, a predetermined central area) in the captured image
  • the client device IT or the smartphone 200 automatically switches to the instruction input mode. You may try to hit the target. The same applies when processing is performed using the feature points 512 without using the marker 60.
  • the motor control circuit 400 controls the triaxial structure 90, and the smartphone 200 controls the projection position based on image processing.
  • the three-axis structure 90 may also be controlled by the smartphone 200.
  • a circuit may be provided in the base 93 to control the projection position based on image processing.
  • smart phone 200 will be used only for phone calls.
  • a telephone call function may also be provided within the base 93.
  • the instruction image is transformed by the smartphone 200 so that the instruction image is not projected in a distorted (or changed in size) manner.
  • the shape of the instruction image is not important and it is important to indicate a specific position (for example, when indicating the position at the center point of a cross mark), if the position can be shown correctly, the instruction image Even if it becomes distorted (even if its size changes), there is no problem. In such a case, the process of transforming the instruction image may not be performed.
  • the projection position and the like are controlled based on image processing by the smartphone 200 (FIG. 20).
  • the projection position and the like may not be controlled based on image processing by the smartphone 200, and only the processing by the triaxial structure 90 may be performed.
  • the triaxial structure 90 control is sufficient.
  • the marker 60 may not be used.
  • the smartphone 200 not only performs the controls corresponding to FIGS. 24B and 24C, but also performs the controls corresponding to FIGS. 25 and 26. However, only the controls corresponding to FIGS. 24B and 24C may be performed.
  • the mode when a fixing command is given to the client device IT, the mode is set such that a reference selection area image is displayed as a still image and an instruction image is input. However, if the local guide does not move, the selected area image may be displayed as a moving image.
  • the instructor inputs an instruction image in this state and clicks the instruction image transmission button 502.
  • the client device IT and the smartphone 200 may use the captured image at that time as the reference selection area image.
  • a still image is used as the instruction image.
  • a moving image may be used as the instruction image.
  • the on-site device repeatedly reproduces the video.
  • the smartphone 200, the laser projector 776 with direction control, and the omnidirectional camera 774 constitute the on-site device. However, they may be constructed as one. Further, instead of the smartphone 200, a dedicated device, a PC, a tablet, a stick type PC, etc. may be used.
  • the direction of the displayed captured image is changed by operating the direction change button 500.
  • the direction of the displayed captured image may be changed by dragging the screen (moving the cursor while holding down the mouse button).
  • a camera and a projector are attached to the headset 770. However, it may also be attached to something else worn, such as a helmet.
  • It may be attached to a car, bicycle, cart, etc. operated by a local guide.
  • the driving unit 16 controls the projection direction, and the smartphone 200 performs image processing and projection control (step S34) so that the instruction image is correctly displayed by following the marker 60. ing.
  • control for tracking the marker 60 and the like may also be controlled by the drive section 16.
  • the wide-angle captured image is transmitted from the smartphone 200 to the server device SV, and the selected area image is generated by the server device SV.
  • the selected area image is generated by the server device SV.
  • different selection area images can be transmitted. For example, as shown in FIG. 22, in the case of a tour where a plurality of clients perform remote experiences for one local guide 54, each client A, B, and C can use the change button 500 to request a tour. You can see the image in the direction you want to go. In this case, only one client can provide the instruction image.
  • selection area images for each client A, B, and C may be generated using the smartphone 200 and transmitted to the client device IT via the server device SV. Communication load can be reduced.
  • a spherical camera 774 that captures images in all directions is used.
  • a camera that captures images at 360 degrees in the horizontal direction may be a predetermined degree
  • a hemispherical camera that captures images below the horizontal may also be used.
  • an instruction image of a product selection mark is projected at the site.
  • a barcode used for smartphone payment for example, a PayPay (trademark) payment barcode
  • this barcode may be read by a camera of the client's device IT and projected onto a desk or the like at the site as an instruction image. Local shops can read this barcode and make payments with customers.
  • the credit card number may be recorded in the server device SV in advance when the client registers as a user.
  • the local guide accesses the server device SV to use the client's credit card number at the time of payment, and sends amount information to the server device SV.
  • the server device SV transmits the credit number, amount information, etc. to the client's client terminal device IT or the client's smartphone to request approval.
  • the server device SV uses the credit number and amount to access the credit company's server and process the payment.
  • a stationary PC is used as an example of the client device IT.
  • a smartphone, tablet, laptop computer, etc. may also be used.
  • an image of the local area may be displayed on a head-mounted display (HMD) worn by the client.
  • HMD head-mounted display
  • a 6Dof head tracker
  • the client can change the direction of the local image according to the direction of their head without having to input direction commands using a mouse or the like, allowing them to naturally enjoy the local scenery.
  • FIG. 28 shows the functional configuration of the remote experience system according to the third embodiment.
  • a wide-angle camera such as a spherical camera is used as the imaging section 12, and a spherical laser projector is used as the projection section 14.
  • the imaging unit 12 of the field device GT is a wide-angle camera such as a spherical camera, and captures images of the local guide in all spherical directions to generate a wide-angle captured image.
  • This wide-angle captured image is transmitted to the server device SV by the transmitter 22 under the control of the captured image transmitter 18.
  • the server device SV receives this wide-angle captured image, selects a portion in that direction from the wide-angle captured image based on the direction instruction received from the client device IT, and generates a selected area image.
  • the captured image transfer means of the server device SV transmits the generated selected area image to the client device IT using the transmitter 758.
  • the captured image receiving means 36 of the client's device IT receives the selected area image in the direction specified by the client, and displays it on the captured image display section 40.
  • the client looks at this selection area image and wants to change the direction, he inputs a direction instruction into the client's device IT.
  • the direction instruction transmitting means 39 of the client device IT transmits the direction instruction to the server device SV through the transmitter 34.
  • the server device SV receives this direction instruction and uses it to select a selection area image from the wide-angle captured image. Further, the direction command transfer means 752 transmits the direction command to the local device GT using the transmitter 758.
  • the client can select and view images of the local guide in any direction in the celestial sphere. This allows the client to enjoy the experience of viewing images in any direction without having a fixed field of view.
  • each client can independently enjoy the selected area image in their preferred direction.
  • the client When giving an instruction, the client inputs a fixing command while the selected area image containing the target object is displayed.
  • the captured image display unit 40 sets the selected area image in that direction as a reference selected area image and displays it as a still image.
  • the direction when the fixed command is given is transmitted to the local device GT via the server device SV.
  • the client inputs an instruction image from the instruction image input section 44 while viewing the local reference selection image displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the instruction image input by the transmitter 34 to the local device GT via the server device SV.
  • the follow-up control means 21 of the field device GT receives the instruction image by the receiving section 24, and controls the projection section 14 to project the instruction image in the direction based on the direction when the fixing command is given. control. As a result, the instruction image 62 is projected onto the target object 52.
  • the direction in which the instruction image is projected by the projection unit 14 matches the direction of the reference selection area image, so the instruction image is projected onto the location intended by the instructor. Although it is possible to implement this control alone, if the person in charge of the site moves from place to place, the projected position of the instruction image will shift.
  • the correction means 26 of the local device GT compares the characteristic partial images (markers, etc.) on the reference selection area screen with the characteristic partial images in the current selection area image so that the instruction image is correctly projected at the intended position. Then, the instruction image is deformed and the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
  • a spherical laser projector 780 is used instead of the laser projector 776 with direction control. Therefore, a spherical camera 774 and a spherical laser projector 780 are fixed to the top of the headset 770.
  • the omnidirectional laser projector 780 is configured to be able to project in all directions, up, down, left and right. It may be constructed by combining a plurality of laser projectors.
  • the hardware configuration of the client device IT is the same as that in the first embodiment (see FIG. 5). Furthermore, since the three-axis structure 90 is not used, the motors 92, 94, and 96 for controlling it are unnecessary, and the motor control circuit 400 is also unnecessary.
  • the hardware configuration of the smartphone 200 is the same as that of the first embodiment (see FIG. 3).
  • the hardware configuration of the server device SV is also similar to that of the first embodiment (see FIG. 4).
  • Remote experience processing Figure 30 shows a flowchart of remote experience processing.
  • the smartphone 200 of the local guide 54 acquires a wide-angle captured image of the omnidirectional camera 774 by short-range communication (wired communication may be used), and transmits it to the server device SV via the Internet (step S21).
  • the server device SV receives this wide-angle captured image and records it on the SSD 558.
  • the wide-angle captured image output by the omnidirectional camera 774 is an image captured in all directions.
  • the server device SV selects an image in a predetermined direction from this wide-angle captured image according to the direction command acquired and recorded from the client device IT, and generates a selected area image (step S91). Therefore, the selected area image is the same as the image obtained when the local guide takes an image in a predetermined direction with a normal camera.
  • the server device SV transmits the generated selection area image to the client device IT.
  • the client device IT receives the selected area image and displays it on the display 306 (step S41).
  • FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
  • an image in a predetermined direction is selected and displayed as a selected area image. Since this predetermined direction is determined as a direction in the wide-angle captured image, it is determined as the up, down, left and right directions centered on the local guide.
  • the client wishes to change this predetermined direction and view the image in a different direction, he/she operates the keyboard/mouse 316 of the client's device IT and clicks on the direction command button 500.
  • the direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
  • the client device IT transmits a direction command corresponding to the click to the server device SV (step S42).
  • Server device SV receives this and updates the direction command (step S92). Therefore, the server device SV changes the predetermined direction for selecting the selection area image in step S91, and the image in the direction commanded by the direction command button 500 is displayed on the display 306 of the client device IT. For example, as shown in FIG. 19, selected area images can be viewed in different directions.
  • server device SV transmits the direction command received from the client device IT to the smartphone 200 (step S92), and the smartphone 200 receives this and updates the direction command (step S22).
  • the wide-angle captured image is an image in all celestial sphere directions, it is possible to view selected area images in any direction, up, down, left, or right, centered on the local guide.
  • the client can enjoy the local image in the desired direction through his/her own operations.
  • the selected area image in the direction selected by the client is displayed as a moving image on the display 306 of the client's device IT.
  • the client clicks the instruction input mode button 502 while viewing the partial captured image and with the object 52 and marker 60 displayed.
  • the client device IT sets the selected area image at that time as a reference selected area image and displays it on the display 306 as a still image (step S52).
  • the client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, in the product image displayed on the display 306, an instruction image 62 (in this example, a circle image) is drawn on the product 52c desired to be purchased using the mouse 316 and input.
  • the client device IT transmits the fixed command and the direction when the fixed command was given to the smartphone 200 via the server SV (steps S51, S93).
  • the smartphone 200 that has received the fixing command determines a reference selected area image based on the selected area image and direction at the time of receiving the fixing command, and records it in the nonvolatile memory 212 (step S32). Note that since the direction command has been received in step S22 of FIG. 30, the smartphone 200 can generate the selected area image from the wide-angle captured image.
  • the client device IT and the smartphone 200 can recognize the selected area image at the same time as the reference selected area image.
  • information for identifying the frame such as a frame number
  • the smartphone 200 by determining the reference selection area image based on information specifying this frame, it is possible to prevent deviations due to time lag.
  • the client device IT When the client finishes inputting the instruction image, he or she clicks the instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference selection area image on the display 306. Thereby, the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94). Further, the client device IT cancels the instruction input mode, stops displaying the still image as the reference selection area image, and displays the transmitted selection area image as a moving image (step S54). This allows the client to see the local situation again.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 23A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current wide-angle captured image from the omnidirectional camera 774, and extracts a selected area image based on the direction command received in step S22 (step S33).
  • the smartphone 200 compares the marker of the reference selection area image with the marker of the current selection area image, and adjusts the instruction image by the spherical laser projector 780 so that the instruction image is correctly projected by following the marker.
  • the projection direction is controlled (step S34). Furthermore, by comparing the markers, the instruction image is transformed so that it is correctly projected, and its projection position is controlled.
  • the omnidirectional laser projector 780 since the omnidirectional laser projector 780 is used, it is possible to simultaneously project images instructed by a plurality of clients.
  • a spherical camera 774 that captures images in all directions and a spherical laser projector 780 that projects images in all directions are used.
  • a camera that captures images 360 degrees horizontally may be a predetermined degree
  • a hemispherical camera that captures images below the horizontal may be a hemispherical camera that captures images in front (backward)
  • a camera that captures images 360 degrees horizontally may be a predetermined degree
  • a laser projector that projects, a hemispherical laser projector that projects downward (upwards) from the horizontal, a hemispherical laser projector that projects forward (rearward), etc. may be used.
  • a marker is used as the partial feature image, but feature points of the image may also be used.
  • the omnidirectional camera 774 and the omnidirectional laser projector 780 are directly attached to the headset 770. However, it may be attached via a cushioning material such as silicone gel.
  • the direction instruction is given based on the reference direction of the wide-angle captured image (for example, in front of the local guide). Therefore, when the local guide changes the orientation of his or her body, the orientation of the selected area image changes accordingly. Even if the client wants to see the buildings on the right side of the road, if the local guide turns to the side, he or she will not be able to see the desired direction.
  • the omnidirectional camera 774 (the omnidirectional laser projector 780) can be attached via a three-axis structure to perform direction control.
  • the omnidirectional camera 774 (the omnidirectional laser projector 780) can be attached via a three-axis structure to perform direction control.
  • FIG. 32 shows the functional configuration of the remote experience system according to the fourth embodiment.
  • the local guide wears the imaging section 12 and the projection section 14 via the driving section 16.
  • the imaging area of the imaging unit 12 and the projection area of the projection unit 14 are arranged to be substantially the same.
  • imaging section 12 and projection section 14 are integrally configured so that their imaging direction and projection direction can be changed by a driving section 16.
  • the imaging direction and projection direction of the imaging unit 12 and the projection unit 14 are detected by the sensor 28.
  • the direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and directs the imaging unit 12 and the projection unit 14 in a predetermined direction centered on the local guide, regardless of the movement of the local guide. maintain.
  • the imaging unit 12 of the local device GT images the site and generates a captured image.
  • This captured image is transmitted by the transmitting unit 22 to the client device IT via the server device SV under the control of the captured image transmitting means 18.
  • the captured image receiving means 36 of the client device IT receives the captured image by the receiving unit 32.
  • the captured image display unit 40 displays the received captured image. This allows the instructor to view images of the site.
  • the client wishes to look in a different direction, he or she inputs a direction instruction into the client's device IT.
  • the direction instruction transmitting means 39 of the client device IT transmits this direction instruction to the local device GT via the server device SV.
  • the direction control means 20 of the field device GT controls the drive section 16 and changes the predetermined directions of the imaging section 12 and the projection section 14 according to the direction instruction.
  • the imaging direction at the site is changed according to the direction instruction, and the captured image displayed on the client device IT is also changed to one in a different direction.
  • the client can look in the direction he/she wants to see, regardless of the orientation of the local guide.
  • the client When giving instructions, the client inputs a fixed command. When a fixing command is input, the captured image display 40 uses the captured image at that time as a reference captured image and displays it as a still image. The client inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the instruction image input by the transmitter 34 to the local device GT via the server device SV.
  • the local device GT receives the instruction image through the receiving section 24 and projects the instruction image from the projection section 14. As a result, the instruction image 62 is projected onto the target object 52. As described above, since the projection direction of the projection unit 14 is fixed, even if the local guide changes the direction of his or her face, the instruction image will be projected onto the location intended by the client. However, if the local guide moves from place to place, the projected position of the instruction image will shift.
  • the correction means 26 of the field device GT compares the characteristic partial image (marker, etc.) in the reference captured image with the characteristic partial image in the current captured image so that the instruction image is correctly projected at the intended position.
  • the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
  • the transmitting unit 34 transmits the fixing command to the local device GT via the server device SV.
  • the receiving unit 24 of the field device GT receives this and records the captured image at that time as a reference captured image. Further, the local guide places a marker near the object 52 so that the image is taken.
  • the local guide can receive instructions on the products to purchase and the direction in which he or she should proceed based on the instruction image 62 actually projected at the site.
  • This instruction image 62 is displayed at the correct position by the direction control means 20 even if the local guide changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the local guide's head, reducing stress.
  • the instruction image 62 will continue to be projected, and the other local guides will not be confused. . Furthermore, even if the local guide moves, the instruction image 62 is displayed correctly.
  • Appearance and Hardware Configuration Figure 33 shows the state in which the local guide 54 is wearing the local device GT.
  • headset 770 is a camera/laser projector complex 58 .
  • the configuration of the camera/laser projector complex 58 is as shown in FIG.
  • the basic configuration is similar to the direction-controlled laser projector 776 of the second embodiment.
  • not only the laser projector 84 but also the camera 82 is housed within the unit 80. Therefore, both the laser projector 84 and the camera 82 are directionally controlled.
  • the projection area of the laser projector 84 and the imaging area of the camera 82 are configured to substantially match.
  • the hardware configuration of the smartphone 200 is the same as that shown in FIG. 3, the hardware configuration of the server device SV is the same as that shown in FIG. 4, the hardware configuration of the client device IT is the same as that shown in FIG. 5, and the hardware configuration of the motor control circuit 400 is the same as that shown in FIG. 16.
  • Remote experience processing Figure 35 shows a flowchart during guidance.
  • the local guide's smartphone 200 acquires an image (video) taken by the camera 92 through short-range communication, and transmits it to the server device SV (step S21).
  • the server device SV receives this captured image and transfers it to the client device IT (step S91).
  • the client device IT receives the captured image and displays it on the display 306 (step S41).
  • FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
  • the camera/laser projector integrated body 58 is provided with a gyro sensor JS and an acceleration sensor AS.
  • the motor control circuit 400 receives the outputs of the gyro sensor JS and the acceleration sensor AS, and detects the orientation of (the base 93 of) the camera/laser projector integrated body 58 (step S1).
  • the motor control circuit 400 controls the motors 92, 94, and 96 so that the unit 80 is oriented in a predetermined direction even if the orientation of the base 93 changes (step S2). Therefore, an image in a predetermined direction is captured regardless of the direction of the local guide.
  • the direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
  • the client device IT transmits a direction command corresponding to the click to the server device SV (step S42).
  • Server device SV receives this and transfers it to smartphone 200 (step S92).
  • the smartphone 200 that has received the direction command transfers it to the motor control circuit 400 (step S22).
  • the motor control circuit 400 changes the direction (predetermined direction) to be constantly controlled (step S3).
  • the client can view a stable image in the desired direction through his/her own operations.
  • the client can give instructions to the local guide by projecting an image onto a local object.
  • An instruction image is transmitted from the client's device IT, and is projected onto an object at the site using a laser projector 84 worn by the local guide.
  • the guide can provide accurate instructions.
  • FIG. 36 shows a flowchart of instruction image projection.
  • FIG. 21a a case will be described in which the user instructs to purchase 52c among products 52a, 52b, 52c, 52d, etc. lined up at a store.
  • the client instructs the local guide 54 to place a marker 60 prepared in advance nearby by voice call or the like.
  • the size and shape of the image of the marker 60 are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the laser projector 84 to the marker 60 based on the captured image of the marker 60.
  • the captured image is displayed as a moving image on the display 306 of the client's device IT. While looking at this captured image, the client clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 21a to give a fixing instruction. Note that here, it is assumed that the marker 60 prepared as a card is placed leaning against the product 52b.
  • the client device IT sets the captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52).
  • the client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, a circle mark 62 is drawn using the mouse 316 on the image of the product 52c displayed on the display 306, and inputted.
  • the client device IT transmits the fixed command to the smartphone 200 via the server device SV (steps S51, S93).
  • the smartphone 200 that has received the fixing instruction records the captured image when receiving the fixing instruction in the nonvolatile memory 212 as a reference captured image (step S32).
  • the client device IT and the smartphone 200 can recognize captured images taken at the same time as reference captured images.
  • the client clicks the instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306.
  • the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94).
  • the client device IT cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the transmitted captured image as a moving image (step S54). This allows the instructor to see the local situation again.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 24A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82 (step S33).
  • the imaging range of the camera 82 and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current captured image is exactly the same as the recorded reference captured image (that is, if the local guide has not moved at all since the reference captured image), the laser When the instruction image data is projected by the projector 84, the instruction image 62 is projected onto the product 52c.
  • this instruction image 62 Since the position of this instruction image 62 matches the position input by the client on the display 306, the target product 52c can be accurately shown to the local guide.
  • the local guide can use the instruction image 62 as a landmark to purchase the product 52c without making a mistake.
  • a correction means 26 is provided for correctly displaying the instruction image by performing image processing or projection control.
  • the direction control means 20 that controls the projection direction of the projection section 14 by the drive section 16 may be sufficient.
  • a laser projector 94 is provided to project the instruction image 62.
  • the laser projector 94 may not be provided. Even in this case, the client can view the image in the direction he or she desires.
  • the driving unit 16 controls the projection direction, and the smartphone 200 performs image processing and projection control (step S34) so that the instruction image is correctly displayed by following the marker 60. ing.
  • control for tracking the marker 60 and the like may also be controlled by the drive section 16.

Abstract

[Problem] To provide a more advanced remote experience system allowing an experience of sightseeing and shopping at a remote place. [Solution] An available device list transmission means 652 of a server device SV transmits a generated available device list to a requester device IT by means of a transmission unit 644. An available device list display means 666 of the requester device IT displays the available device list on a display unit 668. A requester refers to the list displayed on the display unit 668 and selects any local device GT1, GT2... GTn (i.e., local guide). A request information reception means 618 of the local device GT1 receives request information by means of a reception unit 622. This causes the local device GT1 to enter a guide mode. In the guide mode, the local device GT1 captures an image of the local by means of an image capturing unit 612. A captured-image transmission means 616 transmits the captured image of the local area to the server device SV by means of a transmission unit 620. A captured-image transfer means 650 of the server device SV transfers the captured image to the requester device IT. A captured-image reception means 670 of the requester device IT receives the captured image by means of a reception unit 662. A captured-image display means 672 displays, on the display unit 668, the captured image of the local area that has been received.

Description

遠隔体験システムRemote experience system
 この発明は、遠隔から現地の状況を見たり、買い物を行ったりする遠隔体験システムに関するものである。 This invention relates to a remote experience system for viewing local conditions and shopping remotely.
 実際に現地までいかなくとも旅行や買物を体験することのできるシステムが提案されている。 A system has been proposed that allows people to experience traveling and shopping without actually going to the location.
 特許文献1には、現地にて走行しているバスなどから撮像した画像を、サーバ装置を介して、リアルタイムでユーザである端末装置に送信するシステムが開示されている。これにより、ユーザは、自宅などにいながら現地の景色などを楽しむことができる。 Patent Document 1 discloses a system that transmits images captured from a bus or the like running in a local area to a terminal device, which is a user, in real time via a server device. This allows the user to enjoy the local scenery while staying at home or the like.
 特許文献2には、ユーザから離れたところにいる現地の人が動画を撮像し、これをユーザの端末装置に送信して、ユーザが閲覧できるようにしている。また、その撮像方向などについて、ユーザが現地の人に指示を行うようにしている。これにより、ユーザの希望する方向の画像を楽しむことができる。 In Patent Document 2, a local person located far away from the user captures a video and transmits it to the user's terminal device so that the user can view it. In addition, the user gives instructions to local people regarding the imaging direction, etc. This allows the user to enjoy images in the desired direction.
 このように、上記システムによれば、遠隔からであっても、観光などの体験を行うことができる。 In this way, according to the above system, it is possible to experience sightseeing even remotely.
特開2013-66017JP2013-66017 特許6332718Patent 6332718
 しかしながら、特許文献1や特許文献2のような従来技術では、依頼人が現地案内人を適切に選択して決定するシステムが用意されておらず、依頼が容易ではないという問題があった。 However, the conventional techniques such as Patent Document 1 and Patent Document 2 do not have a system in place for a client to appropriately select and decide on a local guide, and there is a problem in that it is not easy to make a request.
 また、特許文献1のような従来技術では、設置されたカメラの撮像する方向を、遠隔のユーザが変更することはできなかった。このため、ユーザが見たい方向の景色を見ることができないという問題があった。 Further, in the conventional technology such as Patent Document 1, a remote user cannot change the direction in which an installed camera takes images. For this reason, there was a problem in that the user could not see the scenery in the direction he or she wanted to see.
 特許文献2では、遠隔のユーザが撮像方向を変える場合、現地の人に指示を出して撮像方向を変えなければならず、ユーザが見たい方向を見るのが容易ではなかった。 In Patent Document 2, when a remote user wants to change the imaging direction, he or she has to instruct a local person to change the imaging direction, making it difficult for the user to see the direction he or she wants to see.
 また、ユーザの側から現地案内人に対する音声指示では指示内容を明確に伝えることが難しいという問題があった。 Additionally, there was a problem in that it was difficult to clearly convey the instruction content when the user gave voice instructions to the local guide.
 この発明は、上記のような問題点のいずれかを解決して、遠隔地において観光やショッピングを体験することのできる、より進化した遠隔体験システムを提供することを目的とする。 The purpose of this invention is to solve any of the above-mentioned problems and provide a more advanced remote experience system that allows people to experience sightseeing and shopping in remote locations.
 以下この発明の独立して適用可能ないくつかの特徴を列挙する。 Hereinafter, some independently applicable features of this invention will be listed.
(1)~(6)この発明に係る遠隔体験システムは、複数の現地装置と、サーバ装置と、依頼人装置とを備えた遠隔体験システムであって、
 前記現地装置は、現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着した撮像部と、案内人または案内ロボットが案内可能である状態となると、送信部によって、自らの位置と案内可能である旨を、可能情報としてサーバ装置に送信する案内可能送信手段と、受信部によって、サーバ装置から依頼情報を受信し、当該依頼人装置に対して案内モードに入る依頼情報受信手段と、案内モードにおいて、送信部により、前記撮像部の撮像画像をサーバ装置に送信するための撮像画像送信手段とを備え、
 前記サーバ装置は、受信部により、前記現地装置から可能情報を受信する可能情報受信手段と、送信部により、前記可能情報を送信してきた現地装置を地図上に配置した可能装置一覧を依頼人装置に送信する可能装置一覧送信手段と、
 受信部によって依頼人装置より受信した依頼情報を、送信部により当該現地装置に送信する依頼情報転送手段と、受信部により前記現地装置からの撮像画像を受信し、送信部により当該撮像画像を前記依頼人装置に送信する撮像画像転送手段とを備え、
 前記依頼人装置は、受信部により、前記可能装置一覧を受信する可能装置一覧受信手段と、前記可能装置一覧を表示部に表示する可能装置一覧表示手段と、表示された可能装置一覧の中から依頼人の操作によって選択された現地装置の識別符号と案内依頼と依頼人装置の識別符号を、依頼情報として送信部によってサーバ装置に送信する案内依頼送信手段と、受信部により、撮像画像を受信する撮像画像受信手段と、表示部により、受信した撮像画像を表示する撮像画像表示手段とを備えている。
(1) to (6) The remote experience system according to the present invention is a remote experience system comprising a plurality of local devices, a server device, and a client device,
When the local guide or the local guide robot is in a state where the local guide or the local guide robot is ready to guide, the local device transmits an image of the local guide or the local guide robot to the imaging unit attached to the body or a moving body that moves with the local guide, and the transmitting unit. A guide capable transmitting means transmits the location and the fact that guidance is possible to the server device as possible information, and a receiving unit receives request information from the server device and requests information to enter a guide mode for the client device. and a captured image transmitting unit for transmitting the captured image of the imaging unit to the server device by the transmitting unit in the guide mode,
The server device includes a possibility information receiving means for receiving possibility information from the local device by a receiving section, and a possibility information receiving means for receiving possibility information from the local device by a receiving section, and a possible device list in which local devices that have transmitted the possibility information are arranged on a map by a transmitting section. a means for transmitting a list of devices that can be transmitted to;
request information transfer means for transmitting request information received from the client's device by the receiving section to the local device by the transmitting section; receiving a captured image from the local device by the receiving section; and a captured image transfer means for transmitting to the client device,
The client device includes, by means of a receiving unit, possible device list receiving means for receiving the possible device list, possible device list display means for displaying the possible device list on a display unit, and a possible device list displaying means for displaying the possible device list on a display unit. A guidance request transmitting means for transmitting the identification code of the local device selected by the client's operation, the guidance request, and the identification code of the client's device as request information to the server device by the transmitting unit, and the receiving unit receives the captured image. and a captured image display unit that displays the received captured image using a display section.
 したがって、依頼人が体験したい場所の近傍にいる案内人または案内ロボットを選択して、案内を受けることができる。 Therefore, the client can select a guide or guide robot near the place he or she wants to experience and receive guidance.
(7)この発明に係るシステムは、現地装置の前記撮像部は、広角撮像画像を出力する広角度撮像部であり、サーバ装置の撮像画像転送手段は、前記依頼人装置からの方向指示を受けて、受信した広角撮像画像中から当該方向指示に対応する領域を選択し、送信部によって選択領域画像として前記依頼人装置に送信し、依頼人装置は、表示部に表示された選択領域画像を見て、依頼人が入力した方向指示を前記サーバ装置に送信する方向指示送信手段を備えたことを特徴としている。 (7) In the system according to the present invention, the imaging unit of the field device is a wide-angle imaging unit that outputs a wide-angle captured image, and the captured image transfer means of the server device receives direction instructions from the client device. selects an area corresponding to the direction instruction from the received wide-angle captured image and transmits it as a selected area image to the client device by the transmitting unit, and the client device displays the selected area image displayed on the display unit. The present invention is characterized by comprising a direction instruction transmitting means for transmitting a direction instruction input by the client to the server device.
 したがって、現地案内人または現地案内ロボットの向きに拘わらず、依頼人の希望する方向の画像を得ることができる。 Therefore, regardless of the direction of the local guide or the local guide robot, an image in the direction desired by the client can be obtained.
(8)この発明に係るシステムは、現地装置の広角度撮像部は、当該広角度撮像部の方向変化を検出して、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、所定方向が基準方向となるように広角度撮像画像を出力し、方向指示は、前記基準方向を基準として与えられることを特徴としている。 (8) In the system according to the present invention, the wide-angle imaging unit of the field device detects a change in the direction of the wide-angle imaging unit, regardless of the movement of the local guide, the local guide robot, or the mobile object. The wide-angle captured image is output so that a predetermined direction becomes a reference direction, and the direction instruction is given with the reference direction as a reference.
 したがって、現地案内人または現地案内ロボットが向きを変えても、依頼人の見る画像の方向は一定に保たれる。 Therefore, even if the local guide or local guide robot changes direction, the direction of the image seen by the client remains constant.
(9)この発明に係るシステムは、サーバ装置を介して前記現地装置から前記選択領域画像を受信して表示する依頼人装置は、複数台設けられ、各依頼人装置の指示する方向指示は、それぞれ異なっていることを特徴としている。 (9) In the system according to the present invention, a plurality of client devices are provided that receive and display the selected area image from the local device via a server device, and the direction instruction given by each client device is Each is characterized by being different.
 したがって、複数の依頼人が参加する遠隔体験ツアーなどにおいても、各依頼人がそれぞれ希望する方向の画像を楽しむことができる。 Therefore, even in a remote experience tour in which multiple clients participate, each client can enjoy images taken in their desired direction.
(10)この発明に係るシステムは、
 前記現地装置が、現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着され、与えられた指示画像データに基づいて、現地空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部と、投影部の向きを検出するセンサからの出力を受けて、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記投影部の投影方向が前記現地案内人または現地案内ロボットまたは前記移動体を中心として所定方向を向くように駆動部を制御する方向制御手段とをさらに備え、
 前記依頼人装置は、送信部により、前記現地装置に対して固定指令を与える固定指令手段と、固定指令があると、前記撮像画像の特徴部分画像の近傍を参照撮像画像とし、当該参照撮像画像において、指示画像の位置を特定した指示画像データを、前記サーバ装置を介して前記現地装置に送信する指示画像送信手段とをさらに備えたことを特徴としている。
(10) The system according to this invention:
a projection unit in which the local device is attached to a local guide or a local guide robot on his or her own body or a moving body that moves with the local guide, and projects an instruction image onto the local space based on given instruction image data; In response to outputs from a drive unit that changes the projection direction of the projection unit and a sensor that detects the orientation of the projection unit, the projection unit changes regardless of the movement of the local guide, the local guide robot, or the moving body. further comprising a direction control means for controlling a drive unit so that the projection direction is directed in a predetermined direction with the local guide, the local guide robot, or the moving body as the center;
The client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets a vicinity of the characteristic partial image of the captured image as a reference captured image, and sets the reference captured image as a reference captured image. The present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
 したがって、依頼人は、現地案内人または現地案内ロボットに対し、現地の正確な位置に画像による指示を投影することができる。 Therefore, the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
(11)この発明に係るシステムは、指示画像が前記現地空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴としている。 (11) The system according to the present invention includes a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. It is characterized by additional features.
 したがって、さらに正確に指示画像を投影することができる。 Therefore, the instruction image can be projected more accurately.
(12)この発明に係るシステムは、
 前記現地装置が、現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着され、与えられた指示画像データに基づいて、現地空間に指示画像を投影する広角投影部と、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記投影部の投影方向が前記現地案内人または現地案内ロボットまたは前記移動体を中心として所定方向を向くように広角投影部による投影を制御する方向制御手段とをさらに備え、
 前記依頼人装置は、送信部により、前記現地装置に対して固定指令を与える固定指令手段と、固定指令があると、前記撮像画像の特徴部分画像の近傍を参照撮像画像とし、当該参照撮像画像において、指示画像の位置を特定した指示画像データを、前記サーバ装置を介して前記現地装置に送信する指示画像送信手段とをさらに備えたことを特徴としている。
(12) The system according to this invention:
The on-site device is a wide-angle projection unit that is attached to the on-site guide or the on-site guide robot on his or her own body or on a moving object that moves with the on-site guide, and projects an instruction image into the on-site space based on given instruction image data. , a wide-angle projection unit so that the projection direction of the projection unit faces a predetermined direction with the local guide, the local guide robot, or the mobile body as a center regardless of the movement of the local guide, the local guide robot, or the mobile body; directional control means for controlling the projection by the
The client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets a vicinity of the characteristic partial image of the captured image as a reference captured image, and sets the reference captured image as a reference captured image. The present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
 したがって、依頼人は、現地案内人または現地案内ロボットに対し、現地の正確な位置に画像による指示を投影することができる。 Therefore, the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
(13)この発明に係るシステムは、指示画像が前記現地空間の所定部位を基準として正しく表示されるように、前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴としている。 (13) The system according to the present invention is characterized by further comprising a correction means for correcting the projection of the instruction image by the projection unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. There is.
 したがって、さらに正確に指示画像を投影することができる。 Therefore, the instruction image can be projected more accurately.
(14)この発明に係るシステムは、
 現地装置が、前記撮像部の撮像方向を変化させる駆動部と、撮像部の向きを検出するセンサからの出力を受けて、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記撮像部が前記現地案内人または現地案内ロボットを中心として方向指示に基づく所定方向を向くように駆動部を制御する方向制御手段とをさらに備え、
 前記依頼人装置は、表示部に表示された撮像画像を見て、依頼人が入力した方向指示を前記サーバ装置に送信する方向指示送信手段を備えたことを特徴としている。
(14) The system according to this invention:
The on-site device receives outputs from a drive unit that changes the imaging direction of the imaging unit and a sensor that detects the orientation of the imaging unit, regardless of the movement of the on-site guide, the on-site guide robot, or the mobile object. further comprising a direction control means for controlling a drive unit so that the imaging unit faces in a predetermined direction based on a direction instruction with the local guide or local guide robot as the center;
The client device is characterized in that it includes a direction instruction transmitting means for transmitting a direction instruction input by the client to the server device while looking at the captured image displayed on the display unit.
 したがって、依頼人の操作により、所望の方向の画像を見ることができる。 Therefore, the image can be viewed in a desired direction by the client's operation.
(15)この発明に係るシステムは、
 前記現地装置が、現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着され、与えられた指示画像データに基づいて、現地空間に指示画像を投影する投影部をさらに備え、
 前記駆動部は、前記投影部の投影方向も変化させるものであり、投影部の向きを検出するセンサからの出力を受けて、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記投影部の投影方向が前記現地案内人または現地案内ロボットまたは前記移動体を中心として所定方向を向くように駆動部を制御する方向制御手段をさらに備え、
 前記依頼人装置は、送信部により、前記現地装置に対して固定指令を与える固定指令手段と、固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示画像の位置を特定した指示画像データを、前記サーバ装置を介して前記現地装置に送信する指示画像送信手段とをさらに備えたことを特徴としている。
(15) The system according to this invention:
The local device further includes a projection unit that is attached to a local guide or a local guide robot on his or her own body or a moving body that moves with the local guide, and projects an instruction image onto the local space based on given instruction image data. Prepare,
The drive unit also changes the projection direction of the projection unit, and receives an output from a sensor that detects the direction of the projection unit, and changes the projection direction regardless of the movement of the local guide, the local guide robot, or the mobile object. , further comprising a direction control means for controlling a drive unit so that the projection direction of the projection unit is directed in a predetermined direction with the local guide, the local guide robot, or the moving body as the center;
The client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and sets the captured image of interest to the captured image of interest. The present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
 したがって、依頼人は、現地案内人または現地案内ロボットに対し、現地の正確な位置に画像による指示を投影することができる。 Therefore, the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
(16)この発明に係るシステムは、指示画像が前記現地空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴としている。 (16) The system according to the present invention includes a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. It is characterized by additional features.
 「案内可能送信手段」は、実施形態においては、ステップS103がこれに対応する。 In the embodiment, the "guidable transmission means" corresponds to step S103.
 「依頼情報受信手段」は、実施形態においては、ステップS104がこれに対応する。 In the embodiment, the "request information receiving means" corresponds to step S104.
 「撮像画像送信手段」は、実施形態においては、ステップS106、S21、S30、S35がこれに対応する。 In the embodiment, the "captured image transmitting means" corresponds to steps S106, S21, S30, and S35.
 「可能情報受信手段」は、実施形態においては、ステップS121がこれに対応する。 In the embodiment, the "availability information receiving means" corresponds to step S121.
 「可能装置一覧送信手段」は、実施形態においては、ステップS123がこれに対応する。 In the embodiment, the "capable device list transmitting means" corresponds to step S123.
 「依頼情報転送手段」は、実施形態においては、ステップS124がこれに対応する。 In the embodiment, the "request information transfer means" corresponds to step S124.
 「撮像画像転送手段」は、実施形態においては、ステップS125、S91がこれに対応する。 In the embodiment, the "captured image transfer means" corresponds to steps S125 and S91.
 「可能装置一覧表示手段」は、実施形態においては、ステップS143がこれに対応する。 In the embodiment, the "capable device list display means" corresponds to step S143.
 「案内依頼送信手段」は、実施形態においては、ステップS144がこれに対応する。 In the embodiment, the "guidance request transmitting means" corresponds to step S144.
 「撮像画像受信手段」は、実施形態においては、ステップS145、S41がこれに対応する。 In the embodiment, the "captured image receiving means" corresponds to steps S145 and S41.
 「撮像画像表示手段」は、実施形態においては、ステップS146、S41がこれに対応する。 In the embodiment, the "captured image display means" corresponds to steps S146 and S41.
 「装置」とは、1台のコンピュータによって構成されるものだけでなく、ネットワークなどを介して接続された複数のコンピュータによって構成されるものも含む概念である。したがって、本発明の手段(あるいは手段の一部でもよい)が複数のコンピュータに分散されている場合、これら複数のコンピュータが装置に該当する。 The term "device" is a concept that includes not only one computer but also multiple computers connected via a network. Therefore, when the means (or a part of the means) of the present invention is distributed over multiple computers, these multiple computers correspond to the apparatus.
 「プログラム」とは、CPUにより直接実行可能なプログラムだけでなく、ソース形式のプログラム、圧縮処理がされたプログラム、暗号化されたプログラム、オペレーティングシステムと協働してその機能を発揮するプログラム等を含む概念である。 "Program" refers to not only programs that can be directly executed by the CPU, but also programs in source format, compressed programs, encrypted programs, programs that cooperate with the operating system to perform their functions, etc. It is a concept that includes
この発明の一実施形態による遠隔体験システムの機能構成図である。1 is a functional configuration diagram of a remote experience system according to an embodiment of the present invention. 遠隔体験システムのシステム構成である。This is the system configuration of the remote experience system. スマートフォンGTのハードウエア構成である。This is the hardware configuration of Smartphone GT. サーバ装置SVのハードウエア構成である。This is the hardware configuration of the server device SV. 依頼人装置ITのハードウエア構成である。This is the hardware configuration of the client device IT. 現地案内人決定処理のフローチャートである。It is a flowchart of local guide determination processing. 現地案内人決定処理のフローチャートである。It is a flowchart of local guide determination processing. 案内処理のフローチャートである。It is a flowchart of guidance processing. 可能装置一覧の画面表示例である。This is an example of a screen displaying a list of possible devices. 決済画面の例である。This is an example of a payment screen. 第2の実施形態による遠隔体験システムの機能構成図である。FIG. 2 is a functional configuration diagram of a remote experience system according to a second embodiment. 現地装置STを装着した現地案内人を示す図である。It is a diagram showing a local guide wearing a local device ST. 方向制御付きプロジェクタ776を示す図である。It is a diagram showing a projector 776 with direction control. ユニット80の断面を示す図である。7 is a diagram showing a cross section of a unit 80. FIG. ゲルブッシュ120の取付位置を示す図である。It is a figure showing the attachment position of gel bush 120. モータ制御回路400のハードウエア構成である。This is the hardware configuration of the motor control circuit 400. 案内処理のフローチャートである。It is a flowchart of guidance processing. 依頼人装置ITに映し出された現地の画像である。This is an image of the site displayed on the client's IT device. 依頼人装置ITに映し出された別角度からの現地の画像である。This is an image of the site from a different angle, displayed on the client's IT device. 案内処理のフローチャートである。It is a flowchart of guidance processing. 依頼人が商店にて指示入力を行う際の画像である。This is an image when a client inputs instructions at a store. 依頼人が商店にて指示入力を行う際の画像である。This is an image when a client inputs instructions at a store. 複数の依頼人がいる場合の画像例である。This is an example image when there are multiple clients. 指示画像データの例である。This is an example of instruction image data. 現地案内人56の動きと指示画像表示との関係を示す図である。5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display. FIG. 現地案内人56の動きと指示画像表示との関係を示す図である。5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display. FIG. 現地案内人56の動きと指示画像表示との関係を示す図である。5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display. FIG. マーカ60に代えて画像特徴点512を用いる場合を示す図である。6 is a diagram showing a case where image feature points 512 are used instead of markers 60. FIG. 第3の実施形態による遠隔体験システムの機能構成である。It is a functional configuration of a remote experience system according to a third embodiment. 現地装置GTを装着した現地案内人54を示す図である。It is a figure showing the local guide 54 wearing the local device GT. 案内処理のフローチャートである。It is a flowchart of guidance processing. 案内処理のフローチャートである。It is a flowchart of guidance processing. 第4の実施形態による遠隔体験システムの機能構成である。It is a functional configuration of a remote experience system according to a fourth embodiment. 現地装置GTを装着した現地案内人54を示す図である。It is a figure showing the local guide 54 wearing the local device GT. カメラ・レーザプロジェクタ統合体90を示す図である。9 is a diagram showing a camera/laser projector integrated body 90. FIG. 案内処理のフローチャートである。It is a flowchart of guidance processing. 案内処理のフローチャートである。It is a flowchart of guidance processing.
1.第1の実施形態
1.1機能構成
 図1に、この発明の一実施形態による遠隔体験システムの機能構成を示す。サーバ装置SVに対し通信可能に現場装置GT1、GT2・・・GTnが設けられている。また、サーバ装置SVに対し通信可能に依頼人装置ITが設けられている。
1. First embodiment
1.1 Functional Configuration FIG. 1 shows the functional configuration of a remote experience system according to an embodiment of the present invention. Field devices GT1, GT2, . . . GTn are provided to be able to communicate with the server device SV. Further, a client device IT is provided to be able to communicate with the server device SV.
 現地装置GT1、GT2・・・GTnは、観光地やショッピングモールなどの現地にいる各現地案内人が所持するスマートフォンなどの携帯端末装置である。現地装置GT1、GT2・・・GTnは複数設けられているが、以下では、現地装置GT1に焦点を当てて説明する。 The local devices GT1, GT2, . . . GTn are mobile terminal devices such as smartphones owned by each local guide at the site such as a tourist spot or a shopping mall. Although a plurality of local devices GT1, GT2, . . . GTn are provided, the description below will focus on the local device GT1.
 現地案内人が案内可能状態になると、現地案内人はその旨を現地装置GT1に入力する。現地装置GT1の案内可能送信手段614は、送信部620により、案内可能である旨、現地装置GT1の識別符号、現在位置を、可能情報としてサーバ装置SVに送信する。現地装置GT1の現在位置は、たとえば、内蔵するGPS受信機から取得する。 When the local guide becomes available for guidance, the local guide inputs that fact into the local device GT1. The guiding capability transmitting means 614 of the local device GT1 transmits the fact that guidance is possible, the identification code of the local device GT1, and the current position to the server device SV as possible information by the transmitting unit 620. The current location of the local device GT1 is obtained, for example, from a built-in GPS receiver.
 現地装置GT1以外の現地装置も、現地案内人が案内可能な状態になると上記可能情報をサーバ装置SVに送信する。 Local devices other than the local device GT1 also transmit the above-mentioned availability information to the server device SV when the local guide becomes available for guidance.
 サーバ装置SVの可能情報受信手段646は、受信部642により、これら可能情報を受信する。したがって、サーバ装置SVは、現在案内可能となっている現場装置がどれであるかを把握することができる。サーバ装置SVは、各現地装置GT1、GT2・・・GTnから受信した可能情報に基づき、地図上に案内可能状態にある現地装置を示した可能装置一覧を生成する。可能装置一覧送信手段652は、送信部644により、生成された可能装置一覧を依頼人装置ITに送信する。 The availability information receiving means 646 of the server device SV receives these availability information through the receiving unit 642. Therefore, the server device SV can grasp which field devices are currently available for guidance. The server device SV generates a possible device list showing local devices that are ready to guide on the map based on the possibility information received from each local device GT1, GT2, . . . GTn. The possible device list transmitting means 652 uses the transmitting unit 644 to transmit the generated possible device list to the client device IT.
 依頼人装置ITの可能装置一覧受信手段664は、可能装置一覧を受信する。可能装置一覧表示手段666は、表示部668に、可能装置一覧を表示する。これにより、依頼人は、現地において案内が可能な現地案内人とその位置を地図上で確認することができる。 The possible device list receiving means 664 of the client device IT receives the possible device list. Possible device list display means 666 displays a list of possible devices on display section 668. This allows the client to confirm on the map the local guides who are available to guide them and their locations.
 依頼人は、表示部668に表示された一覧を参照し、いずれかの現地装置GT1、GT2・・・GTn(すなわち現地案内人)を選択する。ここでは、現地装置GT1が選択されたものとして説明を進める。案内依頼送信手段676は、送信部674により、選択された現地装置GT1の識別符号、依頼人装置の識別符号を、依頼情報としてサーバ装置SVに送信する。 The client refers to the list displayed on the display section 668 and selects one of the local devices GT1, GT2, . . . GTn (ie, local guide). Here, the explanation will proceed assuming that the local device GT1 has been selected. The guide request transmitting means 676 transmits the identification code of the selected local device GT1 and the identification code of the requester's device to the server device SV by the transmitting unit 674 as request information.
 サーバ装置SVの依頼情報転送手段648は、この依頼情報を選択された現地装置GT1に転送する。 The request information transfer means 648 of the server device SV transfers this request information to the selected local device GT1.
 現地装置GT1の依頼情報受信手段618は、受信部622により、依頼情報を受信する。これにより、現地装置GT1は案内モードに入る。 The request information receiving means 618 of the local device GT1 receives the request information through the receiving unit 622. This causes the local device GT1 to enter the guidance mode.
 案内モードにおいて、現地装置GT1は、撮像部612によって現地を撮像する。撮像画像送信手段616は、送信部620により、現地の撮像画像をサーバ装置SVに送信する。サーバ装置SVの撮像画像転送手段650は、撮像画像を依頼人装置ITに転送する。 In the guidance mode, the local device GT1 images the site using the imaging unit 612. The captured image transmitting means 616 transmits the captured image of the site to the server device SV using the transmitting unit 620. The captured image transfer means 650 of the server device SV transfers the captured image to the client device IT.
 依頼人装置ITの撮像画像受信手段670は、受信部662により、撮像画像を受信する。撮像画像表示手段672は、受信した現地の撮像画像を、表示部668に表示する。 The captured image receiving means 670 of the client device IT receives the captured image through the receiving unit 662. The captured image display means 672 displays the received captured image of the site on the display section 668.
 以上のようにして、依頼人の選択によって現地案内人を選択して案内を依頼することができる。
 
As described above, the client can select a local guide and request guidance based on his/her selection.
1.2ハードウエア構成
 図2に、遠隔体験システムのシステム構成を示す。この実施形態では、現地装置としてスマートフォンGT1、GT2・・・GTnを用いている。これら現地案内人が所持するスマートフォンGT1、GT2・・・GTnは、インターネットを介してサーバ装置SVと通信可能である。また、依頼人が使用する依頼人装置ITも、インターネットを介してサーバ装置SVと通信可能である。
1.2 Hardware configuration Figure 2 shows the system configuration of the remote experience system. In this embodiment, smartphones GT1, GT2, . . . GTn are used as local devices. The smartphones GT1, GT2, . . . GTn owned by these local guides can communicate with the server device SV via the Internet. Further, the client device IT used by the client can also communicate with the server device SV via the Internet.
 図3に、スマートフォンGTのハードウエア構成を示す。CPU202には、メモリ204、タッチディスプレイ206、近距離通信回路208、撮像部である内蔵カメラ217、不揮発性メモリ212、スピーカ214、マイク216、通信回路218、GPS受信機219が接続されている。 Figure 3 shows the hardware configuration of smartphone GT. Connected to the CPU 202 are a memory 204, a touch display 206, a short-range communication circuit 208, a built-in camera 217 serving as an imaging unit, a nonvolatile memory 212, a speaker 214, a microphone 216, a communication circuit 218, and a GPS receiver 219.
 近距離通信回路208は、ブルーツースなど近距離の通信を行うための回路である。これに対し、通信回路218は、インターネットに接続するために基地局と通信をするための回路である。GPS受信機219は、衛星からの電波を受信して自らの位置を取得するためのものである。内蔵カメラ217は、現地の静止画や動画を撮像するためのものである。マイク216は、案内人の声や現地の音を収集するためのものである。 The short range communication circuit 208 is a circuit for short range communication such as Bluetooth. On the other hand, the communication circuit 218 is a circuit for communicating with a base station in order to connect to the Internet. The GPS receiver 219 is for receiving radio waves from satellites and acquiring its own position. The built-in camera 217 is for capturing still images and videos of the site. The microphone 216 is used to collect the guide's voice and local sounds.
 不揮発性メモリ212には、オペレーティングシステム222、現場プログラム224が記録されている。現場プログラム224は、オペレーティングシステム222と協働してその機能を発揮するものである。 An operating system 222 and a field program 224 are recorded in the nonvolatile memory 212. The field program 224 cooperates with the operating system 222 to perform its functions.
 図4に、サーバ装置SVのハードウエア構成を示す。CPU552には、メモリ554、ディスプレイ556、SSD558、DVD-ROMドライブ560、通信回路562が接続されている。通信回路562は、インターネットに接続するためのものである。 FIG. 4 shows the hardware configuration of the server device SV. A memory 554, a display 556, an SSD 558, a DVD-ROM drive 560, and a communication circuit 562 are connected to the CPU 552. Communication circuit 562 is for connecting to the Internet.
 SSD558には、オペレーティングシステム564、サーバプログラム566が記録されている。サーバプログラム566は、オペレーティングシステム564と協働してその機能を発揮するものである。これらプログラムは、DVD-ROM568に記録されていたものを、DVD-ROMドライブ560を介して、SSD558にインストールしたものである。 An operating system 564 and a server program 566 are recorded on the SSD 558. The server program 566 cooperates with the operating system 564 to perform its functions. These programs were recorded on the DVD-ROM 568 and installed into the SSD 558 via the DVD-ROM drive 560.
 図5に、依頼人装置ITのハードウエア構成を示す。CPU302には、メモリ304、ディスプレイ306、マイク308、通信回路310、SSD312、DVD-ROMドライブ314、マウス/キーボード316、スピーカ318が接続されている。通信回路310は、インターネットに接続するためのものである。 FIG. 5 shows the hardware configuration of the client device IT. A memory 304, a display 306, a microphone 308, a communication circuit 310, an SSD 312, a DVD-ROM drive 314, a mouse/keyboard 316, and a speaker 318 are connected to the CPU 302. Communication circuit 310 is for connecting to the Internet.
 SSD312には、オペレーティングシステム320、依頼人プログラム322が記録されている。依頼人プログラム322は、オペレーティングシステム320と協働してその機能を発揮するものである。これらプログラムは、DVD-ROM324に記録されていたものを、DVD-ROMドライブ314を介して、SSD312にインストールしたものである。
 
An operating system 320 and a client program 322 are recorded on the SSD 312 . The client program 322 cooperates with the operating system 320 to perform its functions. These programs were recorded on the DVD-ROM 324 and installed into the SSD 312 via the DVD-ROM drive 314.
1.3遠隔体験処理
 図6~図8に、遠隔体験処理のフローチャートを示す。図6、図7が現地案内人決定処理、図8が案内処理である。
1.3 Remote experience processing Figures 6 to 8 show flowcharts of remote experience processing. 6 and 7 show the local guide determination process, and FIG. 8 shows the guide process.
 現地装置GTを所持する現地案内人は、予め、サーバ装置SVに対し、現地案内人の氏名、写真、住所、決済手段(案内料の振込先など)を送信して登録する。サーバ装置SVは、これら情報に現地装置GTの識別符号を付して記録する。 A local guide who owns a local device GT registers the local guide by transmitting his or her name, photo, address, and payment method (guide fee transfer destination, etc.) to the server device SV in advance. The server device SV records this information by adding an identification code of the local device GT.
 現地案内人は、案内が可能な状態であれば、自らが所持する現地装置(スマートフォン)GTに、案内可能である旨を入力する。現地装置GTのCPU202(以下、現地装置GTと省略することがある)は、この入力を受けると(ステップS101)、GPS受信機219により、自機の位置を取得する(ステップS102)。さらに、現地装置GTは、案内可能である旨、自機の識別符号、自機の位置(緯度経度)を含む可能情報をサーバ装置SVに送信する(ステップS103)。 If the local guide is in a state where he or she is able to provide guidance, the local guide inputs information to the effect that he/she is available to provide guidance into the local device (smartphone) GT that he or she owns. Upon receiving this input (step S101), the CPU 202 of the local device GT (hereinafter sometimes abbreviated as local device GT) acquires the position of the local device using the GPS receiver 219 (step S102). Further, the local device GT transmits possibility information including the fact that guidance is possible, the identification code of the own device, and the position (latitude and longitude) of the own device to the server device SV (step S103).
 サーバ装置SVは、可能情報を受信し、不揮発性メモリ212に記録する(ステップS121)。現地装置は多数あるので、サーバ装置SVには多くの可能情報が蓄積される。なお、他にすべきことができたなど現地案内人が案内不可能となった場合、不可能である旨をサーバ装置SVに送信する。サーバ装置SVは、これを受けて、当該現地装置GTを案内可能から不可能に変更する。 The server device SV receives the availability information and records it in the nonvolatile memory 212 (step S121). Since there are many local devices, a lot of possibility information is accumulated in the server device SV. Note that if the local guide becomes unable to guide the guide because he or she has other things to do, a message to the effect that the guide is unable to guide is sent to the server device SV. In response to this, the server device SV changes the local device GT from being guideable to being unable to guide it.
 遠隔体験を希望する依頼人は、依頼人装置ITを操作し、体験を希望する場所やエリアを指定して案内可能状態にある現地装置GTの一覧の要求を入力する。これを受けて、依頼人装置ITのCPU302(以下、依頼人装置ITと省略することがある)は、案内可能にある現地装置GTの一覧をサーバ装置SVに要求する(ステップS141)。たとえば、依頼人がメトロポリタン美術館の案内を希望する場合、「メトロポリタン美術館」を入力し、あるいは地図上で「メトロポリタン美術館」を指定するなどして、案内可能な現地装置GTの一覧を要求する。 A client who wishes to have a remote experience operates the client's device IT, specifies the place or area in which he or she desires the experience, and inputs a request for a list of local devices GT that are available for guidance. In response to this, the CPU 302 of the client device IT (hereinafter sometimes abbreviated as client device IT) requests the server device SV for a list of local devices GT that can be guided (step S141). For example, if a client wishes to be guided around the Metropolitan Museum of Art, he or she may input "Metropolitan Museum of Art" or specify "Metropolitan Museum of Art" on a map, and request a list of local equipment GTs that are available for guidance.
 サーバ装置SVは、この要求を受けて、蓄積されている可能情報に基づいて、メトロポリタン美術館近傍の案内可能状態にある現地装置GTを抽出する。これを、地図上に配置して、可能装置一覧を生成する(ステップS122)。なお、可能装置一覧は、予め、場所ごとに生成しておいてもよい。 In response to this request, the server device SV extracts local devices GT that are available for guidance near the Metropolitan Museum of Art based on the stored availability information. This is placed on the map to generate a list of possible devices (step S122). Note that the list of possible devices may be generated in advance for each location.
 サーバ装置SVは、生成した可能装置一覧を、依頼人装置ITに送信する(ステップS123)。依頼人装置ITは、これを受信し(ステップS142)、ディスプレイ306に表示する(ステップS143)。 The server device SV transmits the generated list of possible devices to the client device IT (step S123). The client device IT receives this (step S142) and displays it on the display 306 (step S143).
 図9に、依頼人装置ITに表示された可能装置一覧を示す。指定されたメトロポリタン美術館の近傍にいる現地装置GTが現地案内人の名前とともに表示される。また、各現地装置GTには、その識別符号が対応付けられている。依頼人が、現地装置GTにマウスカーソルを合わせると、当該現地案内人のプロフィール、過去の依頼人からのレーティング、顔写真などが、サーバ装置SVから送信されてみることができる。 FIG. 9 shows a list of possible devices displayed on the client's device IT. Local equipment GTs located near the designated Metropolitan Museum of Art are displayed together with the name of the local guide. Further, each local device GT is associated with its identification code. When the client places the mouse cursor on the local device GT, the local guide's profile, ratings from past clients, facial photos, etc. can be sent from the server device SV.
 依頼人は、これらを参考にして、表示された現地装置GT(現地案内人)のいずれかを、マウス/キーボード316によって選択する。依頼人装置ITは、選択された現地装置GTの識別符号、依頼人装置ITの識別符号とともに、依頼情報としてサーバ装置SVに送信する(ステップS144)。たとえば、Emmaの現地装置GTが選択されたものとする。 The client uses the mouse/keyboard 316 to select one of the displayed local devices GT (local guide) with reference to these. The client device IT transmits the identification code of the selected local device GT and the identification code of the client device IT as request information to the server device SV (step S144). For example, assume that Emma's local device GT is selected.
 サーバ装置SVは、この依頼情報を選択されたEmmaの現地装置GTに転送する(ステップS124)。Emmaの現地装置GTのCPU202(以下、現地装置GTと省略することがある)は、依頼情報を受信する(ステップS104)。これにより、Emmaの現地装置GTは、案内モードに入る(ステップS105)。 The server device SV transfers this request information to the selected Emma's local device GT (step S124). The CPU 202 of Emma's local device GT (hereinafter sometimes abbreviated as local device GT) receives the request information (step S104). As a result, Emma's local device GT enters the guidance mode (step S105).
 以上のようにして、案内を希望する場所の近傍にいる現地案内人を比較選択して、所望の現地案内人を選択することができる。 As described above, a desired local guide can be selected by comparing and selecting local guides who are near the place where the user desires guidance.
 図8に、案内モードにおける処理フローチャートを示す。現地装置GTを所持する現地案内人Emmaと、依頼人装置ITを操作する依頼人は、現地装置GTおよび依頼人装置ITのインターネット通話機能を用いて、サーバ装置SVを介して(あるいは直接に)、互いに音声通話を行うことができる(図8のステップS106、S125、S145、S146、S147、S126、S107、S108)。 FIG. 8 shows a processing flowchart in the guide mode. Emma, the local guide who owns the local device GT, and the client who operates the client device IT, use the Internet call function of the local device GT and client device IT to communicate via the server device SV (or directly). , can make voice calls to each other (steps S106, S125, S145, S146, S147, S126, S107, S108 in FIG. 8).
 したがって、依頼人は、メトロポリタン美術館に行きたい旨を現地案内人Emmaに伝える。これにより、現地案内人Emmaは、現地装置GTのカメラによって動画を撮像し、メトロポリタン美術館に向かう。依頼人は、撮像された動画や周囲の音を、依頼人装置ITにて鑑賞することができる。 Therefore, the client tells Emma, the local guide, that he wants to go to the Metropolitan Museum of Art. As a result, the local guide Emma takes a video with the camera of the local equipment GT and heads to the Metropolitan Museum of Art. The client can view the captured video and surrounding sounds on the client's device IT.
 メトロポリタン美術館のチケット窓口に到着すると、現地案内人Emmaは、現地装置GTを操作し、図10に示すように、撮像中の動画に重ねて決済用ウインドウ700を表示する。決済用ウインドウ700には、メトロポリタン美術館の入場料を決済するためのサイトにアクセスするための二次元バーコードが表示される。このような二次元バーコードは、対象となる施設や商店から、予め取得しておき、サーバ装置SVに蓄積しておくとよい。現地案内人は、自らのスマートフォンGTに記録しておくか、あるいはサーバ装置SVから取得することができる。 Upon arriving at the ticket counter of the Metropolitan Museum of Art, the local guide Emma operates the local device GT to display a payment window 700 superimposed on the video being captured, as shown in FIG. The payment window 700 displays a two-dimensional barcode for accessing a site for paying admission fees to the Metropolitan Museum of Art. It is preferable that such two-dimensional barcodes be obtained in advance from target facilities or shops and stored in the server device SV. The local guide can record it on his or her smartphone GT or obtain it from the server device SV.
 依頼人は、依頼人装置ITに表示された二次元バーコードを、スマートフォンなどで読み込み決済を行う。なお、現地案内人Emmaが立て替えて入場料を支払い、案内料とともに後に決済を行うようにしてもよい。 The client reads the two-dimensional barcode displayed on the client's device IT using a smartphone or the like and makes a payment. Alternatively, the local guide Emma may pay the entrance fee in advance and settle the payment together with the guide fee later.
 その後は、現地案内人Emmaの案内によりメトロポリタン美術館の中を撮像した動画を、依頼人は依頼人装置ITによって鑑賞することができる。 After that, the client can watch a video of the inside of the Metropolitan Museum of Art guided by the local guide Emma using the client's IT device.
 上記のようにして、依頼人は現地案内人を選択して、遠隔地から現地の場所を体験することができる。案内料の決済についても、図10と同じように用意してある決済用二次元バーコード700を用いることができる。
 
As described above, the client can select a local guide and experience the local location from a remote location. Regarding the payment of the guide fee, the two-dimensional payment barcode 700 prepared in the same manner as in FIG. 10 can be used.
1.4その他の例
(1)上記実施形態では、依頼人が現地案内人を選択する際に、現地案内人のプロフィールなどを確認できるようにしている。これに代えて、あるいは加えて、現地装置GTのカメラ画像を確認した上で、現地案内人を選択できるようにしてもよい。これにより、現地装置のカメラ画像のクオリティなどをチェックすることができる。
1.4 Other examples
(1) In the above embodiment, when the client selects a local guide, the client is allowed to check the profile of the local guide. Alternatively or in addition to this, it may be possible to select a local guide after checking the camera image of the local device GT. This allows you to check the quality of the camera images from the on-site equipment.
 また、依頼前に通話によって現地案内人と話ができるようにしてもよい。 Additionally, it may be possible to speak with a local guide by telephone before making a request.
 さらに、現地案内人が過去の案内時の動画をサーバ装置SVにアップしておくことで、これを依頼人が閲覧できるようにしてもよい。 Furthermore, the local guide may upload videos of past guidance to the server device SV so that the client can view them.
 また、過去の依頼人が現地案内人に対して行った評価やコメントを表示するようにしてもよい。 Additionally, evaluations and comments made by past clients to the local guide may be displayed.
(2)上記実施形態では、施設の体験を例として説明した。しかし、観光地やショッピングの体験にも適用することができる。 (2) In the above embodiment, the facility experience was explained as an example. However, it can also be applied to tourist destinations and shopping experiences.
(3)上記実施形態では、現地案内人が現地装置を固定、保持ないし装着している。しかし、現地装置を現地案内のためのロボットに固定、保持ないし装着するようにしてもよい。以下の実施形態においても同様である。 (3) In the above embodiment, the local guide fixes, holds, or wears the local device. However, the on-site device may be fixed, held, or attached to a robot for on-site guidance. The same applies to the following embodiments.
(4)上記実施形態や変形例は、その本質に反しない限り、他の実施形態や変形例と組み合わせて実施可能である。
 
(4) The embodiments and modifications described above can be implemented in combination with other embodiments and modifications as long as they do not contradict the essence thereof.
2.第2の実施形態
2.1機能構成
 上記実施形態では、案内モードにおいて現地案内人のスマートフォンを現地装置GTとして用いた場合を説明した。この実施形態では、より安定した動画を送ることができるように、また、依頼人の指示が伝えやすいようにした現地装置GTを用いる場合について説明する。
2. Second embodiment
2.1 Functional Configuration In the above embodiment, a case has been described in which the local guide's smartphone is used as the local device GT in the guidance mode. In this embodiment, a case will be explained in which a local device GT is used, which is designed to be able to send more stable videos and to make it easier to convey instructions from the client.
 図11に、第2の実施形態による遠隔体験システムの機能構成を示す。この実施形態では、依頼人と現地案内人をマッチングする仕組みは設けていない。しかし、第1の実施形態と同様の仕組みを用いることができる。あるいは、その他のマッチング方法を用いてもよい。 FIG. 11 shows the functional configuration of the remote experience system according to the second embodiment. This embodiment does not provide a mechanism for matching clients and local guides. However, a mechanism similar to the first embodiment can be used. Alternatively, other matching methods may be used.
 現地案内人は、広角撮像部12を装着している。また、駆動部16を介して、投影部14が設けられている。 The local guide is wearing a wide-angle imaging unit 12. Further, a projection section 14 is provided via a drive section 16 .
 投影部14は、駆動部16により、その投影方向を変化できるように構成されている。投影部14の投影方向は、センサ28によって検出される。方向制御手段20は、センサ28の出力に基づいて駆動部16を制御し、現地案内人の動きに拘わらず、投影部14の投影方向を、現地案内人を中心とした所定方向に維持する。 The projection unit 14 is configured so that its projection direction can be changed by a drive unit 16. The projection direction of the projection unit 14 is detected by the sensor 28. The direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and maintains the projection direction of the projection unit 14 in a predetermined direction centered on the local guide, regardless of the movement of the local guide.
 現地装置GTの撮像部12は全天球カメラなどの広角カメラであり、現地案内人の全天球方向全てを撮像して広角撮像画像を生成する。この広角撮像画像は、撮像画像送信手段18の制御により、送信部22によって、サーバ装置SVに送信される。サーバ装置SVは、この広角撮像画像を受信し、依頼人装置ITから受けた方向指示に基づいて、広角撮像画像から当該方向の部分を選択し、選択領域画像を生成する。サーバ装置SVの撮像画像転送手段は、生成した選択領域画像を、送信部758によって、依頼人装置ITに送信する。 The imaging unit 12 of the field device GT is a wide-angle camera such as a spherical camera, and captures images of the local guide in all spherical directions to generate a wide-angle captured image. This wide-angle captured image is transmitted to the server device SV by the transmitter 22 under the control of the captured image transmitter 18. The server device SV receives this wide-angle captured image, selects a portion in that direction from the wide-angle captured image based on the direction instruction received from the client device IT, and generates a selected area image. The captured image transfer means of the server device SV transmits the generated selected area image to the client device IT using the transmitter 758.
 依頼人装置ITの撮像画像受信手段36は、依頼人が指定した方向の選択領域画像を受信し、撮像画像表示部40に表示する。依頼人は、この選択領域画像を見て、方向を変えたい場合には、依頼人装置ITに方向指示を入力する。 The captured image receiving means 36 of the client's device IT receives the selected area image in the direction specified by the client, and displays it on the captured image display section 40. When the client looks at this selection area image and wants to change the direction, he inputs a direction instruction into the client's device IT.
 依頼人装置ITの方向指示送信手段39は、方向指示を、送信部34によってサーバ装置SVに送信する。サーバ装置SVは、この方向指示を受信し、広角撮像画像から選択領域画像を選択するために用いる。また、方向指令転送手段752は、送信部758により、方向指示を現地装置GTに送信する。 The direction instruction transmitting means 39 of the client device IT transmits the direction instruction to the server device SV through the transmitter 34. The server device SV receives this direction instruction and uses it to select a selection area image from the wide-angle captured image. Further, the direction command transfer means 752 transmits the direction command to the local device GT using the transmitter 758.
 以上のようにして、依頼人は、現地案内人の全天球方向のいずれの方向の画像も選択して見ることができる。これにより、依頼人は、視野が固定されず自由な方向の画像を見て体験を楽しむことができる。 As described above, the client can select and view images of the local guide in any direction in the celestial sphere. This allows the client to enjoy the experience of viewing images in any direction without having a fixed field of view.
 さらに、一人の現地案内人に対して複数の依頼人が接続している場合であっても、各依頼人は、それぞれの好みの方向の選択領域画像を独立して楽しむことができる。 Furthermore, even if multiple clients are connected to one local guide, each client can independently enjoy the selected area image in their preferred direction.
 この実施形態では、上記に加えて、依頼人が現地案内人に対して明確な指示を与えるための機能を備えている。たとえば、現地案内人が商店に入って、依頼人のために買い物をする場合、依頼人は購入した商品を明確に指示できることが好ましい。音声通話によってこれが可能な場合ばかりではない。類似した商品が複数個並んでいるような場合には、どの商品を購入するのかの指示が難しい。 In addition to the above, this embodiment has a function for the client to give clear instructions to the local guide. For example, when a local guide enters a store and makes purchases for a client, it is preferable that the client be able to clearly indicate the items purchased. This is not always possible through voice calls. When multiple similar products are lined up, it is difficult to determine which product to purchase.
 この実施形態において、依頼人が指示を与える際、依頼人は依頼人装置ITに固定指令を入力する。撮像画像表示部40は、固定指令が入力されるとその際の選択領域画像を参照選択領域画像とし、静止画として表示する。 In this embodiment, when the client gives an instruction, the client inputs a fixed command into the client device IT. When the fixed command is input, the captured image display unit 40 uses the selected area image at that time as a reference selected area image and displays it as a still image.
 依頼人は、撮像画像表示部40に表示された現地の参照選択領域画像を見ながら、指示画像入力部44から指示画像を入力する。たとえば、複数の商品が並んで撮像された参照選択領域画像において、希望する商品に指示画像(たとえば丸画像)を描いて入力する。 The client inputs an instruction image from the instruction image input section 44 while viewing the local reference selection area image displayed on the captured image display section 40. For example, in a reference selection area image in which a plurality of products are imaged side by side, an instruction image (for example, a circle image) is drawn and inputted on a desired product.
 指示画像送信手段38は、送信部34により、入力された指示画像をサーバ装置SVに送信する。サーバ装置SVの固定指令転送手段754は、指示画像を現地装置GTに転送する。 The instruction image transmitting means 38 transmits the input instruction image to the server device SV by the transmitting unit 34. The fixed command transfer means 754 of the server device SV transfers the instruction image to the local device GT.
 現地装置GTは、受信部24によって指示画像を受信し、投影部14から指示画像を投影する。これにより、商品52a、52b、52c、52dのうち、依頼人の希望する商品52cに、指示画像(たとえば丸画像)62が投影される。 The local device GT receives the instruction image through the receiving section 24 and projects the instruction image from the projection section 14. As a result, the instruction image (for example, a circle image) 62 is projected onto the product 52c desired by the client among the products 52a, 52b, 52c, and 52d.
 上述のように、投影部14の投影方向は固定されているので、現地案内人が顔の向きを変えたとしても、指示画像は依頼人の意図する箇所に投影されることになる。 As described above, since the projection direction of the projection unit 14 is fixed, even if the local guide changes the direction of his or her face, the instruction image will be projected onto the location intended by the client.
 とはいえ、現地案内人が場所を移動すると、指示画像の投影位置は、ずれてしまうことになる。そこで、現地装置GTの補正手段26は、参照選択領域画像における特徴部分画像(マーカなど)と現在の選択領域画像における特徴部分画像との比較により、指示画像が意図した位置に正しく投影されるように、指示画像の投影位置を補正制御する。これにより、現地案内人が移動したとしても、指示画像が正しい位置に表示される。 However, if the local guide moves from place to place, the projected position of the instruction image will shift. Therefore, the correction means 26 of the local device GT compares the characteristic partial image (marker, etc.) in the reference selection area image with the characteristic partial image in the current selection area image so that the instruction image is correctly projected at the intended position. Then, the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
 なお、上記を実現するため、依頼人装置ITが固定指令を受けると、送信部34により、サーバ装置SVを介して、固定指令を現地装置GTに送信する。現地装置GTの受信部24はこれを受信して、そのときの選択領域画像を参照選択領域画像として記録する。また、現地案内人は、商品52a、52b、52c、52dの近傍にマーカを置いて撮像されるようにする。 In order to realize the above, when the client device IT receives a fixing command, the transmitting unit 34 transmits the fixing command to the local device GT via the server device SV. The receiving unit 24 of the local device GT receives this and records the selected area image at that time as a reference selected area image. Further, the local guide places markers near the products 52a, 52b, 52c, and 52d so that the products can be imaged.
 現地案内人は、実際に投影された指示画像62に基づいていずれの商品を購入すべきかを確認することができる。この指示画像62は、現地案内人が頭の向きを変えても方向制御手段20によって正しい位置に表示される。したがって、現地案内人の頭の向きによって指示画像62が消えてしまうことがなくストレスが少ない。また、複数人の現地案内人が同行している場合、現地装置GTを身につけている現地案内人が、頭を大きく動かしたとしても指示画像62が投影され続け、他の現地案内人がとまどうことがない。さらに、現地案内人が移動したとしても、指示画像62が正しく表示される。
 
The local guide can confirm which product to purchase based on the actually projected instruction image 62. This instruction image 62 is displayed at the correct position by the direction control means 20 even if the local guide changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the local guide's head, reducing stress. Furthermore, if multiple local guides are accompanying the local guide, even if the local guide wearing the local device GT moves his or her head significantly, the instruction image 62 will continue to be projected, which may confuse the other local guides. Never. Furthermore, even if the local guide moves, the instruction image 62 is displayed correctly.
2.2外観及びハードウエア構成
 図12に、現地装置GTを装着した現地案内人54を示す。現地案内人54は、観光地や施設や商店などに行き、動画を送信することで遠隔の依頼人に対して様々な体験をさせる。
2.2 External appearance and hardware configuration Figure 12 shows the local guide 54 equipped with the local device GT. The local guide 54 goes to sightseeing spots, facilities, shops, etc., and allows remote clients to experience various things by transmitting videos.
 この実施形態では、スマートフォン772、方向制御付きレーザプロジェクタ776、全天球カメラ774、ヘッドセット770によって現地装置GTを構成している。ヘッドセット770のスピーカ、マイクは、スマートフォン772と近距離通信(ブルーツースなど)によって接続される。全天球カメラ774(近距離通信回路内蔵)、方向制御付きレーザプロジェクタ776(近距離通信回路内蔵)も、同様に、スマートフォン772と近距離通信によって接続されている。 In this embodiment, a smartphone 772, a laser projector 776 with direction control, a spherical camera 774, and a headset 770 constitute a field device GT. The speaker and microphone of headset 770 are connected to smartphone 772 by short-range communication (such as Bluetooth). The omnidirectional camera 774 (with built-in short-range communication circuit) and the laser projector 776 with direction control (with built-in short-range communication circuit) are similarly connected to the smartphone 772 by short-range communication.
 全天球カメラ772は、ヘッドセット770の頭頂部に設けられている。現地案内人54の前方を撮像する半天球カメラと、後方を撮像する半天球カメラとによって、全方向の画像を撮像する。 The omnidirectional camera 772 is provided at the top of the headset 770. Images in all directions are captured by a half-celestial camera that captures an image in front of the local guide 54 and a half-celestial camera that captures an image behind the local guide.
 全天球カメラ772の上部には、方向制御付きレーザプロジェクタ776が設けられている。図13に、方向制御付きレーザプロジェクタ776の外観を示す。方向制御付きレーザプロジェクタ776には、基部93が設けられており、基部93は全天球カメラ772の上部に固定されている。 A laser projector 776 with direction control is provided above the omnidirectional camera 772. FIG. 13 shows the external appearance of the laser projector 776 with direction control. The laser projector 776 with direction control is provided with a base 93 , and the base 93 is fixed to the top of the omnidirectional camera 772 .
 基部93には、駆動部である三軸構造体90(その他の多軸構造体でもよい)を介して、レーザプロジェクタ84を収納したユニット80が固定されている。三軸構造体90の基部93にはモータ92が固定され、モータ92によってXY平面において回動される中間部材92Aの一端が接続されている。中間部材92Aは、L字状に形成されており、その他端にモータ94が固定されている。モータ94には、当該モータ94によってZX平面において回動される中間部材94Aの一端が接続されている。中間部材94Aは、L字状に形成されており、その他端にモータ96が固定されている。モータ96には、当該モータ96によってZY平面において回動されるマウント部材97が接続されている。なお、図13において示したXYZ軸は、各部材92A、94A、97の回動につれて変動するものである。 A unit 80 housing a laser projector 84 is fixed to the base 93 via a triaxial structure 90 (another multiaxial structure may be used) as a drive section. A motor 92 is fixed to the base 93 of the triaxial structure 90, and one end of an intermediate member 92A that is rotated in the XY plane by the motor 92 is connected. The intermediate member 92A is formed in an L-shape, and a motor 94 is fixed to the other end. One end of an intermediate member 94A that is rotated in the ZX plane by the motor 94 is connected to the motor 94. The intermediate member 94A is formed in an L shape, and a motor 96 is fixed to the other end. A mount member 97 that is rotated in the ZY plane by the motor 96 is connected to the motor 96 . Note that the XYZ axes shown in FIG. 13 vary as each member 92A, 94A, 97 rotates.
 このように、三軸構造体90は、モータ92、94、96を駆動することにより、マウント部材97の向きを三軸の自由度にて調整することができる。 In this manner, the three-axis structure 90 can adjust the orientation of the mount member 97 with three-axis degrees of freedom by driving the motors 92, 94, and 96.
 また、基部93には、センサ28として三軸ジャイロセンサJS、三軸加速度センサASが設けられている。また、基部93には、上記のモータ92、94、96を制御するモータ制御回路(図示せず)が設けられている。上記各モータ92、94、96は、三軸ジャイロセンサJS、三軸加速度センサASの出力に基づいて、モータ制御回路によって制御される。 Furthermore, the base 93 is provided with a triaxial gyro sensor JS and a triaxial acceleration sensor AS as the sensors 28. Further, the base 93 is provided with a motor control circuit (not shown) that controls the motors 92, 94, and 96 described above. Each of the motors 92, 94, and 96 is controlled by a motor control circuit based on the outputs of the three-axis gyro sensor JS and the three-axis acceleration sensor AS.
 三軸構造体90のマウント部材97には、レーザプロジェクタ84を収納したユニット80が固定されている。図14に示すように、ユニット80の筐体81内には、レーザプロジェクタ84を制御するレーザプロジェクタ制御回路104(近距離通信回路を含む)が設けられている。 A unit 80 housing a laser projector 84 is fixed to the mount member 97 of the triaxial structure 90. As shown in FIG. 14, a laser projector control circuit 104 (including a short-range communication circuit) that controls the laser projector 84 is provided within the housing 81 of the unit 80.
 レーザプロジェクタ制御回路104は、基部93に設けるようにしてもよいが、少なくともレーザプロジェクタ104のMEMS回路はユニット80に設けることが好ましい。 Although the laser projector control circuit 104 may be provided in the base 93, it is preferable that at least the MEMS circuit of the laser projector 104 is provided in the unit 80.
 筐体81は、三軸構造体90のマウント上面101、マウント側面97、マウント底面99に、シリコンゲルブッシュ120(たとえばTaica社防振材ゲルブッシュB-1)を介して取り付けられている。なお、図13においては、理解を容易にするためにマウント上面101を省略している。 The casing 81 is attached to the mount top surface 101, mount side surface 97, and mount bottom surface 99 of the triaxial structure 90 via silicone gel bushings 120 (for example, Taica's anti-vibration material gel bushing B-1). Note that in FIG. 13, the mount top surface 101 is omitted for easy understanding.
 図14に示すように、シリコンゲルブッシュ120は、リング状のシリコンゲル116の上部外側に挿入するリング状のシリコンゲル114を備えている。シリコンゲル116の上部は、筐体81に設けられた穴に挿入される。シリコンゲル114とシリコンゲル116によって筐体81を挟み込むようにしている。シリコンゲル114、116は、ボルト110、ワッシャ112によってマウント底面99にネジ止めされる。このような構造により、筐体81は、シリコンゲル116、114によって保持された状態となる。これにより、外部からの高周波振動が筐体81に伝達されるのを防ぐことができる。 As shown in FIG. 14, the silicone gel bush 120 includes a ring-shaped silicone gel 114 inserted outside the upper part of the ring-shaped silicone gel 116. The upper part of the silicone gel 116 is inserted into a hole provided in the housing 81. The housing 81 is sandwiched between the silicon gel 114 and the silicon gel 116. The silicon gels 114 and 116 are screwed to the mount bottom surface 99 by bolts 110 and washers 112. With this structure, the housing 81 is held by the silicone gels 116 and 114. This can prevent high-frequency vibrations from being transmitted to the housing 81 from the outside.
 この実施形態では、図15に示すように、筐体81の上面、側面、底面にそれぞれ2カ所、シリコンゲルブッシュ120を設けている。 In this embodiment, as shown in FIG. 15, silicone gel bushes 120 are provided at two locations on each of the top, side, and bottom surfaces of the housing 81.
 図16に、モータ制御回路400のハードウエア構成を示す。CPU402には、メモリ404、ジャイロセンサJS、加速度センサAS、カメラ82、レーザプロジェクタ84、モータ92、94、96、不揮発性メモリ406が接続されている。 FIG. 16 shows the hardware configuration of the motor control circuit 400. A memory 404, a gyro sensor JS, an acceleration sensor AS, a camera 82, a laser projector 84, motors 92, 94, and 96, and a nonvolatile memory 406 are connected to the CPU 402.
 不揮発性メモリ406には、オペレーティングシステム31、モータ制御プログラム32が記録されている。モータ制御プログラム32は、オペレーティングシステム31と協働してその機能を発揮するものである。 The operating system 31 and motor control program 32 are recorded in the nonvolatile memory 406. The motor control program 32 cooperates with the operating system 31 to perform its functions.
 依頼人装置IT、スマートフォン772のハードウエア構成は、第1の実施形態と同様である。
 
The hardware configuration of the client device IT and the smartphone 772 is the same as in the first embodiment.
2.3遠隔体験処理
 図17に、案内時のフローチャートを示す。現地案内人のスマートフォン200は、近距離通信により全天球カメラ774の広角撮像画像(動画)を取得し、サーバ装置SVに送信する(ステップS21)。サーバ装置SVは、この広角撮像画像を受信してSSD558に記録する。
2.3 Remote experience processing Figure 17 shows a flowchart during guidance. The local guide's smartphone 200 acquires a wide-angle captured image (video) of the omnidirectional camera 774 through short-range communication, and transmits it to the server device SV (step S21). The server device SV receives this wide-angle captured image and records it on the SSD 558.
 全天球カメラ774の出力する広角撮像画像は、全方向を撮像した画像である。サーバ装置SVは、依頼人装置ITから取得して記録している方向指令に従って、この広角撮像画像から所定方向の画像を選択し、選択領域画像を生成する(ステップS91)。したがって、選択領域画像は、現地案内人が通常のカメラにて所定方向に撮像を行った場合の画像と同じとなる。サーバ装置SVは、生成した選択領域画像を、依頼人装置ITに送信する。 The wide-angle captured image output by the omnidirectional camera 774 is an image captured in all directions. The server device SV selects an image in a predetermined direction from this wide-angle captured image according to the direction command acquired and recorded from the client device IT, and generates a selected area image (step S91). Therefore, the selected area image is the same as the image obtained when the local guide takes an image in a predetermined direction with a normal camera. The server device SV transmits the generated selection area image to the client device IT.
 依頼人装置ITは、選択領域画像を受信し、ディスプレイ306に表示する(ステップS41)。図18に、表示された選択領域画像を示す。これにより、依頼人は、現地の動画を見ることができる。現地案内人の移動につれて、現地の様子を楽しむことができる。 The client device IT receives the selected area image and displays it on the display 306 (step S41). FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
 なお、上述のように、所定方向の画像が選択され選択領域画像として表示される。この所定方向は、広角撮像画像における方向として決定されるので、現地案内人を中心とした上下左右方向として定められることになる。 Note that, as described above, an image in a predetermined direction is selected and displayed as a selected area image. Since this predetermined direction is determined as a direction in the wide-angle captured image, it is determined as the up, down, left and right directions centered on the local guide.
 依頼人がこの所定方向を変更して、異なる方向の画像を見たい場合には、依頼人装置ITのキーボード/マウス316を操作し、方向指令ボタン500をクリックする。方向指令ボタン500は、上下左右の円周方向360度においてクリックできるようになっている。 If the client wishes to change this predetermined direction and view the image in a different direction, he/she operates the keyboard/mouse 316 of the client's device IT and clicks on the direction command button 500. The direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
 方向指令ボタン500がクリックされると、依頼人装置ITは、上記クリックに応じた方向指令をサーバ装置SVに送信する(ステップS42)。サーバ装置SVは、これを受信し、方向指令を更新する。したがって、サーバ装置SVは、ステップS91において選択領域画像を選択する所定方向を変更し、依頼人装置ITのディスプレイ306には、方向指令ボタン500によって指令した方向の画像が表示されることになる。たとえば、図19に示すように、異なる方向の選択領域画像を見ることができる。 When the direction command button 500 is clicked, the client device IT transmits a direction command corresponding to the click to the server device SV (step S42). Server device SV receives this and updates the direction command. Therefore, the server device SV changes the predetermined direction for selecting the selection area image in step S91, and the image in the direction commanded by the direction command button 500 is displayed on the display 306 of the client device IT. For example, as shown in FIG. 19, selected area images can be viewed in different directions.
 なお、広角撮像画像は全天球の方向の画像であるので、現地案内人を中心とした上下左右いずれの方向の選択領域画像も見ることが可能である。 Note that since the wide-angle captured image is an image in all celestial sphere directions, it is possible to view selected area images in any direction, up, down, left, or right, centered on the local guide.
 上記のようにして、現地案内人の向いている方向に拘わらず、依頼人は自らの操作により、所望の方向の現地画像を楽しむことができる。 As described above, regardless of the direction in which the local guide is facing, the client can enjoy the local image in the desired direction through his/her own operations.
 さらに、この実施形態では、現地の対象物に画像を投影することで依頼人から現地案内人に指示を行うことができるようにしている。依頼人装置ITから指示画像を送信し、現地案内人の装着している方向制御付きレーザプロジェクタ776にて、現地の対象物に指示画像を投影する。たとえば、現地にて現地案内人に買い物を依頼する時、購入したい商品に指示画像を投影すれば、正確な指示を行うことができる。 Furthermore, in this embodiment, the client can give instructions to the local guide by projecting an image onto a local object. An instruction image is transmitted from the client's device IT, and is projected onto an object at the site using a direction-controlled laser projector 776 worn by the local guide. For example, when requesting shopping from a local guide at a local location, by projecting an instruction image onto the product the user wants to purchase, the guide can provide accurate instructions.
 これを実現する場合、現地案内人が向きを変えたり、動いたりしても、対象物を外れずに指示画像が投影されるようにする必要がある。そのため、この実施形態では、方向制御付きレーザプロジェクタ776を用いて、指示画像が正しく投影されるようにしている。 To achieve this, it is necessary to ensure that the instruction image is projected without leaving the target object even if the local guide changes direction or moves. Therefore, in this embodiment, a direction-controlled laser projector 776 is used to ensure that the instruction image is projected correctly.
 モータ制御回路400のCPU402(以下、モータ制御回路400と省略することがある)は、方向制御付きレーザプロジェクタ776のジャイロセンサJS、加速度センサASの出力を取得する(図17、ステップS1)。この実施形態では、直交する三軸方向のジャイロセンサ、加速度センサを用いている。 The CPU 402 of the motor control circuit 400 (hereinafter sometimes abbreviated as the motor control circuit 400) acquires the outputs of the gyro sensor JS and acceleration sensor AS of the direction-controlled laser projector 776 (FIG. 17, step S1). In this embodiment, a gyro sensor and an acceleration sensor in three orthogonal axes are used.
 モータ制御回路400は、ジャイロセンサJS、加速度センサASの出力に基づいて、基部93(図13参照)が三次元空間においていずれの位置、いずれの方向にあるのかを算出する。そして、基部93の位置、方向に拘わらず、ユニット80が所定方向を向くように、モータ92、94、96の回転角を制御する(ステップS2)。したがって、現地案内人54の頭の向きに拘わらず、ユニット80は一定の方向に保たれる。このような制御は、カメラなどの安定装置として用いられているジンバルと同様の制御である。また、上記の所定方向は、依頼人装置ITからの方向指令によって変更される(ステップS42、S92、S22、S3)。したがって、依頼人がディスプレイ306において見ている現地画像の方向に、レーザプロジェクタ84の投影方向が合致することになる。これにより、選択領域画像の範囲とレーザプロジェクタ84の投影範囲が合致するようにしている。 The motor control circuit 400 calculates in which position and in which direction the base 93 (see FIG. 13) is located in three-dimensional space based on the outputs of the gyro sensor JS and the acceleration sensor AS. The rotation angles of the motors 92, 94, and 96 are then controlled so that the unit 80 faces in a predetermined direction regardless of the position and direction of the base 93 (step S2). Therefore, regardless of the orientation of the local guide 54's head, the unit 80 is kept in a constant direction. Such control is similar to that of a gimbal used as a stabilizing device for cameras and the like. Further, the above-mentioned predetermined direction is changed by a direction command from the client device IT (steps S42, S92, S22, S3). Therefore, the projection direction of the laser projector 84 matches the direction of the local image that the client is viewing on the display 306. Thereby, the range of the selected area image and the projection range of the laser projector 84 are made to match.
 図20に、指示画像投影のフローチャートを示す。ここでは、図21に示すように、販売店において商品52a、52b、52c、52d等が並んでいる中、52cの購入を指示する場合について説明する。 FIG. 20 shows a flowchart of instruction image projection. Here, as shown in FIG. 21, a case will be described in which an instruction is given to purchase 52c among products 52a, 52b, 52c, 52d, etc. lined up at a store.
 指示画像による指示を行う場合、依頼人は音声通話などで、現地案内人54に対し近傍に予め用意しているマーカ60を置くように伝える。マーカ60の画像(特徴部分画像)は、予め、大きさや形状がスマートフォン200の不揮発性メモリ212に記録されている。したがって、スマートフォン200は、撮像されたマーカ60の画像に基づいて、レーザプロジェクタ84からマーカ60までの距離、方向などを算出することができる。 When giving an instruction using an instruction image, the client instructs the local guide 54 to place a marker 60 prepared in advance nearby by voice call or the like. The size and shape of the image of the marker 60 (characteristic portion image) are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the laser projector 84 to the marker 60 based on the captured image of the marker 60.
 上述のように図17に示す処理により、選択領域画像は依頼人装置ITのディスプレイ306に動画として表示されている。依頼人は、この選択領域画像を見ながら、図21に示すように、マーカ60が撮像された状態にて、指示入力モードボタン501をクリックして固定指令を与える。なお、ここでは、カードとして用意されているマーカ60を、商品52bに立てかけて置いたものとする。 As described above, by the process shown in FIG. 17, the selected area image is displayed as a moving image on the display 306 of the client device IT. While looking at this selected area image, the client clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 21 to give a fixing instruction. Note that here, it is assumed that the marker 60 prepared as a card is placed leaning against the product 52b.
 依頼人装置ITは、指示入力モードボタン501がクリックされることにより固定指令が与えられると、そのときの選択領域画像を参照選択領域画像とし、静止画としてディスプレイ306に表示する(ステップS52)。依頼人は、この静止画に対し、マウス316を用いて現地案内人に対する指示を指示画像として入力する(ステップS53)。たとえば、図21bに示すように、ディスプレイ306に表示された商品52cの画像の上に、マウス316にて丸印62を描画し、入力する。 When the client device IT receives a fixing command by clicking the instruction input mode button 501, the client device IT sets the selected area image at that time as the reference selected area image and displays it on the display 306 as a still image (step S52). The client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, a circle mark 62 is drawn using the mouse 316 on the image of the product 52c displayed on the display 306, and inputted.
 依頼人装置ITは、上記固定指令を、サーバ装置SVを介して、スマートフォン200に送信する(ステップS51、S93)。固定指令を受けたスマートフォン200は、当該固定指令を受信した際の選択領域画像を、参照選択領域画像として不揮発性メモリ212に記録する(ステップS32)。なお、スマートフォン200は、ステップS22において、方向指令を受信して更新しているので広角撮像画像から選択領域画像を生成することができる。 The client device IT transmits the fixed command to the smartphone 200 via the server device SV (steps S51, S93). The smartphone 200 that has received the fixing command records the selected area image at the time of receiving the fixing instruction in the nonvolatile memory 212 as a reference selected area image (step S32). Note that since the smartphone 200 receives and updates the direction command in step S22, it can generate the selected area image from the wide-angle captured image.
 したがって、依頼人装置ITとスマートフォン200において、同じ時点の選択領域画像を参照選択領域画像として認識することができる。なお、通信によるタイムラグを避けるため、依頼人装置ITからスマートフォン200に固定指令を送信する際に、フレームを特定する情報(フレーム番号など)を付けて送信するようにしてもよい。スマートフォン200において、このフレームを特定する情報に基づいて参照選択領域画像を決定することで、タイムラグによるずれを防止することができる。 Therefore, the client device IT and the smartphone 200 can recognize the selected area image at the same time as the reference selected area image. Note that in order to avoid time lag due to communication, when transmitting the fixing command from the client device IT to the smartphone 200, information for identifying the frame (such as a frame number) may be attached and transmitted. In the smartphone 200, by determining the reference selection area image based on information specifying this frame, it is possible to prevent deviations due to time lag.
 依頼人は、指示画像を入力し終わると、ディスプレイ306の参照選択領域画像の右下に表示されている指示画像送信ボタン502(指示入力モードになると表示される)をクリックする。これにより、依頼人装置ITは、サーバ装置SVを介して、指示画像をスマートフォン200に送信する(ステップS53、S94)。 When the client finishes inputting the instruction image, he or she clicks the instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference selection area image on the display 306. Thereby, the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94).
 また、依頼人装置ITは、指示入力モードを解除して参照選択領域画像としての静止画の表示を止めて、送信されてきた選択領域画像を動画として表示する(ステップS54)。これにより、指示者は、再び現地の状況を見ることができる。 Further, the client device IT cancels the instruction input mode, stops displaying the still image as the reference selection area image, and displays the transmitted selection area image as a moving image (step S54). This allows the instructor to see the local situation again.
 スマートフォン200に送信される指示画像のデータ構造を、図23Aに示す。指示画像データは、図23Bに示すように、指示者が入力した指示画像の実体データである。基準座標位置は、図23Cに示すように、マーカ画像の基準点(たとえばMの中央下部の点)原点としたときの、指示画像の基準点のXY座標値である。この実施形態では、図23Bに示すように、指示者が入力した指示画像に外接する矩形の左上を基準点としている。 The data structure of the instruction image sent to the smartphone 200 is shown in FIG. 23A. The instruction image data is the actual data of the instruction image input by the instructor, as shown in FIG. 23B. As shown in FIG. 23C, the reference coordinate position is the XY coordinate value of the reference point of the instruction image when the reference point of the marker image (for example, the lower center point of M) is set as the origin. In this embodiment, as shown in FIG. 23B, the reference point is the upper left of the rectangle circumscribing the instruction image input by the instructor.
 なお、上記では、指示画像を画像データとして送信しているが、予め定められた画像の形状に応じて、そのパラメータを数値にて送信するようにしてもよい。たとえば、真円なら中心座標と半径、正方形なら左上の座標と辺の長さなどを数値にて表して送信するようにしてもよい。 Note that in the above, the instruction image is transmitted as image data, but the parameters may be transmitted as numerical values depending on the shape of the image determined in advance. For example, if it is a perfect circle, the center coordinates and radius, and if it is a square, the upper left coordinates and side lengths may be expressed numerically and transmitted.
 スマートフォン200は、図23Aの指示画像データを受信すると、これをメモリ204に保持する。さらに、スマートフォン200は、全天球カメラ774から現在の撮像画像(選択領域画像)を取得する(ステップS33)。 Upon receiving the instruction image data of FIG. 23A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image (selected area image) from the omnidirectional camera 774 (step S33).
 前述のように、選択領域画像の範囲(通常のカメラで選択領域画像を出力するようにしたときの撮像範囲)と、レーザプロジェクタ84の投影範囲は、同じになるように構成されている。したがって、現在の選択領域画像が記録した参照選択領域画像と全く同じであれば(すなわち、参照選択領域画像の時から現場担当者が全く動かなければ)、基準座標位置(図23C)に基づく位置に、レーザプロジェクタ84によって指示画像データを投影すれば、商品52cに指示画像62が映し出されることになる。 As described above, the range of the selected area image (the imaging range when outputting the selected area image with a normal camera) and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current selection area image is exactly the same as the recorded reference selection area image (that is, if the field personnel has not moved at all since the reference selection area image), the position based on the reference coordinate position (FIG. 23C) Then, if the instruction image data is projected by the laser projector 84, the instruction image 62 will be projected onto the product 52c.
 この指示画像62の位置は、依頼人がディスプレイ306において入力した位置と合致するので、現地案内人に対して正確に対象物である商品52cを示すことができる。現地案内人は、この指示画像62を目印として間違わずに商品52cを購入することができる。 Since the position of this instruction image 62 matches the position input by the client on the display 306, the target product 52c can be accurately shown to the local guide. The local guide can use the instruction image 62 as a landmark to purchase the product 52c without making a mistake.
 図24Aに示すように、現地案内人54の頭の垂直軸(水平軸)を中心としてヘッドセット770を水平回転(垂直回転)させた場合でも、図17の方向固定制御によって、レーザプロジェクタ84の指示画像までの距離や向きが変わらなければ(回転量により機構的な限界があるが)、指示画像62は正しい位置に表示される。 As shown in FIG. 24A, even when the headset 770 is horizontally rotated (vertically rotated) about the vertical axis (horizontal axis) of the local guide's 54's head, the direction fixing control of FIG. As long as the distance and direction to the instruction image do not change (although there is a mechanical limit depending on the amount of rotation), the instruction image 62 will be displayed at the correct position.
 ただし、図24Bの破線にて示すように、現地案内人54(ヘッドセット770)が対象物52に近づいたり、離れたりした場合には、現場に投影される指示画像62が大きくなったり、小さくなったりする。 However, as shown by the broken line in FIG. 24B, when the local guide 54 (headset 770) approaches or moves away from the object 52, the instruction image 62 projected onto the site becomes larger or smaller. It happens.
 また、図24Cの破線にて示すように、現地案内人54(ヘッドセット770)が左右に移動すると、図17の方向固定制御では、矢印で示す投影方向が現場担当者54を基準として一定に保持されるだけである。このため、レーザプロジェクタ84によって投影される指示画像62が左右にずれた位置に表示されることになる。このような状況は、現地案内人54の上下方向への移動(しゃがんでいた状態から立ち上がった状態になるなど)があった場合も同様に、指示画像62が上下にずれることになる。 Furthermore, as shown by the broken line in FIG. 24C, when the local guide 54 (headset 770) moves left and right, in the direction fixing control shown in FIG. It is only retained. Therefore, the instruction image 62 projected by the laser projector 84 is displayed at a position shifted left and right. In such a situation, if the local guide 54 moves in the vertical direction (such as from a squatting position to a standing position), the instruction image 62 will also shift vertically.
 これらを解消して正しく指示画像62を投影するため、この実施形態では次のような処理を行っている。 In order to eliminate these problems and correctly project the instruction image 62, the following processing is performed in this embodiment.
 スマートフォン200は、参照選択領域画像におけるマーカ60の画像に基づいて、レーザプロジェクタ84とマーカ60(および指示画像62を投影すべき場所)との距離、方向を算出する。前述のように、マーカ60には予め既知の絵柄が印刷されているので、撮像画像に基づいて、商品52c近傍に置かれたマーカ60(および指示画像62を投影すべき場所)までの距離や方向を算出することができる。 The smartphone 200 calculates the distance and direction between the laser projector 84 and the marker 60 (and the location where the instruction image 62 is to be projected) based on the image of the marker 60 in the reference selection area image. As mentioned above, since a known pattern is printed on the marker 60 in advance, the distance to the marker 60 placed near the product 52c (and the location where the instruction image 62 should be projected) can be determined based on the captured image. The direction can be calculated.
 また、スマートフォン200は、ステップS33において取得した現在の選択領域画像に基づいて、マーカ60(および指示画像62を投影すべき場所)までの距離や方向を算出する。 Furthermore, the smartphone 200 calculates the distance and direction to the marker 60 (and the location where the instruction image 62 is to be projected) based on the current selection area image acquired in step S33.
 したがって、スマートフォン200は、参照選択領域画像の際のマーカ60(および指示画像62を投影すべき場所)までの距離・方向と、現在の選択領域画像のマーカ60(および指示画像62を投影すべき場所)までの距離・方向との比較に基づいて、指示画像62を変形し、指示画像62を投影する位置を制御する(ステップS34)。 Therefore, the smartphone 200 determines the distance and direction to the marker 60 (and the place where the instruction image 62 should be projected) in the reference selection area image, and the distance and direction to the marker 60 (and the place where the instruction image 62 should be projected) of the current selection area image. The instruction image 62 is transformed based on the comparison with the distance and direction to the location), and the position where the instruction image 62 is projected is controlled (step S34).
 たとえば、図24Bのような場合であれば、カメラ82とマーカ60(または指示画像62)の距離の変化に応じて指示画像62を拡大・縮小して投影するよう制御する。 For example, in the case as shown in FIG. 24B, the instruction image 62 is controlled to be enlarged/reduced and projected according to the change in the distance between the camera 82 and the marker 60 (or the instruction image 62).
 図24Cのような場合であれば、マーカ60の移動に伴って指示画像62を投影する位置を移動させるよう制御する。 In a case like that shown in FIG. 24C, control is performed to move the position where the instruction image 62 is projected as the marker 60 moves.
 この実施形態では、三軸構造体90による方向固定制御(図17参照)を別途行っているので、多くの場合、図24B、図24Cに対する制御を行うことで正しい位置に指示画像62を表示させることができる。 In this embodiment, since the direction fixing control (see FIG. 17) is performed separately by the triaxial structure 90, in many cases, the instruction image 62 is displayed at the correct position by performing the control for FIGS. 24B and 24C. be able to.
 この実施形態では、三軸構造体90による方向固定制御を別途行った上、上記の制御を行っているので、指示画像62を安定して正しい位置に表示させることができる。また、現地案内人54が頭の向きを変えて対象物52から視線を外した場合であっても、方向固定制御によって指示画像62が表示され続ける。このため、現地案内人54のストレスが少ない。 In this embodiment, the direction fixing control by the triaxial structure 90 is separately performed and the above control is performed, so the instruction image 62 can be stably displayed at the correct position. Moreover, even if the local guide 54 changes the direction of his head and takes his line of sight away from the object 52, the instruction image 62 continues to be displayed by the direction fixing control. Therefore, stress on the local guide 54 is reduced.
 ところで、三軸構造体90による方向固定制御の限界を超えるような場合には、たとえば、図25に模式的に示すように、レーザプロジェクタ84に対して、対象物52のマーカ60が貼り付けられた面510の傾きが、図25Aの参照選択領域画像の時点から、図25Bの現在の選択領域画像の時点で、変化することがある。 By the way, if the limit of direction fixing control by the triaxial structure 90 is exceeded, for example, as schematically shown in FIG. The slope of surface 510 may change from the reference selection area image of FIG. 25A to the current selection area image of FIG. 25B.
 この場合、スマートフォン200は、参照選択領域画像におけるマーカ60の画像に基づいて、対象物52の面510の傾きを算出する(図25A)。これにより、依頼人装置ITから送られてきた基準座標位置PL1(XまたはY)に基づいて、マーカ60と指示画像62との実際の距離LLを算出する。 In this case, the smartphone 200 calculates the inclination of the surface 510 of the object 52 based on the image of the marker 60 in the reference selection area image (FIG. 25A). Thereby, the actual distance LL between the marker 60 and the instruction image 62 is calculated based on the reference coordinate position PL1 (X or Y) sent from the client device IT.
 次に、現在の選択領域画像におけるマーカ60の画像に基づいて、対象物52の面510の傾きを算出する(図25B)。これにより、上記算出した実際の距離LLに基づいて、指示画像62を表示すべき位置を決定し、基準座標位置PL2(XまたはY)を算出する。スマートフォン200は、この基準座標位置PL2に基づいて、指示画像62を投影する位置を制御し、正しい位置に指示画像62を投影することができる。また、投影された指示画像62が歪まないように、指示画像62を変形する。 Next, the inclination of the surface 510 of the object 52 is calculated based on the image of the marker 60 in the current selected area image (FIG. 25B). Thereby, the position where the instruction image 62 should be displayed is determined based on the actual distance LL calculated above, and the reference coordinate position PL2 (X or Y) is calculated. The smartphone 200 can control the position at which the instruction image 62 is projected based on this reference coordinate position PL2, and can project the instruction image 62 at the correct position. Furthermore, the instruction image 62 is transformed so that the projected instruction image 62 is not distorted.
 上記の処理は、上下方向、左右方向のいずれについても同様に行うことができる。 The above process can be performed in the same way in both the vertical and horizontal directions.
 さらに、図26に示すように、参照選択領域画像の際の撮像範囲504が、撮像範囲506に示すように斜めに傾いてしまうこともある。図26では、紙面に水平な方向の傾きを示したが、このような傾きは、三次元方向の全てにおいて生じる可能性がある。これにより、投影される指示画像62も歪んでしまうことになる。 Further, as shown in FIG. 26, the imaging range 504 for the reference selection area image may be tilted diagonally as shown in the imaging range 506. Although FIG. 26 shows a tilt in a direction horizontal to the plane of the paper, such a tilt may occur in all three-dimensional directions. As a result, the projected instruction image 62 will also be distorted.
 これらについても、参照選択領域画像におけるマーカ60の画像と、現在の選択領域画像におけるマーカ60の画像とに基づいて、上記歪みを解消するように、指示画像62を変形(歪みの逆に変形する)し投影することで、正しい指示画像を投影することができる。 For these as well, the instruction image 62 is deformed (inversely deformed to the distortion) based on the image of the marker 60 in the reference selection area image and the image of the marker 60 in the current selection area image ), it is possible to project the correct instruction image.
 上記の処理をまとめると、以下のとおりである。スマートフォン200は、参照選択領域画像におけるマーカ60の画像に基づいて、レーザプロジェクタ84とマーカ60との距離、方向を算出する。また、スマートフォン200は、ステップS33において取得した現在の選択領域画像に基づいて、対象物52近傍のマーカ60までの距離や方向を算出する。スマートフォン200は、参照選択領域画像の際のマーカ60までの距離・方向と、現在の選択領域画像のマーカ60までの距離・方向との比較に基づいて、指示画像62を変形し、指示画像62を投影する位置を制御する。 The above processing is summarized as follows. Smartphone 200 calculates the distance and direction between laser projector 84 and marker 60 based on the image of marker 60 in the reference selection area image. Furthermore, the smartphone 200 calculates the distance and direction to the marker 60 near the target object 52 based on the current selected area image acquired in step S33. The smartphone 200 transforms the instruction image 62 based on the comparison between the distance and direction to the marker 60 in the reference selection area image and the distance and direction to the marker 60 in the current selection area image. Control the projection position.
 以上のようにして、依頼者の意図した指示画像が、現地の対象物52に投影されて表示される。 As described above, the instruction image intended by the client is projected and displayed on the local object 52.
 なお、指示画像60を正しく表示するためにはマーカ60は、指示画像60を表示すべき平面に置くことが好ましい。 Note that in order to display the instruction image 60 correctly, the marker 60 is preferably placed on the plane where the instruction image 60 is to be displayed.
 なお、マーカ60を配置した面と指示画像62を表示すべき面が異なる場合(段差がある場合など)には、SLAMなど画像の特徴点を用いて正確な位置を推定することを組み合わせることが好ましい。 Note that if the surface on which the marker 60 is placed and the surface on which the instruction image 62 should be displayed are different (such as when there is a difference in level), it is possible to combine estimating the accurate position using feature points of the image such as SLAM. preferable.
 すなわち、スマートフォン200において撮像画像を解析して対象物近傍(マーカ60近傍)の特徴点(物体の境界上の点など)を算出する。参照撮選択領域画像における特徴点と、現在の選択領域画像の特徴点の比較によって、マーカ60と指示画像62を表示すべき面との位置関係を決定する。 That is, the smartphone 200 analyzes the captured image to calculate feature points (points on the boundary of the object, etc.) near the object (near the marker 60). By comparing the feature points in the reference photographed selection area image and the feature points in the current selection area image, the positional relationship between the marker 60 and the surface on which the instruction image 62 is to be displayed is determined.
 このようにすれば、マーカ60を貼り付けた面と指示画像62を表示すべき面が異なる場合であっても、正しく指示画像62を投影することができる。 In this way, even if the surface on which the marker 60 is pasted and the surface on which the instruction image 62 should be displayed are different, the instruction image 62 can be correctly projected.
 なお、指示画像を投影する必要がなくなった場合、依頼人装置ITまたは現地装置GTを操作して、指示画像の表示を停止することができる。
 
Note that when it is no longer necessary to project the instruction image, the display of the instruction image can be stopped by operating the client device IT or the local device GT.
2.4変形例
(1)上記実施形態では、画像処理ないし投影制御を行うことで、指示画像を正しく表示するための補正手段26を設けている。しかし、細かな精度が要求されない場合には、駆動部16による投影部14の投影方向を制御する方向制御手段20だけでもよい。
2.4 Variations
(1) In the embodiment described above, a correction means 26 is provided for correctly displaying the instruction image by performing image processing or projection control. However, if fine precision is not required, only the direction control means 20 that controls the projection direction of the projection section 14 by the drive section 16 may be sufficient.
(2)上記実施形態では、方向制御付きレーザプロジェクタ776を設けている。しかし、方向制御付きレーザプロジェクタ776は設けずに、全天球カメラ774のみを設けるようにしてもよい。この場合でも、依頼人は、自らの希望する方向の画像を見ることができる。 (2) In the above embodiment, a laser projector 776 with direction control is provided. However, the laser projector 776 with direction control may not be provided, and only the omnidirectional camera 774 may be provided. Even in this case, the client can view the image in the direction he or she desires.
(3)上記実施形態では、広角撮像画像の基準方向(たとえば、現地案内人の前方)を基準として、方向指示を与えるようにしている。このため、現地案内人が体の向きを変えると、それに応じて選択領域画像の向きも変わってしまう。依頼人が道路の右側の建物をみたいと思っていても、現地案内人が横を向いたりすると、所望の方向を見ることができない。 (3) In the above embodiment, the direction instruction is given based on the reference direction of the wide-angle captured image (for example, in front of the local guide). Therefore, when the local guide changes the orientation of his or her body, the orientation of the selected area image changes accordingly. Even if the client wants to see the buildings on the right side of the road, if the local guide turns to the side, he or she will not be able to see the desired direction.
 そこで、全天球カメラ774を、ヘットセット770に固定して取り付けるのではなく、方向付きレーザプロジェクタ776と同じように、三軸構造体を介して取り付けて方向制御を行うようにすることもできる。これにより、現地案内人が向きを変えても、所望の方向(たとえば道路の右側)の領域が選択領域画像として得られる。 Therefore, instead of fixedly attaching the omnidirectional camera 774 to the headset 770, the omnidirectional camera 774 can be attached via a triaxial structure to control the direction, similar to the directional laser projector 776. . As a result, even if the local guide changes direction, the area in the desired direction (for example, the right side of the road) can be obtained as the selected area image.
 なお、上記の場合、レーザプロジェクタ84は、全天球カメラ774と固定して、上記の三軸構造体によって投影方向を制御するようにしてもよい。 Note that in the above case, the laser projector 84 may be fixed to the omnidirectional camera 774, and the projection direction may be controlled by the above three-axis structure.
 また、全天球カメラ774の向きを検出するセンサ(三軸ジャイロセンサ、三軸加速度センサなど)を設け、現地案内人が向きを変えても同じ向きの選択領域画像が抽出されるようにしてもよい。この場合には、三軸構造体は不要である。さらにまた、上記センサに代えて、全天球カメラ774によって撮像した画像自体を解析して、全天球カメラ774の向きを検出するようにしてもよい。 In addition, a sensor (such as a three-axis gyro sensor or a three-axis acceleration sensor) is installed to detect the orientation of the omnidirectional camera 774, so that even if the local guide changes the orientation, images of the selected area in the same orientation will be extracted. Good too. In this case, a triaxial structure is not required. Furthermore, in place of the above sensor, the orientation of the omnidirectional camera 774 may be detected by analyzing the image itself captured by the omnidirectional camera 774.
(4)上記実施形態では、対象物52に対して常に指示画像62をレーザプロジェクタ84によって投影するようにしている。しかし、投影方向に人がいる場合には、レーザプロジェクタ84による照射を行わないようにしてもよい。 (4) In the above embodiment, the instruction image 62 is always projected onto the object 52 by the laser projector 84. However, if there are people in the projection direction, the laser projector 84 may not emit radiation.
 これは、スマートフォン200において、撮像画像中に人がいるかどうかを判断し(たとえばYOLOなどの学習済AIにより)、人がいると判断した場合にはレーザプロジェクタ84による照射を停止することによって実現できる。人が検知されなくなると再び、照射を再開する。 This can be achieved by determining in the smartphone 200 whether or not there is a person in the captured image (for example, using trained AI such as YOLO), and stopping the irradiation by the laser projector 84 if it is determined that there is a person. . Once no more people are detected, the irradiation will resume.
 また、人を検知した場合に、全面的にレーザ照射を停止するのではなく、認識した人の領域(YOLOの場合であれば矩形領域)では照射を停止し、その他の領域では照射を行うようにしてもよい。 In addition, when a person is detected, instead of stopping laser irradiation entirely, it stops irradiating the area of the recognized person (a rectangular area in the case of YOLO) and continues irradiating in other areas. You can also do this.
 さらに、人の目を検知して、目の領域(及びその周辺流域)のみ照射を停止するようにしてもよい。 Furthermore, it may be possible to detect a person's eyes and stop irradiation only in the eye area (and the surrounding area).
(5)上記実施形態では、投影部としてレーザプロジェクタ84を用いている。しかし、通常のプロジェクタを用いてもよい。 (5) In the above embodiment, the laser projector 84 is used as the projection section. However, a normal projector may also be used.
(6)上記実施形態では、三軸構造体90(ジンバル)を用いているが、一軸、二軸構造体(ジンバル)、四軸以上の構造体などを用いてもよい。 (6) In the above embodiment, a three-axis structure 90 (gimbal) is used, but a single-axis structure, a two-axis structure (gimbal), a structure with four or more axes, etc. may also be used.
(7)上記実施形態では、現地案内人54がマーカ60を対象物52に貼り付けるようにしている。しかし、現地の対象物52に予めマーカ60を配置しておくようにしてもよい。 (7) In the above embodiment, the local guide 54 attaches the marker 60 to the object 52. However, the marker 60 may be placed on the object 52 at the site in advance.
(8)上記実施形態では、マーカ60を用いてレーザプロジェクタ84との距離・方向などを把握するようにしている。しかし、マーカ60を用いずに、SLAMなどを用い撮像画像の特徴点のみでこれを把握し、同様の処理を行うようにしてもよい。 (8) In the above embodiment, the marker 60 is used to determine the distance and direction to the laser projector 84. However, without using the marker 60, it is also possible to use SLAM or the like to grasp this only from the feature points of the captured image and perform similar processing.
 この場合、スマートフォン200にて特徴点512(画像を特徴付ける頂点など)を認識し、これを依頼人装置ITに送信し、図27に示すようにディスプレイ306に表示する。指示者はこの画像を見て、マウス316を操作し、位置特定のために用いる特徴点512を選択する。位置特定のための用いる特徴点512としては、対象物52と同じ平面上にある特徴点512を選択することが好ましい。 In this case, the smartphone 200 recognizes the feature points 512 (vertices that characterize the image, etc.), transmits them to the client device IT, and displays them on the display 306 as shown in FIG. 27. The instructor looks at this image, operates the mouse 316, and selects a feature point 512 to be used for position specification. It is preferable to select a feature point 512 on the same plane as the object 52 as the feature point 512 used for position identification.
 あるいは、対象物近傍において凹凸が多い場合には、できるだけ凹凸を含む特徴点512を選択することが好ましい。 Alternatively, if there are many irregularities near the object, it is preferable to select feature points 512 that include as many irregularities as possible.
 指示入力モードボタン501がクリックされると、選択された特徴点512の情報(画面上における座標値)が、スマートフォン200に送信される。スマートフォン200は、これら特徴点512に基づいて、位置や方向を特定することができる。 When the instruction input mode button 501 is clicked, information on the selected feature point 512 (coordinate values on the screen) is transmitted to the smartphone 200. The smartphone 200 can specify the position and direction based on these feature points 512.
(9)上記実施形態では、依頼人が依頼人装置ITにおいて図21bの画面を確認して指示入力モードボタン502をクリックするようにしている。しかし、撮像画像中のマーカ60が、撮像画像中の所定領域内(たとえば予め定めた中央の領域内)に入ったことを、依頼人装置ITまたはスマートフォン200が検知して、指示入力モードに自動的に入るようにしてもよい。マーカ60を用いずに、特徴点512によって処理をする場合も同様である。 (9) In the above embodiment, the client confirms the screen shown in FIG. 21b on the client device IT and clicks the instruction input mode button 502. However, when the client device IT or the smartphone 200 detects that the marker 60 in the captured image has entered a predetermined area (for example, a predetermined central area) in the captured image, the client device IT or the smartphone 200 automatically switches to the instruction input mode. You may try to hit the target. The same applies when processing is performed using the feature points 512 without using the marker 60.
(10)上記実施形態では、三軸構造体90の制御をモータ制御回路400で行い、画像処理に基づく投影位置等の制御をスマートフォン200で行うようにしている。 (10) In the above embodiment, the motor control circuit 400 controls the triaxial structure 90, and the smartphone 200 controls the projection position based on image processing.
 しかし、三軸構造体90の制御もスマートフォン200で行うようにしてもよい。あるいは、基部93の中に、画像処理に基づく投影位置などの制御を行う回路を設けてもよい。この場合、スマートフォン200は、通話のためにのみ用いることになる。さらに、通話機能も基部93の中に設けてもよい。 However, the three-axis structure 90 may also be controlled by the smartphone 200. Alternatively, a circuit may be provided in the base 93 to control the projection position based on image processing. In this case, smart phone 200 will be used only for phone calls. Additionally, a telephone call function may also be provided within the base 93.
(11)上記実施形態では、指示画像が歪んで(あるいは大きさが変わって)投影されないように、スマートフォン200によって指示画像を変形している。しかし、指示画像の形状に重要性がなく、特定の位置を示すことが重要である場合(たとえば、十字マークの中央点にて位置を示す場合)には、位置が正しく示せるのであれば指示画像がゆがんだとしても(大きさが変わったとしても)支障はない。このような場合には、指示画像を変形する処理は行わなくともよい。 (11) In the above embodiment, the instruction image is transformed by the smartphone 200 so that the instruction image is not projected in a distorted (or changed in size) manner. However, if the shape of the instruction image is not important and it is important to indicate a specific position (for example, when indicating the position at the center point of a cross mark), if the position can be shown correctly, the instruction image Even if it becomes distorted (even if its size changes), there is no problem. In such a case, the process of transforming the instruction image may not be performed.
(12)上記実施形態では、三軸構造体90の制御(図17)に加えて、スマートフォン200による画像処理に基づく投影位置等の制御(図20)を行うようにしている。しかし、スマートフォン200による画像処理に基づく投影位置等の制御は行わず、三軸構造体90による処理だけを行うようにしてもよい。 (12) In the above embodiment, in addition to controlling the triaxial structure 90 (FIG. 17), the projection position and the like are controlled based on image processing by the smartphone 200 (FIG. 20). However, the projection position and the like may not be controlled based on image processing by the smartphone 200, and only the processing by the triaxial structure 90 may be performed.
 現地案内人54の動きが少ない場合や、指示画像の現場における投影位置が多少ずれてもいいような場合には、このようにすることができる。たとえば、現地案内人54が身につけた方向制御付きレーザプロジェクタ776から、地面に指示画像(方向を示す矢印など)を表示し、指示装置30から道案内をする場合には、三軸構造体90による制御だけで十分である。この場合、現地案内人が体の向きを変えなければ、マーカ60も用いなくともよい。 This can be done when the movement of the local guide 54 is small or when the projection position of the instruction image at the site can be slightly shifted. For example, when displaying an instruction image (such as an arrow indicating a direction) on the ground from a laser projector 776 with direction control worn by the local guide 54 and providing directions from the instruction device 30, the triaxial structure 90 control is sufficient. In this case, if the local guide does not change the orientation of his or her body, the marker 60 may not be used.
(13)上記実施形態では、スマートフォン200により、図24B、図24Cに対応する制御をするだけでなく、図25、図26に対応する制御を行っている。しかし、図24B、図24Cに対応する制御のみを行うようにしてもよい。 (13) In the above embodiment, the smartphone 200 not only performs the controls corresponding to FIGS. 24B and 24C, but also performs the controls corresponding to FIGS. 25 and 26. However, only the controls corresponding to FIGS. 24B and 24C may be performed.
(14)上記実施形態では、依頼人装置ITに固定指令が与えられると、静止画としての参照選択領域画像を表示し、指示画像を入力するモードとしている。しかし、現地案内人が動かなければ、選択領域画像をそのまま動画として表示するようにしてもよい。 (14) In the above embodiment, when a fixing command is given to the client device IT, the mode is set such that a reference selection area image is displayed as a still image and an instruction image is input. However, if the local guide does not move, the selected area image may be displayed as a moving image.
 指示者は、この状態にて指示画像を入力し、指示画像送信ボタン502をクリックする。これを受けて、依頼人装置IT、スマートフォン200は、その時点の撮像画像を参照選択領域画像とするようにしてもよい。 The instructor inputs an instruction image in this state and clicks the instruction image transmission button 502. In response to this, the client device IT and the smartphone 200 may use the captured image at that time as the reference selection area image.
(15)上記実施形態では、指示画像として静止画を用いている。しかし、動画を指示画像として用いるようにしてもよい。この場合、現場装置では動画を繰り返して再生するようにするとよい。 (15) In the above embodiment, a still image is used as the instruction image. However, a moving image may be used as the instruction image. In this case, it is preferable that the on-site device repeatedly reproduces the video.
(16)上記実施形態では、スマートフォン200、方向制御付きレーザプロジェクタ776、全天球カメラ774により現場装置を構成している。しかし、これらを一体として構築するようにしてもよい。またスマートフォン200に代えて、専用機器、PC、タブレット、スティックタイプPC等を用いてもよい。 (16) In the above embodiment, the smartphone 200, the laser projector 776 with direction control, and the omnidirectional camera 774 constitute the on-site device. However, they may be constructed as one. Further, instead of the smartphone 200, a dedicated device, a PC, a tablet, a stick type PC, etc. may be used.
(17)上記実施形態では、図18に示すように方向変更ボタン500を操作して表示される撮像画像の方向を変更している。しかし、画面をドラッグ(マウスボタンを押したままでカーソルを移動する)することで、表示される撮像画像の方向を変更するようにしてもよい。 (17) In the above embodiment, as shown in FIG. 18, the direction of the displayed captured image is changed by operating the direction change button 500. However, the direction of the displayed captured image may be changed by dragging the screen (moving the cursor while holding down the mouse button).
(18)上記実施形態では、ヘッドセット770にカメラやプロジェクタを取り付けている。しかし、ヘルメットなどその他、身につけるものに取り付けるようにしてもよい。 (18) In the above embodiment, a camera and a projector are attached to the headset 770. However, it may also be attached to something else worn, such as a helmet.
 また、現地案内人が操作する自動車、自転車、カートなどに取り付けるようにしてもよい。 Additionally, it may be attached to a car, bicycle, cart, etc. operated by a local guide.
(19)上記実施形態では、駆動部16によって投影方向を制御した上、マーカ60などを追従して指示画像が正しく表示されるように、スマートフォン200による画像処理、投影制御(ステップS34)を行っている。しかし、マーカ60などを追従する制御についても、駆動部16によって制御するようにしてもよい。 (19) In the above embodiment, the driving unit 16 controls the projection direction, and the smartphone 200 performs image processing and projection control (step S34) so that the instruction image is correctly displayed by following the marker 60. ing. However, control for tracking the marker 60 and the like may also be controlled by the drive section 16.
(20)上記実施形態では、スマートフォン200から広角撮像画像をサーバ装置SVに送信し、サーバ装置SVにて選択領域画像を生成するようにしている。複数の依頼人装置ITが接続されていても、異なる選択領域画像を送信することができる。たとえば、図22に示すように、一人の現地案内人54に対し、複数の依頼人が遠隔体験を行うツアーのような場合、各依頼人A、B、Cは、それぞれ方変更ボタン500によって希望する方向の画像を見ることができる。この場合、指示画像については、一人の依頼人のみが行うことができる。 (20) In the above embodiment, the wide-angle captured image is transmitted from the smartphone 200 to the server device SV, and the selected area image is generated by the server device SV. Even if a plurality of client devices IT are connected, different selection area images can be transmitted. For example, as shown in FIG. 22, in the case of a tour where a plurality of clients perform remote experiences for one local guide 54, each client A, B, and C can use the change button 500 to request a tour. You can see the image in the direction you want to go. In this case, only one client can provide the instruction image.
 また、スマートフォン200にて、各依頼人A、B、Cのための選択領域画像を生成し、サーバ装置SVを介して、依頼人装置ITに送信するようにしてもよい。通信負荷を低減することができる。 Furthermore, selection area images for each client A, B, and C may be generated using the smartphone 200 and transmitted to the client device IT via the server device SV. Communication load can be reduced.
(21)上記実施形態では、全方向を撮像する全天球カメラ774を用いている。しかし、水平方向に360度(所定度でもよい)撮像するカメラ、水平より下方向を撮像する半天球カメラ、前方(後方)を撮像する半天球カメラなどを用いてもよい。 (21) In the above embodiment, a spherical camera 774 that captures images in all directions is used. However, a camera that captures images at 360 degrees in the horizontal direction (may be a predetermined degree), a hemispherical camera that captures images below the horizontal, a hemispherical camera that captures images in front (backward), etc. may also be used.
(22)上記実施形態では、商品選択のマークを指示画像を現地において投影するようにしている。しかし、依頼人の所持するスマートフォンにおいて、スマートフォン決済に用いるバーコード(たとえば、ペイペイ(商標)の支払い用バーコード)を表示した状態とする。さらに、このバーコードを、依頼人装置ITのカメラなどで読み取り、これを指示画像として、現地の机などに投影するようにしてもよい。現地の商店などは、このバーコードを読み取って、依頼人との間で決済を行うことができる。 (22) In the above embodiment, an instruction image of a product selection mark is projected at the site. However, a barcode used for smartphone payment (for example, a PayPay (trademark) payment barcode) is displayed on the client's smartphone. Furthermore, this barcode may be read by a camera of the client's device IT and projected onto a desk or the like at the site as an instruction image. Local shops can read this barcode and make payments with customers.
 また、予め依頼人のユーザ登録などの際に、クレジットカードの番号をサーバ装置SVに記録しておいてもよい。この場合、現地案内人は、決済の時に当該依頼人のクレジットカード番号を使用するためにサーバ装置SVにアクセスし、金額情報をサーバ装置SVに送信する。サーバ装置SVは、これを受けて、依頼人の依頼人端末装置ITまたは依頼人のスマートフォンに、クレジット番号と金額情報などを送信して承認を求める。依頼人が承認すると、サーバ装置SVは、当該クレジット番号、金額を用いて、クレジット会社のサーバにアクセスして決済処理を行う。 Additionally, the credit card number may be recorded in the server device SV in advance when the client registers as a user. In this case, the local guide accesses the server device SV to use the client's credit card number at the time of payment, and sends amount information to the server device SV. In response to this, the server device SV transmits the credit number, amount information, etc. to the client's client terminal device IT or the client's smartphone to request approval. When the client approves, the server device SV uses the credit number and amount to access the credit company's server and process the payment.
(23)上記実施形態では、依頼人装置ITとして据置型のPCを例として示した。しかし、スマートフォン、タブレット、ノートパソコンなどを用いてもよい。また、ディスプレイに現地の状況を表示する代わりに、依頼人が装着したヘッドマウントディスプレイ(HMD)に現地の画像を表示するようにしてもよい。その際、HMDに6Dof(ヘッドトラッカー)を設けて、依頼人の頭の動きを検出して、方向指令を与えるようにしてもよい。これにより、依頼人はマウスなどで方向指令を入力しなくとも、頭の向きに応じて現地画像の向きが変わり、自然に現地の風景などを楽しむことができる。 (23) In the above embodiment, a stationary PC is used as an example of the client device IT. However, a smartphone, tablet, laptop computer, etc. may also be used. Furthermore, instead of displaying the local situation on a display, an image of the local area may be displayed on a head-mounted display (HMD) worn by the client. At that time, a 6Dof (head tracker) may be provided in the HMD to detect the movement of the client's head and give a direction command. As a result, the client can change the direction of the local image according to the direction of their head without having to input direction commands using a mouse or the like, allowing them to naturally enjoy the local scenery.
(24)上記の各変形例は、その本質に反しない限り互いに組み合わせて適用することができる。また、他の実施形態、その変形例とも組み合わせて適用することが可能である。
 
(24) The above modifications can be applied in combination with each other as long as it does not contradict the essence. Further, it is possible to apply the present invention in combination with other embodiments and modifications thereof.
3.第3の実施形態
3.1機能構成
 図28に、第3の実施形態による遠隔体験システムの機能構成を示す。この実施形態では、撮像部12として全天球カメラなどの広角カメラ、投影部14として全天球レーザプロジェクタを用いている。
3. Third embodiment
3.1 Functional Configuration FIG. 28 shows the functional configuration of the remote experience system according to the third embodiment. In this embodiment, a wide-angle camera such as a spherical camera is used as the imaging section 12, and a spherical laser projector is used as the projection section 14.
 現地装置GTの撮像部12は全天球カメラなどの広角カメラであり、現地案内人の全天球方向全てを撮像して広角撮像画像を生成する。この広角撮像画像は、撮像画像送信手段18の制御により、送信部22によって、サーバ装置SVに送信される。サーバ装置SVは、この広角撮像画像を受信し、依頼人装置ITから受けた方向指示に基づいて、広角撮像画像から当該方向の部分を選択し、選択領域画像を生成する。サーバ装置SVの撮像画像転送手段は、生成した選択領域画像を、送信部758によって、依頼人装置ITに送信する。 The imaging unit 12 of the field device GT is a wide-angle camera such as a spherical camera, and captures images of the local guide in all spherical directions to generate a wide-angle captured image. This wide-angle captured image is transmitted to the server device SV by the transmitter 22 under the control of the captured image transmitter 18. The server device SV receives this wide-angle captured image, selects a portion in that direction from the wide-angle captured image based on the direction instruction received from the client device IT, and generates a selected area image. The captured image transfer means of the server device SV transmits the generated selected area image to the client device IT using the transmitter 758.
 依頼人装置ITの撮像画像受信手段36は、依頼人が指定した方向の選択領域画像を受信し、撮像画像表示部40に表示する。依頼人は、この選択領域画像を見て、方向を変えたい場合には、依頼人装置ITに方向指示を入力する。 The captured image receiving means 36 of the client's device IT receives the selected area image in the direction specified by the client, and displays it on the captured image display section 40. When the client looks at this selection area image and wants to change the direction, he inputs a direction instruction into the client's device IT.
 依頼人装置ITの方向指示送信手段39は、方向指示を、送信部34によってサーバ装置SVに送信する。サーバ装置SVは、この方向指示を受信し、広角撮像画像から選択領域画像を選択するために用いる。また、方向指令転送手段752は、送信部758により、方向指示を現地装置GTに送信する。 The direction instruction transmitting means 39 of the client device IT transmits the direction instruction to the server device SV through the transmitter 34. The server device SV receives this direction instruction and uses it to select a selection area image from the wide-angle captured image. Further, the direction command transfer means 752 transmits the direction command to the local device GT using the transmitter 758.
 以上のようにして、依頼人は、現地案内人の全天球方向のいずれの方向の画像も選択して見ることができる。これにより、依頼人は、視野が固定されず自由な方向の画像を見て体験を楽しむことができる。 As described above, the client can select and view images of the local guide in any direction in the celestial sphere. This allows the client to enjoy the experience of viewing images in any direction without having a fixed field of view.
 さらに、一人の現地案内人に対して複数の依頼人が接続している場合であっても、各依頼人は、それぞれの好みの方向の選択領域画像を独立して楽しむことができる。 Furthermore, even if multiple clients are connected to one local guide, each client can independently enjoy the selected area image in their preferred direction.
 指示を与える際、依頼人は対象物を含む選択領域画像が表示された状態で、固定指令を入力する。撮像画像表示部40は、固定指令が入力されるとその方向の選択領域画像を参照選択領域画像とし、静止画として表示する。固定指令が与えられた際の方向は、サーバ装置SVを介して、現地装置GTに送信される。 When giving an instruction, the client inputs a fixing command while the selected area image containing the target object is displayed. When the fixed command is input, the captured image display unit 40 sets the selected area image in that direction as a reference selected area image and displays it as a still image. The direction when the fixed command is given is transmitted to the local device GT via the server device SV.
 依頼人は、撮像画像表示部40に表示された現地の参照選択画像を見ながら、指示画像入力部44から指示画像を入力する。指示画像送信手段38は、送信部34により、入力された指示画像を、サーバ装置SVを介して現地装置GTに送信する。 The client inputs an instruction image from the instruction image input section 44 while viewing the local reference selection image displayed on the captured image display section 40. The instruction image transmitting means 38 transmits the instruction image input by the transmitter 34 to the local device GT via the server device SV.
 現地装置GTの追従制御手段21は、受信部24によって指示画像を受信し、前記固定指令が与えられた際の方向に基づいて、当該方向に向けて指示画像を投影するように、投影部14を制御する。これにより、対象物52上に指示画像62が投影される。 The follow-up control means 21 of the field device GT receives the instruction image by the receiving section 24, and controls the projection section 14 to project the instruction image in the direction based on the direction when the fixing command is given. control. As a result, the instruction image 62 is projected onto the target object 52.
 投影部14による指示画像の投影方向は、参照選択領域画像の方向と合致されるので、指示画像は指示者の意図する箇所に投影されることになる。この制御だけでも実施可能であるが、とはいえ、現場担当者が場所を移動すると、指示画像の投影位置はずれてしまうことになる。 The direction in which the instruction image is projected by the projection unit 14 matches the direction of the reference selection area image, so the instruction image is projected onto the location intended by the instructor. Although it is possible to implement this control alone, if the person in charge of the site moves from place to place, the projected position of the instruction image will shift.
 そこで、現地装置GTの補正手段26は、参照選択領域画面における特徴部分画像(マーカなど)と現在の選択領域画像における特徴部分画像との比較により、指示画像が意図した位置に正しく投影されるように、指示画像を変形したり指示画像の投影位置を補正制御する。これにより、現地案内人が移動したとしても、指示画像が正しい位置に表示される。
 
Therefore, the correction means 26 of the local device GT compares the characteristic partial images (markers, etc.) on the reference selection area screen with the characteristic partial images in the current selection area image so that the instruction image is correctly projected at the intended position. Then, the instruction image is deformed and the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
3.2外観及びハードウエア構成
 この実施形態では、図29に示すように、方向制御付きレーザプロジェクタ776に代えて、全天球レーザプロジェクタ780を用いている。したがって、ヘッドセット770の頂部には、全天球カメラ774、全天球レーザプロジェクタ780が固定される。
3.2 Appearance and Hardware Configuration In this embodiment, as shown in FIG. 29, a spherical laser projector 780 is used instead of the laser projector 776 with direction control. Therefore, a spherical camera 774 and a spherical laser projector 780 are fixed to the top of the headset 770.
 全天球レーザプロジェクタ780は、上下左右全方向に投影可能に構成されている。複数のレーザプロジェクタを組み合わせて構成したものであってもよい。 The omnidirectional laser projector 780 is configured to be able to project in all directions, up, down, left and right. It may be constructed by combining a plurality of laser projectors.
 依頼人装置ITのハードウエア構成は、第1の実施形態と同様である(図5参照)。また、三軸構造体90を用いないので、これを制御するモータ92、94、96は不要であり、モータ制御回路400も不要である。スマートフォン200のハードウエア構成は、第1の実施形態と同様である(図3参照)。サーバ装置SVのハードウエア構成も、第1の実施形態と同様である(図4参照)。
 
The hardware configuration of the client device IT is the same as that in the first embodiment (see FIG. 5). Furthermore, since the three-axis structure 90 is not used, the motors 92, 94, and 96 for controlling it are unnecessary, and the motor control circuit 400 is also unnecessary. The hardware configuration of the smartphone 200 is the same as that of the first embodiment (see FIG. 3). The hardware configuration of the server device SV is also similar to that of the first embodiment (see FIG. 4).
3.3遠隔体験処理
 図30に、遠隔体験処理のフローチャートを示す。現地案内人54のスマートフォン200は、近距離通信(有線通信でもよい)によって全天球カメラ774の広角撮像画像を取得し、インターネットを介して、サーバ装置SVに送信する(ステップS21)。サーバ装置SVは、この広角撮像画像を受信してSSD558に記録する。
3.3 Remote experience processing Figure 30 shows a flowchart of remote experience processing. The smartphone 200 of the local guide 54 acquires a wide-angle captured image of the omnidirectional camera 774 by short-range communication (wired communication may be used), and transmits it to the server device SV via the Internet (step S21). The server device SV receives this wide-angle captured image and records it on the SSD 558.
 全天球カメラ774の出力する広角撮像画像は、全方向を撮像した画像である。サーバ装置SVは、依頼人装置ITから取得して記録している方向指令に従って、この広角撮像画像から所定方向の画像を選択し、選択領域画像を生成する(ステップS91)。したがって、選択領域画像は、現地案内人が通常のカメラにて所定方向に撮像を行った場合の画像と同じとなる。サーバ装置SVは、生成した選択領域画像を、依頼人装置ITに送信する。 The wide-angle captured image output by the omnidirectional camera 774 is an image captured in all directions. The server device SV selects an image in a predetermined direction from this wide-angle captured image according to the direction command acquired and recorded from the client device IT, and generates a selected area image (step S91). Therefore, the selected area image is the same as the image obtained when the local guide takes an image in a predetermined direction with a normal camera. The server device SV transmits the generated selection area image to the client device IT.
 依頼人装置ITは、選択領域画像を受信し、ディスプレイ306に表示する(ステップS41)。図18に、表示された選択領域画像を示す。これにより、依頼人は、現地の動画を見ることができる。現地案内人の移動につれて、現地の様子を楽しむことができる。 The client device IT receives the selected area image and displays it on the display 306 (step S41). FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
 なお、上述のように、所定方向の画像が選択され選択領域画像として表示される。この所定方向は、広角撮像画像における方向として決定されるので、現地案内人を中心とした上下左右方向として定められることになる。 Note that, as described above, an image in a predetermined direction is selected and displayed as a selected area image. Since this predetermined direction is determined as a direction in the wide-angle captured image, it is determined as the up, down, left and right directions centered on the local guide.
 依頼人がこの所定方向を変更して、異なる方向の画像を見たい場合には、依頼人装置ITのキーボード/マウス316を操作し、方向指令ボタン500をクリックする。方向指令ボタン500は、上下左右の円周方向360度においてクリックできるようになっている。 If the client wishes to change this predetermined direction and view the image in a different direction, he/she operates the keyboard/mouse 316 of the client's device IT and clicks on the direction command button 500. The direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
 方向指令ボタン500がクリックされると、依頼人装置ITは、上記クリックに応じた方向指令をサーバ装置SVに送信する(ステップS42)。サーバ装置SVは、これを受信し、方向指令を更新する(ステップS92)。したがって、サーバ装置SVは、ステップS91において選択領域画像を選択する所定方向を変更し、依頼人装置ITのディスプレイ306には、方向指令ボタン500によって指令した方向の画像が表示されることになる。たとえば、図19に示すように、異なる方向の選択領域画像を見ることができる。 When the direction command button 500 is clicked, the client device IT transmits a direction command corresponding to the click to the server device SV (step S42). Server device SV receives this and updates the direction command (step S92). Therefore, the server device SV changes the predetermined direction for selecting the selection area image in step S91, and the image in the direction commanded by the direction command button 500 is displayed on the display 306 of the client device IT. For example, as shown in FIG. 19, selected area images can be viewed in different directions.
 また、サーバ装置SVは、依頼人装置ITから受信した方向指令を、スマートフォン200に送信し(ステップS92)、スマートフォン200はこれを受信して、方向指令を更新する(ステップS22)。 Additionally, the server device SV transmits the direction command received from the client device IT to the smartphone 200 (step S92), and the smartphone 200 receives this and updates the direction command (step S22).
 なお、広角撮像画像は全天球の方向の画像であるので、現地案内人を中心とした上下左右いずれの方向の選択領域画像も見ることが可能である。 Note that since the wide-angle captured image is an image in all celestial sphere directions, it is possible to view selected area images in any direction, up, down, left, or right, centered on the local guide.
 上記のようにして、現地案内人の向いている方向に拘わらず、依頼人は自らの操作により、所望の方向の現地画像を楽しむことができる。 As described above, regardless of the direction in which the local guide is facing, the client can enjoy the local image in the desired direction through his/her own operations.
 上述の処理により、依頼人の選択した方向の選択領域画像は依頼人装置ITのディスプレイ306に動画として表示されている。依頼人は、この部分撮像画像を見ながら、対象物52およびマーカ60が表示された状態にて、指示入力モードボタン502をクリックする。 Through the above-described processing, the selected area image in the direction selected by the client is displayed as a moving image on the display 306 of the client's device IT. The client clicks the instruction input mode button 502 while viewing the partial captured image and with the object 52 and marker 60 displayed.
 依頼人装置ITは、指示入力モードボタン501がクリックされることにより固定指令が与えられると、そのときの選択領域画像を参照選択領域画像とし、静止画としてディスプレイ306に表示する(ステップS52)。依頼人は、この静止画に対し、マウス316を用いて現地案内人に対する指示を指示画像として入力する(ステップS53)。たとえば、図21bに示すように、ディスプレイ306に表示された商品の画像において、購入したい商品52cの上に指示画像62(この例では、丸画像)をマウス316にて描画し、入力する。 When a fixing command is given by clicking the instruction input mode button 501, the client device IT sets the selected area image at that time as a reference selected area image and displays it on the display 306 as a still image (step S52). The client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, in the product image displayed on the display 306, an instruction image 62 (in this example, a circle image) is drawn on the product 52c desired to be purchased using the mouse 316 and input.
 また、依頼人装置ITは、上記固定指令および固定指令が与えられた時の方向を、サーバ措置SVを介して、スマートフォン200に送信する(ステップS51、S93)。固定指令を受けたスマートフォン200は、当該固定指令を受信した際の選択領域画像と方向とに基づいて、参照選択領域画像を決定し、不揮発性メモリ212に記録する(ステップS32)。なお、図30のステップS22において、方向指令を受信しているので、スマートフォン200は、広角撮像画像から選択領域画像を生成することができる。 Further, the client device IT transmits the fixed command and the direction when the fixed command was given to the smartphone 200 via the server SV (steps S51, S93). The smartphone 200 that has received the fixing command determines a reference selected area image based on the selected area image and direction at the time of receiving the fixing command, and records it in the nonvolatile memory 212 (step S32). Note that since the direction command has been received in step S22 of FIG. 30, the smartphone 200 can generate the selected area image from the wide-angle captured image.
 上記のように、依頼人装置ITとスマートフォン200において、同じ時点の選択領域画像を参照選択領域画像として認識することができる。なお、通信によるタイムラグを避けるため、依頼人装置ITからスマートフォン200に固定指令を送信する際に、フレームを特定する情報(フレーム番号など)を付けて送信するようにしてもよい。スマートフォン200において、このフレームを特定する情報に基づいて参照選択領域画像を決定することで、タイムラグによるずれを防止することができる。 As described above, the client device IT and the smartphone 200 can recognize the selected area image at the same time as the reference selected area image. Note that in order to avoid time lag due to communication, when transmitting the fixing command from the client device IT to the smartphone 200, information for identifying the frame (such as a frame number) may be attached and transmitted. In the smartphone 200, by determining the reference selection area image based on information specifying this frame, it is possible to prevent deviations due to time lag.
 依頼人は、指示画像を入力し終わると、ディスプレイ306の参照選択領域画像の右下に表示されている指示画像送信ボタン502(指示入力モードになると表示される)をクリックする。これにより、依頼人装置ITは、サーバ装置SVを介して、指示画像をスマートフォン200に送信する(ステップS53、S94)。また、依頼人装置ITは、指示入力モードを解除して参照選択領域画像としての静止画の表示を止めて、送信されてきた選択領域画像を動画として表示する(ステップS54)。これにより、依頼人は、再び現地の状況を見ることができるる。 When the client finishes inputting the instruction image, he or she clicks the instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference selection area image on the display 306. Thereby, the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94). Further, the client device IT cancels the instruction input mode, stops displaying the still image as the reference selection area image, and displays the transmitted selection area image as a moving image (step S54). This allows the client to see the local situation again.
 スマートフォン200は、図23Aの指示画像データを受信すると、これをメモリ204に保持する。さらに、スマートフォン200は、全天球カメラ774から現在の広角撮像画像を取得し、ステップS22にて受信した方向指令に基づいて、選択領域画像を抽出する(ステップS33)。 Upon receiving the instruction image data of FIG. 23A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current wide-angle captured image from the omnidirectional camera 774, and extracts a selected area image based on the direction command received in step S22 (step S33).
 スマートフォン200は、参照選択領域画像のマーカと、現在の選択領域画像のマーカとを比較して、マーカに追従して正しく指示画像が投影されるように、全天球レーザプロジェクタ780による指示画像の投影方向を制御する(ステップS34)。さらに、上記のマーカの比較により、指示画像が正しく投影されるように指示画像を変形し、その投影位置を制御する。 The smartphone 200 compares the marker of the reference selection area image with the marker of the current selection area image, and adjusts the instruction image by the spherical laser projector 780 so that the instruction image is correctly projected by following the marker. The projection direction is controlled (step S34). Furthermore, by comparing the markers, the instruction image is transformed so that it is correctly projected, and its projection position is controlled.
 なお,この実施形態では、全天球レーザプロジェクタ780を用いているので、複数人の依頼人からの指示画像を同時に投影することができる。
 
Note that in this embodiment, since the omnidirectional laser projector 780 is used, it is possible to simultaneously project images instructed by a plurality of clients.
3.4変形例
(1)上記実施形態では、全方向を撮像する全天球カメラ774、全方向に投影する全天球レーザプロジェクタ780を用いている。しかし、水平方向に360度(所定度でもよい)撮像するカメラ、水平より下方向を撮像する半天球カメラ、前方(後方)を撮像する半天球カメラ、水平方向に360度(所定度でもよい)投影するレーザプロジェクタ、水平より下方向(上方向)に投影する半天球レーザプロジェクタ、前方(後方)に投影する半天球レーザプロジェクタなどを用いてもよい。
3.4 Variations
(1) In the above embodiment, a spherical camera 774 that captures images in all directions and a spherical laser projector 780 that projects images in all directions are used. However, a camera that captures images 360 degrees horizontally (may be a predetermined degree), a hemispherical camera that captures images below the horizontal, a hemispherical camera that captures images in front (backward), and a camera that captures images 360 degrees horizontally (may be a predetermined degree) A laser projector that projects, a hemispherical laser projector that projects downward (upwards) from the horizontal, a hemispherical laser projector that projects forward (rearward), etc. may be used.
(2)上記実施形態では、部分特徴画像としてマーカを用いているが、画像の特徴点などを用いるようにしてもよい。 (2) In the above embodiment, a marker is used as the partial feature image, but feature points of the image may also be used.
(3)上記実施形態では、全天球カメラ774、全天球レーザプロジェクタ780をヘッドセット770に直接取り付けるようにしている。しかし、シリコンゲルなどの緩衝材を介して取り付けるようにしてもよい。 (3) In the above embodiment, the omnidirectional camera 774 and the omnidirectional laser projector 780 are directly attached to the headset 770. However, it may be attached via a cushioning material such as silicone gel.
(4)上記実施形態では、広角撮像画像の基準方向(たとえば、現地案内人の前方)を基準として、方向指示を与えるようにしている。このため、現地案内人が体の向きを変えると、それに応じて選択領域画像の向きも変わってしまう。依頼人が道路の右側の建物をみたいと思っていても、現地案内人が横を向いたりすると、所望の方向を見ることができない。 (4) In the above embodiment, the direction instruction is given based on the reference direction of the wide-angle captured image (for example, in front of the local guide). Therefore, when the local guide changes the orientation of his or her body, the orientation of the selected area image changes accordingly. Even if the client wants to see the buildings on the right side of the road, if the local guide turns to the side, he or she will not be able to see the desired direction.
 そこで、全天球カメラ774(全天球レーザプロジェクタ780)を、ヘットセット770に固定して取り付けるのではなく、三軸構造体を介して取り付けて方向制御を行うようにすることもできる。これにより、現地案内人が向きを変えても、所望の方向(たとえば道路の右側)の領域が選択領域画像として得られる。 Therefore, instead of being fixedly attached to the headset 770, the omnidirectional camera 774 (the omnidirectional laser projector 780) can be attached via a three-axis structure to perform direction control. As a result, even if the local guide changes direction, the area in the desired direction (for example, the right side of the road) can be obtained as the selected area image.
(5)上記の各変形例は、その本質に反しない限り互いに組み合わせて適用することができる。また、他の実施形態、その変形例とも組み合わせて適用することが可能である。
 
(5) The above modifications can be applied in combination with each other as long as they do not contradict their essence. Further, it is possible to apply the present invention in combination with other embodiments and modifications thereof.
4.第4の実施形態
4.1機能構成
 この実施形態においては、第1の実施形態と同じように投影部14を駆動部16によって方向制御するようにしている。この実施形態では、投影部14だけでなく、撮像部12も、駆動部16によって方向制御するようにしている。
4. Fourth embodiment
4.1 Functional Configuration In this embodiment, the direction of the projection section 14 is controlled by the driving section 16 in the same way as in the first embodiment. In this embodiment, not only the projection section 14 but also the imaging section 12 are controlled in direction by the driving section 16.
 図32に、第4の実施形態による遠隔体験システムの機能構成を示す。現地案内人は、駆動部16を介して、撮像部12、投影部14を装着している。撮像部12の撮像領域と、投影部14の投影領域は、実質的に同じになるように配置されている。 FIG. 32 shows the functional configuration of the remote experience system according to the fourth embodiment. The local guide wears the imaging section 12 and the projection section 14 via the driving section 16. The imaging area of the imaging unit 12 and the projection area of the projection unit 14 are arranged to be substantially the same.
 これら撮像部12、投影部14は、一体として、駆動部16により、その撮像方向、投影方向を変化できるように構成されている。撮像部12、投影部14の撮像方向、投影方向は、センサ28によって検出される。方向制御手段20は、センサ28の出力に基づいて駆動部16を制御し、現地案内人の動きに拘わらず、撮像部12、投影部14の方向を、現地案内人を中心とした所定方向に維持する。 These imaging section 12 and projection section 14 are integrally configured so that their imaging direction and projection direction can be changed by a driving section 16. The imaging direction and projection direction of the imaging unit 12 and the projection unit 14 are detected by the sensor 28. The direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and directs the imaging unit 12 and the projection unit 14 in a predetermined direction centered on the local guide, regardless of the movement of the local guide. maintain.
 現地装置GTの撮像部12は、現地を撮像して撮像画像を生成する。この撮像画像は、撮像画像送信手段18の制御により、送信部22によって、サーバ装置SVを介して、依頼人装置ITに送信される。依頼人装置ITの撮像画像受信手段36は、受信部32により撮像画像を受信する。撮像画像表示部40は、受信した撮像画像を表示する。これにより、指示者は現地の画像を見ることができる。 The imaging unit 12 of the local device GT images the site and generates a captured image. This captured image is transmitted by the transmitting unit 22 to the client device IT via the server device SV under the control of the captured image transmitting means 18. The captured image receiving means 36 of the client device IT receives the captured image by the receiving unit 32. The captured image display unit 40 displays the received captured image. This allows the instructor to view images of the site.
 また、依頼人が異なる方向を見たい場合には、依頼人装置ITに方向指示を入力する。依頼人装置ITの方向指示送信手段39は、この方向指示を、サーバ装置SVを介して現地装置GTに送信する。現地装置GTの方向制御手段20は、駆動部16を制御し、撮像部12、投影部14の所定方向を、方向指示に従って変更する。 Furthermore, if the client wishes to look in a different direction, he or she inputs a direction instruction into the client's device IT. The direction instruction transmitting means 39 of the client device IT transmits this direction instruction to the local device GT via the server device SV. The direction control means 20 of the field device GT controls the drive section 16 and changes the predetermined directions of the imaging section 12 and the projection section 14 according to the direction instruction.
 したがって、現地における撮像方向が方向指示に従って変更され、依頼人装置ITに表示される撮像画像も異なる方向のものに変更される。このようにして、依頼人は、現地案内人の向きに拘わらず、自らが見たい方向を見ることができる。 Therefore, the imaging direction at the site is changed according to the direction instruction, and the captured image displayed on the client device IT is also changed to one in a different direction. In this way, the client can look in the direction he/she wants to see, regardless of the orientation of the local guide.
 指示を与える際、依頼人は固定指令を入力する。撮像画像表示40は、固定指令が入力されるとその際の撮像画像を参照撮像画像とし、静止画として表示する。依頼人は、撮像画像表示部40に表示された現地の参照撮像画像を見ながら、指示画像入力部44から指示画像を入力する。指示画像送信手段38は、送信部34により、入力された指示画像を、サーバ装置SVを介して現地装置GTに送信する。 When giving instructions, the client inputs a fixed command. When a fixing command is input, the captured image display 40 uses the captured image at that time as a reference captured image and displays it as a still image. The client inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40. The instruction image transmitting means 38 transmits the instruction image input by the transmitter 34 to the local device GT via the server device SV.
 現地装置GTは、受信部24によって指示画像を受信し、投影部14から指示画像を投影する。これにより、対象物52上に指示画像62が投影される。上述のように、投影部14の投影方向は固定されているので、現地案内人が顔の向きを変えたとしても、指示画像は依頼人の意図する箇所に投影されることになる。とはいえ、現地案内人が場所を移動すると、指示画像の投影位置は、ずれてしまうことになる。 The local device GT receives the instruction image through the receiving section 24 and projects the instruction image from the projection section 14. As a result, the instruction image 62 is projected onto the target object 52. As described above, since the projection direction of the projection unit 14 is fixed, even if the local guide changes the direction of his or her face, the instruction image will be projected onto the location intended by the client. However, if the local guide moves from place to place, the projected position of the instruction image will shift.
 そこで、現地装置GTの補正手段26は、参照撮像画面における特徴部分画像(マーカなど)と現在の撮像画像における特徴部分画像との比較により、指示画像が意図した位置に正しく投影されるように、指示画像の投影位置を補正制御する。これにより、現地案内人が移動したとしても、指示画像が正しい位置に表示される。 Therefore, the correction means 26 of the field device GT compares the characteristic partial image (marker, etc.) in the reference captured image with the characteristic partial image in the current captured image so that the instruction image is correctly projected at the intended position. The projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
 なお、上記を実現するため、依頼人装置ITが固定指令を受けると、送信部34により固定指令を、サーバ装置SVを介して現地装置GTに送信する。現地装置GTの受信部24はこれを受信して、そのときの撮像画像を参照撮像画像として記録する。また、現地案内人は、対象物52の近傍にマーカを置いて撮像されるようにする。 In order to realize the above, when the client device IT receives a fixing command, the transmitting unit 34 transmits the fixing command to the local device GT via the server device SV. The receiving unit 24 of the field device GT receives this and records the captured image at that time as a reference captured image. Further, the local guide places a marker near the object 52 so that the image is taken.
 現地案内人は、現地において実際に投影された指示画像62に基づいて購入すべき商品や、自らが進む方向などの指示を受けることができる。この指示画像62は、現地案内人が頭の向きを変えても方向制御手段20によって正しい位置に表示される。したがって、現地案内人の頭の向きによって指示画像62が消えてしまうことがなくストレスが少ない。また、複数人で案内を行っている場合、現地装置GTを身につけている現地案内人が、頭を大きく動かしたとしても指示画像62が投影され続け、他の現地案内人が戸惑うことがない。さらに、現地案内人が移動したとしても、指示画像62が正しく表示される。
 
The local guide can receive instructions on the products to purchase and the direction in which he or she should proceed based on the instruction image 62 actually projected at the site. This instruction image 62 is displayed at the correct position by the direction control means 20 even if the local guide changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the local guide's head, reducing stress. In addition, when multiple people are providing guidance, even if the local guide wearing the local device GT moves his or her head significantly, the instruction image 62 will continue to be projected, and the other local guides will not be confused. . Furthermore, even if the local guide moves, the instruction image 62 is displayed correctly.
4.2外観およびハードウエア構成
 現地案内人54が現地装置GTを装着した状態を、図33に示す。ヘッドセット770の頂部には、カメラ・レーザプロジェクタ複合体58が設けられている。カメラ・レーザプロジェクタ複合体58の構成は、図34に示すとおりである。基本的な構成は、第2の実施形態の方向制御付きレーザプロジェクタ776と同様である。ただし、この実施形態では、ユニット80内に、レーザプロジェクタ84だけでなく、カメラ82も収納されている。したがって、レーザプロジェクタ84、カメラ82がともに、方向制御されることになる。また、レーザプロジェクタ84の投影領域と、カメラ82の撮像領域は、実質的に一致するように構成されている。
4.2 Appearance and Hardware Configuration Figure 33 shows the state in which the local guide 54 is wearing the local device GT. At the top of headset 770 is a camera/laser projector complex 58 . The configuration of the camera/laser projector complex 58 is as shown in FIG. The basic configuration is similar to the direction-controlled laser projector 776 of the second embodiment. However, in this embodiment, not only the laser projector 84 but also the camera 82 is housed within the unit 80. Therefore, both the laser projector 84 and the camera 82 are directionally controlled. Furthermore, the projection area of the laser projector 84 and the imaging area of the camera 82 are configured to substantially match.
 スマートフォン200のハードウエア構成は図3、サーバ装置SVのハードウエア構成は図4、依頼人装置ITのハードウエア構成は図5、モータ制御回路400のハードウエア構成は図16と同様である。
 
The hardware configuration of the smartphone 200 is the same as that shown in FIG. 3, the hardware configuration of the server device SV is the same as that shown in FIG. 4, the hardware configuration of the client device IT is the same as that shown in FIG. 5, and the hardware configuration of the motor control circuit 400 is the same as that shown in FIG. 16.
4.3遠隔体験処理
 図35に、案内時のフローチャートを示す。現地案内人のスマートフォン200は、近距離通信によりカメラ92の撮像画像(動画)を取得し、サーバ装置SVに送信する(ステップS21)。サーバ装置SVは、この撮像画像を受信して、依頼人装置ITに転送する(ステップS91)。
4.3 Remote experience processing Figure 35 shows a flowchart during guidance. The local guide's smartphone 200 acquires an image (video) taken by the camera 92 through short-range communication, and transmits it to the server device SV (step S21). The server device SV receives this captured image and transfers it to the client device IT (step S91).
 依頼人装置ITは、撮像画像を受信し、ディスプレイ306に表示する(ステップS41)。図18に、表示された選択領域画像を示す。これにより、依頼人は、現地の動画を見ることができる。現地案内人の移動につれて、現地の様子を楽しむことができる。 The client device IT receives the captured image and displays it on the display 306 (step S41). FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
 なお、カメラ・レーザプロジェクタ統合体58には、ジャイロセンサJS、加速度センサASが設けられている。モータ制御回路400は、ジャイロセンサJS、加速度センサASの出力を受けて、カメラ・レーザプロジェクタ統合体58(の基部93)の向きを検出する(ステップS1)。モータ制御回路400は、基部93の向きが変化しても、ユニット80の向きが所定方向となるように、モータ92、94、96を制御する(ステップS2)。したがって、現地案内人の向きに拘わらず、所定方向の画像が撮像されることになる。 Note that the camera/laser projector integrated body 58 is provided with a gyro sensor JS and an acceleration sensor AS. The motor control circuit 400 receives the outputs of the gyro sensor JS and the acceleration sensor AS, and detects the orientation of (the base 93 of) the camera/laser projector integrated body 58 (step S1). The motor control circuit 400 controls the motors 92, 94, and 96 so that the unit 80 is oriented in a predetermined direction even if the orientation of the base 93 changes (step S2). Therefore, an image in a predetermined direction is captured regardless of the direction of the local guide.
 依頼人が撮像方向を変更して、異なる方向の画像を見たい場合には、依頼人装置ITのキーボード/マウス316を操作し、方向指令ボタン500をクリックする。方向指令ボタン500は、上下左右の円周方向360度においてクリックできるようになっている。 If the client wants to change the imaging direction and view images in a different direction, he or she operates the keyboard/mouse 316 of the client's device IT and clicks on the direction command button 500. The direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
 方向指令ボタン500がクリックされると、依頼人装置ITは、上記クリックに応じた方向指令をサーバ装置SVに送信する(ステップS42)。サーバ装置SVは、これを受信して、スマートフォン200に転送する(ステップS92)。方向指令を受信したスマートフォン200は、これをモータ制御回路400に転送する(ステップS22)。モータ制御回路400は、この方向指令を受けて、一定に制御する方向(所定方向)を変更する(ステップS3)。 When the direction command button 500 is clicked, the client device IT transmits a direction command corresponding to the click to the server device SV (step S42). Server device SV receives this and transfers it to smartphone 200 (step S92). The smartphone 200 that has received the direction command transfers it to the motor control circuit 400 (step S22). Upon receiving this direction command, the motor control circuit 400 changes the direction (predetermined direction) to be constantly controlled (step S3).
 以上のようにして、依頼人は、自らの操作により、所望の方向の安定した画像を見ることができる。 As described above, the client can view a stable image in the desired direction through his/her own operations.
 さらに、この実施形態では、現地の対象物に画像を投影することで依頼人から現地案内人に指示を行うことができるようにしている。依頼人装置ITから指示画像を送信し、現地案内人の装着しているレーザプロジェクタ84にて、現地の対象物に指示画像を投影する。たとえば、現地にて現地案内人に買い物を依頼する時、購入したい商品に指示画像を投影すれば、正確な指示を行うことができる。 Furthermore, in this embodiment, the client can give instructions to the local guide by projecting an image onto a local object. An instruction image is transmitted from the client's device IT, and is projected onto an object at the site using a laser projector 84 worn by the local guide. For example, when requesting shopping from a local guide at a local location, by projecting an instruction image onto the product the user wants to purchase, the guide can provide accurate instructions.
 図36に、指示画像投影のフローチャートを示す。ここでは、図21aに示すように、販売店において商品52a、52b、52c、52d等が並んでいる中、52cの購入を指示する場合について説明する。 FIG. 36 shows a flowchart of instruction image projection. Here, as shown in FIG. 21a, a case will be described in which the user instructs to purchase 52c among products 52a, 52b, 52c, 52d, etc. lined up at a store.
 指示画像による指示を行う場合、依頼人は音声通話などで、現地案内人54に対し近傍に予め用意しているマーカ60を置くように伝える。マーカ60の画像(特徴部分画像)は、予め、大きさや形状がスマートフォン200の不揮発性メモリ212に記録されている。したがって、スマートフォン200は、撮像されたマーカ60の画像に基づいて、レーザプロジェクタ84からマーカ60までの距離、方向などを算出することができる。 When giving an instruction using an instruction image, the client instructs the local guide 54 to place a marker 60 prepared in advance nearby by voice call or the like. The size and shape of the image of the marker 60 (characteristic portion image) are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the laser projector 84 to the marker 60 based on the captured image of the marker 60.
 上述のように図35に示す処理により、撮像画像は依頼人装置ITのディスプレイ306に動画として表示されている。依頼人は、この撮像画像を見ながら、図21aに示すように、マーカ60が撮像された状態にて、指示入力モードボタン501をクリックして固定指令を与える。なお、ここでは、カードとして用意されているマーカ60を、商品52bに立てかけて置いたものとする。 As described above, through the process shown in FIG. 35, the captured image is displayed as a moving image on the display 306 of the client's device IT. While looking at this captured image, the client clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 21a to give a fixing instruction. Note that here, it is assumed that the marker 60 prepared as a card is placed leaning against the product 52b.
 依頼人装置ITは、指示入力モードボタン501がクリックされることにより固定指令が与えられると、そのときの撮像画像を参照撮像画像とし、静止画としてディスプレイ306に表示する(ステップS52)。依頼人は、この静止画に対し、マウス316を用いて現地案内人に対する指示を指示画像として入力する(ステップS53)。たとえば、図21bに示すように、ディスプレイ306に表示された商品52cの画像の上に、マウス316にて丸印62を描画し、入力する。 When a fixing command is given by clicking the instruction input mode button 501, the client device IT sets the captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52). The client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, a circle mark 62 is drawn using the mouse 316 on the image of the product 52c displayed on the display 306, and inputted.
 依頼人装置ITは、上記固定指令を、サーバ装置SVを介して、スマートフォン200に送信する(ステップS51、S93)。固定指令を受けたスマートフォン200は、当該固定指令を受信した際の撮像画像を、参照撮像画像として不揮発性メモリ212に記録する(ステップS32)。 The client device IT transmits the fixed command to the smartphone 200 via the server device SV (steps S51, S93). The smartphone 200 that has received the fixing instruction records the captured image when receiving the fixing instruction in the nonvolatile memory 212 as a reference captured image (step S32).
 したがって、依頼人装置ITとスマートフォン200において、同じ時点の撮像画像を参照撮像画像として認識することができる。 Therefore, the client device IT and the smartphone 200 can recognize captured images taken at the same time as reference captured images.
 依頼人は、指示画像を入力し終わると、ディスプレイ306の参照撮像画像の右下に表示されている指示画像送信ボタン502(指示入力モードになると表示される)をクリックする。これにより、依頼人装置ITは、サーバ装置SVを介して、指示画像をスマートフォン200に送信する(ステップS53、S94)。 When the client finishes inputting the instruction image, the client clicks the instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306. Thereby, the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94).
 また、依頼人装置ITは、指示入力モードを解除して参照撮像画像としての静止画の表示を止めて、送信されてきている撮像画像を動画として表示する(ステップS54)。これにより、指示者は、再び現地の状況を見ることができる。 Further, the client device IT cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the transmitted captured image as a moving image (step S54). This allows the instructor to see the local situation again.
 スマートフォン200は、図24Aの指示画像データを受信すると、これをメモリ204に保持する。さらに、スマートフォン200は、カメラ82から現在の撮像画像を取得する(ステップS33)。 Upon receiving the instruction image data of FIG. 24A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82 (step S33).
 前述のように、カメラ82の撮像範囲と、レーザプロジェクタ84の投影範囲は、同じになるように構成されている。したがって、現在の撮像画像が記録した参照撮像画像と全く同じであれば(すなわち、参照撮像画像の時から現地案内人が全く動かなければ)、基準座標位置(図24C)に基づく位置に、レーザプロジェクタ84によって指示画像データを投影すれば、商品52cに指示画像62が映し出されることになる。 As described above, the imaging range of the camera 82 and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current captured image is exactly the same as the recorded reference captured image (that is, if the local guide has not moved at all since the reference captured image), the laser When the instruction image data is projected by the projector 84, the instruction image 62 is projected onto the product 52c.
 この指示画像62の位置は、依頼人がディスプレイ306において入力した位置と合致するので、現地案内人に対して正確に対象物である商品52cを示すことができる。現地案内人は、この指示画像62を目印として間違わずに商品52cを購入することができる。 Since the position of this instruction image 62 matches the position input by the client on the display 306, the target product 52c can be accurately shown to the local guide. The local guide can use the instruction image 62 as a landmark to purchase the product 52c without making a mistake.
 現地案内人が向きを変えたり、移動したりした場合も、正しい位置に指示画像62が表示されるように制御する処理は、第2の実施形態と同様である。
 
Even when the local guide changes direction or moves, the process of controlling the instruction image 62 to be displayed at the correct position is the same as in the second embodiment.
4.4変形例
(1)上記実施形態では、画像処理ないし投影制御を行うことで、指示画像を正しく表示するための補正手段26を設けている。しかし、細かな精度が要求されない場合には、駆動部16による投影部14の投影方向を制御する方向制御手段20だけでもよい。
4.4 Variations
(1) In the embodiment described above, a correction means 26 is provided for correctly displaying the instruction image by performing image processing or projection control. However, if fine precision is not required, only the direction control means 20 that controls the projection direction of the projection section 14 by the drive section 16 may be sufficient.
(2)上記実施形態では、指示画像62を投影するために、レーザプロジェクタ94を設けている。しかし、指示画像62を投影せず、依頼人が現地の画像を見るだけでよい場合には、レーザプロジェクタ94を設けなくともよい。この場合でも、依頼人は、自らの希望する方向の画像を見ることができる。 (2) In the above embodiment, a laser projector 94 is provided to project the instruction image 62. However, if the instruction image 62 is not projected and the client only needs to view the on-site image, the laser projector 94 may not be provided. Even in this case, the client can view the image in the direction he or she desires.
(3)上記実施形態では、駆動部16によって投影方向を制御した上、マーカ60などを追従して指示画像が正しく表示されるように、スマートフォン200による画像処理、投影制御(ステップS34)を行っている。しかし、マーカ60などを追従する制御についても、駆動部16によって制御するようにしてもよい。 (3) In the above embodiment, the driving unit 16 controls the projection direction, and the smartphone 200 performs image processing and projection control (step S34) so that the instruction image is correctly displayed by following the marker 60. ing. However, control for tracking the marker 60 and the like may also be controlled by the drive section 16.
(4)上記の各変形例は、その本質に反しない限り互いに組み合わせて適用することができる。また、他の実施形態、その変形例とも組み合わせて適用することが可能である。
 
 
 
(4) The above modifications can be applied in combination with each other as long as it does not contradict the essence. Further, it is possible to apply the present invention in combination with other embodiments and modifications thereof.


Claims (16)

  1.  複数の現地装置と、サーバ装置と、依頼人装置とを備えた遠隔体験システムであって、
     前記現地装置は、
     現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着した撮像部と、
     案内人または案内ロボットが案内可能である状態となると、送信部によって、自らの位置と案内可能である旨を、可能情報としてサーバ装置に送信する案内可能送信手段と、
     受信部によって、サーバ装置から依頼情報を受信し、当該依頼人装置に対して案内モードに入る依頼情報受信手段と、
     案内モードにおいて、送信部により、前記撮像部の撮像画像をサーバ装置に送信するための撮像画像送信手段とを備え、
     前記サーバ装置は、
     受信部により、前記現地装置から可能情報を受信する可能情報受信手段と、
     送信部により、前記可能情報を送信してきた現地装置を地図上に配置した可能装置一覧を依頼人装置に送信する可能装置一覧送信手段と、
     受信部によって依頼人装置より受信した依頼情報を、送信部により当該現地装置に送信する依頼情報転送手段と、
     受信部により前記現地装置からの撮像画像を受信し、送信部により当該撮像画像を前記依頼人装置に送信する撮像画像転送手段とを備え、
     前記依頼人装置は、
     受信部により、前記可能装置一覧を受信する可能装置一覧受信手段と、
     前記可能装置一覧を表示部に表示する可能装置一覧表示手段と、
     表示された可能装置一覧の中から依頼人の操作によって選択された現地装置の識別符号と案内依頼と依頼人装置の識別符号を、依頼情報として送信部によってサーバ装置に送信する案内依頼送信手段と、
     受信部により、撮像画像を受信する撮像画像受信手段と、
     表示部により、受信した撮像画像を表示する撮像画像表示手段と、
     を備えた遠隔体験システム。
    A remote experience system comprising a plurality of local devices, a server device, and a client device,
    The local device is
    An imaging unit that a local guide or a local guide robot attaches to his or her own body or a moving body that moves with the local guide;
    a guiding enable transmitting means for transmitting, as enabling information, information about the guide's position and the fact that the guiding robot is capable of guiding to the server device, using a transmitting unit, when the guide or the guiding robot is in a state where the guide is capable of guiding;
    request information receiving means for receiving request information from a server device by a receiving unit and entering a guidance mode for the client device;
    and a captured image transmitting means for transmitting the captured image of the imaging unit to the server device by the transmitting unit in the guide mode,
    The server device includes:
    Possibility information receiving means for receiving possibility information from the local device by a receiving unit;
    Possible device list transmitting means for transmitting a list of possible devices in which the local devices that have transmitted the possible information are arranged on a map to the client device by the transmitting unit;
    request information transfer means for transmitting the request information received from the requester's device by the receiving section to the local device by the transmitting section;
    a captured image transfer means for receiving a captured image from the on-site device by a receiving unit and transmitting the captured image to the client device by a transmitting unit;
    The client device is
    Possible device list receiving means for receiving the list of possible devices by a receiving unit;
    Possible device list display means for displaying the list of possible devices on a display unit;
    Guidance request transmitting means for transmitting, as request information, the identification code of the local device selected by the client's operation from the displayed list of possible devices, the guidance request, and the identification code of the client's device to the server device as request information; ,
    a captured image receiving means for receiving the captured image by the receiving unit;
    a captured image display means for displaying the received captured image by a display unit;
    A remote experience system equipped with
  2.  サーバ装置、依頼人装置とともに遠隔体験システムを構築するための現地装置であって、
     現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着した撮像部と、
     案内人または案内ロボットが案内可能である状態となると、依頼人装置において案内可能状態にある現地装置の一覧を地図上に表示するため、送信部によって、自らの位置と案内可能である旨を、可能情報としてサーバ装置に送信する案内可能送信手段と、
     前記一覧から自らの現地装置が選択されると、受信部によって、サーバ装置から依頼情報を受信し、当該依頼人装置に対して案内モードに入る依頼情報受信手段と、
     案内モードにおいて、送信部により、前記撮像部の撮像画像をサーバ装置に送信するための撮像画像送信手段と、
     を備えた現地装置。
    A local device for constructing a remote experience system together with a server device and client device,
    An imaging unit that a local guide or a local guide robot attaches to his or her own body or a moving body that moves with the local guide;
    When the guide or the guide robot becomes ready to guide, the requester's device displays a list of local devices that are ready to guide on the map, so the transmitter sends information about its own location and the fact that it is available for guidance. Guidable transmission means for transmitting possible information to the server device;
    When one's own local device is selected from the list, a receiving unit receives request information from the server device and enters a guidance mode for the client device;
    In the guidance mode, a captured image transmitting means for transmitting the captured image of the imaging unit to the server device by the transmitting unit;
    On-site equipment with.
  3.  サーバ装置、依頼人装置とともに遠隔体験システムを構築するための現地装置をコンピュータによって実現するための現地プログラムであって、コンピュータを、
     案内人または案内ロボットが案内可能である状態となると、依頼人装置において案内可能状態にある現地装置の一覧を地図上に表示するため、送信部によって、自らの位置と案内可能である旨を、可能情報としてサーバ装置に送信する案内可能送信手段と、
     前記一覧から自らの現地装置が選択されると、受信部によって、サーバ装置から依頼情報を受信し、当該依頼人装置に対して案内モードに入る依頼情報受信手段と、
     案内モードにおいて、送信部により、現地案内人または現地案内ロボットが自らの身体または自らとともに移動する移動体に装着した撮像部の撮像画像をサーバ装置に送信するための撮像画像送信手段として機能させるための現地プログラム。
    A local program for using a computer to implement a local device for constructing a remote experience system together with a server device and a client device, which uses a computer to
    When the guide or the guide robot becomes ready to guide, the requester's device displays a list of local devices that are ready to guide on the map, so the transmitter sends information about its own location and the fact that it is available for guidance. Guidable transmission means for transmitting possible information to the server device;
    When one's own local device is selected from the list, a receiving unit receives request information from the server device and enters a guidance mode for the client device;
    In the guidance mode, the transmitting unit functions as a captured image transmission means for the local guide or the local guide robot to transmit the captured image of the imaging unit attached to the body or a moving body that moves with the local guide to the server device. local program.
  4.  現地装置、依頼人装置とともに遠隔体験システムを構築するためのサーバ装置であって、
     案内人または案内ロボットが案内可能である状態となっている現地装置の位置と案内可能である旨を可能情報として、前記現地装置から、受信部により受信する可能情報受信手段と、
     送信部により、前記可能情報を送信してきた現地装置を地図上に配置した可能装置一覧を依頼人装置に送信する可能装置一覧送信手段と、
     受信部によって依頼人装置より受信した依頼情報を、送信部により当該現地装置に送信する依頼情報転送手段と、
     受信部により前記現地装置からの撮像画像を受信し、送信部により当該撮像画像を前記依頼人装置に送信する撮像画像転送手段と、
     を備えたサーバ装置。
    A server device for constructing a remote experience system together with local devices and client devices,
    Possibility information receiving means for receiving, by a receiving unit, from the local device the position of the local device that is in a state where the guide or the guide robot can guide and the fact that the guide is possible;
    Possible device list transmitting means for transmitting a list of possible devices in which the local devices that have transmitted the possible information are arranged on a map to the client device by the transmitting unit;
    request information transfer means for transmitting the request information received from the requester's device by the receiving section to the local device by the transmitting section;
    a captured image transfer means for receiving a captured image from the on-site device by a receiving unit and transmitting the captured image to the client device by a transmitting unit;
    A server device equipped with
  5.  サーバ装置、現地装置とともに遠隔体験システムを構築するための依頼人装置であって、
     受信部により、案内可能状態にある現地装置の一覧を地図上に示した可能装置一覧を、サーバ装置から受信する可能装置一覧受信手段と、
     前記可能装置一覧を表示部に表示する可能装置一覧表示手段と、
     表示された可能装置一覧の中から依頼人の操作によって選択された現地装置の識別符号と案内依頼と依頼人装置の識別符号を、依頼情報として送信部によってサーバ装置に送信する案内依頼送信手段と、
     受信部により、撮像画像を受信する撮像画像受信手段と、
     表示部により、受信した撮像画像を表示する撮像画像表示手段と、
     を備えた依頼人装置。
    A client device for constructing a remote experience system together with a server device and a local device,
    Possible device list receiving means for receiving a list of possible devices showing a list of local devices in a guideable state on a map from the server device by the receiving unit;
    Possible device list display means for displaying the list of possible devices on a display unit;
    Guidance request transmitting means for transmitting, as request information, the identification code of the local device selected by the client's operation from the displayed list of possible devices, the guidance request, and the identification code of the client's device to the server device as request information; ,
    a captured image receiving means for receiving the captured image by a receiving unit;
    a captured image display means for displaying the received captured image by a display unit;
    Client device equipped with.
  6.  サーバ装置、現地装置とともに遠隔体験システムを構築するための依頼人装置をコンピュータによって実現するための依頼人プログラムであって、コンピュータを、
     受信部により、案内可能状態にある現地装置の一覧を地図上に示した可能装置一覧を、サーバ装置から受信する可能装置一覧受信手段と、
     前記可能装置一覧を表示部に表示する可能装置一覧表示手段と、
     表示された可能装置一覧の中から依頼人の操作によって選択された現地装置の識別符号と案内依頼と依頼人装置の識別符号を、依頼情報として送信部によってサーバ装置に送信する案内依頼送信手段と、
     受信部により、撮像画像を受信する撮像画像受信手段と、
     表示部により、受信した撮像画像を表示する撮像画像表示手段として機能させるための依頼人プログラム。
    A client program for using a computer to realize a client device for constructing a remote experience system together with a server device and local devices,
    Possible device list receiving means for receiving a list of possible devices showing a list of local devices in a guideable state on a map from the server device by the receiving unit;
    Possible device list display means for displaying the list of possible devices on a display unit;
    Guidance request transmitting means for transmitting, as request information, the identification code of the local device selected by the client's operation from the displayed list of possible devices, the guidance request, and the identification code of the client's device to the server device as request information; ,
    a captured image receiving means for receiving the captured image by a receiving unit;
    A client program for causing a display unit to function as a captured image display means for displaying a received captured image.
  7.  請求項1~6のシステム、装置またはプログラムにおいて、
     前記現地装置の前記撮像部は、広角撮像画像を出力する広角度撮像部であり、
     前記サーバ装置の撮像画像転送手段は、前記依頼人装置からの方向指示を受けて、受信した広角撮像画像中から当該方向指示に対応する領域を選択し、送信部によって選択領域画像として前記依頼人装置に送信し、
     前記依頼人装置は、表示部に表示された選択領域画像を見て、依頼人が入力した方向指示を前記サーバ装置に送信する方向指示送信手段を備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claims 1 to 6,
    The imaging unit of the on-site device is a wide-angle imaging unit that outputs a wide-angle captured image,
    The captured image transfer means of the server device receives the direction instruction from the client device, selects an area corresponding to the direction instruction from the received wide-angle captured image, and transfers the selected area image to the client device by the transmitter. send to the device,
    A system, device, or program characterized in that the client device includes a direction instruction transmitting means for transmitting a direction instruction input by the client to the server device by viewing a selection area image displayed on a display unit. .
  8.  請求項7のシステム、装置またはプログラムにおいて、
     前記現地装置の広角度撮像部は、当該広角度撮像部の方向変化を検出して、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、所定方向が基準方向となるように広角度撮像画像を出力し、
     前記方向指示は、前記基準方向を基準として与えられることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claim 7,
    The wide-angle imaging unit of the on-site device detects a change in the direction of the wide-angle imaging unit so that the predetermined direction becomes the reference direction regardless of the movement of the on-site guide, the on-site guide robot, or the mobile object. Outputs wide-angle captured images,
    A system, device, or program characterized in that the direction instruction is given based on the reference direction.
  9.  請求項7のシステム、装置またはプログラムにおいて、
     前記サーバ装置を介して前記現地装置から前記選択領域画像を受信して表示する依頼人装置は、複数台設けられ、
     各依頼人装置の指示する方向指示は、それぞれ異なっていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claim 7,
    A plurality of client devices are provided that receive and display the selected area image from the local device via the server device,
    A system, device, or program characterized in that the direction instructions given by each client device are different from each other.
  10.  請求項7のシステム、装置またはプログラムにおいて、
     前記現地装置は、
     現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着され、与えられた指示画像データに基づいて、現地空間に指示画像を投影する投影部と、
     前記投影部の投影方向を変化させる駆動部と、
     投影部の向きを検出するセンサからの出力を受けて、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記投影部の投影方向が前記現地案内人または現地案内ロボットまたは前記移動体を中心として所定方向を向くように駆動部を制御する方向制御手段とをさらに備え、
     前記依頼人装置は、
     送信部により、前記現地装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を参照撮像画像とし、当該参照撮像画像において、指示画像の位置を特定した指示画像データを、前記サーバ装置を介して前記現地装置に送信する指示画像送信手段とをさらに備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claim 7,
    The local device is
    A projection unit that is attached to a local guide or a local guide robot on its own body or a moving body that moves with the local guide, and projects an instruction image onto the local space based on given instruction image data;
    a drive unit that changes the projection direction of the projection unit;
    In response to an output from a sensor that detects the direction of the projection unit, the projection direction of the projection unit is determined to be the same as that of the local guide, the local guide robot, or the mobile object, regardless of the movement of the local guide, the local guide robot, or the moving body. further comprising a direction control means for controlling the drive unit to face in a predetermined direction with the movable body as the center;
    The client device is
    fixed command means for giving a fixed command to the local device by a transmitter;
    When there is a fixing command, the vicinity of the characteristic partial image of the captured image is set as a reference captured image, and instruction image data in which the position of the instruction image is specified in the reference captured image is transmitted to the local device via the server device. A system, device, or program further comprising instruction image transmitting means.
  11.  請求項10のシステム、装置またはプログラムにおいて、
     前記指示画像が前記現地空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claim 10,
    The system further comprises a correction unit that corrects projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. , device or program.
  12.  請求項7のシステム、装置またはプログラムにおいて、
     前記現地装置は、
     現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着され、与えられた指示画像データに基づいて、現地空間に指示画像を投影する広角投影部と、
     前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記投影部の投影方向が前記現地案内人または現地案内ロボットまたは前記移動体を中心として所定方向を向くように広角投影部による投影を制御する方向制御手段とをさらに備え、
     前記依頼人装置は、
     送信部により、前記現地装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を参照撮像画像とし、当該参照撮像画像において、指示画像の位置を特定した指示画像データを、前記サーバ装置を介して前記現地装置に送信する指示画像送信手段とをさらに備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claim 7,
    The local device is
    a wide-angle projection unit that is attached to a local guide or a local guide robot on its own body or a moving body that moves with the local guide, and projects an instruction image in the local space based on given instruction image data;
    A wide-angle projection unit so that the projection direction of the projection unit faces a predetermined direction centering on the local guide, the local guide robot, or the mobile body, regardless of the movement of the local guide, the local guide robot, or the mobile body. further comprising a direction control means for controlling the projection;
    The client device is
    fixed command means for giving a fixed command to the local device by a transmitter;
    When there is a fixing command, the vicinity of the characteristic partial image of the captured image is set as a reference captured image, and instruction image data in which the position of the instruction image is specified in the reference captured image is transmitted to the local device via the server device. A system, device, or program further comprising instruction image transmitting means.
  13.  請求項12のシステム、装置またはプログラムにおいて、
     前記指示画像が前記現地空間の所定部位を基準として正しく表示されるように、前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claim 12,
    A system, device, or program product, further comprising a correction unit that corrects projection of the instruction image by the projection unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space.
  14.  請求項1~6のシステム、装置またはプログラムにおいて、
     前記現地装置は、前記撮像部の撮像方向を変化させる駆動部と、
     撮像部の向きを検出するセンサからの出力を受けて、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記撮像部が前記現地案内人または現地案内ロボットを中心として方向指示に基づく所定方向を向くように駆動部を制御する方向制御手段とをさらに備え、
     前記依頼人装置は、表示部に表示された撮像画像を見て、依頼人が入力した方向指示を前記サーバ装置に送信する方向指示送信手段を備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claims 1 to 6,
    The on-site device includes a drive unit that changes the imaging direction of the imaging unit;
    In response to the output from the sensor that detects the orientation of the imaging unit, the imaging unit instructs the direction around the local guide or the local guide robot, regardless of the movement of the local guide, the local guide robot, or the mobile object. further comprising a direction control means for controlling the drive unit to face a predetermined direction based on the
    A system, device, or program product, wherein the client device includes a direction instruction transmitting unit that transmits a direction instruction input by the client to the server device by viewing a captured image displayed on a display unit.
  15.  請求項14のシステム、装置またはプログラムにおいて、
     前記現地装置は、
     現地案内人または現地案内ロボットが、自らの身体または自らとともに移動する移動体に装着され、与えられた指示画像データに基づいて、現地空間に指示画像を投影する投影部をさらに備え、
     前記駆動部は、前記投影部の投影方向も変化させるものであり、
     投影部の向きを検出するセンサからの出力を受けて、前記現地案内人または現地案内ロボットまたは前記移動体の動きに拘わらず、前記投影部の投影方向が前記現地案内人または現地案内ロボットまたは前記移動体を中心として所定方向を向くように駆動部を制御する方向制御手段をさらに備え、
     前記依頼人装置は、
     送信部により、前記現地装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を参照撮像画像とし、当該参照撮像画像において、指示画像の位置を特定した指示画像データを、前記サーバ装置を介して前記現地装置に送信する指示画像送信手段とをさらに備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to claim 14,
    The local device is
    The local guide or the local guide robot further includes a projection unit that is attached to the body of the local guide or a moving body that moves with the local guide and projects an instruction image onto the local space based on the given instruction image data,
    The drive unit also changes the projection direction of the projection unit,
    In response to an output from a sensor that detects the direction of the projection unit, the projection direction of the projection unit is determined to be the same as that of the local guide, the local guide robot, or the mobile object, regardless of the movement of the local guide, the local guide robot, or the moving body. further comprising direction control means for controlling the drive unit to face in a predetermined direction with the moving body as the center;
    The client device is
    fixed command means for giving a fixed command to the local device by a transmitter;
    When there is a fixing command, the vicinity of the characteristic partial image of the captured image is set as a reference captured image, and instruction image data in which the position of the instruction image is specified in the reference captured image is transmitted to the local device via the server device. A system, device, or program further comprising instruction image transmitting means.
  16.  請求項15のシステム、装置またはプログラムにおいて、
     前記指示画像が前記現地空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴とするシステム、装置またはプログラム。
     
     
     
     
    The system, device or program according to claim 15,
    The system further comprises a correction unit that corrects projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. , device or program.



PCT/JP2023/005479 2022-04-20 2023-02-16 Remote experience system WO2023203853A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-069247 2022-04-20
JP2022069247 2022-04-20

Publications (1)

Publication Number Publication Date
WO2023203853A1 true WO2023203853A1 (en) 2023-10-26

Family

ID=88419658

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/005479 WO2023203853A1 (en) 2022-04-20 2023-02-16 Remote experience system

Country Status (1)

Country Link
WO (1) WO2023203853A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005117621A (en) * 2003-09-16 2005-04-28 Honda Motor Co Ltd Image distribution system
JP2005323310A (en) * 2004-05-11 2005-11-17 Nippon Telegr & Teleph Corp <Ntt> Visual field sharing instrument, visual field movement input unit, picture image display device, photographing scope projection method, control method of visual field movement input unit, control method of picture image display device, program of visual field sharing device. and program of both visual field movement input device and picture image dispaly device
JP2013192029A (en) * 2012-03-14 2013-09-26 Renesas Mobile Corp Portable terminal including photographing function and image acquisition system using the same
WO2014077046A1 (en) * 2012-11-13 2014-05-22 ソニー株式会社 Image display device and image display method, mobile body device, image display system, and computer program
JP2014225108A (en) * 2013-05-16 2014-12-04 ソニー株式会社 Image processing apparatus, image processing method, and program
JP2016126365A (en) * 2014-12-26 2016-07-11 セイコーエプソン株式会社 Display system, display device, information display method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005117621A (en) * 2003-09-16 2005-04-28 Honda Motor Co Ltd Image distribution system
JP2005323310A (en) * 2004-05-11 2005-11-17 Nippon Telegr & Teleph Corp <Ntt> Visual field sharing instrument, visual field movement input unit, picture image display device, photographing scope projection method, control method of visual field movement input unit, control method of picture image display device, program of visual field sharing device. and program of both visual field movement input device and picture image dispaly device
JP2013192029A (en) * 2012-03-14 2013-09-26 Renesas Mobile Corp Portable terminal including photographing function and image acquisition system using the same
WO2014077046A1 (en) * 2012-11-13 2014-05-22 ソニー株式会社 Image display device and image display method, mobile body device, image display system, and computer program
JP2014225108A (en) * 2013-05-16 2014-12-04 ソニー株式会社 Image processing apparatus, image processing method, and program
JP2016126365A (en) * 2014-12-26 2016-07-11 セイコーエプソン株式会社 Display system, display device, information display method, and program

Similar Documents

Publication Publication Date Title
JP7268692B2 (en) Information processing device, control method and program
US10354407B2 (en) Camera for locating hidden objects
CN107782314B (en) Code scanning-based augmented reality technology indoor positioning navigation method
US9401050B2 (en) Recalibration of a flexible mixed reality device
CN104903775B (en) Head mounted display and its control method
US20170337743A1 (en) System and method for referencing a displaying device relative to a surveying instrument
JP6201024B1 (en) Method for supporting input to application for providing content using head mounted device, program for causing computer to execute the method, and content display device
JP2018009836A (en) Program, head-mounted-type display device, and calibration method
JP4433385B2 (en) Destination guide device and portable terminal device
KR20190118939A (en) Mr telescope and system and method for operating the same
CN103763470A (en) Portable scene shooting device
KR20220046504A (en) Hands-free pedestrian navigation system and method
WO2023203853A1 (en) Remote experience system
JP6398630B2 (en) Visible image display method, first device, program, and visibility changing method, first device, program
EP4086571A1 (en) High-density 3d environment capture to guide mixed reality
CN112558008B (en) Navigation method, system, equipment and medium based on optical communication device
US20230035962A1 (en) Space recognition system, space recognition method and information terminal
US20210289321A1 (en) A device for location based services
WO2023188951A1 (en) Remote instruction system
KR20200004135A (en) Method for providing model house virtual image based on augmented reality
US20240112422A1 (en) Communication management server, communication system, and method for managing communication
JP2020042667A (en) Projection system, projection method, and program
WO2022172335A1 (en) Virtual guide display device, virtual guide display system, and virtual guide display method
US20220084258A1 (en) Interaction method based on optical communication apparatus, and electronic device
US20240118703A1 (en) Display apparatus, communication system, display control method, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23791509

Country of ref document: EP

Kind code of ref document: A1