WO2020090697A1 - Remote live video amusement facility and method for billing user using said remote live video amusement facility - Google Patents

Remote live video amusement facility and method for billing user using said remote live video amusement facility Download PDF

Info

Publication number
WO2020090697A1
WO2020090697A1 PCT/JP2019/042078 JP2019042078W WO2020090697A1 WO 2020090697 A1 WO2020090697 A1 WO 2020090697A1 JP 2019042078 W JP2019042078 W JP 2019042078W WO 2020090697 A1 WO2020090697 A1 WO 2020090697A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
live video
sub
user device
experience
Prior art date
Application number
PCT/JP2019/042078
Other languages
French (fr)
Japanese (ja)
Inventor
井筒政弘
Original Assignee
株式会社Dapリアライズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Dapリアライズ filed Critical 株式会社Dapリアライズ
Priority to JP2019565360A priority Critical patent/JPWO2020090697A1/en
Publication of WO2020090697A1 publication Critical patent/WO2020090697A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • A63G31/16Amusement arrangements creating illusions of travel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Definitions

  • the present invention relates to a remote live image entertainment facility where a user can enjoy a live image of a target existing in a plurality of remote areas while staying at one place.
  • the present invention also relates to a charging method for a user who uses the remote live video entertainment facility.
  • tourism industry now accounts for about 10% of the world's GDP, and it is predicted that it will become the “largest industry in the 21st century” in the future. In this trend, there is a growing need for tourism in traditional tourist destinations, as well as in unexplored areas and frontier spaces such as outer space and the deep sea.
  • a remote live video entertainment facility has been proposed that allows people to experience sightseeing in multiple tourist destinations while staying in one place.
  • an object whose video can be displayed on the live video display means of the user device installed in the remote live video entertainment facility (hereinafter abbreviated as "displayable target of user device").
  • displayable target of user device an object whose video can be displayed on the live video display means of the user device installed in the remote live video entertainment facility.
  • the remote live video entertainment facility if the remote live video entertainment facility is used by the user and an attempt is made to pay for it, the number of displayable objects of the user device will increase. Although the usage situation is diversified, there is a problem that an appropriate charging method cannot be taken according to the situation.
  • the present invention has been made in view of the above circumstances, and the user can enjoy the live image of any target even if the user device installed inside the remote live image entertainment facility has a large number of displayable targets. It is an object of the present invention to provide a remote live video entertainment facility that is less confusing and does not require complicated operations for displaying the live video that one wants to enjoy.
  • the present invention allows the user to say, “When the user device installed therein has a large number of displayable objects,“ there is a remote place where an object actually enjoying the live image exists ”.
  • the purpose is to provide a remote live video entertainment facility that can give the illusion of "staying in.”
  • the present invention provides a remote live video entertainment facility and a remote live video entertainment facility that can use an appropriate charging method according to the usage situation of a user when the number of displayable objects of the user device installed therein increases. It is an object of the present invention to provide a billing method for a user who uses a live video entertainment facility.
  • a first aspect of the present invention relating to a remote live video entertainment facility is as follows.
  • a remote live video entertainment facility in which a plurality of “user devices equipped with“ live video display means for displaying live video of an object existing in a remote place ”are installed,
  • An object whose image can be displayed on the live image display means of the user device (hereinafter abbreviated as “displayable object of the user device”) changes according to a result selected or declared by a user who uses the user device.
  • displayable object of the user device A remote live video entertainment facility characterized by that. As configured.
  • the user when the user enters the remote live video entertainment facility, the user can select the type or range of displayable target of the user device, and can use the demographics (age, sex, The place of residence, height / weight, occupation, etc. should be declared, and the results of the selection / declaration recorded on the admission card (contactless IC card, etc.) held by the user, It can be configured to be read by the device.
  • the demographics age, sex, The place of residence, height / weight, occupation, etc.
  • the results of the selection / declaration recorded on the admission card (contactless IC card, etc.) held by the user, It can be configured to be read by the device.
  • immersive 3D-HMD Head Mounted Display; hereinafter, “immersive 3D-HMD” is simply referred to as “HMD”
  • HMD Head Mounted Display
  • the user device may be configured to include a seating means such as a reclining seat on which the user wearing the HMD sits, and a controller manually operated by the user, in addition to the HMD.
  • a seating means such as a reclining seat on which the user wearing the HMD sits
  • a controller manually operated by the user, in addition to the HMD.
  • a second aspect of the present invention relating to a remote live video entertainment facility is as follows.
  • the experience area in which the user device is installed in the remote live video entertainment facility is divided into a plurality of sub-experience areas,
  • the displayable target of the user device is set to be different depending on which sub-experience area the user device is installed in, As configured.
  • the experience area can be divided into multiple sub-experience areas using fixed walls or movable partitions.
  • a third aspect of the present invention relating to a remote live video entertainment facility is, In the second form of remote live video entertainment facility, At least one of interior, exhibition, brightness, temperature, and humidity in the sub-experience area, and a status of a remote place where at least one of the displayable objects of the user device installed in the sub-experience area exists. To adapt to, As configured.
  • the present condition information such as brightness, temperature, and humidity is obtained from the remote place, and the lighting device and the air conditioner are operated based on the present condition information. it can.
  • a typical building in the remote area can be drawn on the wall surface or a typical plant can be displayed.
  • a fourth aspect of the present invention relating to a remote live video entertainment facility is, In the remote live video entertainment facility of any one of the first to third aspects, The displayable target of the user device changes with time and time, As configured.
  • a fifth aspect of the present invention relating to a charging method for a user who uses a remote live video entertainment facility, A charging method for a user who uses the remote live video entertainment facility according to any one of the first to fourth aspects, and charges according to the type and / or number of displayable objects of the user device used by the user. As configured.
  • a sixth aspect of the present invention relating to a charging method for a user who uses a remote live video entertainment facility
  • a charging method for a user who uses the remote live video entertainment facility according to any one of the second to fourth aspects Charge according to the type and / or number of sub-experience areas that the user can enter, As configured.
  • the user when the user enters the remote live video entertainment facility, the user can use, for example, the following displayable items as displayable targets of the user device used by the user.
  • the targets 1 to 6 a plurality of targets can be selected
  • only the live video of the displayable target selected by itself is displayed on the live video display means of the user device used by the user. be able to.
  • the user when using the user device, the user will not hesitate about “what kind of target live video should be enjoyed” from a wide range of targets.
  • the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
  • the user when the user enters the remote live video entertainment facility, the user can use his / her own demographics (age, gender, religion, nationality, place of residence). , Religion, height / weight, occupation, etc.), and the displayable video of the user device used by the user can be limited based on the content.
  • the live image display means of the user device used by the user under 15 years old is It is possible not to display the live image of the so-called "R15" live event. Further, it is possible to prevent the live image display means of the user device used by the believer of a particular religion from displaying a live image that violates the contraindications of the religion.
  • the user can It is possible to appropriately charge according to the usage status of.
  • the experience area is divided into six sub-experience areas (sub-experience areas 1 to 6)), and then each sub-experience area is divided.
  • the displayable targets of the user devices arranged in the experience area can be assigned as follows according to the area (tourist spot) in which they are present.
  • users who want to enjoy live images of tourist destinations in Japan enter the sub-experience area 1
  • users who want to enjoy live images of tourist destinations in Asia other than Japan enter the sub-experience area 2, and tourism in the United States.
  • Users who want to enjoy the live video of the area enter the sub-experience area 3
  • users who want to enjoy the live video of the tourist area in Europe want to enter the sub-experience area 4
  • the live image of the tourist area in the Pacific region The user enters the sub-experience area 5
  • the user who wants to enjoy a live image of a tourist spot in Africa enters the sub-experience area 6 so that the user can use the user device in a wide range of tourist spots. It is less likely to be confused as to "Which local tourist spot should you enjoy the live video of?"
  • the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
  • the experience area is divided into seven sub-experience areas (sub-experience areas 1 to 7), and then each sub-experience area
  • the displayable objects of the user devices arranged in the inside can be allocated as follows according to their types.
  • Sub-experience area 1 ... land animals
  • Sub-experience area 2 ... underwater animals
  • Sub-experience area 3 ... plants
  • Sub-experience area 4 ... Natural scenery
  • Sub-experience area 5 ... Ruins
  • Sub-experience area 6 ... .... Modern building Sub-experience area 7 .... Live event
  • users who want to enjoy live images of land animals enter the sub-experience area 1 users who want to enjoy live images of underwater animals enter the sub-experience area 2, and users who want to enjoy live images of plants are sub Users who want to enter the experience area 3 and enjoy the live image of the natural landscape enter the sub experience area 4, and those who want to enjoy the live image of the ruins enter the sub experience area 5 and enjoy the live image of the modern building.
  • the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
  • all or part of the sub-experience area is further divided into a plurality of sub-sub-experience areas, and the sub-experience areas are arranged in each sub-sub-experience area. It is possible to further limit the displayable target of the user device depending on the type thereof.
  • the sub-experience areas 1 to 7 in paragraph [0030] can be divided into the following sub-sub-experience areas. [Sub experience area 1] Savusavu Experience Area 1A Japanese land animals Savusavu Area 1B Asian land animals other than Japan Savusavu Area 1C ...
  • Savusavu experience area 3A ⁇ ⁇ ⁇ Tropical plants Savusavu experience area 3B ⁇ ⁇ ⁇ Temperate plants Savusavu experience area 3C .... Arid plants Savusavu experience area 3D .... Plants in the plateau [Sub-experience area 4] Savusavu experience area 4A ⁇ ⁇ ⁇ Natural landscape of Japan Savusavu experience area 4B .... Natural landscape of Asia other than Japan Savusavu experience area 4C .... Natural landscape of North and Central and South America Savusavu experience area 4D > European natural landscape Area 4E: Natural scenery of the Pacific Ocean Sub-sub-experience area 4F: Natural scenery of Africa [Sub-experience area 5] Savusavu Experience Area 5A ...
  • the user who wants to enjoy the images of Japanese land animals enters the sub-sub experience area 1A
  • the user who wants to enjoy the images of the sports live event enters the sub-sub experience area 7A
  • the user uses the user device In doing so, it is less likely to be confused about “what kind of target live video should be enjoyed” from a wide range of targets.
  • the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
  • the remote live video entertainment facility of the second to fourth aspects of the present invention it is possible to provide an independent entrance for each sub-experience area to manage the entrance and exit of the user.
  • a charging method that charges a user who uses the remote live video entertainment facility of the present invention according to the type and / or number of sub-experience areas in which the user can enter is used, It is possible to charge appropriately according to the usage situation.
  • the remote live video entertainment facility of the third or fourth aspect of the present invention at least one of interior, exhibition, brightness, temperature, and humidity in the sub-experience area, Since at least one of the displayable objects of the user device installed in the sub-experience area can be adapted to the present condition of the remote place, the user is asked to "play the actual live video". It is possible to give an illusionary sensation of staying in a remote place where an object is present.
  • the brightness, temperature, and humidity in each sub-experience area can be adjusted according to the current time and weather of each tourist destination. Also, inside each of the sub-experience areas, you can display exhibits that match the customs of each tourist destination.
  • the remote live video entertainment facility of the fourth aspect of the present invention only the live video that the user can actually enjoy while using the user device is displayed. It can be configured.
  • a natural landscape that is actually exhibited when the user is using the user device for example, an aurora in a polar region, a volcanic eruption, Solar eclipse / moon eclipse, cherry blossoms / autumn leaves, etc.
  • the sub-sub experience area 7A includes only sports events (for example, domestic professional baseball games) held when the user is using the user device.
  • the sub-sub experience area 7D can be configured to display tourist events (eg, festivals, fireworks, rocket launches, etc.) that are being held when the user is using the user device.
  • tourist events eg, festivals, fireworks, rocket launches, etc.
  • the user will not display the live video out of time on the live video display means of the user device, and the complexity of the operation for displaying the live video that the user wants to enjoy will be reduced.
  • the charging method for the user who uses the remote live video entertainment facility of the fifth or sixth aspect of the present invention it is possible to appropriately charge according to the usage situation of the user.
  • FIG. 3 is an image diagram for explaining a configuration and a function of a local system that transmits a live video to a user device inside a remote live video entertainment facility according to the first embodiment of the present invention.
  • FIG. 3 is an image diagram for explaining the function of the user device that constitutes the remote live video entertainment facility according to the first embodiment of the present invention.
  • FIG. 6 is another image diagram for explaining the function of the user device configuring the remote live video entertainment facility according to the first embodiment of the present invention. It is explanatory drawing for demonstrating the imaging range of an imaging means, a viewpoint, a gaze direction, and a viewpoint (center of an imaging range).
  • FIG. 8 is an image diagram for explaining the configuration and function of a space field system and a relay device for transmitting live video to a user device inside a remote live video entertainment facility according to a second embodiment of the present invention.
  • FIG. 1 is a block diagram for explaining the configuration and functions of a remote live video entertainment facility according to a first embodiment of the present invention.
  • the experience area in which the user device is installed is divided into three sub-experience areas 1 to 3.
  • user devices 11_511 to 1i_51i i is the total number of user devices installed in the sub-experience area 1 and can be set arbitrarily.
  • the user devices 11_511 to 3k_53k are in the form of reclining seats on which the user sits, the reclining seats are provided with a controller and a headset, and a communication unit is stored inside the reclining seats.
  • the controller has a form similar to that of a propo-type joystick controller used for a normal aerial photography drone, and includes buttons, levers, dials and the like operated by a user.
  • the headset has the HMD and the headphone integrated.
  • each of the user devices 11_511 to 3k_53k is provided with an identification number defined within the remote live video entertainment facility.
  • a server area is provided separately from the sub-experience areas 1 to 3, and the facility server 4 is installed inside the server area.
  • the facility server 4 is connected to the Internet 3, and the local systems ⁇ 1 to ⁇ m (m is the total number of local systems installed in the specific area ⁇ , which are also connected to the Internet 3 and can be set arbitrarily. Shows only the local systems ⁇ 1 to ⁇ 3) and the local systems ⁇ 1 to ⁇ n (n is the total number of local systems installed in the specific area ⁇ and can be set arbitrarily. Only ⁇ 3 is displayed) and local systems ⁇ 1 to ⁇ p (p is the total number of local systems installed in the specific area ⁇ and can be set arbitrarily. Only local systems ⁇ 1 to ⁇ 3 are displayed in the figure. Signal) to and from.
  • the local system ⁇ 1 is an aerial drone ⁇ 11_1 ⁇ 11 to aerial drone ⁇ 1r_1 ⁇ 1r (r is the total number of aerial drones included in the local system ⁇ 1 and can be set arbitrarily. Only the aerial drones ⁇ 11_1 ⁇ 11 to ⁇ 13_1 ⁇ 13 are displayed in Fig. 1. And the local parent device ⁇ 1_2 ⁇ 1. Further, each of the aerial photography drone ⁇ 11_1 ⁇ 11 to the aerial photography drone ⁇ 1r_1 ⁇ 1r is provided with an identification number defined inside the local system ⁇ 1.
  • FIG. 2 is an image diagram for explaining in more detail the configuration and function of the on-site system ⁇ 1 for transmitting a live image to the user device inside the remote live image entertainment facility according to the first embodiment of the present invention (see FIG. Shows only aerial drone ⁇ 11_1 ⁇ 11 to aerial drone ⁇ 16_1 ⁇ 16).
  • the local parent device ⁇ 1_2 ⁇ 1 is composed of a transmitting / receiving antenna 2 ⁇ 11 and a local server 2 ⁇ 12, and the local server 2 ⁇ 12 is connected to the Internet 3.
  • Aerial drone ⁇ 11_1 ⁇ 11 to aerial drone ⁇ 1r_1 ⁇ 1r is a multi-copter type unmanned aerial vehicle that floats in the air by controlling the rotation speed of each rotor, and then performs straight movement, turning movement and rotation movement. Can be done.
  • each of the aerial photography drone ⁇ 11_1 ⁇ 11 to the aerial photography drone ⁇ 1r_1 ⁇ 1r is equipped with two CMOS camera modules and video signal conversion means for converting the imaging electrical signals from the two image sensors into an uncompressed 3D video signal. , And transmits the generated 3D uncompressed video signal to the local parent device ⁇ 1_2 ⁇ 1 by wireless communication.
  • the CMOS camera module is installed on the 3-way platform, and by rotating the 3-way platform around the three rotation axes, it is possible to realize tilting and panning in the front / rear / left / right of the imaging device. Is becoming
  • the CMOS camera module is used as the image pickup means, but other image pickup means (for example, CCD camera module) may be used.
  • the aerial drones ⁇ 11_1 ⁇ 11 to ⁇ 1r_1 ⁇ 1r are based on the video information carried by the uncompressed video signal generated by the image sensor and the video signal conversion means, based on other aerial drones, local parent devices and obstacles other than them.
  • An approach determination means for determining the approach to is provided.
  • the flight control means in addition to the flight control secondary signal transmitted from the local parent device ⁇ 1_2 ⁇ 1 (described in a later paragraph), also based on the result determined by the approach determination means, each rotor blade. By controlling the rotation speed of the, it is possible to avoid contact with the aerial photography drone, the local parent device, and obstacles other than them.
  • the aerial photography drones ⁇ 11_1 ⁇ 11 to ⁇ 1r_1 ⁇ 1r are equipped with microphones, and the audio signals collected by each are transmitted to the local parent device ⁇ 1_2 ⁇ 1 by wireless communication.
  • the local server 2 ⁇ 12 that constitutes the local parent device ⁇ 1_2 ⁇ 1 is a 3D uncompressed video signal converted by the video signal conversion means from the imaging electrical signal captured by the image sensor of the aerial photography drone ⁇ 11_1 ⁇ 11 to ⁇ 1r_1 ⁇ 1r via the transmission / reception antenna 2 ⁇ 11.
  • the 3D non-compressed video signal is subjected to compression encoding processing and transmitted as a 3D compressed video signal.
  • the functions and configurations related to transmission of live video and live audio are as outlined above.
  • the functions and configurations related to the flight control of the aerial photography drone will be explained in a later paragraph.
  • the local systems ⁇ 2 to ⁇ m, the local systems ⁇ 1 to ⁇ n, and the local system ⁇ 1 .About..gamma.p have the same functions and configurations as the local system .alpha.1 including these functions and configurations.
  • the local systems ⁇ 1 to ⁇ m, the local systems ⁇ 1 to ⁇ n, and the local systems ⁇ 1 to ⁇ p are installed in different areas (hereinafter, the local systems ⁇ 1 to ⁇ m are located in Japan, the local systems ⁇ 1 to ⁇ n are Will be installed in North America, and the local systems ⁇ 1 to ⁇ p will be installed in Europe). And each local system is installed in a different tourist resort in each area.
  • admission fee A Entrance to all of the sub-experience areas 1 to 3
  • Admission fee B ... Entrance to only one of the sub-experience areas 1 to 3 Select which sub-experience area to enter from 1 to 3.
  • the HMDs and headphones of the user devices 11_511 to 3k_53k installed in the remote live video entertainment facility are transmitted from a specific local system according to which sub-experience area the user device is installed. It is set so that only live video and audio can be displayed and output.
  • the sub-experience areas 1 to 3 internal user devices
  • the local system region of a tourist resort where the local system is installed
  • Sub-experience area 1 user device 11_511 to 1i_51i
  • Sub-experience area 2 user devices 21_521 to 2j_52j
  • Local system ⁇ 1 to ⁇ n North America
  • Sub-experience area 3 (user device 31_531 ⁇ 2j_53k) ... Local system ⁇ 1 ⁇ ⁇ p (Europe)
  • a user who uses a remote live video entertainment facility can rent an admission card by paying the admission fee of either the admission fee A or the admission fee B at the entrance of the entire facility.
  • the admission card is a non-contact type IC card, and the information of the sub-experience area that can be admitted is recorded according to the admission fee paid by the user and selection.
  • the user who lent out the admission card goes through the access area to the entrance of the sub-experience area where they want to enter.
  • an entrance management gate equipped with a means for reading the information recorded on the entrance card. It will be open only when you hold the "admission card with the information that you can enter” on the information reading section.
  • the user can enter the sub-experience area corresponding to the entrance fee paid by the user and the selection.
  • the functions and configurations of the user devices installed inside the sub-experience area will be described by using the user devices 11_511 to 1i_51i inside the sub-area 1 as a representative. Further, it is assumed that the user devices 11_511 to 1i_51i inside the sub-area 1 are set so as to be able to output only the live video and the live audio of the tourist spots in Japan as described above.
  • a user who has entered the sub-experience area 1 is a user device that is not used by another user among the user devices 11_511 to 1i_51i, which is searched for by himself or passively by an attendant of the remote live image entertainment facility. Sit on the reclining seat of. Hereinafter, the user will be described as sitting on the reclining seat of the user device 11_511, but the functions and configurations described below are common to the other user devices 12_512 to 1i_51i.
  • the seat of the user device 11_511 is provided with an admission card insertion unit that reads / writes information from / to the admission card, and when the user inserts the admission card into the insertion unit, the user device 11_511 is activated and Information on the time (t0) when the user device is activated is recorded on the entrance card.
  • the user After sitting on the reclining seat, the user adjusts the position of the controller so that it is easy to operate with his / her own hand, and adjusts the position of the headset so that the HMD covers his / her own visual field and the headphones cover the ears.
  • the displayable targets of the user devices 11_511 to 3k_53k are determined by "in which sub-experience area each user device is installed", but as illustrated below. In addition, it is possible to change the displayable target even for user devices installed in the same sub-experience area.
  • each of the user devices 11_511 to 3k_53k is provided with a reading unit for reading the information recorded on the entrance card, and the information reading unit indicates that "the entrance fee paid by the user is the entrance fee C.”
  • Is read "in the virtual space projected on the HMD of the user device used by the user," an image showing fewer options than in FIG. 3 (for example, "modern building” is not displayed) "Or” an image showing fewer options than in Fig. 4 (for example, only "Horyuji area” and “Kyoto Kyoto” are displayed) "is displayed. It is possible to limit the displayable target of the “user device that operates”.
  • facility server 4 saves or switches the image data related to the “image for selecting the type of tourist spot” and the “image for selecting the place or property of the tourist spot”.
  • a signal transmitting the information is transmitted via the communication unit to the facility server. Sent to 4.
  • the facility server 4 gives the user device 11_511 an IP address based on the identification number given in advance to the user device 11_511.
  • the facility server 4 is a local system installed near the Munakata Taisha near Okinoshima via the Internet 3 (hereinafter, the local system installed near the Munakata Taisha is described as a local system ⁇ 1_2 ⁇ 1).
  • a signal requesting transmission of a live image and live voice of Okinoshima is transmitted to the user device 11_511 to which the IP address is assigned.
  • the local server 2 ⁇ 12 Upon receiving the request signal, the local server 2 ⁇ 12 selects an available aerial drone belonging to the local system ⁇ 1_2 ⁇ 1 (hereinafter, the selected aerial drone will be described as the aerial drone ⁇ 11_1 ⁇ 11), An IP address is given to the aerial photography drone ⁇ 11_1 ⁇ 11 based on the identification number given in advance to the aerial photography drone ⁇ 11_1 ⁇ 11. Then, the local server 2 ⁇ 12 sends a flight control primary signal for controlling the flight of the aerial photography drone ⁇ 11_1 ⁇ 11 to the facility server 4 via the Internet 3 to the aerial photography drone ⁇ 11_1 ⁇ 11 to which the IP address is given. Send a signal requesting to send.
  • a communication channel is established between the user device 11_511 and the aerial photography drone ⁇ 11_1 ⁇ 11, and thereafter, between the user device 11_511 and the aerial photography drone ⁇ 11_1 ⁇ 11, the facility server 4, the Internet 3, and the local server. Communication conforming to the Internet protocol will be performed via 2 ⁇ 12 and the transmitting / receiving antenna 2 ⁇ 11.
  • the communication unit of the user device 11_511 receives the 3D compressed video signal and audio signal streamed from the local server 2 ⁇ 12 via the built-in internet interface and the facility server 4. , Performs necessary decoding processing, and then transmits to the HMD and headphones.
  • the 3D compressed video signal is subjected to decompression / decoding processing and transmitted as a 3D non-compressed video signal.
  • the HMD projects images from the left and right projection units based on the 3D uncompressed video signal received from the communication unit. As a result, the user can enjoy a 3D live image of Okitsunomiya viewed from above Okinoshima. Further, the headphone emits sound from the left and right sound emitting units based on the sound signal received from the communication unit. As a result, the user can enjoy the sound of the waves and wind of Okinoshima.
  • the transmission of the 3D uncompressed video signal and audio signal decoded from the communication unit can be wire-transmitted as an HDMI (registered trademark) signal or an MHL (registered trademark) signal. Alternatively, it can be wirelessly transmitted as a WirelessHD signal or a WHDI signal.
  • the user can control the flight of the aerial drone ⁇ 11_1 ⁇ 11 by operating the controller while watching the live image displayed on the HMD.
  • the controller generates a flight control primary signal based on a user's button operation, lever operation, dial operation, etc., and transmits the flight control primary signal to the communication unit.
  • the communication unit transmits the flight control primary signal received from the controller to the IP address given to the aerial imaging drone ⁇ 11_1 ⁇ 11 using the established communication channel.
  • an acceleration sensor, direction sensor and control unit are built in.
  • the control unit identifies the direction of the line of sight of the user based on the signals from the acceleration sensor and the orientation sensor, generates a line-of-sight control primary signal based on the result, and transmits the line-of-sight control primary signal to the communication unit.
  • the communication unit transmits the line-of-sight control primary signal received from the controller to the IP address assigned to the aerial imaging drone ⁇ 11_1 ⁇ 11 using the established communication channel.
  • the field server 2 ⁇ 12 receives the flight control primary signal and the line-of-sight control primary signal via the transmission / reception antenna 2 ⁇ 11, and based on the flight control primary signal and the line-of-sight control primary signal, a flight that controls the flight of the aerial photography drone ⁇ 11_1 ⁇ 11.
  • the control secondary signal and the line-of-sight control primary signal are generated and transmitted to the aerial imaging drone ⁇ 11_1 ⁇ 11.
  • the aerial photography loan ⁇ 11_1 ⁇ 11 controls the rotational speed of each rotor based on the received secondary flight control signal, while changing the line-of-sight direction of the CMOS camera module.
  • the local server 2 ⁇ 12 determines whether the aerial drones approach each other or each aerial drone and the local parent device and other obstacles based on the 3D uncompressed video signal received from the aerial drone ⁇ 11_1 ⁇ 11. Has a judgment function. Then, the local server 2 ⁇ 12 generates the flight control secondary signal based on the result determined by the approach determination function in addition to the flight control primary signal received via the transmission / reception antenna 2 ⁇ 11, and transmits it to the aerial photography drone ⁇ 11_1 ⁇ 11. Send. This allows each child device to avoid contact with other child devices, parent devices, and obstacles other than these.
  • the user can fly the aerial drone ⁇ 11_1 ⁇ 11 over Okinoshima and enjoy 3D live images of Okitsunomiya viewed from above Okinoshima, and enjoy live audio such as the sound of waves and wind.
  • the viewpoint of the image pickup means can be freely changed, and the line of sight of the image pickup means can be changed in accordance with the user's own line of sight.
  • You can enjoy the "First Person View” as if you were landing on Okinoshima, where entry is prohibited.
  • the time ( ⁇ T) that the user can use the user device in each sub-experience area is fixed, and when the time becomes t0 + ⁇ T, the user device 11_511 automatically stops.
  • the information stopped by the user device 11_511 is transmitted to the local server 2 ⁇ 12 via the Internet 3 or the like.
  • the local server 2 ⁇ 12 Upon receiving the stop information, the local server 2 ⁇ 12 transmits a flight control secondary signal for returning the aerial photography drone ⁇ 11_1 ⁇ 11 to a predetermined place, to the aerial photography drone ⁇ 11_1 ⁇ 11.
  • the correspondence between the sub-experience areas 1 to 3 (user devices inside thereof) and the local system (sightseeing place) is not limited to the above-mentioned area, but is also the following sightseeing in a sightseeing spot. It can also be performed depending on the type of object.
  • Sub-experience area 3 (user device 31_531 ⁇ 2j_53k) > Local system (modern building)
  • the experience area in which the user device is installed has been described as being divided into three sub-experience areas 1 to 3, but the number of divisions may be only two or four or more. be able to.
  • the interiors and exhibits inside the sub-experience areas 1 to 9 can be adapted to the areas of the tourist destinations that each sub-experience area supports. Also, the brightness and temperature of the illumination inside the sub-experience areas 1 to 9 can be matched to the brightness and temperature at that time in the area of the tourist destination corresponding to each sub-experience area.
  • FIG. 6 is a block diagram for explaining the configuration and function of the remote live video entertainment facility according to the second embodiment of the present invention.
  • the remote live video entertainment facility is divided into three experience areas in which user devices are installed, sub-experience areas 1 to 3.
  • user devices 11_511 to 1i_51i i is the total number of user devices installed in the sub-experience area 1 and can be set arbitrarily.
  • the remote live video entertainment facility of the present embodiment does not have a server room separately from the sub-experience areas 1 to 3, and instead, each of the sub-experience areas 1 to 3 is provided. Inside the sub-experience area server 1_41, sub-experience area server 2_42 and sub-experience area server 3_43 are installed.
  • the remote live video entertainment facility of the first embodiment only the live video and live audio from a plurality of local systems can be output to the HMD and the headphone of the user device installed inside each sub-experience area.
  • the HMD and the headphone of the user device installed inside the sub-experience area are provided with a specific local system. It is configured so that only live video and live audio can be displayed and output.
  • the experience area of the remote live video entertainment facility is divided into three sub-experience areas 1 to 3 as in the remote live video entertainment facility of the first embodiment. Will be described as being associated as follows.
  • Sub-experience area 1 ... Multiple local systems (plural) installed on the surface of the world
  • Sub-experience area 2 ...
  • Sub-experience area 3 One local system installed in outer space (hereinafter "space local system”)
  • FIG. 7 is an image diagram for explaining the configuration and function of the deep sea on-site system and the relay device for transmitting the live video to the user device inside the remote live video entertainment facility according to the second embodiment of the present invention.
  • This deep sea on-site system consists of underwater drones 1_1B1 to 6_1B6 and a deep sea master unit 2B.
  • the deep sea master unit 2B is connected to a relay ship 6B1 by a communication cable, and the relay ship 6B1 and the land relay unit 6B2 are wireless.
  • the land relay device 6B2 is connected to the Internet 3 by exchanging signals by communication. Further, each of the underwater drones 1_1B1 to 6_1B6 is given an identification number defined inside the deep sea on-site system.
  • Underwater drones 1_1B1 to 6_1B6 are mounted on deep-sea parent device 2B that moves over the sea and underwater, and travel long distances to the deep sea. After they arrive at the site, they are released to the outside of deep-sea parent device 2B, and move vertically and horizontally as well as rotate. To do however, since the underwater drones 1_1B1 to 6_1B6 and the deep sea parent device 2B are connected by a communication cable, the underwater drones 1_1B1 to 6_1B6 can move only within the range where the cable can be extended.
  • the configuration and functions of the underwater drone 1_1B1 to 6_1B6 are basically the same as those of the aerial photography drone ⁇ 11_1 ⁇ 11 to ⁇ 1r_1 ⁇ 1r described in Example 1, but movements in the sea include rotation of the screw propeller, rudder angle, and seawater. Inject and drain.
  • the deep-sea parent device 2B is a submersible capable of self-navigation over the sea and in the sea, and is equipped with a local server (not shown) having the same function as the local server 2 ⁇ 12 in the local parent device ⁇ 1_2 ⁇ 1 inside.
  • the interface unit 1 connecting the communication cable connected to the underwater drones 1_1B1 to 6_1B6 and the interface unit 2 connecting the communication cable connecting to the relay ship 6B1.
  • the relay ship 6B1 is a communication ship that anchors above the deep sea master device 2B, and exchanges video signals, audio signals, and flight control signals with the deep sea master device 2B via a cable, while on land. Video signals, audio signals, and flight control signals are exchanged with the relay device 6B2 by wireless communication.
  • the land relay device 6B2 is a mobile base station installed on land.
  • FIG. 8 is an image diagram for explaining the configuration and function of a space field system and a relay device for transmitting live video to a user device inside a remote live video entertainment facility according to the second embodiment of the present invention.
  • This space field system is composed of cosmic devices 1_1C1 to 6_1C6 and a space master device 2C.
  • the space master device 2C exchanges signals with the space relay device 6C1 by wireless communication, and the space relay device 6C1 and the ground relay device. Signals are exchanged with 6C2 by wireless communication, and the ground relay device 6C2 is connected to the Internet 3. Further, each of the cosmic devices 1_1C1 to 6_1C6 is given an identification number defined inside the space field system.
  • the cosmic devices 1_1C1 to 6_1C6 are launched from the ground, mounted on the parent device 2 that flies outside the atmosphere, and travel long distances outside the atmosphere. After arriving at the site, they are released to the outside of the parent device 2 and vertically and horizontally. Move and rotate. However, since the cosmic devices 1_1C1 to 6_1C6 and the cosmic parent device 2C are connected by a cable, the cosmic devices 1_1C1 to 6_1C6 can move only within the range where the cable extends.
  • the configuration and functions of the cosmic devices 1_1C1 to 6_1C6 are basically the same as those of the aerial imaging drone ⁇ 11_1 ⁇ 11 to ⁇ 1r_1 ⁇ 1r described in Example 1, but movement in outer space is performed by jet injection.
  • the space parent device 2C is a spacecraft that can fly outside the atmosphere, and is equipped with a local server (not shown) having the same function as the local server 2 ⁇ 12 in the local parent device ⁇ 1_2 ⁇ 1 inside.
  • the interface unit that connects the communication cable connected to the space device 1_1C1 to 6_1C6, and the transmission / reception for exchanging signals with the space relay device 6C1 by wireless communication.
  • the space master device 2C can be configured to separate the rocket used at the time of launch.
  • the space relay device 6C1 is a communication satellite existing outside the atmosphere, and exchanges video signals, audio signals, and flight control signals with the space master device 2C and the ground relay device 6C2 by wireless communication.
  • the ground relay device 6C2 is a mobile base station installed on the ground.
  • the same entrance management as that of the remote live video entertainment facility of the first embodiment is performed, and the user who has lent the entrance card at the entrance of the entire facility can access the access area. Through the entrance of each sub-experience area, enter the sub-experience area that you want to enter.
  • the functions and configurations of the user devices 11_511 to 1i_51i inside the sub-experience area 1 in the remote live video entertainment facility of the present embodiment are inside the sub-experience area 1 in the remote live video entertainment facility of the first embodiment. Similar to the user devices 11_511 to 1i_51i, the user who enters the sub-experience area 1 in the remote live video entertainment facility of the present embodiment is the same as the user who enters the sub-experience area 1 in the remote live video entertainment facility of the first embodiment. You can experience simulated tourism. However, while the user who entered the sub-experience area 1 in the remote live video entertainment facility of Example 1 enjoyed only the live video and live audio of a tourist spot in Japan, the remote live video entertainment of this Example was used. A user who has entered the sub-experience area 1 of the facility can enjoy live video and live audio of tourist spots on the surface of the world.
  • a pool is installed inside the sub-experience area 2, and a user who enters the sub-experience area 2 is equipped with necessary diving equipment and has waterproof specifications.
  • One of the user devices 21_521 to 2j_52j which is an integrated HMD, is worn on the head and swims in the water stretched in the pool.
  • the user will be described assuming that the HMD that is the user device 21_521 is worn on the head, but the functions and configurations described below are common to the other user devices 22_522 to 2j_52j.
  • the HMD which is the user device 21_521, is provided with a reading / writing unit that reads / writes information to / from an entrance card in a contactless manner.
  • the user device 21_521 is activated.
  • information on the time when the user device is activated is recorded on the entrance card.
  • an activation signal is sent from the communication unit to the sub-experience area server 42.
  • the sub-experience area server 42 gives the IP address to the user device 21_521 based on the identification number given in advance to the user device 21_521. Then, the sub-experience area server 42 assigns the IP address to the local server in the deep sea parent device 2B that constitutes the deep sea local system via the Internet 3, the land relay device 6B2, and the relay ship 6B1.
  • a signal requesting transmission of deep-sea live video and live audio is transmitted to the selected user device 21_521.
  • the local server in the deep-sea parent device 2B that has received the request signal selects an available underwater drone belonging to the deep-sea local system (hereinafter, the selected underwater drone will be described as the underwater drone 1B1).
  • An IP address is given to the underwater drone 1B1 based on an identification number given in advance to the underwater drone 1B1.
  • the local server in the deep-sea parent device 2B through the Internet 3, to the server 42 in the sub-experience area, to the underwater drone 1B1 assigned with the IP address, the swimming of the underwater drone 1B1.
  • a communication channel is established between the user device 21_521 and the underwater drone 1B1, and thereafter, between the user device 21_521 and the underwater drone 1B1, the sub-experience area server 42, the Internet 3, and the land.
  • Communication based on the Internet protocol is performed through the relay device 6B2, the relay ship 6B1, the interface unit 2, the local server and the interface unit 1.
  • the communication unit of the user device 21_521 receives the streaming-distributed 3D compressed video signal and audio signal from the Internet via the built-in Internet interface, and performs necessary decoding processing. Then, it transmits to the HMD and headphones. Particularly, the 3D compressed video signal is subjected to decompression / decoding processing and transmitted as a 3D non-compressed video signal.
  • the HMD projects images from the left and right projection units based on the 3D uncompressed video signal received from the communication unit. As a result, the user can enjoy the 3D video including the observation target on the spot. Further, the headphone emits sound from the left and right sound emitting units based on the sound signal received from the communication unit. This allows the user to enjoy the on-site voice.
  • the transmission of the 3D uncompressed video signal and audio signal decoded from the communication unit can be wire-transmitted as an HDMI (registered trademark) signal or an MHL (registered trademark) signal. Alternatively, it can be wirelessly transmitted as a WirelessHD signal or a WHDI signal.
  • a control unit identifies the position and movement of the user's head and the direction of the line of sight based on the signals from the acceleration sensor and the direction sensor, and generates a swimming control primary signal and a visual line control primary signal based on the result, Send those signals to the communication unit.
  • the communication unit transmits the swimming control primary signal and the line-of-sight control primary signal received from the controller to the IP address assigned to the underwater drone 1B1 using the established communication channel.
  • the local server in the deep-sea master apparatus 2B receives the swimming control primary signal and the line-of-sight control primary signal via the interface unit 2, and based on the swimming control primary signal and the line-of-sight control primary signal, the underwater drone 1B1. It generates a swimming control secondary signal and a line-of-sight control primary signal for controlling the swimming of, and transmits it to the underwater drone 1B1.
  • Underwater drone 1B1 controls the rotation of the screw propeller, the angle of the rudder, and the injection / drainage of seawater based on the received secondary flight control signal, while changing the line-of-sight direction of the CMOS camera module.
  • the user can swim the underwater drone 1B1 in the deep sea and enjoy the live video of the deep sea, and enjoy the live audio in the deep sea.
  • images that are linked to the position of the user's head and the movement of the face (line of sight) specified by the control unit are projected on the left and right projection units of the HMD, the user feels as if he / she was in deep water. You can enjoy the images as if you were swimming.
  • the pool is also installed inside the sub-experience area 3, and the user who enters the sub-experience area 3 is equipped with the necessary diving equipment and has a waterproof specification.
  • Any of the user devices 31_531 to 3k_53k, which is a user, is attached to the head and swims in the water stretched in the pool.
  • the user will be described assuming that the HMD that is the user device 31_531 is worn on the head, but the functions and configurations described below are common to the other user devices 32_532 to 3k_53k.
  • the HMD that is the user device 31_531 is provided with a reading / writing unit that reads / writes information to / from an entrance card in a contactless manner, and when the user holds the reading / writing unit, the user device 31_531 is activated. At the same time, information on the time when the user device is activated is recorded on the entrance card.
  • an activation signal is sent from the communication unit to the sub-experience area server 43.
  • the sub-experience area server 43 gives the user device 31_531 an IP address based on the identification number given in advance to the user device 31_531. Then, the sub-experience area server 43 sends the IP address to the local server in the space parent device 2C that constitutes the space local system via the Internet 3, the ground relay device 6C2, and the space relay device 6C1.
  • a signal requesting transmission of a live image of the earth as viewed from outer space is transmitted to the assigned user device 31_531.
  • the local server in the space parent device 2C that received the request signal selects an available one from the cosmic devices belonging to the space local system (hereinafter, the selected cosmic device is assumed to be the cosmic device 1C1.
  • the selected cosmic device is assumed to be the cosmic device 1C1.
  • an IP address is given to the cosmic device 1C1 based on an identification number given in advance to the cosmic device 1C1.
  • the local server in the universe master device 2C sends the universe child device 1C1 addressed to the universe device 1C1 to which the IP address is given to the sub experience area server 43 via the Internet 3.
  • a communication channel is established between the user device 31_531 and the cosmic device 1C1, and thereafter, between the user device 31_531 and the cosmic device 1C1, the sub-experience area server 43, the Internet 3 , The ground relay device 6C2, the space relay device 6C1, the transmission / reception antenna, the local server, and the interface unit, and communication according to the Internet protocol will be performed.
  • the communication unit of the user device 31_531 receives the streaming-distributed 3D compressed video signal and audio signal from the Internet via the built-in Internet interface, and performs the necessary decoding processing. Then, it transmits to the HMD and headphones. Particularly, the 3D compressed video signal is subjected to decompression decoding processing and transmitted as a 3D non-compressed video signal.
  • the HMD projects the video from the left and right projection units. As a result, the user can enjoy the 3D video including the observation target on the spot.
  • the transmission of the 3D uncompressed video signal decoded from the communication unit can be wire-transmitted as an HDMI (registered trademark) signal or an MHL (registered trademark) signal. Alternatively, it can be wirelessly transmitted as a WirelessHD signal or a WHDI signal.
  • a control unit identifies the position and movement of the user's head and the direction of the line of sight based on the signals from the acceleration sensor and the direction sensor, and generates a swimming control primary signal and a primary line of sight control signal based on the result, Send those signals to the communication unit.
  • the communication unit transmits the swimming control primary signal and the line-of-sight control primary signal received from the controller to the IP address assigned to the cosmic device 1C1 using the established communication channel.
  • the local server in the space master device 2C receives the swimming control primary signal and the line-of-sight control primary signal via the antenna unit, and based on the swimming control primary signal and the line-of-sight control primary signal, the cosmic device 1C1
  • the swimming control secondary signal and the line-of-sight control primary signal for controlling the swimming of the are generated and transmitted to the cosmic device 1C1.
  • the cosmic device 1C1 controls jet ejection based on the received secondary flight control signal while changing the line-of-sight direction of the CMOS camera module.
  • the user can swim the cosmic device 1C1 in outer space and enjoy the live image of the earth as seen from outer space.
  • the left and right projection units of the HMD project an image linked to the position of the user's head and the movement of the direction of the face (line of sight) specified by the control unit. You can enjoy the images as if you were swimming.
  • the HMD of the user device has been described as displaying a live image of an object in a remote place, but a recorded image based on the result of shooting an object in a remote place. Can be displayed, or a VR (Virtual Reality) image can be displayed.
  • a VR Virtual Reality
  • the recorded image can be used as a substitute to alleviate the user's dissatisfaction to some extent.
  • the present invention can be used in an industry for constructing a remote live video entertainment facility and an industry for providing a service related to a pseudo-tour experience using the remote live video entertainment facility. Further, it can be used in an industry of manufacturing a user device used in a facility that provides the simulated tourism experience, particularly in an industry of manufacturing live image display means in the user device.
  • Cosmic device 1 1C2 ⁇ ⁇ ⁇ Cosmic device 2 1C3 ⁇ ⁇ ⁇ Cosmic device 3 1C4 ; Cosmic device 4 1C5 ⁇ ⁇ ⁇ Cosmic device 5 1C6 ...
  • Cosmic device 6 2 ⁇ 1 Local parent device ⁇ 1 2 ⁇ 11 ............ antenna 2 ⁇ 12 ?? Local server 2B ; Deep sea parent device 2C:
  • Space parent device 3 Internet 4 Facility server 41 ?? Sub-experience area server 1 42 ?? Server 2 in the sub-experience area 43 — Server 3 in the sub-experience area 511 ...
  • User device 11 512 User device 12 521 ... user device 21 522 ; User device 22 531 User device 31 532 User device 32 6B1 ........ anna 6B2 ; Ground relay device 6C1 ....
  • Space repeater 6C2 Ground relay device 71 . Imaging means 711 ... 72 ........ Gaze direction 73 ... Imaging range 731 ... Viewpoint (center of imaging range)

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

[Problem] The purpose of the present invention is to provide: a remote live video amusement facility that, even in the case where the number of objects capable of being displayed on a user device installed therein grows substantially, is less likely to confuse a user as to in which live video the user should take enjoyment or to require cumbersome operations in order to display live video in which the user would like to take enjoyment; and a remote live video amusement facility which enables appropriate billing according to a user's usage situation. [Solution] Provided is a remote live video amusement facility that has installed therein a plurality of "user devices each of which being equipped with a 'live video display means for displaying live video capturing an object located in a remote place'", wherein the object of which video can be displayed on the live video display means of said user device is configured to change according to a result selected or declared by a user using said user device.

Description

遠隔ライブ映像娯楽施設及び該遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法Remote live video entertainment facility and charging method for users who use the remote live video entertainment facility
 本発明は、ユーザが一箇所に留まりながら複数の遠隔地に存在する対象のライブ映像を楽しむことができる遠隔ライブ映像娯楽施設に関する。また、該遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法に関する。 The present invention relates to a remote live image entertainment facility where a user can enjoy a live image of a target existing in a plurality of remote areas while staying at one place. The present invention also relates to a charging method for a user who uses the remote live video entertainment facility.
 観光産業は、現在、全世界のGDPの約10%を占めるに至っており、今後、「21世紀最大の産業」となるとの予測がある。
 このような趨勢の中、従来の観光地での観光に飽き足らず、秘境、さらには、宇宙空間、深海といったフロンティア空間における観光に対するニーズも高まりつつある。
The tourism industry now accounts for about 10% of the world's GDP, and it is predicted that it will become the “largest industry in the 21st century” in the future.
In this trend, there is a growing need for tourism in traditional tourist destinations, as well as in unexplored areas and frontier spaces such as outer space and the deep sea.
 しかし、秘境に数多くの観光客が押し寄せると、希少な自然環境や貴重な遺跡が破壊されるという問題が生じる。また、宇宙空間、深海といったフロンティア空間における観光については、現地に赴くために多大なコストがかかるだけではなく、無重力や高気圧といった周辺環境の影響で身体に過大な負荷がかかったり、現地への移動手段(宇宙船や深海調査船など)が破損した場合に生命が危険にさらされたりするリスクが伴う。 However, if many tourists rush to the unexplored area, the problem will occur that the rare natural environment and precious ruins will be destroyed. In addition, sightseeing in frontier spaces such as outer space and the deep sea not only costs a lot of money to go to the site, but it also puts an excessive load on the body due to the influence of the surrounding environment such as weightlessness and high pressure, and travel to the site. There is a risk of life being compromised if the means (spacecraft, deep-sea research vessel, etc.) is damaged.
 上記のような問題を解決する方法として、内部に「『遠隔地に存在する対象を撮影したライブ映像を表示するライブ映像表示手段』を備えたユーザ装置」を複数設置することで、ユーザが一箇所に留まりながら複数の観光地における観光を疑似的に体験できるようにした遠隔ライブ映像娯楽施設が提案されている。 As a method for solving the above-mentioned problem, by installing a plurality of “user devices equipped with“ live image display means for displaying a live image of an image of an object existing in a remote place ”inside, A remote live video entertainment facility has been proposed that allows people to experience sightseeing in multiple tourist destinations while staying in one place.
 しかし、従来の遠隔ライブ映像娯楽施設では、該遠隔ライブ映像娯楽施設内に設置されたユーザ装置のライブ映像表示手段に映像が表示され得る対象(以下「ユーザ装置の表示可能対象」と略記する)の数が増えてくると、ユーザがライブ映像を楽しめる対象の選択肢が増えるという利点がある反面、ユーザがどの対象のライブ映像を楽しむべきかを迷ったり、自身が楽しみたいライブ映像を表示させるために煩雑な操作を要したりすることになり、却ってユーザの利便性が下がるという問題があった。 However, in a conventional remote live video entertainment facility, an object whose video can be displayed on the live video display means of the user device installed in the remote live video entertainment facility (hereinafter abbreviated as "displayable target of user device"). As the number of users increases, the number of choices for users to enjoy live video increases, but on the other hand, users may wonder which target live video should be enjoyed or to display the live video that they want to enjoy. However, there is a problem in that the user's convenience is reduced, since it requires complicated operations.
 また、一般に、ユーザが映像を楽しむ娯楽施設では、施設の内装、展示、照明、温度などを、ユーザが楽しんでいる映像と同調させることにより、ユーザに対して統合的な知覚体験を与えることができる。しかし、従来の遠隔ライブ映像娯楽施設では、ユーザ装置の表示可能対象が複数になると、「施設の内装、展示、照明、温度などをどの対象が存在する遠隔地に適合させるか」が特定できないため、ユーザに対して上記のような統合的な知覚感覚を与えることができないという問題があった。 In general, in an entertainment facility where the user enjoys images, it is possible to give the user an integrated sensory experience by synchronizing the interior, exhibition, lighting, temperature, etc. of the facility with the image that the user enjoys. it can. However, in the conventional remote live video entertainment facility, when there are multiple displayable targets of the user device, it is not possible to specify "which target is to adapt the interior, exhibition, lighting, temperature, etc. of the facility to the remote location." However, there is a problem in that it is not possible to give the user the above-described integrated sense of perception.
 さらに、従来の遠隔ライブ映像娯楽施設では、当該遠隔ライブ映像娯楽施設をユーザの利用に供し、それに対して対価を得ようとした場合、ユーザ装置の表示可能対象の数が増えてくるとユーザの利用状況が多様になってくるが、それに応じた適切な課金方法が取れないという問題があった。 Furthermore, in the conventional remote live video entertainment facility, if the remote live video entertainment facility is used by the user and an attempt is made to pay for it, the number of displayable objects of the user device will increase. Although the usage situation is diversified, there is a problem that an appropriate charging method cannot be taken according to the situation.
特許6089256号公報Japanese Patent No. 6089256
 本発明は、このような事情に鑑みてなされたものであり、遠隔ライブ映像娯楽施設の内部に設置されたユーザ装置の表示可能対象が多数になっても、ユーザがどの対象のライブ映像を楽しむべきかを迷ったり、自身が楽しみたいライブ映像を表示させるために煩雑な操作を要したりすることの少ない遠隔ライブ映像娯楽施設を提供することを目的とする。 The present invention has been made in view of the above circumstances, and the user can enjoy the live image of any target even if the user device installed inside the remote live image entertainment facility has a large number of displayable targets. It is an object of the present invention to provide a remote live video entertainment facility that is less confusing and does not require complicated operations for displaying the live video that one wants to enjoy.
 また、本発明は、その内部に設置されたユーザ装置の表示可能対象が多数になっても、ユーザに対して「自身が実際に『ライブ映像を楽しんでいる対象が存在している遠隔地』に滞在しているような錯覚的感覚」を与えることができる遠隔ライブ映像娯楽施設を提供することを目的とする。 In addition, the present invention allows the user to say, “When the user device installed therein has a large number of displayable objects,“ there is a remote place where an object actually enjoying the live image exists ”. The purpose is to provide a remote live video entertainment facility that can give the illusion of "staying in."
 また、本発明は、その内部に設置されたユーザ装置の表示可能対象が多数になった場合に、ユーザの利用状況に応じた適切な課金方法が取ることができる遠隔ライブ映像娯楽施設及び該遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法を提供することを目的とする。 In addition, the present invention provides a remote live video entertainment facility and a remote live video entertainment facility that can use an appropriate charging method according to the usage situation of a user when the number of displayable objects of the user device installed therein increases. It is an object of the present invention to provide a billing method for a user who uses a live video entertainment facility.
 上記目的を達成するために、遠隔ライブ映像娯楽施設に係る本発明の第1の様態は、
内部に「『遠隔地に存在する対象を撮影したライブ映像を表示するライブ映像表示手段』を備えたユーザ装置」を複数設置する遠隔ライブ映像娯楽施設であって、
前記ユーザ装置のライブ映像表示手段に映像が表示され得る対象(以下「ユーザ装置の表示可能対象」と略記する)が、該ユーザ装置を利用するユーザが選択又は申告した結果に応じて変わる、
ことを特徴とする遠隔ライブ映像娯楽施設。
ように構成した。
In order to achieve the above object, a first aspect of the present invention relating to a remote live video entertainment facility is as follows.
A remote live video entertainment facility in which a plurality of “user devices equipped with“ live video display means for displaying live video of an object existing in a remote place ”are installed,
An object whose image can be displayed on the live image display means of the user device (hereinafter abbreviated as “displayable object of the user device”) changes according to a result selected or declared by a user who uses the user device.
A remote live video entertainment facility characterized by that.
As configured.
 なお、本発明の第1の様態においては、ユーザが、該遠隔ライブ映像娯楽施設に入場する際に、ユーザ装置の表示可能対象の種類や範囲を選択したり、デモグラフィックス(年齢、性別、居住地、身長・体重、職業など)を申告したりするようにした上で、それら選択・申告した結果を、該ユーザが保持する入場カード(非接触型ICカードなど)に記録して、ユーザ装置に読み取らせる構成とすることができる。 In the first aspect of the present invention, when the user enters the remote live video entertainment facility, the user can select the type or range of displayable target of the user device, and can use the demographics (age, sex, The place of residence, height / weight, occupation, etc. should be declared, and the results of the selection / declaration recorded on the admission card (contactless IC card, etc.) held by the user, It can be configured to be read by the device.
 また、ユーザ装置のライブ映像表示手段として、没入型3D-HMD(Head Mounted Display。以下、「没入型3D-HMD」のことを単に「HMD」という)を使用することができる。 An immersive 3D-HMD (Head Mounted Display; hereinafter, "immersive 3D-HMD" is simply referred to as "HMD") can be used as a live image display means of the user device.
 また、ユーザ装置は、前記HMDに加えて、該HMDを装着したユーザが着座するリクライニングシートなど着座手段と、該ユーザがマニュアル操作するコントローとを備えるような構成とすることができる。 Further, the user device may be configured to include a seating means such as a reclining seat on which the user wearing the HMD sits, and a controller manually operated by the user, in addition to the HMD.
 次に、上記目的を達成するために、遠隔ライブ映像娯楽施設に係る本発明の第2の様態は、
第1の様態の遠隔ライブ映像娯楽施設において、
前記遠隔ライブ映像娯楽施設においてユーザ装置が設置される体験エリアが複数のサブ体験エリアに分割されており、
前記ユーザ装置の表示可能対象が、該ユーザ装置がどのサブ体験エリアに設置されているかに応じて異なるように設定する、
ように構成した。
Next, in order to achieve the above object, a second aspect of the present invention relating to a remote live video entertainment facility is as follows.
In the remote live video entertainment facility of the first aspect,
The experience area in which the user device is installed in the remote live video entertainment facility is divided into a plurality of sub-experience areas,
The displayable target of the user device is set to be different depending on which sub-experience area the user device is installed in,
As configured.
 なお、体験エリアの複数のサブ体験エリアへの分割は、固定壁によって行うこともできるし、可動式のパーティションによって行うこともできる。 Note that the experience area can be divided into multiple sub-experience areas using fixed walls or movable partitions.
 また、上記目的を達成するために、遠隔ライブ映像娯楽施設に係る本発明の第3の様態は、
第2の様態の遠隔ライブ映像娯楽施設において、
前記サブ体験エリアにおける内装、展示、明るさ、温度、湿度のうちの少なくとも一つを、該サブ体験エリア内に設置されたユーザ装置の表示可能対象のうちの少なくとも一つが存在する遠隔地の現況に適応させる、
ように構成した。
Further, in order to achieve the above object, a third aspect of the present invention relating to a remote live video entertainment facility is,
In the second form of remote live video entertainment facility,
At least one of interior, exhibition, brightness, temperature, and humidity in the sub-experience area, and a status of a remote place where at least one of the displayable objects of the user device installed in the sub-experience area exists. To adapt to,
As configured.
 なお、本発明の第3の様態においては、前記遠隔地から明るさ、温度、湿度などの現況情報を入手し、該現況情報に基づいて、照明装置や空調装置を操作する構成とすることができる。また、前記遠隔地の代表的な建築物を壁面に描いたり、代表的な植物を展示したりすることができる。 In addition, in the third aspect of the present invention, the present condition information such as brightness, temperature, and humidity is obtained from the remote place, and the lighting device and the air conditioner are operated based on the present condition information. it can. In addition, a typical building in the remote area can be drawn on the wall surface or a typical plant can be displayed.
 また、上記目的を達成するために、遠隔ライブ映像娯楽施設に係る本発明の第4の様態は、
第1乃至第3のいずれか一つの様態の遠隔ライブ映像娯楽施設において、
前記ユーザ装置の表示可能対象が、時期や時間によって変化する、
ように構成した。
Further, in order to achieve the above object, a fourth aspect of the present invention relating to a remote live video entertainment facility is,
In the remote live video entertainment facility of any one of the first to third aspects,
The displayable target of the user device changes with time and time,
As configured.
 さらに、上記目的を達成するために、遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法に係る本発明の第5の様態は、
第1乃至第4のいずれか一つの様態の遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法であって
該ユーザが利用するユーザ装置の表示可能対象の種類及び/又は数に応じて課金する、
ように構成した。
Further, in order to achieve the above object, a fifth aspect of the present invention relating to a charging method for a user who uses a remote live video entertainment facility,
A charging method for a user who uses the remote live video entertainment facility according to any one of the first to fourth aspects, and charges according to the type and / or number of displayable objects of the user device used by the user.
As configured.
 また、上記目的を達成するために、遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法に係る本発明の第6の様態は、
第2乃至第4の様態の遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法であって、
該ユーザが入場することができるサブ体験エリアの種類及び/又は数に応じて課金する、
ように構成した。
Further, in order to achieve the above object, a sixth aspect of the present invention relating to a charging method for a user who uses a remote live video entertainment facility,
A charging method for a user who uses the remote live video entertainment facility according to any one of the second to fourth aspects,
Charge according to the type and / or number of sub-experience areas that the user can enter,
As configured.
 本発明の第1乃至第4の様態の疑似観光体験施設によれば、ユーザは、遠隔ライブ映像娯楽施設に入場する際、自身が利用するユーザ装置の表示可能対象として、例えば、下記の表示可能対象1~6から選択する(複数を選択することも可能)ことで、自身が利用するユーザ装置のライブ映像表示手段に、自身が選択した表示可能対象のライブ映像だけが表示されるようにすることができる。
 表示可能対象1‥‥‥陸上の動物
 表示可能対象2‥‥‥水中の動物
 表示可能対象3‥‥‥植物
 表示可能対象4‥‥‥自然景観
 表示可能対象5‥‥‥遺跡
 表示可能対象6‥‥‥現代の建築物
 表示可能対象7‥‥‥ライブイベント
According to the simulated tourism experience facilities of the first to fourth aspects of the present invention, when the user enters the remote live video entertainment facility, the user can use, for example, the following displayable items as displayable targets of the user device used by the user. By selecting from the targets 1 to 6 (a plurality of targets can be selected), only the live video of the displayable target selected by itself is displayed on the live video display means of the user device used by the user. be able to.
Displayable object 1 ···········································································································・ ・ ・ Modern buildings Displayable objects 7 ‥‥ Live event
 そうすると、ユーザは、ユーザ装置を使用する際に、広範囲の対象の中から「どのような種類の対象のライブ映像を楽しむべきか」を迷うことは少なくなる。また、自身が楽しみたいライブ映像を表示させるための操作の煩雑さも軽減される。 Then, when using the user device, the user will not hesitate about “what kind of target live video should be enjoyed” from a wide range of targets. In addition, the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
 また、本発明の第1乃至第4の様態の疑似観光体験施設によれば、ユーザは、遠隔ライブ映像娯楽施設に入場する際、自身のデモグラフィックス(年齢、性別、宗教、国籍、居住地、宗教、身長・体重、職業など)を申告し、その内容に基づいて、該ユーザが利用するユーザ装置の表示可能映像を限定するようにすることができる。 In addition, according to the pseudo-tourism experience facility of the first to fourth aspects of the present invention, when the user enters the remote live video entertainment facility, the user can use his / her own demographics (age, gender, religion, nationality, place of residence). , Religion, height / weight, occupation, etc.), and the displayable video of the user device used by the user can be limited based on the content.
 そうすると、例えば、段落[0023]で例示した構成において、15歳未満の利用者が表示可能対象7を選択した場合に、該15歳未満の利用者が利用するユーザ装置のライブ映像表示手段には、いわゆる「R15」のライブイベントのライブ映像は表示させないようにすることができる。また、特定の宗教の信者が利用するユーザ装置のライブ映像表示手段に、該宗教の禁忌に抵触するようなライブ映像を表示させないようにすることができる。 Then, for example, in the configuration exemplified in paragraph [0023], when the user under 15 years old selects the displayable object 7, the live image display means of the user device used by the user under 15 years old is It is possible not to display the live image of the so-called "R15" live event. Further, it is possible to prevent the live image display means of the user device used by the believer of a particular religion from displaying a live image that violates the contraindications of the religion.
 また、本発明の第1乃至第4の様態の疑似観光体験施設によれば、ユーザが利用するユーザ装置の表示可能対象の種類及び/又は数に応じて課金する課金方法を採用すれば、ユーザの利用状況に応じて適切に課金することができる。 In addition, according to the simulated tourism experience facilities of the first to fourth aspects of the present invention, if a charging method that charges according to the type and / or number of displayable targets of the user device used by the user is adopted, the user can It is possible to appropriately charge according to the usage status of.
 また、特に、本発明の第2乃至第4の様態の遠隔ライブ映像娯楽施設によれば、例えば、体験エリアを6つのサブ体験エリア(サブ体験エリア1~6)に分割した上で、各サブ体験エリア内に配置されているユーザ装置の表示可能対象を、それらが存在する地域(観光地)に応じて、下記のように割り振ることができる。
 サブ体験エリア1‥‥‥日本国内の観光地
 サブ体験エリア2‥‥‥日本以外のアジアの観光地
 サブ体験エリア3‥‥‥北中南米域内の観光地
 サブ体験エリア4‥‥‥欧州域内の観光地
 サブ体験エリア5‥‥‥大洋州域内の観光地
 サブ体験エリア6‥‥‥アフリカ域内の観光地
Further, in particular, according to the remote live video entertainment facility of the second to fourth aspects of the present invention, for example, the experience area is divided into six sub-experience areas (sub-experience areas 1 to 6), and then each sub-experience area is divided. The displayable targets of the user devices arranged in the experience area can be assigned as follows according to the area (tourist spot) in which they are present.
Sub-experience area 1 ・ ・ ・ Sightseeing spots in Japan Sub-experience area 2 ‥‥ ‥ Sightseeing spots in Asia other than Japan Sub-experience area 3 ‥ ‥ Sightseeing spots in North and Central and South America Sub-experience area 4 ‥ ‥ Sightseeing in Europe Local sub-experience area 5 ‥‥ Tourist destinations in the Pacific Ocean Sub-experience area 6 ‥‥ Tourist destinations in Africa
 そうすると、日本国内の観光地のライブ映像を楽しみたいユーザはサブ体験エリア1に入場し、日本以外のアジアの観光地のライブ映像を楽しみたいユーザはサブ体験エリア2に入場し、米国内の観光地のライブ映像を楽しみたいユーザはサブ体験エリア3に入場し、欧州域内の観光地のライブ映像を楽しみたいユーザはサブ体験エリア4に入場し、大洋州域内の観光地のライブ映像を楽しみたいユーザはサブ体験エリア5に入場し、アフリカ域内の観光地のライブ映像を楽しみたいユーザはサブ体験エリア6に入場することで、ユーザは、ユーザ装置を使用する際に、広範囲の観光地の中から「どの地域の観光地のライブ映像を楽しむべきか」を迷うことは少なくなる。また、自身が楽しみたいライブ映像を表示させるための操作の煩雑さも軽減される。 Then, users who want to enjoy live images of tourist destinations in Japan enter the sub-experience area 1, users who want to enjoy live images of tourist destinations in Asia other than Japan enter the sub-experience area 2, and tourism in the United States. Users who want to enjoy the live video of the area enter the sub-experience area 3, users who want to enjoy the live video of the tourist area in Europe want to enter the sub-experience area 4, and the live image of the tourist area in the Pacific region The user enters the sub-experience area 5, and the user who wants to enjoy a live image of a tourist spot in Africa enters the sub-experience area 6 so that the user can use the user device in a wide range of tourist spots. It is less likely to be confused as to "Which local tourist spot should you enjoy the live video of?" In addition, the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
 また、本発明の第2乃至第4の様態の遠隔ライブ映像娯楽施設によれば、例えば、体験エリアを7つのサブ体験エリア(サブ体験エリア1~7)に分割した上で、各サブ体験エリア内に配置されているユーザ装置の表示可能対象を、それらの種類に応じて、下記のように割り振ることができる。
 サブ体験エリア1‥‥‥陸上の動物
 サブ体験エリア2‥‥‥水中の動物
 サブ体験エリア3‥‥‥植物
 サブ体験エリア4‥‥‥自然景観
 サブ体験エリア5‥‥‥遺跡
 サブ体験エリア6‥‥‥現代建築物
 サブ体験エリア7‥‥‥ライブイベント
Further, according to the remote live video entertainment facility of the second to fourth aspects of the present invention, for example, the experience area is divided into seven sub-experience areas (sub-experience areas 1 to 7), and then each sub-experience area The displayable objects of the user devices arranged in the inside can be allocated as follows according to their types.
Sub-experience area 1 ... land animals Sub-experience area 2 ... underwater animals Sub-experience area 3 ... plants Sub-experience area 4 ... Natural scenery Sub-experience area 5 ... Ruins Sub-experience area 6 ... ‥‥ Modern building Sub-experience area 7 ‥‥ Live event
 そうすると、陸上の動物のライブ映像を楽しみたいユーザはサブ体験エリア1に入場し、水中の動物のライブ映像を楽しみたいユーザはサブ体験エリア2に入場し、植物のライブ映像を楽しみたいユーザはサブ体験エリア3に入場し、自然景観のライブ映像を楽しみたいユーザはサブ体験エリア4に入場し、遺跡のライブ映像を楽しみたいユーザはサブ体験エリア5に入場し、現代建築物のライブ映像を楽しみたいユーザはサブ体験エリア6に入場し、現代建築物のライブ映像を楽しみたいユーザはサブ体験エリア7に入場することで、ユーザは、ユーザ装置を使用する際に、広範囲の種類の対象の中から「どの種類の対象のライブ映像を楽しむべきか」を迷うことは少なくなる。また、自身が楽しみたいライブ映像を表示させるための操作の煩雑さも軽減される。 Then, users who want to enjoy live images of land animals enter the sub-experience area 1, users who want to enjoy live images of underwater animals enter the sub-experience area 2, and users who want to enjoy live images of plants are sub Users who want to enter the experience area 3 and enjoy the live image of the natural landscape enter the sub experience area 4, and those who want to enjoy the live image of the ruins enter the sub experience area 5 and enjoy the live image of the modern building. A user who wants to enter the sub-experience area 6 and a user who wants to enjoy a live image of a modern building enter the sub-experience area 7, so that the user can select a wide range of objects when using the user device. It is less likely to be confused as to “what kind of target live video should be enjoyed”. In addition, the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
 さらに、本発明の第2乃至第4の様態の遠隔ライブ映像娯楽施設によれば、サブ体験エリアのすべて又は一部をさらに複数のサブサブ体験エリアに分割した上で、各サブサブ体験エリアに配置されているユーザ装置の表示可能対象を、それらの種類に応じて、さらに限定することができる。例えば、段落[0030]のサブ体験エリア1~7を、以下のサブサブ体験エリアに分割することができる。
[サブ体験エリア1]
  サブサブ体験エリア1A‥‥‥日本の陸上動物
  サブサブ体験エリア1B‥‥‥日本以外のアジアの陸上動物
  サブサブ体験エリア1C‥‥‥北中南米の陸上動物
  サブサブ体験エリア1D‥‥‥欧州の陸上動物
  サブサブ体験エリア1E‥‥‥大洋州の陸上動物
  サブサブ体験エリア1F‥‥‥アフリカの陸上動物
[サブ体験エリア2]
  サブサブ体験エリア2A‥‥‥浅瀬の魚
  サブサブ体験エリア2B‥‥‥深海の魚
  サブサブ体験エリア2C‥‥‥鯨類(クジラ、イルカ、シャチ)
  サブサブ体験エリア2D‥‥‥サメ類
[サブ体験エリア3]
  サブサブ体験エリア3A‥‥‥熱帯の植物
  サブサブ体験エリア3B‥‥‥温帯の植物
  サブサブ体験エリア3C‥‥‥乾燥帯の植物
  サブサブ体験エリア3D‥‥‥高原の植物
[サブ体験エリア4]
  サブサブ体験エリア4A‥‥‥日本の自然景観
  サブサブ体験エリア4B‥‥‥日本以外のアジアの自然景観
  サブサブ体験エリア4C‥‥‥北中南米の自然景観
  サブサブ体験エリア4D‥‥‥欧州の自然景観
  サブサブ体験エリア4E‥‥‥大洋州の自然景観
  サブサブ体験エリア4F‥‥‥アフリカの自然景観
[サブ体験エリア5]
  サブサブ体験エリア5A‥‥‥古代の遺跡
  サブサブ体験エリア5B‥‥‥中世の遺跡
  サブサブ体験エリア5C‥‥‥近世の遺跡
  サブサブ体験エリア5D‥‥‥近代の遺跡
[サブ体験エリア6]
  サブサブ体験エリア6A‥‥‥日本の現代建築物
  サブサブ体験エリア6B‥‥‥日本以外のアジアの現代建築物
  サブサブ体験エリア6C‥‥‥北中南米の現代建築物
  サブサブ体験エリア5D‥‥‥欧州の現代建築物
  サブサブ体験エリア6E‥‥‥大洋州の現代建築物
  サブサブ体験エリア6F‥‥‥アフリカの現代建築物
[サブ体験エリア7]
  サブサブ体験エリア7A‥‥‥スポーツイベント
  サブサブ体験エリア7B‥‥‥演劇
  サブサブ体験エリア7C‥‥‥コンサート
  サブサブ体験エリア7D‥‥‥観光イベント
Further, according to the remote live video entertainment facility of the second to fourth aspects of the present invention, all or part of the sub-experience area is further divided into a plurality of sub-sub-experience areas, and the sub-experience areas are arranged in each sub-sub-experience area. It is possible to further limit the displayable target of the user device depending on the type thereof. For example, the sub-experience areas 1 to 7 in paragraph [0030] can be divided into the following sub-sub-experience areas.
[Sub experience area 1]
Savusavu Experience Area 1A Japanese land animals Savusavu Area 1B Asian land animals other than Japan Savusavu Area 1C ... land animals in North and Central and South America Savusavu Experience Area 1D European land animals Savusavu Experience Area 1E ・ ・ ・ Land animals of the Pacific Ocean Sub-sub-experience area 1F ・ ・ ・ African land animals [Sub-experience area 2]
Savusavu Experience Area 2A ... Shoal Fish Savusavu Experience Area 2B ... Deep Sea Fish Savusavu Experience Area 2C ... Whales (whale, dolphin, orca)
Savusavu Experience Area 2D sharks [Savansavu Area 3]
Savusavu experience area 3A ・ ・ ・ Tropical plants Savusavu experience area 3B ・ ・ ・ Temperate plants Savusavu experience area 3C ‥‥ Arid plants Savusavu experience area 3D ‥‥ Plants in the plateau [Sub-experience area 4]
Savusavu experience area 4A ・ ・ ・ Natural landscape of Japan Savusavu experience area 4B ‥‥ Natural landscape of Asia other than Japan Savusavu experience area 4C ‥‥ Natural landscape of North and Central and South America Savusavu experience area 4D ‥‥‥ European natural landscape Area 4E: Natural scenery of the Pacific Ocean Sub-sub-experience area 4F: Natural scenery of Africa [Sub-experience area 5]
Savusavu Experience Area 5A ... Ancient Ruins Savusavu Experience Area 5B ... Medieval Ruins Savusavu Experience Area 5C ... Early Modern Ruins Savusavu Experience Area 5D ... Modern Ruins [Sub-Experience Area 6]
Savusavu Experience Area 6A Contemporary Japanese Buildings Savusavu Area 6B Contemporary Asian Buildings Other Than Japan Savusavu Area 6C Contemporary Buildings in North and Central and South America Savusav Experience Area 5D European Modern Architectural building Savusavu experience area 6E ... Contemporary buildings in the Pacific Ocean Savusavu experience area 6F ... Contemporary African buildings [Sub surfing area 7]
Savusavu Experience Area 7A Sports Event Savusavu Experience Area 7B Theater Dubbing Savusav Experience Area 7C Concert Savusav Experience Area 7D Sightseeing Event
 そうすると、日本の陸上動物の映像を楽しみたいユーザはサブサブ体験エリア1Aに入場し、スポーツライブイベントの映像を楽しみたいユーザはサブサブ体験エリア7Aに入場するなどすることによって、ユーザは、ユーザ装置を使用する際に、広範囲の対象の中から「どのような種類の対象のライブ映像を楽しむべきか」を迷うことは少なくなる。また、自身が楽しみたいライブ映像を表示させるための操作の煩雑さも軽減される。 Then, the user who wants to enjoy the images of Japanese land animals enters the sub-sub experience area 1A, the user who wants to enjoy the images of the sports live event enters the sub-sub experience area 7A, and the user uses the user device. In doing so, it is less likely to be confused about “what kind of target live video should be enjoyed” from a wide range of targets. In addition, the complexity of the operation for displaying the live video that one wants to enjoy is reduced.
 また、特に、本発明の第2乃至第4の様態の遠隔ライブ映像娯楽施設によれば、サブ体験エリアごとに独立した入口を設けてユーザの入出場を管理するようにすることができる。その構成において、本発明の遠隔ライブ映像娯楽施設を利用するユーザに対して、ユーザが入場することができるサブ体験エリアの種類及び/又は数に応じて課金する課金方法を採用すれば、ユーザの利用状況に応じて適切に課金することができる。 Further, in particular, according to the remote live video entertainment facility of the second to fourth aspects of the present invention, it is possible to provide an independent entrance for each sub-experience area to manage the entrance and exit of the user. In that configuration, if a charging method that charges a user who uses the remote live video entertainment facility of the present invention according to the type and / or number of sub-experience areas in which the user can enter is used, It is possible to charge appropriately according to the usage situation.
 また、就中、特に、本発明の第3又は第4の様態の遠隔ライブ映像娯楽施設によれば、前記サブ体験エリアにおける内装、展示、明るさ、温度、湿度のうちの少なくとも一つを、該サブ体験エリア内に設置されたユーザ装置の表示可能対象のうちの少なくとも一つが存在する遠隔地の現況に適応させることができるため、ユーザに対して「自身が実際に『ライブ映像を楽しんでいる対象が存在している遠隔地』に滞在しているような錯覚的感覚」を与えることができる。 In particular, according to the remote live video entertainment facility of the third or fourth aspect of the present invention, at least one of interior, exhibition, brightness, temperature, and humidity in the sub-experience area, Since at least one of the displayable objects of the user device installed in the sub-experience area can be adapted to the present condition of the remote place, the user is asked to "play the actual live video". It is possible to give an illusionary sensation of staying in a remote place where an object is present.
 例えば、段落[0028]で例示した構成においては、各々のサブ体験エリアにおける明るさ、温度、湿度を、各々の観光地の現在時刻や天候に合わせて調整することができる。また、各々のサブ体験エリアの内部に、各々の観光地の習俗に合わせた展示物を展示することができる。 For example, in the configuration illustrated in paragraph [0028], the brightness, temperature, and humidity in each sub-experience area can be adjusted according to the current time and weather of each tourist destination. Also, inside each of the sub-experience areas, you can display exhibits that match the customs of each tourist destination.
 また、就中、特に、本発明の第4の様態の遠隔ライブ映像娯楽施設によれば、ユーザがユーザ装置を使用している時点で実際に楽しむことができるライブ映像だけが表示されるような構成とすることができる。例えば、段落[0030]で例示した構成においては、サブ体験エリア4では、ユーザがユーザ装置を利用している時点で実際に発現している自然景観(例えば、極地でのオーロラ、火山の噴火、日食・月食、桜の開花・紅葉、など)を表示するようにすることができる。また、段落[0032]で例示した構成においては、サブサブ体験エリア7Aには、ユーザがユーザ装置を利用している時点で開催されているスポーツイベント(例えば、国内のプロ野球の試合など)だけを表示するようにすることができ、サブサブ体験エリア7Dには、ユーザがユーザ装置を利用している時点で開催されている観光イベント(例えば、祭り、花火、ロケットの発射、など)を表示するようにすることができる。 In addition, in particular, according to the remote live video entertainment facility of the fourth aspect of the present invention, only the live video that the user can actually enjoy while using the user device is displayed. It can be configured. For example, in the configuration illustrated in paragraph [0030], in the sub-experience area 4, a natural landscape that is actually exhibited when the user is using the user device (for example, an aurora in a polar region, a volcanic eruption, Solar eclipse / moon eclipse, cherry blossoms / autumn leaves, etc.) can be displayed. In the configuration illustrated in paragraph [0032], the sub-sub experience area 7A includes only sports events (for example, domestic professional baseball games) held when the user is using the user device. The sub-sub experience area 7D can be configured to display tourist events (eg, festivals, fireworks, rocket launches, etc.) that are being held when the user is using the user device. Can be
 そうすると、ユーザは、ユーザ装置のライブ映像表儒手段に、時期外れのライブ映像を表示させることはなくなり、自身が楽しみたいライブ映像を表示させるための操作の煩雑さが軽減される。 Then, the user will not display the live video out of time on the live video display means of the user device, and the complexity of the operation for displaying the live video that the user wants to enjoy will be reduced.
 また、本発明の第5又は第6の様態の遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法によれば、ユーザの利用状況に応じて適切に課金することができる。 Further, according to the charging method for the user who uses the remote live video entertainment facility of the fifth or sixth aspect of the present invention, it is possible to appropriately charge according to the usage situation of the user.
本発明の実施例1に係る遠隔ライブ映像娯楽施設の構成及び機能を説明するためのブロック図である。It is a block diagram for explaining the configuration and function of the remote live video entertainment facility according to the first embodiment of the present invention. 本発明の実施例1に係る遠隔ライブ映像娯楽施設の内部のユーザ装置に対してライブ映像を送信する現地システムの構成及び機能を説明するためのイメージ図である。FIG. 3 is an image diagram for explaining a configuration and a function of a local system that transmits a live video to a user device inside a remote live video entertainment facility according to the first embodiment of the present invention. 本発明の実施例1に係る遠隔ライブ映像娯楽施設を構成するユーザ装置の機能を説明するためのイメージ図である。FIG. 3 is an image diagram for explaining the function of the user device that constitutes the remote live video entertainment facility according to the first embodiment of the present invention. 本発明の実施例1に係る遠隔ライブ映像娯楽施設を構成するユーザ装置の機能を説明するためのもう一つのイメージ図である。FIG. 6 is another image diagram for explaining the function of the user device configuring the remote live video entertainment facility according to the first embodiment of the present invention. 撮像手段の撮像範囲、視座、視線方向及び視点(撮像範囲の中心)を説明するための説明図である。It is explanatory drawing for demonstrating the imaging range of an imaging means, a viewpoint, a gaze direction, and a viewpoint (center of an imaging range). 本発明の実施例2に係る遠隔ライブ映像娯楽施設の構成及び機能を説明するためのブロック図である。It is a block diagram for demonstrating the structure and function of the remote live image entertainment facility which concerns on Example 2 of this invention. 本発明の実施例2に係る遠隔ライブ映像娯楽施設の内部のユーザ装置に対してライブ映像を送信する深海現地システム及び中継装置の構成及び機能を説明するためのイメージ図である。It is an image figure for demonstrating the structure and function of the deep sea on-site system and relay apparatus which transmit a live image to the user apparatus inside the remote live image entertainment facility which concerns on Example 2 of this invention. 本発明の実施例2に係る遠隔ライブ映像娯楽施設の内部のユーザ装置に対してライブ映像を送信する宇宙現地システム及び中継装置の構成及び機能を説明するためのイメージ図である。FIG. 8 is an image diagram for explaining the configuration and function of a space field system and a relay device for transmitting live video to a user device inside a remote live video entertainment facility according to a second embodiment of the present invention.
 以下、図面を参照して本発明の実施形態について説明する。ただし、本発明はかかる実
施形態に限定されず、その技術思想の範囲内で種々の変更が可能である。
Embodiments of the present invention will be described below with reference to the drawings. However, the present invention is not limited to such an embodiment, and various modifications can be made within the scope of the technical idea thereof.
 図1は、本発明の実施例1に係る遠隔ライブ映像娯楽施設の構成及び機能を説明するためのブロック図である。 1 is a block diagram for explaining the configuration and functions of a remote live video entertainment facility according to a first embodiment of the present invention.
 本実施例の遠隔ライブ映像娯楽施設においては、ユーザ装置を設置する体験エリアがサブ体験エリア1~3の3つに区分されている。
 そして、サブ体験エリア1の内部には、ユーザ装置11_511~ユーザ装置1i_51i(iはサブ体験エリア1の内部に設置されるユーザ装置の総数であり、任意に設定できる。図にはユーザ装置11_511とユーザ装置12_512だけを表示している)が設置され、サブ体験エリア2の内部には、ユーザ装置21_521~ユーザ装置2j_52j(jはサブ体験エリア2の内部に設置されるユーザ装置の総数であり、任意に設定できる。図にはユーザ装置21_521とユーザ装置22_522だけを表示している)、サブ体験エリア3の内部には、ユーザ装置31_531~ユーザ装置3k_53k(kはサブ体験エリア3の内部に設置されるユーザ装置の総数であり、任意に設定できる。図にはユーザ装置31_531とユーザ装置32_532だけを表示している)が設置されている。
In the remote live video entertainment facility of the present embodiment, the experience area in which the user device is installed is divided into three sub-experience areas 1 to 3.
In the sub-experience area 1, user devices 11_511 to 1i_51i (i is the total number of user devices installed in the sub-experience area 1 and can be set arbitrarily. Only the user device 12_512 is displayed), and inside the sub-experience area 2, user devices 21_521 to 2j_52j (j is the total number of user devices installed inside the sub-experience area 2), User device 21_521 and user device 22_522 are only shown in the figure), inside sub-experience area 3 user device 31_531 to user device 3k_53k (k is installed inside sub-experience area 3) It is the total number of user devices that can be set and can be set arbitrarily. In the figure, only the user devices 31_531 and 32_532 are displayed).
 ユーザ装置11_511~ユーザ装置3k_53kは、ユーザが着座するリクライニングシートの形態をしており、該リクライニングシートには、コントローラ及びヘッドセットが付帯しており、また、リクライニングシートの内部には通信ユニットが格納されている。ここで、コントローラは、通常の空撮ドローン用に使用されるプロポ型ジョイスティックコントローラと同様の形態であり、ユーザが操作するボタン、レバー、ダイヤルなどを備えている。また、ヘッドセットは、HMDとヘッドフォンとが一体化されている。さらに、ユーザ装置11_511~ユーザ装置3k_53kの各々には、遠隔ライブ映像娯楽施設の内部で定義された識別番号が与えられている。 The user devices 11_511 to 3k_53k are in the form of reclining seats on which the user sits, the reclining seats are provided with a controller and a headset, and a communication unit is stored inside the reclining seats. Has been done. Here, the controller has a form similar to that of a propo-type joystick controller used for a normal aerial photography drone, and includes buttons, levers, dials and the like operated by a user. Further, the headset has the HMD and the headphone integrated. Further, each of the user devices 11_511 to 3k_53k is provided with an identification number defined within the remote live video entertainment facility.
 本実施例の遠隔ライブ映像娯楽施設には、サブ体験エリア1~3とは別にサーバエリアが設けられており、該サーバエリアの内部には施設サーバ4が設置されている。 In the remote live video entertainment facility of this embodiment, a server area is provided separately from the sub-experience areas 1 to 3, and the facility server 4 is installed inside the server area.
 施設サーバ4は、インターネット3に接続されており、同じくインターネット3に接続されている現地システムα1~αm(mは特定地域α内に設置される現地システムの総数であり、任意に設定できる。図には現地システムα1~α3だけを表示している)、現地システムβ1~βn(nは特定地域β内に設置される現地システムの総数であり、任意に設定できる。図には現地システムβ1~β3だけを表示している)、及び、現地システムγ1~γp(pは特定地域γ内に設置される現地システムの総数であり、任意に設定できる。図には現地システムγ1~γ3だけを表示している)との間で信号のやり取りを行う。 The facility server 4 is connected to the Internet 3, and the local systems α1 to αm (m is the total number of local systems installed in the specific area α, which are also connected to the Internet 3 and can be set arbitrarily. Shows only the local systems α1 to α3) and the local systems β1 to βn (n is the total number of local systems installed in the specific area β and can be set arbitrarily. Only β3 is displayed) and local systems γ1 to γp (p is the total number of local systems installed in the specific area γ and can be set arbitrarily. Only local systems γ1 to γ3 are displayed in the figure. Signal) to and from.
 以下では、本実施例における現地システムの構成と機能を、現地システムα1で代表させて説明する。
 現地システムα1は、空撮ドローンα11_1α11~空撮ドローンα1r_1α1r(rは現地システムα1に含まれる空撮ドローンの総数であり、任意に設定できる。図1には、空撮ドローンα11_1α11~α13_1α13だけを表示している)と、現地親装置α1_2α1とを備えている。また、空撮ドローンα11_1α11~空撮ドローンα1r_1α1rの各々には、現地システムα1の内部で定義された識別番号が与えられている。
In the following, the configuration and functions of the local system in this embodiment will be described by using the local system α1 as a representative.
The local system α1 is an aerial drone α11_1α11 to aerial drone α1r_1α1r (r is the total number of aerial drones included in the local system α1 and can be set arbitrarily. Only the aerial drones α11_1α11 to α13_1α13 are displayed in Fig. 1. And the local parent device α1_2α1. Further, each of the aerial photography drone α11_1α11 to the aerial photography drone α1r_1α1r is provided with an identification number defined inside the local system α1.
[規則91に基づく訂正 20.01.2020] 
 図2は、本発明の実施例1に係る遠隔ライブ映像娯楽施設の内部のユーザ装置に対してライブ映像を送信する現地システムα1の構成及び機能をより詳しく説明するためのイメージ図である(図には、空撮ドローンα11_1α11~空撮ドローンα16_1α16だけを表示している)。
 現地親装置α1_2α1は、送受信アンテナ2α11と現地サーバ2α12から構成され、現地サーバ2α12は、インターネット3に接続している。
[Correction 20.01.2020 under Rule 91]
FIG. 2 is an image diagram for explaining in more detail the configuration and function of the on-site system α1 for transmitting a live image to the user device inside the remote live image entertainment facility according to the first embodiment of the present invention (see FIG. Shows only aerial drone α11_1α11 to aerial drone α16_1α16).
The local parent device α1_2α1 is composed of a transmitting / receiving antenna 2α11 and a local server 2α12, and the local server 2α12 is connected to the Internet 3.
 空撮ドローンα11_1α11~空撮ドローンα1r_1α1rは、マルチコプター式の無人航空機であって、各々の回転翼の回転数を制御することによって空中を浮遊させた上で、直進運動、旋回運動及び回転運動を行わせることができる。 Aerial drone α11_1α11 to aerial drone α1r_1α1r is a multi-copter type unmanned aerial vehicle that floats in the air by controlling the rotation speed of each rotor, and then performs straight movement, turning movement and rotation movement. Can be done.
 また、空撮ドローンα11_1α11~空撮ドローンα1r_1α1rは、それぞれ、2つのCMOSカメラモジュールと、2つのイメージセンサーからの撮像電気信号を非圧縮の3D映像信号に変換する映像信号変換手段とを備えており、生成された3D非圧縮映像信号を無線通信で現地親装置α1_2α1に送信する。
 また、CMOSカメラモジュールは3ウェイ雲台の上に設置されており、3ウェイ雲台を3つの回転軸のまわりに回転させることで、撮像装置の前後・左右のティルト及びパンを実現できるようになっている。
In addition, each of the aerial photography drone α11_1α11 to the aerial photography drone α1r_1α1r is equipped with two CMOS camera modules and video signal conversion means for converting the imaging electrical signals from the two image sensors into an uncompressed 3D video signal. , And transmits the generated 3D uncompressed video signal to the local parent device α1_2α1 by wireless communication.
In addition, the CMOS camera module is installed on the 3-way platform, and by rotating the 3-way platform around the three rotation axes, it is possible to realize tilting and panning in the front / rear / left / right of the imaging device. Is becoming
 なお、本実施例では、撮像手段としてCMOSカメラモジュールを使用しているが、それ以外の撮像手段(例えばCCDカメラモジュールなど)を使用してもよい。 In this embodiment, the CMOS camera module is used as the image pickup means, but other image pickup means (for example, CCD camera module) may be used.
 さらに、空撮ドローンα11_1α11~α1r_1α1rは、前記イメージセンサーと前記映像信号変換手段とによって生成される非圧縮映像信号が担う映像情報に基づき、他の空撮ドローン、現地親装置及びそれら以外の障害物への接近を判断する接近判断手段を備えている。そして、前記飛行制御手段は、現地親装置α1_2α1から送信される飛行制御二次信号(後の段落で説明する)に加えて、前記接近判断手段が判断した結果にも基づいて、各々の回転翼の回転数を制御することにより、空撮ドローン、現地親装置及びそれら以外の障害物との接触を回避することができる。 Furthermore, the aerial drones α11_1α11 to α1r_1α1r are based on the video information carried by the uncompressed video signal generated by the image sensor and the video signal conversion means, based on other aerial drones, local parent devices and obstacles other than them. An approach determination means for determining the approach to is provided. Then, the flight control means, in addition to the flight control secondary signal transmitted from the local parent device α1_2α1 (described in a later paragraph), also based on the result determined by the approach determination means, each rotor blade. By controlling the rotation speed of the, it is possible to avoid contact with the aerial photography drone, the local parent device, and obstacles other than them.
 また、空撮ドローンα11_1α11~α1r_1α1rは、マイクロフォンを備えており、それぞれが集音した音声信号を、無線通信によって現地親装置α1_2α1に送信する。 Also, the aerial photography drones α11_1α11 to α1r_1α1r are equipped with microphones, and the audio signals collected by each are transmitted to the local parent device α1_2α1 by wireless communication.
 また、現地親装置α1_2α1を構成する現地サーバ2α12は、送受信アンテナ2α11を経由して、空撮ドローンα11_1α11~α1r_1α1rのイメージセンサーが撮像した撮像電気信号を映像信号変換手段が変換した3D非圧縮映像信号と、空撮ドローンα11_1α11~α1r_1α1rのマイクロフォンが集音した音声信号とを受信し、それぞれ、ストリーミング配信可能な映像信号及び音声信号に変換して、インターネット3に送信する。特に、3D非圧縮映像信号については、圧縮符号化処理を行い、3D圧縮映像信号として送信する。 In addition, the local server 2α12 that constitutes the local parent device α1_2α1 is a 3D uncompressed video signal converted by the video signal conversion means from the imaging electrical signal captured by the image sensor of the aerial photography drone α11_1α11 to α1r_1α1r via the transmission / reception antenna 2α11. And the audio signals picked up by the microphones of the aerial photography drones α11_1α11 to α1r_1α1r, respectively converted into video signals and audio signals that can be distributed by streaming, and transmitted to the Internet 3. Particularly, the 3D non-compressed video signal is subjected to compression encoding processing and transmitted as a 3D compressed video signal.
 現地システムα1の機能と構成のうち、ライブ映像とライブ音声の送信に関わる機能と構成は、概略、以上の通りである。一方、現地システムα1の機能と構成のうち、空撮ドローンの飛行の制御に関わる機能と構成は、後の段落で説明するが、現地システムα2~~αm、現地システムβ1~βnと現地システムγ1~γpは、これらの機能と構成も含めて、現地システムα1と同様の機能と構成を有する。 Among the functions and configurations of the local system α1, the functions and configurations related to transmission of live video and live audio are as outlined above. On the other hand, among the functions and configurations of the local system α1, the functions and configurations related to the flight control of the aerial photography drone will be explained in a later paragraph. The local systems α2 to αm, the local systems β1 to βn, and the local system γ1 .About..gamma.p have the same functions and configurations as the local system .alpha.1 including these functions and configurations.
 そして、これらの現地システムα1~αm、現地システムβ1~βn、現地システムγ1~γpは、それぞれ異なる地域に設置されている(以下では、現地システムα1~αmは日本国内に、現地システムβ1~βnは北米地域に、現地システムγ1~γpは欧州地域に、それぞれ設置されているものとして説明する)。そして、各現地システムは、それぞれの地域の異なる観光地に設置されている。 The local systems α1 to αm, the local systems β1 to βn, and the local systems γ1 to γp are installed in different areas (hereinafter, the local systems α1 to αm are located in Japan, the local systems β1 to βn are Will be installed in North America, and the local systems γ1 to γp will be installed in Europe). And each local system is installed in a different tourist resort in each area.
 さて、遠隔ライブ映像娯楽施設では、入場可能なサブ体験エリアに応じて、以下の2種類の入場料が設定されている。
  入場料A……サブ体験エリア1~3のすべてに入場可能
  入場料B……サブ体験エリア1~3のうちの1つだけに入場可能
そして、入場料Bを支払ったユーザは、サブ体験エリア1~3のうち、いずれのサブ体験エリアに入場するかを選択する。
At the remote live video entertainment facility, the following two types of entrance fees are set according to the sub-experience areas that can be entered.
Admission fee A ... Entrance to all of the sub-experience areas 1 to 3 Admission fee B ... Entrance to only one of the sub-experience areas 1 to 3 Select which sub-experience area to enter from 1 to 3.
 一方、遠隔ライブ映像娯楽施設内に設置されたユーザ装置11_511~ユーザ装置3k_53kのHMD及びヘッドフォンには、ユーザ装置がどのサブ体験エリアに設置されるかに応じて、特定の現地システムから送信されるライブ映像及び音声だけが表示・出力され得るように設定されている。
 以下では、サブ体験エリア1~3(の内部のユーザ装置)と現地システム(が設置されている観光地の地域)とは、下記のように対応付けられているものとして説明する。
 サブ体験エリア1(ユーザ装置11_511~1i_51i)……現地システムα1~αm(日本国内)
 サブ体験エリア2(ユーザ装置21_521~2j_52j)……現地システムβ1~βn(北米)
 サブ体験エリア3(ユーザ装置31_531~2j_53k)……現地システムγ1~γp(欧州)
On the other hand, the HMDs and headphones of the user devices 11_511 to 3k_53k installed in the remote live video entertainment facility are transmitted from a specific local system according to which sub-experience area the user device is installed. It is set so that only live video and audio can be displayed and output.
In the following description, the sub-experience areas 1 to 3 (internal user devices) and the local system (region of a tourist resort where the local system is installed) are associated as described below.
Sub-experience area 1 (user device 11_511 to 1i_51i) ... Local system α1 to αm (in Japan)
Sub-experience area 2 (user devices 21_521 to 2j_52j) ... Local system β1 to βn (North America)
Sub-experience area 3 (user device 31_531 ~ 2j_53k) ... Local system γ1 ~ γp (Europe)
 遠隔ライブ映像娯楽施設を利用するユーザは、施設全体の入口部において、上記の入場料A、入場料Bのいずれかの入場料を支払うことで入場カードを貸与される。入場カードは、非接触型ICカードであって、ユーザ自身の支払った入場料及び選択に応じて、入場可能なサブ体験エリアの情報が記録されている。 A user who uses a remote live video entertainment facility can rent an admission card by paying the admission fee of either the admission fee A or the admission fee B at the entrance of the entire facility. The admission card is a non-contact type IC card, and the information of the sub-experience area that can be admitted is recorded according to the admission fee paid by the user and selection.
 入場カードを貸与されたユーザは、アクセスエリアを通じて、自身が入場したいサブ体験エリアの入口に向かう。各サブ体験エリアの入口には、入場カードに記録された情報の読み取り手段を備えた入場管理ゲートが設置されており、各々のサブ体験エリアの入場管理ゲートは、ユーザが、「該サブ体験エリアに入場可能であるとの情報が記録された入場カード」を情報読み取り部にかざした場合だけオープンになる。こうして、ユーザは、自身の支払った入場料及び選択に対応したサブ体験エリアに入場できるようになる。 The user who lent out the admission card goes through the access area to the entrance of the sub-experience area where they want to enter. At the entrance of each sub-experience area, there is an entrance management gate equipped with a means for reading the information recorded on the entrance card. It will be open only when you hold the "admission card with the information that you can enter" on the information reading section. Thus, the user can enter the sub-experience area corresponding to the entrance fee paid by the user and the selection.
 以下では、サブ体験エリアの内部に設置されたユーザ装置の機能と構成を、サブエリア1の内部にあるユーザ装置11_511~ユーザ装置1i_51iで代表させて説明する。また、サブエリア1の内部にあるユーザ装置11_511~ユーザ装置1i_51iには、上記の通り、日本国内の観光地のライブ映像とライブ音声だけが出力できるように設定されているものとして説明する。 In the following, the functions and configurations of the user devices installed inside the sub-experience area will be described by using the user devices 11_511 to 1i_51i inside the sub-area 1 as a representative. Further, it is assumed that the user devices 11_511 to 1i_51i inside the sub-area 1 are set so as to be able to output only the live video and the live audio of the tourist spots in Japan as described above.
 サブ体験エリア1に入場したユーザは、自分自身で探して、又は、遠隔ライブ映像娯楽施設の係員に受動されて、ユーザ装置11_511~ユーザ装置1i_51iのうち、他のユーザが使用していないユーザ装置のリクライニングシートに着座する。
 以下、ユーザは、ユーザ装置11_511のリクライニングシートに着座したものとして説明するが、以下で説明する機能や構成は、他のユーザ装置12_512~ユーザ装置1i_51iでも共通である。
A user who has entered the sub-experience area 1 is a user device that is not used by another user among the user devices 11_511 to 1i_51i, which is searched for by himself or passively by an attendant of the remote live image entertainment facility. Sit on the reclining seat of.
Hereinafter, the user will be described as sitting on the reclining seat of the user device 11_511, but the functions and configurations described below are common to the other user devices 12_512 to 1i_51i.
 ユーザ装置11_511の座席には、入場カードに対して情報の読み取り/書き込みを行う入場カード挿入部が設けられており、ユーザが該挿入部に入場カードを挿入すると、ユーザ装置11_511が起動するとともに、該入場カードには、ユーザ装置が起動した時刻(t0)の情報が記録される。
 ユーザは、リクライニングシートに着座した後、コントローラの位置を自身の手で操作しやすいように調整し、ヘッドセットの位置をHMDが自身の視野を覆い、ヘッドフォンが耳をカバーするように調整する。
The seat of the user device 11_511 is provided with an admission card insertion unit that reads / writes information from / to the admission card, and when the user inserts the admission card into the insertion unit, the user device 11_511 is activated and Information on the time (t0) when the user device is activated is recorded on the entrance card.
After sitting on the reclining seat, the user adjusts the position of the controller so that it is easy to operate with his / her own hand, and adjusts the position of the headset so that the HMD covers his / her own visual field and the headphones cover the ears.
[規則91に基づく訂正 20.01.2020] 
 ヘッドセットの位置調整が完了すると、ユーザの視野には、まず、図3に例示したような「観光地の種類を選択するためのイメージ」が現れる。ユーザは、コントローラのレバーを操作して、「ライブ映像を楽しみたい観光地の種類の位置」までカーソルを移動させた後、ボタンを押すことで観光地の種類を選択できる。
 観光地の種類を選択すると(以下では、「世界遺産」を選択したものとして説明する)、HMDに表示される仮想空間内には、図4に例示したような「観光地の場所や物件を選択するためのイメージ」が現れる。ここでも、ユーザは、コントローラのレバーを操作して、「ライブ映像を楽しみたい観光地の場所や物件の位置」までカーソルを移動させた後、ボタンを押すことで観光地の場所や物件を選択できる。
[Correction 20.01.2020 under Rule 91]
When the position adjustment of the headset is completed, the “image for selecting the type of tourist spot” as illustrated in FIG. 3 first appears in the field of view of the user. The user operates the lever of the controller to move the cursor to "the position of the type of tourist spot where he / she wants to enjoy the live image", and then presses the button to select the type of tourist spot.
When you select the type of tourist destination (explained below as selecting "World Heritage"), the virtual space displayed on the HMD will display "places and properties of tourist destinations" as shown in Figure 4. The image to select ”appears. Also here, the user operates the lever of the controller to move the cursor to the "location of the tourist spot or property where you want to enjoy the live image", and then press the button to select the location or property of the tourist spot. it can.
 なお、上記の実施例は、ユーザ装置11_511~ユーザ装置3k_53kの表示可能対象は、「各ユーザ装置がどのサブ体験エリアに設置されているか」によって決まる構成となっているが、下記に例示するように、同じサブ体験エリアに設置されているユーザ装置であっても、表示可能対象を変える構成とすることが可能である。 In the above example, the displayable targets of the user devices 11_511 to 3k_53k are determined by "in which sub-experience area each user device is installed", but as illustrated below. In addition, it is possible to change the displayable target even for user devices installed in the same sub-experience area.
 すなわち、入場料A、Bに加えてより廉価な入場料Cを設定しておき、入場カードには、入場料A、B、Cのいずれを支払ったかを記録するようにする。その上で、ユーザ装置11_511~ユーザ装置3k_53kの各々には、入場カードに記録された情報の読み取り手段を備えておき、該情報読み取り手段で「ユーザが支払った入場料が入場料Cであること」を読み取った場合には、該ユーザが使用するユーザ装置のHMDにおいて投射される仮想空間内には、「図3よりも少ない選択肢しか示されないイメージ(例えば、「現代建築物」が表示されない)」や「図4よりも少ない選択肢しか示されないイメージ(例えば、「法隆寺地域」と「古都京都」だけしか表示されない)」を表示するようにすることによって、「入場料Cを支払ったユーザが利用するユーザ装置」の表示可能対象を限定することができる。 That is, in addition to the admission fees A and B, a cheaper admission fee C is set, and the admission card records which of the admission fees A, B, and C is paid. In addition, each of the user devices 11_511 to 3k_53k is provided with a reading unit for reading the information recorded on the entrance card, and the information reading unit indicates that "the entrance fee paid by the user is the entrance fee C." Is read, "in the virtual space projected on the HMD of the user device used by the user," an image showing fewer options than in FIG. 3 (for example, "modern building" is not displayed) "Or" an image showing fewer options than in Fig. 4 (for example, only "Horyuji area" and "Kyoto Kyoto" are displayed) "is displayed. It is possible to limit the displayable target of the “user device that operates”.
 なお、上記の「観光地の種類を選択するためのイメージ」や「観光地の場所や物件を選択するためのイメージ」に係る画像データの保存や切り替えは、施設サーバ4によって行われる。 Note that the facility server 4 saves or switches the image data related to the “image for selecting the type of tourist spot” and the “image for selecting the place or property of the tourist spot”.
 さて、段落[0063]に記載した方法で観光地の場所や物件を選択すると(以下では、「沖ノ島」を選択したものとして説明する)、その情報を伝達する信号が通信ユニットを介して施設サーバ4に送信される。その信号を受信した施設サーバ4は、ユーザ装置11_511に対して、ユーザ装置11_511にあらかじめ付与されている識別番号に基づいてIPアドレスを付与する。その上で、施設サーバ4は、インターネット3を介して、沖ノ島に近接する宗像大社付近に設置された現地システム(以下、宗像大社付近に設置された現地システムが現地システムα1_2α1であるとして説明する)を構成する現地サーバ2α12に対して、前記IPアドレスが付与されたユーザ装置11_511宛に、沖ノ島のライブ映像及びライブ音声を送信するように要求する信号を送信する。 By the way, when a place or property of a tourist resort is selected by the method described in paragraph [0063] (hereinafter, it is assumed that “Okinoshima” is selected), a signal transmitting the information is transmitted via the communication unit to the facility server. Sent to 4. Upon receiving the signal, the facility server 4 gives the user device 11_511 an IP address based on the identification number given in advance to the user device 11_511. Then, the facility server 4 is a local system installed near the Munakata Taisha near Okinoshima via the Internet 3 (hereinafter, the local system installed near the Munakata Taisha is described as a local system α1_2α1). To the local server 2α12 constituting the above, a signal requesting transmission of a live image and live voice of Okinoshima is transmitted to the user device 11_511 to which the IP address is assigned.
 前記要求信号を受信した現地サーバ2α12は、現地システムα1_2α1に属する空撮ドローンのうち使用可能なものを選択し(以下、選択された空撮ドローンは空撮ドローンα11_1α11であるとして説明する)、該空撮ドローンα11_1α11に対して、空撮ドローンα11_1α11にあらかじめ付与されている識別番号に基づいてIPアドレスを付与する。その上で、現地サーバ2α12は、インターネット3を介して、施設サーバ4に対して、前記IPアドレスが付与された空撮ドローンα11_1α11宛に、該空撮ドローンα11_1α11の飛行を制御する飛行制御一次信号を送信するように要求する信号を送信する。 Upon receiving the request signal, the local server 2α12 selects an available aerial drone belonging to the local system α1_2α1 (hereinafter, the selected aerial drone will be described as the aerial drone α11_1α11), An IP address is given to the aerial photography drone α11_1α11 based on the identification number given in advance to the aerial photography drone α11_1α11. Then, the local server 2α12 sends a flight control primary signal for controlling the flight of the aerial photography drone α11_1α11 to the facility server 4 via the Internet 3 to the aerial photography drone α11_1α11 to which the IP address is given. Send a signal requesting to send.
 こうして、ユーザ装置11_511と空撮ドローンα11_1α11との間に通信チャネルが確立されることになり、これ以降は、ユーザ装置11_511と空撮ドローンα11_1α11との間で、施設サーバ4、インターネット3、現地サーバ2α12及び送受信アンテナ2α11を介して、インターネット・プロトコルに準拠した通信が行われることになる。 In this way, a communication channel is established between the user device 11_511 and the aerial photography drone α11_1α11, and thereafter, between the user device 11_511 and the aerial photography drone α11_1α11, the facility server 4, the Internet 3, and the local server. Communication conforming to the Internet protocol will be performed via 2α12 and the transmitting / receiving antenna 2α11.
 このように確立した通信チャネルによって、ユーザ装置11_511の通信ユニットは、内蔵するインターネットインターフェースと施設サーバ4とを経由して、前記現地サーバ2α12からストリーミング配信された3D圧縮映像信号と音声信号を受信し、必要な復号化処理を行った上で、HMD及びヘッドフォンに送信する。特に、3D圧縮映像信号については、伸張復号化処理を行い、3D非圧縮映像信号として送信する。 Through the communication channel thus established, the communication unit of the user device 11_511 receives the 3D compressed video signal and audio signal streamed from the local server 2α12 via the built-in internet interface and the facility server 4. , Performs necessary decoding processing, and then transmits to the HMD and headphones. Particularly, the 3D compressed video signal is subjected to decompression / decoding processing and transmitted as a 3D non-compressed video signal.
 HMDは、通信ユニットから受信した3D非圧縮映像信号に基づき、左右の投射部から映像を投射する。これにより、ユーザは、沖ノ島上空から見た沖津宮などの3Dライブ映像を楽しむことができる。また、ヘッドフォンは、通信ユニットから受信した音声信号に基づき、左右の放音部から音声を放出する。これにより、ユーザは沖ノ島の波の音や風の音を楽しむことができる。 The HMD projects images from the left and right projection units based on the 3D uncompressed video signal received from the communication unit. As a result, the user can enjoy a 3D live image of Okitsunomiya viewed from above Okinoshima. Further, the headphone emits sound from the left and right sound emitting units based on the sound signal received from the communication unit. As a result, the user can enjoy the sound of the waves and wind of Okinoshima.
 なお、前記通信ユニットからデコードされた3D非圧縮映像信号及び音声信号の送信は、HDMI(登録商標)信号又はMHL(登録商標)信号として有線伝送することができる。又は、WirelessHD信号やWHDI信号として無線伝送することができる。 Note that the transmission of the 3D uncompressed video signal and audio signal decoded from the communication unit can be wire-transmitted as an HDMI (registered trademark) signal or an MHL (registered trademark) signal. Alternatively, it can be wirelessly transmitted as a WirelessHD signal or a WHDI signal.
 ユーザは、HMDに表示されたライブ映像を見ながらコントローラを操作することで、空撮ドローンα11_1α11の飛行を制御することができる。具体的には、コントローラは、ユーザのボタン操作、レバー操作、ダイヤル操作などに基づいて飛行制御一次信号を生成し、該飛行制御一次信号を通信ユニットに送信する。通信ユニットは、コントローラから受信した飛行制御一次信号を、前記確立された通信チャネルを用いて、空撮ドローンα11_1α11に付与されたIPアドレス宛に送信する。 The user can control the flight of the aerial drone α11_1α11 by operating the controller while watching the live image displayed on the HMD. Specifically, the controller generates a flight control primary signal based on a user's button operation, lever operation, dial operation, etc., and transmits the flight control primary signal to the communication unit. The communication unit transmits the flight control primary signal received from the controller to the IP address given to the aerial imaging drone α11_1α11 using the established communication channel.
 一方、HMDの内部には、加速度センサー、方位センサー及びコントロールユニットが内蔵されている。コントロールユニットは、加速度センサー及び方位センサーからの信号に基づき、ユーザの視線の向きを特定し、その結果に基づいて視線制御一次信号を生成し、該視線制御一次信号を通信ユニットに送信する。通信ユニットは、コントローラから受信した視線制御一次信号を、前記確立された通信チャネルを用いて、空撮ドローンα11_1α11に付与されたIPアドレス宛に送信する。 On the other hand, inside the HMD, an acceleration sensor, direction sensor and control unit are built in. The control unit identifies the direction of the line of sight of the user based on the signals from the acceleration sensor and the orientation sensor, generates a line-of-sight control primary signal based on the result, and transmits the line-of-sight control primary signal to the communication unit. The communication unit transmits the line-of-sight control primary signal received from the controller to the IP address assigned to the aerial imaging drone α11_1α11 using the established communication channel.
 現地サーバ2α12は、送受信アンテナ2α11を経由して前記飛行制御一次信号及び視線制御一次信号を受信し、該飛行制御一次信号及び視線制御一次信号に基づいて、空撮ドローンα11_1α11の飛行を制御する飛行制御二次信号及び視線制御一次信号を生成し、空撮ドローンα11_1α11に送信する。 The field server 2α12 receives the flight control primary signal and the line-of-sight control primary signal via the transmission / reception antenna 2α11, and based on the flight control primary signal and the line-of-sight control primary signal, a flight that controls the flight of the aerial photography drone α11_1α11. The control secondary signal and the line-of-sight control primary signal are generated and transmitted to the aerial imaging drone α11_1α11.
 空撮ローンα11_1α11は、受信した飛行制御二次信号に基づいて、各々の回転翼の回転数を制御する一方、CMOSカメラモジュールの視線方向を変化させる。 The aerial photography loan α11_1α11 controls the rotational speed of each rotor based on the received secondary flight control signal, while changing the line-of-sight direction of the CMOS camera module.
 なお、現地サーバ2α12は、空撮ドローンα11_1α11から受信した3D非圧縮映像信号に基づき、空撮ドローン相互の接近、各空撮ドローンと現地親装置及びそれら以外の障害物への接近を判断する接近判断機能を有する。そして、現地サーバ2α12は、送受信アンテナ2α11経由で受信した飛行制御一次信号に加えて、前記接近判断機能によって判断された結果にも基づいて、飛行制御二次信号を生成して空撮ドローンα11_1α11に送信する。これによって、各子装置が、他の子装置、親装置及びそれら以外の障害物との接触を回避することができる。 Note that the local server 2α12 determines whether the aerial drones approach each other or each aerial drone and the local parent device and other obstacles based on the 3D uncompressed video signal received from the aerial drone α11_1α11. Has a judgment function. Then, the local server 2α12 generates the flight control secondary signal based on the result determined by the approach determination function in addition to the flight control primary signal received via the transmission / reception antenna 2α11, and transmits it to the aerial photography drone α11_1α11. Send. This allows each child device to avoid contact with other child devices, parent devices, and obstacles other than these.
 このようにして、ユーザは、空撮ドローンα11_1α11を沖ノ島上空に飛行させて、沖ノ島上空から見た沖津宮などの3Dライブ映像を楽しむことができ、波の音や風の音などのライブ音声を楽しむことができる。その際、空撮ドローンを自身の意図する通りに飛行させることで撮像手段の視座を自由に変化させるとともに、ユーザ自身の視線に合わせて撮像手段の視線方向を変化させることができるため、神職以外の立ち入りが禁じられている沖ノ島にあたかも上陸しているかのような感覚をもって「First Person View(一人称視点での映像)」を楽しむことができる。 In this way, the user can fly the aerial drone α11_1α11 over Okinoshima and enjoy 3D live images of Okitsunomiya viewed from above Okinoshima, and enjoy live audio such as the sound of waves and wind. You can At that time, since the aerial drone is allowed to fly as intended, the viewpoint of the image pickup means can be freely changed, and the line of sight of the image pickup means can be changed in accordance with the user's own line of sight. You can enjoy the "First Person View" as if you were landing on Okinoshima, where entry is prohibited.
 なお、本明細書等でいう「撮像範囲」「視座」「視線」「視点(撮像範囲の中心)」の意味するところは、図5に示す通りである。 Note that the meanings of “imaging range”, “viewpoint”, “line of sight”, “viewpoint (center of imaging range)” in this specification and the like are as shown in FIG.
 なお、ユーザが、各サブ体験エリア内のユーザ装置を使用することができる時間(ΔT)は決まっており、時刻がt0+ΔTになると、自動的にユーザ装置11_511は停止する。ユーザ装置11_511が停止した情報は、インターネット3等を経由して現地サーバ2α12に送信される。停止情報を受信した現地サーバ2α12は、空撮ドローンα11_1α11に対して、空撮ドローンα11_1α11を所定の場所に帰還させるための飛行制御二次信号を送信する。 Note that the time (ΔT) that the user can use the user device in each sub-experience area is fixed, and when the time becomes t0 + ΔT, the user device 11_511 automatically stops. The information stopped by the user device 11_511 is transmitted to the local server 2α12 via the Internet 3 or the like. Upon receiving the stop information, the local server 2α12 transmits a flight control secondary signal for returning the aerial photography drone α11_1α11 to a predetermined place, to the aerial photography drone α11_1α11.
 本実施例では、サブ体験エリア1~3(の内部のユーザ装置)と現地システム(観光地)との対応付けは、上記のような地域によるだけではなく、下記のような、観光地における観光対象の種類によって行うこともできる。
  サブ体験エリア1(ユーザ装置11_511~1i_51i)……現地システム(自然景観)
  サブ体験エリア2(ユーザ装置21_521~2j_52j)……現地システム(遺跡)
  サブ体験エリア3(ユーザ装置31_531~2j_53k)……現地システム(現代建築物)
In the present embodiment, the correspondence between the sub-experience areas 1 to 3 (user devices inside thereof) and the local system (sightseeing place) is not limited to the above-mentioned area, but is also the following sightseeing in a sightseeing spot. It can also be performed depending on the type of object.
Sub-experience area 1 (user devices 11_511 to 1i_51i) …… Local system (natural scenery)
Sub-experience area 2 (user devices 21_521 to 2j_52j) …… Local system (ruins)
Sub-experience area 3 (user device 31_531 ~ 2j_53k) …… Local system (modern building)
 なお、本実施例では、ユーザ装置を設置する体験エリアが、サブ体験エリア1~3の3つに区分されているとして説明したが、区分数を2つだけにしたり、4つ以上にしたりすることができる。例えば、体験エリアを、サブ体験エリア1~9の9つに区分した場合、サブ体験エリア1~9(の内部のユーザ装置)と現地システムが設置されている観光地の地域・観光対象の種類とを、下記のように対応付けることができる。
  サブ体験エリア1……日本国内の自然景観
  サブ体験エリア2……日本国内の遺跡
  サブ体験エリア3……日本国内の現代建築物
  サブ体験エリア4……日本以外のアジア
  サブ体験エリア5……北米
  サブ体験エリア6……中南米
  サブ体験エリア7……欧州
  サブ体験エリア8……大洋州
  サブ体験エリア9……アフリカ
In the present embodiment, the experience area in which the user device is installed has been described as being divided into three sub-experience areas 1 to 3, but the number of divisions may be only two or four or more. be able to. For example, if the experience area is divided into nine sub-experience areas 1 to 9, the sub-experience areas 1 to 9 (user devices inside) and the area of the tourist destination where the local system is installed And can be associated as follows:
Sub-experience area 1 …… Natural landscape in Japan Sub-experience area 2 …… Ruins in Japan Sub-experience area 3 …… Contemporary buildings in Japan Sub-experience area 4 …… Asia other than Japan Sub-experience area 5 …… North America Sub-experience area 6 …… Latin America Sub-experience area 7 …… Europe Sub-experience area 8 …… Oceania Sub-experience area 9 …… Africa
 そして、この場合、サブ体験エリア1~9の内部の内装と展示を、各サブ体験エリアが対応している観光地の地域に適応したものとすることができる。また、サブ体験エリア1~9の内部の照明の明るさや温度を、各サブ体験エリアが対応している観光地の地域のその時点での明るさや温度に合わせることができる。 In this case, the interiors and exhibits inside the sub-experience areas 1 to 9 can be adapted to the areas of the tourist destinations that each sub-experience area supports. Also, the brightness and temperature of the illumination inside the sub-experience areas 1 to 9 can be matched to the brightness and temperature at that time in the area of the tourist destination corresponding to each sub-experience area.
 図6は、本発明の実施例2に係る遠隔ライブ映像娯楽施設の構成及び機能を説明するためのブロック図である。 FIG. 6 is a block diagram for explaining the configuration and function of the remote live video entertainment facility according to the second embodiment of the present invention.
 本実施例においても、実施例1と同じく、遠隔ライブ映像娯楽施設には、ユーザ装置を設置する体験エリアが、サブ体験エリア1~3の3つに区分されている。そして、サブ体験エリア1の内部には、ユーザ装置11_511~ユーザ装置1i_51i(iはサブ体験エリア1の内部に設置されるユーザ装置の総数であり、任意に設定できる。図にはユーザ装置11_511とユーザ装置12_512だけを表示している)が設置され、サブ体験エリア2の内部には、ユーザ装置21_521~ユーザ装置2j_52j(jはサブ体験エリア2の内部に設置されるユーザ装置の総数であり、任意に設定できる。図にはユーザ装置21_521とユーザ装置22_522だけを表示している)、サブ体験エリア3の内部には、ユーザ装置31_531~ユーザ装置3k_53k(kはサブ体験エリア3の内部に設置されるユーザ装置の総数であり、任意に設定できる。図にはユーザ装置31_531とユーザ装置32_532だけを表示している)が設置されている。 Also in the present embodiment, as in the first embodiment, the remote live video entertainment facility is divided into three experience areas in which user devices are installed, sub-experience areas 1 to 3. In the sub-experience area 1, user devices 11_511 to 1i_51i (i is the total number of user devices installed in the sub-experience area 1 and can be set arbitrarily. Only the user device 12_512 is displayed), and inside the sub-experience area 2, user devices 21_521 to 2j_52j (j is the total number of user devices installed inside the sub-experience area 2), User device 21_521 and user device 22_522 are only shown in the figure), inside sub-experience area 3 user device 31_531 to user device 3k_53k (k is installed inside sub-experience area 3) It is the total number of user devices that can be set and can be set arbitrarily. In the figure, only the user devices 31_531 and 32_532 are displayed).
 ただし、本実施例の遠隔ライブ映像娯楽施設には、実施例1とは異なり、サブ体験エリア1~3とは別にサーバ室は設けられておらず、その代り、サブ体験エリア1~3の各々の内部には、サブ体験エリア内サーバ1_41、サブ体験エリア内サーバ2_42及びサブ体験エリア内サーバ3_43が設置されている。 However, unlike the first embodiment, the remote live video entertainment facility of the present embodiment does not have a server room separately from the sub-experience areas 1 to 3, and instead, each of the sub-experience areas 1 to 3 is provided. Inside the sub-experience area server 1_41, sub-experience area server 2_42 and sub-experience area server 3_43 are installed.
 また、実施例1の遠隔ライブ映像娯楽施設では、各サブ体験エリアの内部に設置されたユーザ装置のHMD及びヘッドフォンには、複数の現地システムからのライブ映像及びライブ音声だけが出力可能であったのに対して、本実施例では、遠隔ライブ映像娯楽施設のサブ体験エリアのうちの一つ又は複数については、その内部に設置されたユーザ装置のHMD及びヘッドフォンには、特定の現地システムからのライブ映像及びライブ音声だけが表示・出力され得るように構成されている。
 以下では、遠隔ライブ映像娯楽施設の体験エリアは、実施例1の遠隔ライブ映像娯楽施設と同じく、サブ体験エリア1~3の3つに区分されており、サブ体験エリア1~3と現地システムとは、下記のように対応付けられているものとして説明する。
 サブ体験エリア1……全世界の地表に設置された複数の現地システム(複数)
 サブ体験エリア2……深海に設置された一つの現地システム(以下「深海現地システム」)
 サブ体験エリア3……宇宙空間に設置された一つの現地システム(以下「宇宙現地システム」)
Further, in the remote live video entertainment facility of the first embodiment, only the live video and live audio from a plurality of local systems can be output to the HMD and the headphone of the user device installed inside each sub-experience area. On the other hand, in this embodiment, for one or more of the sub-experience areas of the remote live video entertainment facility, the HMD and the headphone of the user device installed inside the sub-experience area are provided with a specific local system. It is configured so that only live video and live audio can be displayed and output.
Below, the experience area of the remote live video entertainment facility is divided into three sub-experience areas 1 to 3 as in the remote live video entertainment facility of the first embodiment. Will be described as being associated as follows.
Sub-experience area 1 ... Multiple local systems (plural) installed on the surface of the world
Sub-experience area 2 ... One local system installed in the deep sea (hereinafter "deep sea local system")
Sub-experience area 3 ... One local system installed in outer space (hereinafter "space local system")
 図7は、本発明の実施例2に係る遠隔ライブ映像娯楽施設の内部のユーザ装置に対してライブ映像を送信する深海現地システム及び中継装置の構成及び機能を説明するためのイメージ図である。
 この深海現地システムは、水中ドローン1_1B1~6_1B6と深海親装置2Bから構成されており、深海親装置2Bは中継船6B1と通信ケーブルで接続されており、中継船6B1と陸上中継装置6B2とは無線通信で信号をやり取りし、陸上中継装置6B2はインターネット3に接続している。また、水中ドローン1_1B1~6_1B6の各々には、深海現地システムの内部で定義された識別番号が与えられている。
FIG. 7 is an image diagram for explaining the configuration and function of the deep sea on-site system and the relay device for transmitting the live video to the user device inside the remote live video entertainment facility according to the second embodiment of the present invention.
This deep sea on-site system consists of underwater drones 1_1B1 to 6_1B6 and a deep sea master unit 2B. The deep sea master unit 2B is connected to a relay ship 6B1 by a communication cable, and the relay ship 6B1 and the land relay unit 6B2 are wireless. The land relay device 6B2 is connected to the Internet 3 by exchanging signals by communication. Further, each of the underwater drones 1_1B1 to 6_1B6 is given an identification number defined inside the deep sea on-site system.
 水中ドローン1_1B1~6_1B6は、海上及び海中を移動する深海親装置2Bに搭載されて深海まで長距離移動し、現地到着後に、深海親装置2Bの外部に放出されて、上下左右に移動したり回転したりする。ただし、水中ドローン1_1B1~6_1B6と深海親装置2Bとは通信ケーブルで接続されているため、水中ドローン1_1B1~6_1B6が移動できるのは、ケーブルが伸びきる範囲内に限定される。 Underwater drones 1_1B1 to 6_1B6 are mounted on deep-sea parent device 2B that moves over the sea and underwater, and travel long distances to the deep sea. After they arrive at the site, they are released to the outside of deep-sea parent device 2B, and move vertically and horizontally as well as rotate. To do However, since the underwater drones 1_1B1 to 6_1B6 and the deep sea parent device 2B are connected by a communication cable, the underwater drones 1_1B1 to 6_1B6 can move only within the range where the cable can be extended.
 水中ドローン1_1B1~6_1B6の構成と機能は、基本的には、実施例1で説明した空撮ドローンα11_1α11~α1r_1α1rと同じであるが、海中での移動は、スクリュープロペラの回転、舵の角度、海水の注入・排水によって行う。 The configuration and functions of the underwater drone 1_1B1 to 6_1B6 are basically the same as those of the aerial photography drone α11_1α11 to α1r_1α1r described in Example 1, but movements in the sea include rotation of the screw propeller, rudder angle, and seawater. Inject and drain.
 一方、深海親装置2Bは、海上及び海中を自力航行可能な潜水船であり、その内部に、現地親装置α1_2α1における現地サーバ2α12と同様の機能を有する現地サーバ(図示されない)を備える。ただし、現地親装置α1_2α1とは異なり、送受信アンテナ2α11の代わりに水中ドローン1_1B1~6_1B6と接続される通信ケーブルを接続するインターフェース部1と、中継船6B1と接続される通信ケーブルを接続するインターフェース部2とを備える。 On the other hand, the deep-sea parent device 2B is a submersible capable of self-navigation over the sea and in the sea, and is equipped with a local server (not shown) having the same function as the local server 2α12 in the local parent device α1_2α1 inside. However, unlike the local parent device α1_2α1, instead of the transmitting / receiving antenna 2α11, the interface unit 1 connecting the communication cable connected to the underwater drones 1_1B1 to 6_1B6 and the interface unit 2 connecting the communication cable connecting to the relay ship 6B1. With.
 また、中継船6B1は、深海親装置2Bの上方海面に停泊する通信船であり、深海親装置2Bとの間でケーブルを介して映像信号、音声信号及び飛行制御信号のやり取りをする一方、陸上中継装置6B2との間で、無線通信によって映像信号、音声信号及び飛行制御信号のやり取りをする。一方、陸上中継装置6B2は陸上に設置されたモバイル基地局である。 Further, the relay ship 6B1 is a communication ship that anchors above the deep sea master device 2B, and exchanges video signals, audio signals, and flight control signals with the deep sea master device 2B via a cable, while on land. Video signals, audio signals, and flight control signals are exchanged with the relay device 6B2 by wireless communication. On the other hand, the land relay device 6B2 is a mobile base station installed on land.
 図8は、本発明の実施例2に係る遠隔ライブ映像娯楽施設の内部のユーザ装置に対してライブ映像を送信する宇宙現地システム及び中継装置の構成及び機能を説明するためのイメージ図である。
 この宇宙現地システムは、宇宙子装置1_1C1~6_1C6と宇宙親装置2Cから構成されており、宇宙親装置2Cは宇宙中継装置6C1とは無線通信で信号をやり取りし、宇宙中継装置6C1と地上中継装置6C2とは無線通信で信号をやり取りし、地上中継装置6C2はインターネット3に接続している。また、宇宙子装置1_1C1~6_1C6の各々には、宇宙現地システムの内部で定義された識別番号が与えられている。
FIG. 8 is an image diagram for explaining the configuration and function of a space field system and a relay device for transmitting live video to a user device inside a remote live video entertainment facility according to the second embodiment of the present invention.
This space field system is composed of cosmic devices 1_1C1 to 6_1C6 and a space master device 2C. The space master device 2C exchanges signals with the space relay device 6C1 by wireless communication, and the space relay device 6C1 and the ground relay device. Signals are exchanged with 6C2 by wireless communication, and the ground relay device 6C2 is connected to the Internet 3. Further, each of the cosmic devices 1_1C1 to 6_1C6 is given an identification number defined inside the space field system.
 宇宙子装置1_1C1~6_1C6は、地上から発射され、大気圏内外を飛行する親装置2に搭載されて大気圏外まで長距離移動し、現地到着後に、親装置2の外部に放出されて、上下左右に移動したり回転したりする。ただし、宇宙子装置1_1C1~6_1C6と宇宙親装置2Cとはケーブルで接続されているため、宇宙子装置1_1C1~6_1C6が移動できるのは、ケーブルが伸びきる範囲内に限定される。 The cosmic devices 1_1C1 to 6_1C6 are launched from the ground, mounted on the parent device 2 that flies outside the atmosphere, and travel long distances outside the atmosphere. After arriving at the site, they are released to the outside of the parent device 2 and vertically and horizontally. Move and rotate. However, since the cosmic devices 1_1C1 to 6_1C6 and the cosmic parent device 2C are connected by a cable, the cosmic devices 1_1C1 to 6_1C6 can move only within the range where the cable extends.
 宇宙子装置1_1C1~6_1C6の構成と機能は、基本的には、実施例1で説明した空撮ドローンα11_1α11~α1r_1α1rと同じであるが、宇宙空間内での移動は、ジェット噴射によって行う。 The configuration and functions of the cosmic devices 1_1C1 to 6_1C6 are basically the same as those of the aerial imaging drone α11_1α11 to α1r_1α1r described in Example 1, but movement in outer space is performed by jet injection.
 一方、宇宙親装置2Cは、大気圏外に飛行可能な宇宙船であり、その内部に、現地親装置α1_2α1における現地サーバ2α12と同様の機能を有する現地サーバ(図示されない)を備える。ただし、現地親装置α1_2α1とは異なり、送受信アンテナ2α11の代わりに宇宙子装置1_1C1~6_1C6と接続される通信ケーブルを接続するインターフェース部と、宇宙中継装置6C1と無線通信で信号をやり取りするための送受信アンテナとを備える。なお、宇宙親装置2Cは、発射時に使用したロケットを切り離す構成とすることができる。 On the other hand, the space parent device 2C is a spacecraft that can fly outside the atmosphere, and is equipped with a local server (not shown) having the same function as the local server 2α12 in the local parent device α1_2α1 inside. However, unlike the local parent device α1_2α1, instead of the transmission / reception antenna 2α11, the interface unit that connects the communication cable connected to the space device 1_1C1 to 6_1C6, and the transmission / reception for exchanging signals with the space relay device 6C1 by wireless communication. And an antenna. The space master device 2C can be configured to separate the rocket used at the time of launch.
 また、宇宙中継装置6C1は、大気圏外に存在する通信衛星であり、宇宙親装置2C及び地上中継装置6C2との間で、無線通信によって映像信号、音声信号及び飛行制御信号のやり取りをする。一方、地上中継装置6C2は地上に設置されたモバイル基地局である。 Also, the space relay device 6C1 is a communication satellite existing outside the atmosphere, and exchanges video signals, audio signals, and flight control signals with the space master device 2C and the ground relay device 6C2 by wireless communication. On the other hand, the ground relay device 6C2 is a mobile base station installed on the ground.
 さて、本実施例の遠隔ライブ映像娯楽施設でも、実施例1の遠隔ライブ映像娯楽施設と同様の入場管理が行われており、施設全体の入口部で入場カードを貸与されたユーザは、アクセスエリアを通じて、各サブ体験エリアの入口から自身が入場したいサブ体験エリアに入場する。 In the remote live video entertainment facility of the present embodiment, the same entrance management as that of the remote live video entertainment facility of the first embodiment is performed, and the user who has lent the entrance card at the entrance of the entire facility can access the access area. Through the entrance of each sub-experience area, enter the sub-experience area that you want to enter.
 このうち、本実施例の遠隔ライブ映像娯楽施設におけるサブ体験エリア1の内部にあるユーザ装置11_511~1i_51iの機能と構成は、実施例1の遠隔ライブ映像娯楽施設におけるサブ体験エリア1の内部にあるユーザ装置11_511~1i_51iと同様であり、本実施例の遠隔ライブ映像娯楽施設におけるサブ体験エリア1に入場したユーザは、実施例1の遠隔ライブ映像娯楽施設におけるサブ体験エリア1に入場したユーザと同様の疑似観光を体験することができる。
 ただし、実施例1の遠隔ライブ映像娯楽施設におけるサブ体験エリア1に入場したユーザは、日本国内の観光地のライブ映像とライブ音声だけを楽しめたのに対して、本実施例の遠隔ライブ映像娯楽施設におけるサブ体験エリア1に入場したユーザは、全世界の地表にある観光地のライブ映像とライブ音声を楽しむことができる。
Among these, the functions and configurations of the user devices 11_511 to 1i_51i inside the sub-experience area 1 in the remote live video entertainment facility of the present embodiment are inside the sub-experience area 1 in the remote live video entertainment facility of the first embodiment. Similar to the user devices 11_511 to 1i_51i, the user who enters the sub-experience area 1 in the remote live video entertainment facility of the present embodiment is the same as the user who enters the sub-experience area 1 in the remote live video entertainment facility of the first embodiment. You can experience simulated tourism.
However, while the user who entered the sub-experience area 1 in the remote live video entertainment facility of Example 1 enjoyed only the live video and live audio of a tourist spot in Japan, the remote live video entertainment of this Example was used. A user who has entered the sub-experience area 1 of the facility can enjoy live video and live audio of tourist spots on the surface of the world.
 一方、本実施例においては、サブ体験エリア2の内部にはプールが設置されており、サブ体験エリア2に入場したユーザは、必要な潜水具を装着した上で、防水仕様となっているヘッドフォン一体型HMDであるユーザ装置21_521~2j_52jのいずれかを頭部に装着して、プールに張られた水中を遊泳する。
 以下、ユーザは、ユーザ装置21_521であるHMDを頭部に装着したものとして説明するが、以下で説明する機能や構成は、他のユーザ装置22_522~2j_52jでも共通である。
On the other hand, in the present embodiment, a pool is installed inside the sub-experience area 2, and a user who enters the sub-experience area 2 is equipped with necessary diving equipment and has waterproof specifications. One of the user devices 21_521 to 2j_52j, which is an integrated HMD, is worn on the head and swims in the water stretched in the pool.
Hereinafter, the user will be described assuming that the HMD that is the user device 21_521 is worn on the head, but the functions and configurations described below are common to the other user devices 22_522 to 2j_52j.
 ユーザ装置21_521であるHMDには、入場カードに対して非接触で情報の読み取り/書き込みを行う読み取り/書き込み部が設けられており、ユーザが該読み取り/書き込み部にかざすと、ユーザ装置21_521が起動するとともに、該入場カードには、ユーザ装置が起動した時刻の情報が記録される。 The HMD, which is the user device 21_521, is provided with a reading / writing unit that reads / writes information to / from an entrance card in a contactless manner. When the user holds the reading / writing unit, the user device 21_521 is activated. At the same time, information on the time when the user device is activated is recorded on the entrance card.
 ユーザ装置21_521が起動すると、通信ユニットから起動信号がサブ体験エリア内サーバ42に送信される。その信号を受信したサブ体験エリア内サーバ42は、ユーザ装置21_521に対して、ユーザ装置21_521にあらかじめ付与されている識別番号に基づいてIPアドレスを付与する。その上で、サブ体験エリア内サーバ42は、インターネット3、陸上中継装置6B2、中継船6B1を介して、深海現地システムを構成する深海親装置2B内の現地サーバに対して、前記IPアドレスが付与されたユーザ装置21_521宛に、深海のライブ映像及びライブ音声を送信するように要求する信号を送信する。 When the user device 21_521 is activated, an activation signal is sent from the communication unit to the sub-experience area server 42. Upon receiving the signal, the sub-experience area server 42 gives the IP address to the user device 21_521 based on the identification number given in advance to the user device 21_521. Then, the sub-experience area server 42 assigns the IP address to the local server in the deep sea parent device 2B that constitutes the deep sea local system via the Internet 3, the land relay device 6B2, and the relay ship 6B1. A signal requesting transmission of deep-sea live video and live audio is transmitted to the selected user device 21_521.
 前記要求信号を受信した深海親装置2B内の現地サーバは、深海現地システムに属する水中ドローンのうち使用可能なものを選択し(以下、選択された水中ドローンは水中ドローン1B1であるとして説明する)、該水中ドローン1B1に対して、水中ドローン1B1にあらかじめ付与されている識別番号に基づいてIPアドレスを付与する。その上で、深海親装置2B内の現地サーバは、インターネット3を介して、サブ体験エリア内サーバ42に対して、前記IPアドレスが付与された水中ドローン1B1宛に、該水中ドローン1B1の泳行を制御する泳行制御一次信号を送信するように要求する信号を送信する。 The local server in the deep-sea parent device 2B that has received the request signal selects an available underwater drone belonging to the deep-sea local system (hereinafter, the selected underwater drone will be described as the underwater drone 1B1). , An IP address is given to the underwater drone 1B1 based on an identification number given in advance to the underwater drone 1B1. Then, the local server in the deep-sea parent device 2B, through the Internet 3, to the server 42 in the sub-experience area, to the underwater drone 1B1 assigned with the IP address, the swimming of the underwater drone 1B1. Send a signal requesting to send a swim control primary signal to control.
 こうして、ユーザ装置21_521と水中ドローン1B1との間に通信チャネルが確立されることになり、これ以降は、ユーザ装置21_521と水中ドローン1B1との間で、サブ体験エリア内サーバ42、インターネット3、陸上中継装置6B2、中継船6B1、インターフェース部2、現地サーバ及びインターフェース部1を介して、インターネット・プロトコルに準拠した通信が行われることになる。 In this way, a communication channel is established between the user device 21_521 and the underwater drone 1B1, and thereafter, between the user device 21_521 and the underwater drone 1B1, the sub-experience area server 42, the Internet 3, and the land. Communication based on the Internet protocol is performed through the relay device 6B2, the relay ship 6B1, the interface unit 2, the local server and the interface unit 1.
 このように確立した通信チャネルによって、ユーザ装置21_521の通信ユニットは、内蔵するインターネットインターフェースを経由してインターネットから前記ストリーミング配信された3D圧縮映像信号と音声信号を受信し、必要な復号化処理を行った上で、HMD及びヘッドフォンに送信する。特に、3D圧縮映像信号については、伸張復号化処理を行い、3D非圧縮映像信号として送信する。 Through the communication channel established in this way, the communication unit of the user device 21_521 receives the streaming-distributed 3D compressed video signal and audio signal from the Internet via the built-in Internet interface, and performs necessary decoding processing. Then, it transmits to the HMD and headphones. Particularly, the 3D compressed video signal is subjected to decompression / decoding processing and transmitted as a 3D non-compressed video signal.
 HMDは、通信ユニットから受信した3D非圧縮映像信号に基づき、左右の投射部から映像を投射する。これにより、ユーザは現場の観察対象を含む3D映像を楽しむことができる。また、ヘッドフォンは、通信ユニットから受信した音声信号に基づき、左右の放音部から音声を放出する。これにより、ユーザは現場の音声を楽しむことができる。 The HMD projects images from the left and right projection units based on the 3D uncompressed video signal received from the communication unit. As a result, the user can enjoy the 3D video including the observation target on the spot. Further, the headphone emits sound from the left and right sound emitting units based on the sound signal received from the communication unit. This allows the user to enjoy the on-site voice.
 なお、前記通信ユニットからデコードされた3D非圧縮映像信号及び音声信号の送信は、HDMI(登録商標)信号又はMHL(登録商標)信号として有線伝送することができる。又は、WirelessHD信号やWHDI信号として無線伝送することができる。 Note that the transmission of the 3D uncompressed video signal and audio signal decoded from the communication unit can be wire-transmitted as an HDMI (registered trademark) signal or an MHL (registered trademark) signal. Alternatively, it can be wirelessly transmitted as a WirelessHD signal or a WHDI signal.
 一方、ユーザ装置21_521であるHMDの内部には、コントロールユニット、通信ユニット、加速度センサー及び方位センサーが内蔵されている。コントロールユニットは、加速度センサー及び方位センサーからの信号に基づき、ユーザの頭部の位置や動きと視線の向きを特定し、その結果に基づいて泳行制御一次信号及び視線制御一次信号を生成し、それらの信号を通信ユニットに送信する。通信ユニットは、コントローラから受信した泳行制御一次信号及び視線制御一次信号を、前記確立された通信チャネルを用いて、水中ドローン1B1に付与されたIPアドレス宛に送信する。 On the other hand, inside the HMD which is the user device 21_521, a control unit, a communication unit, an acceleration sensor and a direction sensor are built in. The control unit identifies the position and movement of the user's head and the direction of the line of sight based on the signals from the acceleration sensor and the direction sensor, and generates a swimming control primary signal and a visual line control primary signal based on the result, Send those signals to the communication unit. The communication unit transmits the swimming control primary signal and the line-of-sight control primary signal received from the controller to the IP address assigned to the underwater drone 1B1 using the established communication channel.
 深海親装置2B内の現地サーバは、インターフェース部2を経由して前記泳行制御一次信号及び視線制御一次信号を受信し、該泳行制御一次信号及び視線制御一次信号に基づいて、水中ドローン1B1の泳行を制御する泳行制御二次信号及び視線制御一次信号を生成し、水中ドローン1B1に送信する。 The local server in the deep-sea master apparatus 2B receives the swimming control primary signal and the line-of-sight control primary signal via the interface unit 2, and based on the swimming control primary signal and the line-of-sight control primary signal, the underwater drone 1B1. It generates a swimming control secondary signal and a line-of-sight control primary signal for controlling the swimming of, and transmits it to the underwater drone 1B1.
 水中ドローン1B1は、受信した飛行制御二次信号に基づいて、スクリュープロペラの回転、舵の角度、海水の注入・排水を制御する一方、CMOSカメラモジュールの視線方向を変化させる。 Underwater drone 1B1 controls the rotation of the screw propeller, the angle of the rudder, and the injection / drainage of seawater based on the received secondary flight control signal, while changing the line-of-sight direction of the CMOS camera module.
 このようにして、ユーザは、水中ドローン1B1を深海中に泳行させて深海のライブ映像を楽しむことができ、深海中のライブ音声を楽しむことができる。その際、HMDの左右の投射部には、前記コントロールユニットが特定したユーザの頭部の位置と顔面(視線)の向きの動きに連動した映像が投射されるため、ユーザは、あたかも深海中を遊泳しているかのような感覚をもって映像を楽しむことができる。 In this way, the user can swim the underwater drone 1B1 in the deep sea and enjoy the live video of the deep sea, and enjoy the live audio in the deep sea. At that time, since images that are linked to the position of the user's head and the movement of the face (line of sight) specified by the control unit are projected on the left and right projection units of the HMD, the user feels as if he / she was in deep water. You can enjoy the images as if you were swimming.
 一方、本実施例においては、サブ体験エリア3の内部にもプールが設置されており、サブ体験エリア3に入場したユーザは、必要な潜水具を装着した上で、防水仕様となっているHMDであるユーザ装置31_531~3k_53kのいずれかを頭部に装着して、プールに張られた水中を遊泳する。
 以下、ユーザは、ユーザ装置31_531であるHMDを頭部に装着したものとして説明するが、以下で説明する機能や構成は、他のユーザ装置32_532~3k_53kでも共通である。
On the other hand, in the present embodiment, the pool is also installed inside the sub-experience area 3, and the user who enters the sub-experience area 3 is equipped with the necessary diving equipment and has a waterproof specification. Any of the user devices 31_531 to 3k_53k, which is a user, is attached to the head and swims in the water stretched in the pool.
Hereinafter, the user will be described assuming that the HMD that is the user device 31_531 is worn on the head, but the functions and configurations described below are common to the other user devices 32_532 to 3k_53k.
 ユーザ装置31_531であるHMDには、入場カードに対して非接触で情報の読み取り/書き込みを行う読み取り/書き込み部が設けられており、ユーザが該読み取り/書き込み部にかざすと、ユーザ装置31_531が起動するとともに、該入場カードには、ユーザ装置が起動した時刻の情報が記録される。 The HMD that is the user device 31_531 is provided with a reading / writing unit that reads / writes information to / from an entrance card in a contactless manner, and when the user holds the reading / writing unit, the user device 31_531 is activated. At the same time, information on the time when the user device is activated is recorded on the entrance card.
 ユーザ装置31_531が起動すると、通信ユニットから起動信号がサブ体験エリア内サーバ43に送信される。その信号を受信したサブ体験エリア内サーバ43は、ユーザ装置31_531に対して、ユーザ装置31_531にあらかじめ付与されている識別番号に基づいてIPアドレスを付与する。その上で、サブ体験エリア内サーバ43は、インターネット3、地上中継装置6C2、宇宙中継装置6C1を介して、宇宙現地システムを構成する宇宙親装置2C内の現地サーバに対して、前記IPアドレスが付与されたユーザ装置31_531宛に、宇宙空間から見た地球などのライブ映像を送信するように要求する信号を送信する。 When the user device 31_531 is activated, an activation signal is sent from the communication unit to the sub-experience area server 43. Upon receiving the signal, the sub-experience area server 43 gives the user device 31_531 an IP address based on the identification number given in advance to the user device 31_531. Then, the sub-experience area server 43 sends the IP address to the local server in the space parent device 2C that constitutes the space local system via the Internet 3, the ground relay device 6C2, and the space relay device 6C1. A signal requesting transmission of a live image of the earth as viewed from outer space is transmitted to the assigned user device 31_531.
 前記要求信号を受信した宇宙親装置2C内の現地サーバは、宇宙現地システムに属する宇宙子装置のうち使用可能なものを選択し(以下、選択された宇宙子装置は宇宙子装置1C1であるとして説明する)、該宇宙子装置1C1に対して、宇宙子装置1C1にあらかじめ付与されている識別番号に基づいてIPアドレスを付与する。その上で、宇宙親装置2C内の現地サーバは、インターネット3を介して、サブ体験エリア内サーバ43に対して、前記IPアドレスが付与された宇宙子装置1C1宛に、該宇宙子装置1C1の泳行を制御する泳行制御一次信号を送信するように要求する信号を送信する。 The local server in the space parent device 2C that received the request signal selects an available one from the cosmic devices belonging to the space local system (hereinafter, the selected cosmic device is assumed to be the cosmic device 1C1. As will be described), an IP address is given to the cosmic device 1C1 based on an identification number given in advance to the cosmic device 1C1. Then, the local server in the universe master device 2C sends the universe child device 1C1 addressed to the universe device 1C1 to which the IP address is given to the sub experience area server 43 via the Internet 3. Send a signal requesting to send a swim control primary signal to control the swim.
 こうして、ユーザ装置31_531と宇宙子装置1C1との間に通信チャネルが確立されることになり、これ以降は、ユーザ装置31_531と宇宙子装置1C1との間で、サブ体験エリア内サーバ43、インターネット3、地上中継装置6C2、宇宙中継装置6C1、送受信アンテナ、現地サーバ及びインターフェース部を介して、インターネット・プロトコルに準拠した通信が行われることになる。 In this way, a communication channel is established between the user device 31_531 and the cosmic device 1C1, and thereafter, between the user device 31_531 and the cosmic device 1C1, the sub-experience area server 43, the Internet 3 , The ground relay device 6C2, the space relay device 6C1, the transmission / reception antenna, the local server, and the interface unit, and communication according to the Internet protocol will be performed.
 このように確立した通信チャネルによって、ユーザ装置31_531の通信ユニットは、内蔵するインターネットインターフェースを経由してインターネットから前記ストリーミング配信された3D圧縮映像信号と音声信号を受信し、必要な復号化処理を行った上で、HMD及びヘッドフォンに送信する。特に、3D圧縮映像信号については、伸張復号化処理を行い、3D非圧縮映像信号として送信する。 Through the communication channel thus established, the communication unit of the user device 31_531 receives the streaming-distributed 3D compressed video signal and audio signal from the Internet via the built-in Internet interface, and performs the necessary decoding processing. Then, it transmits to the HMD and headphones. Particularly, the 3D compressed video signal is subjected to decompression decoding processing and transmitted as a 3D non-compressed video signal.
 HMDは、通信ユニットから受信した3D非圧縮映像信号に基づき、左右の投射部から映像を投射する。これにより、ユーザは現場の観察対象を含む3D映像を楽しむことができる。 According to the 3D uncompressed video signal received from the communication unit, the HMD projects the video from the left and right projection units. As a result, the user can enjoy the 3D video including the observation target on the spot.
 なお、前記通信ユニットからデコードされた3D非圧縮映像信号の送信は、HDMI(登録商標)信号又はMHL(登録商標)信号として有線伝送することができる。又は、WirelessHD信号やWHDI信号として無線伝送することができる。 Note that the transmission of the 3D uncompressed video signal decoded from the communication unit can be wire-transmitted as an HDMI (registered trademark) signal or an MHL (registered trademark) signal. Alternatively, it can be wirelessly transmitted as a WirelessHD signal or a WHDI signal.
 一方、ユーザ装置31_531であるHMDの内部には、コントロールユニット、通信ユニット、加速度センサー及び方位センサーが内蔵されている。コントロールユニットは、加速度センサー及び方位センサーからの信号に基づき、ユーザの頭部の位置や動きと視線の向きを特定し、その結果に基づいて泳行制御一次信号及び視線制御一次信号を生成し、それらの信号を通信ユニットに送信する。通信ユニットは、コントローラから受信した泳行制御一次信号及び視線制御一次信号を、前記確立された通信チャネルを用いて、宇宙子装置1C1に付与されたIPアドレス宛に送信する。 On the other hand, inside the HMD which is the user device 31_531, a control unit, a communication unit, an acceleration sensor and a direction sensor are built in. The control unit identifies the position and movement of the user's head and the direction of the line of sight based on the signals from the acceleration sensor and the direction sensor, and generates a swimming control primary signal and a primary line of sight control signal based on the result, Send those signals to the communication unit. The communication unit transmits the swimming control primary signal and the line-of-sight control primary signal received from the controller to the IP address assigned to the cosmic device 1C1 using the established communication channel.
 宇宙親装置2C内の現地サーバは、アンテナ部を経由して前記泳行制御一次信号及び視線制御一次信号を受信し、該泳行制御一次信号及び視線制御一次信号に基づいて、宇宙子装置1C1の泳行を制御する泳行制御二次信号及び視線制御一次信号を生成し、宇宙子装置1C1に送信する。 The local server in the space master device 2C receives the swimming control primary signal and the line-of-sight control primary signal via the antenna unit, and based on the swimming control primary signal and the line-of-sight control primary signal, the cosmic device 1C1 The swimming control secondary signal and the line-of-sight control primary signal for controlling the swimming of the are generated and transmitted to the cosmic device 1C1.
 宇宙子装置1C1は、受信した飛行制御二次信号に基づいて、ジェット噴射を制御する一方、CMOSカメラモジュールの視線方向を変化させる。 The cosmic device 1C1 controls jet ejection based on the received secondary flight control signal while changing the line-of-sight direction of the CMOS camera module.
 このようにして、ユーザは、宇宙子装置1C1を宇宙空間中に泳行させて宇宙空間から見た地球などのライブ映像を楽しむことができる。その際、HMDの左右の投射部には、前記コントロールユニットが特定したユーザの頭部の位置と顔面(視線)の向きの動きに連動した映像が投射されるため、ユーザは、宇宙空間中を遊泳しているかのような感覚をもって映像を楽しむことができる。 In this way, the user can swim the cosmic device 1C1 in outer space and enjoy the live image of the earth as seen from outer space. At that time, the left and right projection units of the HMD project an image linked to the position of the user's head and the movement of the direction of the face (line of sight) specified by the control unit. You can enjoy the images as if you were swimming.
 なお、上記の実施例では、ユーザ装置のHMDには、遠隔地に存在する対象を撮影したライブ映像を表示するものとして説明してきたが、遠隔地に存在する対象を撮影した結果に基づく録画映像を表示したり、VR(Virtual Reality)映像を表示させたりすることもできる。特に、遠隔ライブ映像娯楽施設が存在する地点と、ユーザ装置のHMDにライブ映像が表示される対象が存在する遠隔地に時差があるなどして、ユーザがユーザ装置を使用している時点では、該対象のライブ映像が撮影できないような場合には、録画映像で代用することで、ユーザの不満をある程度軽減することができる。 In the above embodiments, the HMD of the user device has been described as displaying a live image of an object in a remote place, but a recorded image based on the result of shooting an object in a remote place. Can be displayed, or a VR (Virtual Reality) image can be displayed. In particular, when the user is using the user device due to a time difference between the point where the remote live image entertainment facility is present and the remote place where the target for displaying the live image on the HMD of the user device is present, When the live image of the target cannot be captured, the recorded image can be used as a substitute to alleviate the user's dissatisfaction to some extent.
 本発明は、遠隔ライブ映像娯楽施設を建設する産業や、該遠隔ライブ映像娯楽施設を用いて疑似観光体験に係るサービスを提供する産業において利用することができる。また、該疑似観光体験を提供する施設で使用されるユーザ装置を製造する産業、特に、該ユーザ装置におけるライブ映像表示手段を製造する産業において利用することができる。 The present invention can be used in an industry for constructing a remote live video entertainment facility and an industry for providing a service related to a pseudo-tour experience using the remote live video entertainment facility. Further, it can be used in an industry of manufacturing a user device used in a facility that provides the simulated tourism experience, particularly in an industry of manufacturing live image display means in the user device.
1α11‥‥‥空撮ドローンα11
1α12‥‥‥空撮ドローンα12
1α13‥‥‥空撮ドローンα13
1α14‥‥‥空撮ドローンα14
1α15‥‥‥空撮ドローンα15
1α16‥‥‥空撮ドローンα16
1α17‥‥‥空撮ドローンα17
1B1‥‥‥水中ドローン1
1B2‥‥‥水中ドローン2
1B3‥‥‥水中ドローン3
1B4‥‥‥水中ドローン4
1B5‥‥‥水中ドローン5
1B6‥‥‥水中ドローン6
1C1‥‥‥宇宙子装置1
1C2‥‥‥宇宙子装置2
1C3‥‥‥宇宙子装置3
1C4‥‥‥宇宙子装置4
1C5‥‥‥宇宙子装置5
1C6‥‥‥宇宙子装置6
2α1‥‥‥現地親装置α1
2α11‥‥‥送受信アンテナ
2α12‥‥‥現地サーバ 
2B‥‥‥深海親装置
2C‥‥‥宇宙親装置
3‥‥‥インターネット
4‥‥‥施設サーバ
41‥‥‥サブ体験エリア内サーバ1
42‥‥‥サブ体験エリア内サーバ2
43‥‥‥サブ体験エリア内サーバ3
511‥‥‥ユーザ装置11
512‥‥‥ユーザ装置12
521‥‥‥ユーザ装置21
522‥‥‥ユーザ装置22
531‥‥‥ユーザ装置31
532‥‥‥ユーザ装置32
6B1‥‥‥中継船
6B2‥‥‥陸上中継装置
6C1‥‥‥宇宙中継装置
6C2‥‥‥地上中継装置
71‥‥‥撮像手段
711‥‥‥視座
72‥‥‥視線方向
73‥‥‥撮像範囲
731‥‥‥視点(撮像範囲の中心)
1 α11 ‥‥ Aerial drone α11
1 α12 ‥‥‥ Aerial drone α12
1 α13 ‥‥ Aerial drone α13
1 α14 ‥‥‥ Aerial drone α14
1 α15 ‥‥‥ Aerial drone α15
1 α16 ‥‥‥ Aerial drone α16
1 α17 ‥‥‥ Aerial drone α17
1B1 ... Underwater drone 1
1B2 ‥‥‥ Underwater drone 2
1B3 ... Underwater drone 3
1B4 ... Underwater drone 4
1B5 ‥‥‥ Underwater drone 5
1B6 ... Underwater drone 6
1C1 ..... Cosmic device 1
1C2 ・ ・ ・ Cosmic device 2
1C3 ・ ・ ・ Cosmic device 3
1C4 ‥‥‥ Cosmic device 4
1C5 ・ ・ ・ Cosmic device 5
1C6 ... Cosmic device 6
2α1 ‥‥‥ Local parent device α1
2α11 ‥‥‥‥‥‥ antenna
2α12 ‥‥‥ Local server
2B ‥‥‥ Deep sea parent device
2C: Space parent device
3 Internet
4 ‥‥‥ Facility server
41 ‥‥‥ Sub-experience area server 1
42 ‥‥‥ Server 2 in the sub-experience area
43 ‥‥‥ Server 3 in the sub-experience area
511 ... User device 11
512 User device 12
521 ... user device 21
522 ‥‥‥ User device 22
531 User device 31
532 User device 32
6B1 ‥‥‥‥ ................
6B2 ‥‥‥ Ground relay device
6C1 ‥‥‥ Space repeater
6C2 ‥‥‥ Ground relay device
71 ..... Imaging means
711 ...
72 ........ Gaze direction
73 ... Imaging range
731 ... Viewpoint (center of imaging range)

Claims (6)

  1. 内部に「『遠隔地に存在する対象を撮影したライブ映像を表示するライブ映像表示手段』を備えたユーザ装置」を複数設置する遠隔ライブ映像娯楽施設であって、
    前記ユーザ装置のライブ映像表示手段に映像が表示され得る対象(以下「ユーザ装置の表示可能対象」と略記する)が、該ユーザ装置を利用するユーザが選択又は申告した結果に応じて変わる、
    ことを特徴とする遠隔ライブ映像娯楽施設。
    A remote live video entertainment facility in which a plurality of “user devices equipped with“ live video display means for displaying live video of an object existing in a remote place ”are installed,
    An object whose image can be displayed on the live image display means of the user device (hereinafter abbreviated as “displayable object of the user device”) changes according to a result selected or declared by a user who uses the user device.
    A remote live video entertainment facility characterized by that.
  2. 前記遠隔ライブ映像娯楽施設においてユーザ装置が設置される体験エリアが複数のサブ体験エリアに分割されており、
    前記ユーザ装置の表示可能対象が、該ユーザ装置がどのサブ体験エリアに設置されているかに応じて異なるように設定する、
    ことを特徴とする、請求項1に記載の遠隔ライブ映像娯楽施設。
    The experience area in which the user device is installed in the remote live video entertainment facility is divided into a plurality of sub-experience areas,
    The displayable target of the user device is set to be different depending on which sub-experience area the user device is installed in,
    The remote live video entertainment facility according to claim 1, characterized in that:
  3. 前記サブ体験エリアにおける内装、展示、明るさ、温度、湿度のうちの少なくとも一つを、該サブ体験エリア内に設置されたユーザ装置の表示可能対象のうちの少なくとも一つが存在する遠隔地の現況に適応させる、
    ことを特徴とする、請求項2に記載の遠隔ライブ映像娯楽施設。
    At least one of interior, exhibition, brightness, temperature, and humidity in the sub-experience area, and a status of a remote place where at least one of the displayable objects of the user device installed in the sub-experience area exists. To adapt to,
    The remote live video entertainment facility according to claim 2, characterized in that:
  4. 前記ユーザ装置の表示可能対象が、時期や時間によって変化する、
    ことを特徴とする、請求項1乃至3のいずれか一項に記載の遠隔ライブ映像娯楽施設。
    The displayable target of the user device changes with time and time,
    Remote live video entertainment facility according to any one of claims 1 to 3, characterized in that
  5. 請求項1乃至4のいずれか一項に記載の遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法であって、
    該ユーザが利用するユーザ装置の表示可能対象の種類及び/又は数に応じて課金する、
    ことを特徴とする遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法。
    A charging method for a user who uses the remote live video entertainment facility according to any one of claims 1 to 4,
    Charging according to the type and / or number of displayable objects of the user device used by the user,
    A billing method for a user who uses a remote live video entertainment facility characterized by the above.
  6. 請求項2乃至4のいずれか一項に記載の遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法であって、
    該ユーザが入場することができるサブ体験エリアの種類及び/又は数に応じて課金する、
    ことを特徴とする遠隔ライブ映像娯楽施設を利用するユーザに対する課金方法。
    A charging method for a user who uses the remote live video entertainment facility according to any one of claims 2 to 4,
    Charge according to the type and / or number of sub-experience areas that the user can enter,
    A billing method for a user who uses a remote live video entertainment facility characterized by the above.
PCT/JP2019/042078 2018-10-28 2019-10-27 Remote live video amusement facility and method for billing user using said remote live video amusement facility WO2020090697A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019565360A JPWO2020090697A1 (en) 2018-10-28 2019-10-27 Billing method for remote live video entertainment facilities and users who use the remote live video entertainment facilities

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018-202385 2018-10-28
JP2018202385 2018-10-28
JP2019-188261 2019-10-13
JP2019188261 2019-10-13

Publications (1)

Publication Number Publication Date
WO2020090697A1 true WO2020090697A1 (en) 2020-05-07

Family

ID=70463725

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/042078 WO2020090697A1 (en) 2018-10-28 2019-10-27 Remote live video amusement facility and method for billing user using said remote live video amusement facility

Country Status (2)

Country Link
JP (1) JPWO2020090697A1 (en)
WO (1) WO2020090697A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112261425A (en) * 2020-10-20 2021-01-22 成都中科大旗软件股份有限公司 Video live broadcast and video recording playing method and system
CN112804547A (en) * 2021-01-07 2021-05-14 河北交通职业技术学院 Interactive live broadcast system based on unmanned aerial vehicle VR makes a video recording
CN113329260A (en) * 2021-06-15 2021-08-31 北京沃东天骏信息技术有限公司 Live broadcast processing method and device, storage medium and electronic equipment
CN113438531A (en) * 2021-05-18 2021-09-24 北京达佳互联信息技术有限公司 Object display method and device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005165853A (en) * 2003-12-04 2005-06-23 Shinko Electric Co Ltd Device and method for managing admission, and system for managing admission/leaving
US20150124109A1 (en) * 2013-11-05 2015-05-07 Arben Kryeziu Apparatus and method for hosting a live camera at a given geographical location
WO2017002435A1 (en) * 2015-07-01 2017-01-05 ソニー株式会社 Information processing device, information processing method, and program
JP6089256B2 (en) * 2013-06-07 2017-03-08 株式会社Dapリアライズ Live video distribution system
WO2017134778A1 (en) * 2016-02-03 2017-08-10 富士通株式会社 Wireless communication system, base station, and wireless terminal
JP2019036857A (en) * 2017-08-16 2019-03-07 株式会社Dapリアライズ Live video entertainment facility

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005165853A (en) * 2003-12-04 2005-06-23 Shinko Electric Co Ltd Device and method for managing admission, and system for managing admission/leaving
JP6089256B2 (en) * 2013-06-07 2017-03-08 株式会社Dapリアライズ Live video distribution system
US20150124109A1 (en) * 2013-11-05 2015-05-07 Arben Kryeziu Apparatus and method for hosting a live camera at a given geographical location
WO2017002435A1 (en) * 2015-07-01 2017-01-05 ソニー株式会社 Information processing device, information processing method, and program
WO2017134778A1 (en) * 2016-02-03 2017-08-10 富士通株式会社 Wireless communication system, base station, and wireless terminal
JP2019036857A (en) * 2017-08-16 2019-03-07 株式会社Dapリアライズ Live video entertainment facility

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112261425A (en) * 2020-10-20 2021-01-22 成都中科大旗软件股份有限公司 Video live broadcast and video recording playing method and system
CN112261425B (en) * 2020-10-20 2022-07-12 成都中科大旗软件股份有限公司 Video live broadcast and video recording playing method and system
CN112804547A (en) * 2021-01-07 2021-05-14 河北交通职业技术学院 Interactive live broadcast system based on unmanned aerial vehicle VR makes a video recording
CN112804547B (en) * 2021-01-07 2022-08-23 河北交通职业技术学院 Interactive live broadcast system based on unmanned aerial vehicle VR makes a video recording
CN113438531A (en) * 2021-05-18 2021-09-24 北京达佳互联信息技术有限公司 Object display method and device, electronic equipment and storage medium
CN113438531B (en) * 2021-05-18 2023-09-05 北京达佳互联信息技术有限公司 Object display method and device, electronic equipment and storage medium
CN113329260A (en) * 2021-06-15 2021-08-31 北京沃东天骏信息技术有限公司 Live broadcast processing method and device, storage medium and electronic equipment
CN113329260B (en) * 2021-06-15 2024-04-09 北京沃东天骏信息技术有限公司 Live broadcast processing method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
JPWO2020090697A1 (en) 2021-02-15

Similar Documents

Publication Publication Date Title
WO2020090697A1 (en) Remote live video amusement facility and method for billing user using said remote live video amusement facility
JP6478170B2 (en) Remote live video transmission system
CN1783980B (en) Display apparatus, image processing apparatus and image processing method and imaging apparatus
JP4409618B2 (en) Full-field projection device and full-field image system
KR20170114458A (en) The drone built in stereo camera sensors for 3D virtual reality video or connected multi function VR device
CN110322566A (en) A kind of virtual reality traveling method
JP2019036857A (en) Live video entertainment facility
US20230362485A1 (en) Camera service system and method
WO2022220307A1 (en) Video display system, observation device, information processing method, and program
US10882617B2 (en) Aircraft based augmented and virtual reality passenger social media interaction system and related method
JP2019213207A (en) Terminal device for unmanned moving body, unmanned moving body controlled by terminal device for unmanned moving body, and communication unit for terminal device for unmanned moving body
JP2024027828A (en) display device
JP2002077899A (en) Video data delivering system

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019565360

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19878447

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19878447

Country of ref document: EP

Kind code of ref document: A1