WO2024069779A1 - Système de commande, procédé de commande et support d'enregistrement - Google Patents

Système de commande, procédé de commande et support d'enregistrement Download PDF

Info

Publication number
WO2024069779A1
WO2024069779A1 PCT/JP2022/036066 JP2022036066W WO2024069779A1 WO 2024069779 A1 WO2024069779 A1 WO 2024069779A1 JP 2022036066 W JP2022036066 W JP 2022036066W WO 2024069779 A1 WO2024069779 A1 WO 2024069779A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
imaging device
control system
identification information
display device
Prior art date
Application number
PCT/JP2022/036066
Other languages
English (en)
Japanese (ja)
Inventor
佑樹 鶴岡
明彦 大仁田
祐史 丹羽
峰 三宅
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2022/036066 priority Critical patent/WO2024069779A1/fr
Publication of WO2024069779A1 publication Critical patent/WO2024069779A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to control systems, etc.
  • a user may have a travel experience by watching a video of a tourist spot. During an actual trip, the user may travel by vehicle.
  • Patent Document 1 describes displaying images on a head mounted display (HMD). Specifically, for example, Patent Document 1 describes that the images displayed on the HMD include a virtual reality space or an augmented reality space in which the interior space and the scenery seen from the tourist vehicle are combined, and characters, and that the scenery seen from the tourist vehicle is combined on the window of the moving part.
  • HMD head mounted display
  • One example of the objective of this disclosure is to provide a control system or the like that improves the sense of realism when viewing video remotely.
  • the control system includes an acquisition means for acquiring data that associates imaging device identification information of an imaging device attached to a moving object with display device identification information of a user's display device, a detection means for detecting the direction in which the user wishes to view, an imaging device control means for controlling the direction of the imaging device based on the detected direction and the position of the user in a virtual reality space, and an output control means for displaying an image captured by the imaging device on the display device identified by the display device identification information associated with the imaging device identification information of the imaging device.
  • a control method acquires data that associates imaging device identification information of an imaging device attached to a moving object with display device identification information of a user's display device, detects the direction in which the user wishes to view, and, based on the detected direction and the user's position in a virtual reality space, displays an image captured by the imaging device on the display device identified by the display device identification information associated with the imaging device identification information of the imaging device.
  • a program in one aspect of the present disclosure causes a computer to execute a process of acquiring data in which imaging device identification information of an imaging device attached to a moving object and display device identification information of a user's display device are associated, detecting the direction in which the user wants to view, and displaying the image captured by the imaging device on the display device identified by the display device identification information associated with the imaging device identification information of the imaging device based on the detected direction and the user's position in virtual reality space.
  • Each program may be stored on a non-transitory computer-readable recording medium.
  • This disclosure makes it possible to improve the sense of realism when viewing video remotely.
  • FIG. 1 is a block diagram showing a configuration example of a control system according to a first embodiment
  • 4 is a flowchart showing an operation example of the control system according to the first embodiment
  • FIG. 1 is an explanatory diagram showing Example 1 of a display device in a control system.
  • FIG. 11 is an explanatory diagram showing a second example of a display device in the control system.
  • FIG. 2 is an explanatory diagram showing an example of a seat of a moving object.
  • FIG. 1 is an explanatory diagram showing an example in which an imaging device is installed on a moving object.
  • FIG. 10 is an explanatory diagram showing an example in which the imaging device is attached to a robot arm.
  • FIG. 11 is a block diagram showing a configuration example of a control system according to a second embodiment.
  • FIG. 11 is an explanatory diagram illustrating an example of an association table.
  • FIG. 1 is an explanatory diagram (part 1) showing the correspondence between the direction of a user's face in real space and the direction of the user's face in virtual reality space.
  • FIG. 2 is an explanatory diagram (part 2) showing the correspondence between the direction of a user's face in real space and the direction of a user's face in virtual reality space.
  • FIG. 1 is an explanatory diagram showing a display example of a dome-shaped display.
  • FIG. 13 is an explanatory diagram showing an example in which the voices of other users are graphically displayed;
  • FIG. 2 is an explanatory diagram showing an example of an implementation of a control system.
  • FIG. 2 is an explanatory diagram illustrating an example of a hardware configuration of a computer.
  • VR Virtual Reality
  • Fig. 1 is a block diagram showing an example of a configuration of a control system according to the embodiment 1.
  • the control system 10 includes an acquisition unit 101, a detection unit 102, an imaging device control unit 103, and an output control unit 104.
  • the acquisition unit 101 acquires data that associates the imaging device identification information of the imaging device attached to the mobile object with the display device identification information of the user's display device.
  • the moving object is not limited to vehicles such as buses, trains, airplanes, boats, drones, etc.
  • the imaging device identification information is not limited to any particular information as long as it can uniquely identify the imaging device.
  • the display device identification information is not limited to any particular information as long as it can uniquely identify the user's display device. For example, if the user and the display device are in a paired relationship, user identification information may be used instead of the display device identification information.
  • the display device is not limited to any particular information as long as it can be, for example, a general display, an HMD, a dome-shaped display, etc. However, if an HMD, a dome-shaped display, etc. is used, the sense of immersion can be improved.
  • the detection unit 102 detects the direction in which the user wants to look.
  • the direction in which the user wants to look may be the direction of the user's face, or may be a direction specified by a controller or the like.
  • the detection unit 102 detects the direction of the user's face when the display device is identified by the display device identification information.
  • the detection unit 102 may detect the direction of the face based on data detected by an acceleration sensor, a gyro sensor, a geomagnetic sensor, etc. provided in the HMD.
  • the detection unit 102 may detect the direction of the user's face from an image captured by an imaging device different from the imaging device provided in the moving object.
  • the detection unit 102 may detect the direction of the user's gaze in more detail.
  • the detection unit 102 detects the direction the user wants to view in the virtual reality space by accepting the user's direction in the virtual reality space through the user's operation on the input device.
  • the input device may be a controller or a terminal device, and is not particularly limited.
  • the imaging device control unit 103 controls the orientation of the imaging device installed on the moving object based on the direction the user wants to view and the user's position in the virtual reality space.
  • the output control unit 104 displays the image captured by the imaging device on a display device identified by the display device identification information associated with the imaging device identification information of the imaging device.
  • (flowchart) 2 is a flowchart showing an example of an operation of the control system 10 according to the first embodiment.
  • the acquisition unit 101 acquires data in which the imaging device identification information of the imaging device and the display device identification information of the display device are associated with each other (step S101).
  • the data in which the imaging device identification information of the imaging device and the display device identification information of the display device are associated with each other may be stored in a storage unit or the like. Then, the acquisition unit 101 may acquire the data from the storage unit.
  • the detection unit 102 detects the direction in which the user wants to view (step S102).
  • the imaging device control unit 103 controls the orientation of the imaging device based on the detected orientation and the user's position in the virtual reality space (step S103).
  • the output control unit 104 displays the image captured by the imaging device on a display device identified by display device identification information associated with the imaging device identification information of the imaging device (step S104).
  • control system 10 controls the orientation of the imaging device installed on the moving object based on the direction the user wants to view, and displays the captured image on the user's display device. This can improve the sense of realism when viewing images remotely.
  • FIG. 3 is an explanatory diagram showing Example 1 of a display device in a control system.
  • a dome-shaped display 21 is given as an example of the display device.
  • a user is sitting in front of the dome-shaped display 21.
  • the projection device 2102 projects the image onto the screen 2101.
  • the imaging device 22 and the recording device 23 may be installed where the dome-shaped display 21 is installed.
  • the imaging device 22 and the recording device 23 are used to detect the direction of the user's face, the direction of the user's gaze, the user's movements, and the user's conversation.
  • the control system can detect the direction of the user's face, the direction of the user's gaze, and the user's movements from the image captured by the imaging device 22.
  • the control system can detect the user's conversation from the sound obtained by the recording device 23.
  • an audio output device such as a speaker that outputs the voices of other users and the voices of a moving object is installed where the dome-shaped display 21 is installed.
  • a controller 28 may be installed where the dome-shaped display 21 is installed.
  • the controller 28 is, for example, a device that accepts operations by the user.
  • the controller 28 can accept input of the direction that the user wants to see.
  • the number of imaging devices 22 and recording devices 23 is not particularly limited.
  • the imaging devices 22 may transmit captured images to a control system or the like.
  • the recording devices 23 may transmit audio to a control system or the like.
  • the dome-type display 21 is, for example, a device in which at least a portion of the screen 2101 is curved and is structured to cover the user's field of vision. Note that the dome-type display 21 does not have to be a complete dome shape such as 360 degrees, but may be missing a portion such as 180 degrees. That is, the dome-type display 21 may be a hemispherical dome shape. Also, the dome-type display 21 does not need to have a portion of the screen 2101 that is curved. In this way, the size and type of the dome-type display 21 are not particularly limited.
  • the dome-type display 21 may be a 180-degree dome-type display 21 or a 360-degree dome-type display 21.
  • the size of the dome-type display 21 may be about 1 meter to 2 meters in length, 1 meter to 2 meters in width, and 1 meter to 2 meters in height.
  • the installation location of the dome-shaped display 21 is not particularly limited.
  • the dome-shaped display 21 may be installed in the user's home, company, etc., or in a location where anyone can use it.
  • FIG. 4 is an explanatory diagram showing Example 2 of a display device in a control system.
  • an HMD 24 is given as an example of a display device. In real space, a user wears the HMD 24.
  • the HMD 24 has a function that can detect, for example, the direction of the user's face, the direction of the user's gaze, and the user's movements.
  • the HMD 24 may also have an audio output function and an audio collection function.
  • FIG. 5 is an explanatory diagram showing an example of seats in a moving object.
  • the moving object is a vehicle such as a bus or train
  • the moving object is a vehicle such as a bus or train
  • seat A is the seat for user X
  • seat B is the seat for user Y.
  • FIG. 6 is an explanatory diagram showing an example in which an imaging device is installed on a moving object.
  • a moving object 25 is a vehicle such as a bus or train, and an imaging device 26 is installed on one of multiple seats.
  • an imaging device 26 is installed on both seat A and seat D of user X.
  • FIG. 7 is an explanatory diagram showing an example in which the imaging device 26 is attached to a robot arm 27.
  • a plurality of robot arms 27 e.g., robot arms 27a, 27b, etc.
  • An imaging device 26 is attached to each robot arm 27. This allows the control system to control the imaging position of the imaging device 26 by controlling the robot arm 27.
  • the control system is not limited to the example in which the imaging device 26 is attached to the robot arm 27, as long as it is possible to capture an image of the direction of the user's face, a direction according to the user's line of sight, or a position according to the user's movement.
  • the moving body 25 on which the imaging device 26 is installed is exemplified as a vehicle such as a bus or train, but the moving body 25 may be one that is not designed to carry a user.
  • the imaging device 26 in the moving body 25 may capture a landscape, and an image of the captured landscape combined with an image of the vehicle in a virtual reality space may be displayed on the display device.
  • FIG. 8 is a block diagram showing an example of the configuration of a control system according to the second embodiment.
  • the control system 20 includes an acquisition unit 201, a detection unit 202, an imaging device control unit 203, an output control unit 204, an image generation unit 205, a reception unit 206, a product determination unit 207, a registration unit 208, and a settlement unit 209.
  • control system 20 further includes a video generation unit 205, a reception unit 206, a product determination unit 207, a registration unit 208, and a settlement unit 209.
  • the acquisition unit 201 has the function of the acquisition unit 101 according to the first embodiment as a basic function.
  • the detection unit 202 has the function of the detection unit 102 according to the first embodiment as a basic function.
  • the imaging device control unit 203 has the function of the imaging device control unit 103 according to the first embodiment as a basic function.
  • the output control unit 204 has the function of the output control unit 104 according to the first embodiment as a basic function.
  • control system 20 has an association table 2001, a user DB 2002, and a product DB 2003.
  • Each functional unit of the control system 20 can refer to and update various databases and tables as appropriate.
  • FIG. 9 is an explanatory diagram showing an example of the association table 2001.
  • the association table 2001 stores a camera ID (Identifier), a seat, and a display ID in association with each other.
  • the camera ID is an example of imaging device identification information that identifies the imaging device 26.
  • the seat indicates a seat.
  • the display ID is an example of display device identification information that identifies a display device.
  • camera ID "C0001”, seat A, and display ID “D001” are associated with each other.
  • camera ID "C0002”, seat D, and display ID “D002” are associated with each other.
  • User DB 2002 also stores user information for each user. Examples of user information include the user's name and an image of the user. The image of the user may be a photograph of the actual user or an avatar, and is not particularly limited.
  • user DB 2002 may store display device identification information and user information in association with each user.
  • user DB 2002 may separately store display device identification information, user identification information, and user information in association with each other.
  • Product DB2003 also stores product information for each product.
  • Product DB2003 stores product identification information and product information in association with each other. There are no particular limitations on the product identification information as long as it is possible to identify the product.
  • Product information includes, for example, the product name, product price, product image, product features, etc., and is not particularly limited.
  • the acquisition unit 201 acquires an association table 2001 as data in which, for example, the imaging device identification information of the imaging device 26 attached to the mobile object 25 and the display device identification information of the user's display device are associated.
  • the detection unit 202 detects the direction in which the user wants to look.
  • the method for detecting the direction in which the user wants to look is as described in embodiment 1.
  • the imaging device control unit 203 controls the orientation of the imaging device 26 installed on the moving body 25 based on the direction the user wants to view and the user's position in the virtual reality space.
  • the imaging device control unit 203 may control the orientation of the imaging device 26 by controlling the robot arm 27 to which the imaging device 26 is attached.
  • Existing technology may be used as a method for controlling the robot arm 27.
  • the position of the user in the virtual reality space may be determined by the position and orientation of the user's seat in the vehicle in the virtual reality space. Therefore, when the moving body 25 is a vehicle, the imaging device control unit 203 controls the orientation of the imaging device 26 according to the orientation of the user's seat in the vehicle in the virtual reality space. This allows the user to see the scenery in the moving body 25 according to the seat.
  • the output control unit 204 displays the image captured by the imaging device 26 on a display device identified by the display device identification information associated with the imaging device identification information of the imaging device 26.
  • the output control unit 204 may cause the display device to display an image obtained by combining the captured image with an image of the vehicle in the virtual reality space.
  • the image generation unit 205 generates an image obtained by combining the captured image with an image of the vehicle in the virtual reality space. Then, the output control unit 204 causes the display device to display the generated image.
  • the image generating unit 205 when the moving body 25 to which the imaging device 26 is attached is a vehicle, and another imaging device 26 attached to the moving body 25 is shown in the image, the image generating unit 205 generates an image from the image captured by the imaging device 26, in which the other imaging device 26 that is shown in the image has been deleted.
  • the deleted image may be, for example, an image in which no one is sitting in the seat where the other imaging device 26 is installed, or an image in which the seat where the other imaging device 26 is installed has been replaced with a specified image.
  • the specified image may be an image of a user using a display device identified by the display device identification information associated with the imaging device identification information of the other imaging device 26. There may be cases where an imaging device 26 is installed but no user is present. For this reason, the specified image may be an image of any user as if the user were actually sitting in the moving body 25, and is not particularly limited.
  • FIGS. 10 and 11 are explanatory diagrams showing the correspondence between the orientation of the user's face in real space and the orientation of the user's face in virtual reality space.
  • the orientation of the user's face in real space is used as the orientation the user wants to see.
  • the orientation of the user's face in real space and the orientation of the user's face in virtual reality space are associated with each other at the start of use.
  • the orientation of the user's face in real space is facing north.
  • the imaging device 26 is a 360-degree camera
  • the output control unit 204 displays an image corresponding to a north-facing orientation from among the images captured by the imaging device 26.
  • An image that can be seen facing north like the orientation of the user's face in virtual reality space, is displayed on the display device.
  • the user's face in real space faces northeast.
  • the imaging device 26 is a 360-degree camera, for example, the output control unit 204 displays an image captured by the imaging device 26 that faces northeast. This causes an image that appears to face northeast, like the user's face in virtual reality space, to be displayed on the display device.
  • the image generating unit 205 may also generate an image corresponding to the orientation of the user from the image captured by the imaging device 26. For example, if the imaging device 26 is a 360-degree camera, i.e., an omnidirectional camera, it may capture images corresponding to each orientation. In such a case, the image generating unit 205 generates an image corresponding to the orientation of the user from the omnidirectional images captured by the imaging device 26.
  • the imaging device 26 is a 360-degree camera, i.e., an omnidirectional camera, it may capture images corresponding to each orientation. In such a case, the image generating unit 205 generates an image corresponding to the orientation of the user from the omnidirectional images captured by the imaging device 26.
  • the detection unit 202 may also detect the user's line of sight.
  • the detection unit 202 may also detect the movement of the user. Specifically, for example, the detection unit 202 may detect the movement of the user's head as the movement of the user. The movement of the user's head may be movement of the head forward, backward, left or right. For example, the height of the user's line of sight may change when the user stands or sits.
  • the detection method may be detection by various sensors, or may be detection from an image captured by the imaging device 26. Also, for example, if the display device is the HMD 24, the user can move, for example, by walking. Therefore, specifically, the detection unit 202 may detect a change in the position of the user as the movement of the user.
  • the imaging device control unit 203 changes the imaging position of the imaging device 26 by controlling the robot arm 27 on which the imaging device 26 is installed based on the movement of the user. For example, the imaging device control unit 203 controls the robot arm 27 so that the imaging device 26 can capture an image at a position moved by the amount of movement of the user in each of the vertical, horizontal, upward, downward, etc. directions. In this way, the imaging position of the imaging device 26 changes according to the movement of the user, thereby enhancing the user's sense of immersion.
  • the output control unit 204 displays other users and the interior of the vehicle according to the seat of each user in the vehicle in the virtual reality space.
  • the image generation unit 205 generates an image that combines the image captured by the imaging device 26 with the image of other users and the interior of the vehicle according to the seat of each user in the vehicle in the virtual reality space. Then, the output control unit 204 displays the generated image on the display device.
  • the output control unit 204 may display an image of the vehicle in the virtual reality space. Specifically, for example, the image generation unit 205 generates an image in which an image captured by the imaging device 26 installed in the moving object 25 is reflected in a window of the vehicle in the virtual reality space. Then, the output control unit 204 may display the generated image on a display device.
  • FIG. 12 is an explanatory diagram showing a display example of the dome-shaped display 21.
  • an image of the inside of a vehicle is displayed on the dome-shaped display 21.
  • user Y is in a vehicle that is a moving body 25 in real space
  • user Z is not in a vehicle that is a moving body 25 in real space.
  • An imaging device 26 is installed in seat D of user Z.
  • the imaging device 26 installed in seat B of user X captures images of the scenery of seats B and D on the moving body 25 and the scenery outside the window of the moving body 25 in a direction that the user wants to see and in accordance with the position of the user's seat.
  • the image generating unit 205 generates an image from an image captured by the imaging device 26 installed in seat B of user X, in which the imaging device 26 installed in seat B is replaced with an avatar of user Z. Then, the output control unit 204 causes the dome-shaped display 21 to display the generated image.
  • the image generating unit 205 may also generate an image that changes the image captured by the imaging device 26 installed in seat B of user X to a scene inside the vehicle according to the user's preferences.
  • ⁇ Graphical display of audio> For example, in the case of a group trip, some users may know each other, but some users may not. Group trips have the advantage of being able to share information by listening to conversations between users who are not acquainted, but they also have the disadvantage of having to listen to conversations that are not of interest to users.
  • control system 20 may allow the user to selectively hear the audio.
  • the output control unit 204 causes the display device to display a graphic representing the voice of a user other than the user who is conversing with the user.
  • the output control unit 204 may cause the graphic to be displayed together with the voice, or may cause the graphic to be displayed with the voice muted.
  • the video generation unit 205 generates a video in which a graphic representing the voice of the other user is added to the captured video. Then, the output control unit 204 causes the generated video to be displayed on the display device.
  • Graphics refer to photographs, illustrations, figures, symbols, letters, etc.
  • Graphics may be graphics corresponding to audio.
  • Graphics corresponding to audio may differ in shape, size, color, pattern, etc., depending on the audio.
  • audio related to travel may have a different color than audio unrelated to travel.
  • the output control unit 204 may highlight graphics representing audio related to travel more than graphics representing audio unrelated to travel.
  • the pattern of audio related to travel may be different from the pattern of audio unrelated to travel.
  • users other than the user who is having a conversation with the user may be grouped in advance, or may be grouped according to the conversation.
  • “Grouped in advance” means that users who will travel together are registered in advance before use begins, and the users who are registered to travel together are treated as one group.
  • "grouped according to the conversation” means that if user X, user Y, and user Z are having a conversation, the users involved in the same conversation are grouped into one group. In this case, the groups change periodically.
  • FIG. 13 is an explanatory diagram showing an example of a graphic display of the voice of another user.
  • a speech bubble is used as the graphic.
  • the shape of the graphic is not limited to the shape of a speech bubble, and is not particularly limited as long as it is identifiable as representing voice.
  • FIG. 13 there are three speech bubbles. That is, in FIG. 13, there are three people speaking or three groups of conversations.
  • user Z is a user who is not conversing with user X, and the voice of user Z is assumed to be travel-related.
  • the voices of the other users are assumed to be voices unrelated to travel.
  • the output control unit 204 highlights graphics representing voices related to travel more than graphics representing voices unrelated to travel.
  • the size and pattern of the graphic are used for highlighting.
  • the size of the speech bubble representing user Z's voice is larger than the sizes of the other speech bubbles.
  • the pattern of the speech bubble representing user Z's voice is a dot pattern, whereas the patterns of the other speech bubble sizes are solid colors. In this way, the speech bubble representing user Z's voice is emphasized more than the other speech bubbles.
  • the color of the graphic may be used for highlighting.
  • the reception unit 206 may also receive the selection of a graphic representing a voice through a user's operation.
  • the user's operation may be, for example, an operation via an input device, an operation according to the user's hand movement, or an operation according to the user's voice, and the operation method is not particularly limited.
  • the input device is the controller 28 or a terminal device.
  • the user's hand movement can be detected, for example, from an image captured by the imaging device 22.
  • the user's voice is obtained, for example, by the recording device 23.
  • the output control unit 204 then outputs the voice represented by the graphic selected by the user. Alternatively, for example, the output control unit 204 may not display the graphic selected by the user.
  • the video generation unit 205 adds graphics other than the selected graphic to the captured video, and generates a video to which the selected graphic has not been added.
  • the output control unit 204 then displays the generated video on the display device.
  • the output control unit 204 may output the conversation behind the user as audio, and display the conversation in front of the user graphically without audio output.
  • the front of the user is the direction in which the user is facing, and the rear of the user is the direction opposite to the direction in which the user is facing.
  • An example of a dome-type display 21 that does not cover the entire surface of the user is a 180-degree dome-type display.
  • FIG. 14 is an explanatory diagram showing an example of switching from graphic output to audio output.
  • the output control unit 204 causes the dome-shaped display 21 to display a speech bubble representing the voice of user Z.
  • the reception unit 206 receives the selection of a graphic through an operation by user X.
  • the output control unit 204 causes the voice of user Z represented by the selected graphic to be output as audio.
  • the reception unit 206 may also receive an output format for the voice of another user.
  • the output control unit 204 outputs the voice of another user in the received format.
  • the reception unit 206 may be able to select in stages between a graphic representing the voice and the voice.
  • the reception unit 206 may also be able to select the graphic in stages.
  • the output control unit 204 then switches the output in stages.
  • the output control unit 204 may switch in stages so that more detailed content of the voice is displayed, such as a graphic that represents a specific shape in color, or a graphic that represents the voice in text, etc.
  • ⁇ Shopping> For example, shopping may take place while traveling in a virtual reality space.
  • the product determination unit 207 determines recommended products from the products included in the product DB 2003.
  • the products included in the product DB 2003 are products identified by product identification information stored in the product DB 2003.
  • the product determination unit 207 determines products related to travel destinations as recommended products.
  • the product determination unit 207 may determine recommended products from among products related to travel destinations based on the user's conversation.
  • the output control unit 204 presents information about recommended products to the user.
  • the output control unit 204 may cause a display device to display information about recommended products, or may notify a terminal device of the user of information about recommended products.
  • products may be sold in a format similar to in-train or in-flight sales.
  • the output control unit 204 may then display a store clerk's avatar on the display device, and the store clerk's avatar may present information about recommended products.
  • the registration unit 208 accepts product registration through user operation.
  • the settlement unit 209 settles the registered product through user operation. Note that existing technology may be used for the product registration method and settlement method.
  • (flowchart) 15 is a flowchart showing an example of an operation of the control system 20 according to the second embodiment.
  • the acquisition unit 201 acquires data in which the imaging device identification information of the imaging device 26 and the display device identification information of the display device are associated with each other (step S201). For example, the acquisition unit 201 acquires an association table 2001 as this data.
  • the detection unit 202 detects the direction in which the user wants to look (step S202).
  • the imaging device control unit 203 controls the orientation of the imaging device 26 based on the detected orientation, or generates an image corresponding to the direction in which the user wants to look from the image captured by the imaging device 26 (step S203). Note that in step S203, the imaging device control unit 203 may control the orientation of the imaging device 26 based on the direction in which the user wants to look, and may generate an image corresponding to the facial direction from the captured image.
  • the output control unit 204 displays the image captured by the imaging device 26 or the generated image on a display device identified by display device identification information associated with the imaging device identification information of the imaging device 26 (step S204).
  • the detection unit 202 determines whether the movement of the user has been detected (step S205). If the movement of the user has not been detected (step S205: No), the detection unit 202 proceeds to step S207. If the movement of the user has been detected (step S205: Yes), the imaging device control unit 203 controls the imaging position of the imaging device 26 by controlling the robot arm 27 (step S206).
  • the detection unit 202 determines whether the direction in which the user wishes to look has changed (step S207). If the direction in which the user wishes to look has changed (step S207: Yes), the imaging device control unit 203 returns to step S203. If the direction in which the user wishes to look has not changed (step S207: No), the output control unit 204 returns to step S204.
  • the flowchart can be ended at any time.
  • control system 20 controls the orientation of the imaging device 26 according to the orientation of the user's seat in the vehicle in the virtual reality space. This allows the user to see images that would be seen from the seat when riding in an actual vehicle. Therefore, a better video experience can be provided to the user.
  • the control system 20 detects the user's line of sight and controls the orientation of the imaging device 26 based on the line of sight and the user's position in the virtual reality space. This allows the user to see images that would be seen from the seat when riding in an actual vehicle. This makes it possible to provide the user with a better video experience.
  • the control system 20 detects the movement of the user and, based on the user's movement, controls the robot arm 27 on which the imaging device 26 is installed, thereby changing the imaging position of the imaging device 26. This allows the image to change depending on the user's up and down movements, etc., thereby improving the sense of realism.
  • the control system 20 also generates an image based on the orientation of the user from the image captured by the imaging device 26, and displays the generated image on the display device.
  • control system 20 when an image captured by the imaging device 26 includes an imaging device other than the imaging device 26, the control system 20 generates an image from which the other imaging device is deleted. For example, the user can concentrate on the experience provided by the image, and the sense of realism can be improved.
  • the control system 20 displays other users and the interior of the vehicle according to the seat of each user in the vehicle in the virtual reality space.
  • the control system 20 may generate an image that combines an image captured by an imaging device 26 installed in the moving body 25 with an image of the interior of the vehicle in the virtual reality space, and display the generated image. In this way, the control system 20 can provide the user with an experience that is not present in the actual image by complementing it.
  • the control system 20 displays, in graphics, the voices of users other than the user who is conversing with the user. Users want to enjoy realistic images, or, in the case of travel, to enjoy a group trip on a vehicle. However, there are times when a user wants to hear what the other users are saying and times when they don't. For example, a user will want to hear information that is useful to the user. This allows, for example, the user to not be bothered by the conversations of other users as long as they don't look at the graphics. On the other hand, if the user is interested, they can just look at the graphics. In this way, the control system 20 can provide a better video experience by combining the benefits of real travel with the benefits of virtual reality space.
  • the control system 20 also outputs the sound represented by a graphic selected by the user from among the displayed graphics. For example, the user can check a conversation that interests them by audio.
  • the control system 20 does not display the graphic selected by the user. This allows the user to remove from view the graphic that represents the conversation in which the user is not interested.
  • control system 20 may output a conversation behind the user in the virtual reality space as audio, and display a conversation in front of the user in the virtual reality space as graphics.
  • the control system 20 can accept an output format for the voices of other users, and outputs the voices of other users in the accepted output format.
  • the control system 20 can select the output format in stages, in the order of color, text, and sound. This allows the output format of the conversations of other users to be changed according to the interests of the user.
  • control system may be configured to include each functional unit and part of the information.
  • each embodiment is not limited to the above-mentioned examples, and can be modified in various ways.
  • the configuration of the control system in each embodiment is not particularly limited.
  • the control system may be realized by a single device, such as a single server.
  • the single device may be called a control device, information processing device, etc., and is not particularly limited.
  • the control system in each embodiment may be realized by different devices depending on the function or data.
  • each functional unit may be configured by multiple servers and realized as a control system.
  • the control system may be realized by a database server including each DB (Database) and a server having each functional unit.
  • FIG. 16 is an explanatory diagram showing an example of a control system.
  • the control system includes, for example, an edge terminal device 31 and a server 32.
  • the edge terminal device 31, the dome-type display 21, the imaging device 22, and the sound recording device 23 are installed in the user's home or a shared space.
  • the edge terminal device 31, the dome-type display 21, the imaging device 22, the sound recording device 23, and the controller 28 are connected via a communication network.
  • the imaging device 26 and the robot arm 27 are installed on the moving body 25. Furthermore, the imaging device 26 is attached to the robot arm 27.
  • the imaging device 26, the edge terminal device 31, and the server 32 are connected via a communication network.
  • the control system 20 may also be configured as an entire system including an edge terminal device 31, a server 32, a dome-type display 21, an imaging device 22, an audio recording device 23, an imaging device 26, a robot arm 27, and a controller 28.
  • each functional unit in each embodiment is realized by an edge terminal device 31 and a server 32.
  • the server 32 may be a plurality of servers.
  • each functional unit of the control systems 10 and 20 may be realized by a plurality of devices, and the plurality of devices may be installed in different locations.
  • each piece of information and each DB may include a portion of the above-mentioned information. Furthermore, each piece of information and each DB may include information other than the above-mentioned information. Each piece of information and each DB may be divided into more detailed pieces of DB or pieces of information. In this way, the method of realizing each piece of information and each DB is not particularly limited.
  • each screen is merely an example and is not particularly limited. Buttons, lists, check boxes, information display fields, input fields, etc. (not shown) may be added to each screen. Furthermore, the background color of the screen, etc. may be changed.
  • the process of generating information to be displayed on the display device may be performed by the output control unit 104, 204. This process may also be performed by the display device.
  • Fig. 17 is an explanatory diagram showing an example of a hardware configuration of a computer.
  • a part or all of each device can be realized by using any combination of a computer 80 and a program as shown in Fig. 17.
  • the computer 80 has, for example, a processor 801, a ROM (Read Only Memory) 802, a RAM (Random Access Memory) 803, and a storage device 804.
  • the computer 80 also has a communication interface 805 and an input/output interface 806.
  • Each component is connected to the other via, for example, a bus 807. Note that the number of each component is not particularly limited, and there may be one or more of each component.
  • the processor 801 controls the entire computer 80.
  • Examples of the processor 801 include a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
  • the computer 80 has a ROM 802, a RAM 803, and a storage device 804 as a storage unit.
  • Examples of the storage device 804 include a semiconductor memory such as a flash memory, a HDD (Hard Disk Drive), and a SSD (Solid State Drive).
  • the storage device 804 stores an OS (Operating System) program, an application program, and a program according to each embodiment.
  • the ROM 802 stores an application program and a program according to each embodiment.
  • the RAM 803 is used as a work area for the processor 801.
  • the processor 801 also loads programs stored in the storage device 804, ROM 802, etc. The processor 801 then executes each process coded in the program. The processor 801 may also download various programs via the communications network NT. The processor 801 also functions as a part or all of the computer 80. The processor 801 may then execute the processes or instructions in the illustrated flowchart based on the program.
  • the communication interface 805 is connected to a communication network NT, such as a LAN (Local Area Network) or a WAN (Wide Area Network), via a wireless or wired communication line.
  • the communication network NT may be composed of multiple communication networks NT.
  • the computer 80 is connected to an external device or an external computer 80 via the communication network NT.
  • the communication interface 805 serves as an interface between the communication network NT and the inside of the computer 80.
  • the communication interface 805 also controls the input and output of data from the external device or the external computer 80.
  • the input/output interface 806 is connected to at least one of an input device, an output device, and an input/output device.
  • the connection method may be wireless or wired.
  • Examples of the input device include a keyboard, a mouse, and a microphone.
  • Examples of the output device include a display device, a lighting device, and an audio output device that outputs audio.
  • Examples of the input/output device include a touch panel display.
  • the input device, output device, and input/output device may be built into the computer 80 or may be external.
  • the hardware configuration of the computer 80 is an example.
  • the computer 80 may have some of the components shown in FIG. 17.
  • the computer 80 may have components other than those shown in FIG. 17.
  • the computer 80 may have a drive device or the like.
  • the processor 801 may read out programs and data stored in a recording medium attached to the drive device or the like to the RAM 803. Examples of non-transient tangible recording media include optical disks, flexible disks, magnetic optical disks, and USB (Universal Serial Bus) memories.
  • the computer 80 may have input devices such as a keyboard and a mouse.
  • the computer 80 may have an output device such as a display.
  • the computer 80 may also have an input device, an output device, and an input/output device.
  • the computer 80 may also have various sensors (not shown). The type of sensor is not particularly limited.
  • the computer 80 may also have an imaging device capable of capturing images and videos.
  • each device may be realized by any combination of a different computer and program for each component.
  • multiple components that each device has may be realized by any combination of a single computer and program.
  • each device may be realized by circuits for a specific application. Further, some or all of the components of each device may be realized by general-purpose circuits including a processor such as an FPGA (Field Programmable Gate Array). Further, some or all of the components of each device may be realized by a combination of circuits for a specific application and general-purpose circuits. Further, these circuits may be a single integrated circuit. Alternatively, these circuits may be divided into multiple integrated circuits. The multiple integrated circuits may be configured by being connected via a bus or the like.
  • each device may be realized by multiple computers, circuits, etc.
  • the multiple computers, circuits, etc. may be centralized or distributed.
  • control method described in each embodiment is realized by being executed by a control system. Also, for example, the control method is realized by having a computer such as a server or a terminal device execute a program prepared in advance.
  • the programs described in each embodiment are recorded on a computer-readable recording medium such as a HDD, SSD, flexible disk, optical disk, magneto-optical disk, or USB memory.
  • the programs are then executed by a computer by reading them from the recording medium.
  • the programs may also be distributed via a communications network NT.
  • each component of the control system in each embodiment described above may have their functions realized by dedicated hardware, such as a computer.
  • each component may be realized by software.
  • each component may be realized by a combination of hardware and software.
  • (Appendix 1) an acquisition means for acquiring data in which imaging device identification information of an imaging device attached to a moving object and display device identification information of a display device of a user are associated with each other;
  • a detection means for detecting a direction in which the user wants to view;
  • an imaging device control means for controlling an orientation of an imaging device based on the detected orientation and a position of the user in a virtual reality space;
  • an output control means for displaying the image captured by the imaging device on the display device identified by the display device identification information associated with the imaging device identification information of the imaging device;
  • a control system comprising: (Appendix 2) the imaging device control means controls a direction of the imaging device in accordance with a direction of a seat of the user in the vehicle in the virtual reality space. 2.
  • the direction in which the user wants to look is the direction of the user's face
  • the detection means detects the orientation of the face
  • the control system according to claim 1 or 2 wherein the imaging device control means controls an orientation of the imaging device based on an orientation of the face and a position of the user in the virtual reality space.
  • the detection means detects the line of sight of the user
  • the control system according to claim 3 wherein the imaging device control means controls an orientation of the imaging device based on the line of sight and a position of the user in the virtual reality space.
  • the detection means detects the movement of the user, the imaging device control means changes an imaging position of the imaging device by controlling a robot arm on which the imaging device is installed, based on a movement of the user. 5.
  • a control system according to any one of claims 1 to 4. (Appendix 6) an image generating means for generating an image corresponding to the direction in which the user wants to view the image captured by the imaging device; Equipped with The output control means causes the generated image to be displayed on the display device. 6.
  • a control system according to any one of claims 1 to 5. (Appendix 7) When an imaging device other than the imaging device is shown in the image captured by the imaging device, the image generating means generates an image in which the other imaging device is deleted from the image. 7. The control system of claim 6.
  • (Appendix 11) a reception means for receiving a selection of a graphic by the user from among the graphics; Equipped with the output control means prevents the selected graphic from being displayed; 10.
  • the control system of claim 9. (Appendix 12)
  • the output control means causes a conversation occurring behind the user in the virtual reality space to be output as audio, and causes a conversation occurring in front of the user in the virtual reality space to be displayed graphically; 12.
  • (Appendix 13) a receiving means for receiving an output format for the voice of the other user; Equipped with The output control means outputs the voice of the other user in the received output format. 13.
  • (Appendix 14) Acquire data in which imaging device identification information of the imaging device attached to the moving object and display device identification information of the user's display device are associated with each other; Detecting the direction in which the user wants to look; displaying the image captured by the imaging device on the display device identified by the display device identification information associated with the imaging device identification information of the imaging device, based on the detected orientation and the position of the user in a virtual reality space; Control methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Ce système de commande comprend une unité d'acquisition, une unité de détection, une unité de commande de dispositif d'imagerie et une unité de commande de sortie. L'unité d'acquisition acquiert des données dans lesquelles des informations d'identification de dispositif d'imagerie sur un dispositif d'imagerie fixé à un corps mobile sont associées à des informations d'identification de dispositif d'affichage sur un dispositif d'affichage d'un utilisateur. L'unité de détection détecte une orientation que l'utilisateur souhaite visualiser. L'unité de commande de dispositif d'imagerie commande l'orientation du dispositif d'imagerie sur la base de l'orientation détectée et de la position de l'utilisateur dans un espace de réalité virtuelle. L'unité de commande de sortie affiche une vidéo capturée par le dispositif d'imagerie sur le dispositif d'affichage identifié par les informations d'identification de dispositif d'affichage associées aux informations d'identification de dispositif d'imagerie sur le dispositif d'imagerie.
PCT/JP2022/036066 2022-09-28 2022-09-28 Système de commande, procédé de commande et support d'enregistrement WO2024069779A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036066 WO2024069779A1 (fr) 2022-09-28 2022-09-28 Système de commande, procédé de commande et support d'enregistrement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036066 WO2024069779A1 (fr) 2022-09-28 2022-09-28 Système de commande, procédé de commande et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2024069779A1 true WO2024069779A1 (fr) 2024-04-04

Family

ID=90476729

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036066 WO2024069779A1 (fr) 2022-09-28 2022-09-28 Système de commande, procédé de commande et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2024069779A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015143872A (ja) * 2010-10-01 2015-08-06 バルコ エヌ.ブイ. 湾曲型後面投射スクリーン
WO2018100800A1 (fr) * 2016-11-29 2018-06-07 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme informatique
WO2018123074A1 (fr) * 2016-12-27 2018-07-05 公立大学法人首都大学東京 Appareil photographique
JP2019036857A (ja) * 2017-08-16 2019-03-07 株式会社Dapリアライズ ライブ映像娯楽施設
WO2019150675A1 (fr) * 2018-02-02 2019-08-08 株式会社Nttドコモ Dispositif de traitement d'informations
WO2020049768A1 (fr) * 2018-09-07 2020-03-12 オムロン株式会社 Manipulateur et robot mobile
JP2020052846A (ja) * 2018-09-27 2020-04-02 パナソニックIpマネジメント株式会社 描画システム、描画方法、及びプログラム
WO2021044473A1 (fr) * 2019-09-02 2021-03-11 ヤマハ発動機株式会社 Dispositif de commande de bras de robot à articulations multiples et dispositif bras de robot à articulations multiples

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015143872A (ja) * 2010-10-01 2015-08-06 バルコ エヌ.ブイ. 湾曲型後面投射スクリーン
WO2018100800A1 (fr) * 2016-11-29 2018-06-07 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme informatique
WO2018123074A1 (fr) * 2016-12-27 2018-07-05 公立大学法人首都大学東京 Appareil photographique
JP2019036857A (ja) * 2017-08-16 2019-03-07 株式会社Dapリアライズ ライブ映像娯楽施設
WO2019150675A1 (fr) * 2018-02-02 2019-08-08 株式会社Nttドコモ Dispositif de traitement d'informations
WO2020049768A1 (fr) * 2018-09-07 2020-03-12 オムロン株式会社 Manipulateur et robot mobile
JP2020052846A (ja) * 2018-09-27 2020-04-02 パナソニックIpマネジメント株式会社 描画システム、描画方法、及びプログラム
WO2021044473A1 (fr) * 2019-09-02 2021-03-11 ヤマハ発動機株式会社 Dispositif de commande de bras de robot à articulations multiples et dispositif bras de robot à articulations multiples

Similar Documents

Publication Publication Date Title
US11127217B2 (en) Shared environment for a remote user and vehicle occupants
US11250636B2 (en) Information processing device, information processing method, and program
US11367260B2 (en) Video synthesis device, video synthesis method and recording medium
CN111373347B (zh) 用于虚拟现实内容的提供的装置、方法和计算机程序
JPWO2012053033A1 (ja) 3次元立体表示装置および3次元立体表示処理装置
US11625858B2 (en) Video synthesis device, video synthesis method and recording medium
US11361497B2 (en) Information processing device and information processing method
CN108027936B (zh) 用于在视频内容内呈现交互式元素的方法、系统和介质
EP3276982B1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme
WO2013121471A1 (fr) Dispositif de génération d'image
JP2020120336A (ja) プログラム、方法、および情報処理装置
WO2024069779A1 (fr) Système de commande, procédé de commande et support d'enregistrement
WO2012053032A1 (fr) Dispositif d'affichage 3d
WO2024105870A1 (fr) Système de commande, procédé de commande et support d'enregistrement
US20230179756A1 (en) Information processing device, information processing method, and program
JP7072706B1 (ja) 表示制御装置、表示制御方法および表示制御プログラム
US20240105052A1 (en) Information management device, information management method and storage medium
WO2024034350A1 (fr) Système de dialogue en ligne vidéo et programme
JP7123222B1 (ja) 表示制御装置、表示制御方法および表示制御プログラム
JP2023184519A (ja) 情報処理システム、情報処理方法およびコンピュータプログラム
JP2024038605A (ja) 情報処理システム
JP2023000858A (ja) 表示制御装置、表示制御方法および表示制御プログラム
JP2001313957A (ja) 像配信システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22960843

Country of ref document: EP

Kind code of ref document: A1