WO2023153088A1 - 情報設定装置 - Google Patents
情報設定装置 Download PDFInfo
- Publication number
- WO2023153088A1 WO2023153088A1 PCT/JP2022/047065 JP2022047065W WO2023153088A1 WO 2023153088 A1 WO2023153088 A1 WO 2023153088A1 JP 2022047065 W JP2022047065 W JP 2022047065W WO 2023153088 A1 WO2023153088 A1 WO 2023153088A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual object
- information
- terminal device
- unit
- processing device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present invention relates to an information setting device.
- Patent Literature 1 discloses a technique for setting a mask area for restricting viewing of an image displayed on a user's display. When displaying an image in which a mask area is set on a display, the user pays attention to the non-mask area other than the mask area.
- Patent Document 2 discloses a technology related to a system that renders a three-dimensional virtual object in a three-dimensional virtual space.
- a mask area and a non-mask area are set in the virtual space visually recognized by the user.
- the conventional technology limits the user's viewing by dividing the virtual space into masked areas and non-masked areas, so it was not possible to restrict viewing of a part of the virtual object.
- an object of the present invention is to provide an information setting device capable of limiting the visibility of the virtual object itself.
- An information setting device comprises an acquisition unit that acquires a three-dimensional virtual object to be displayed in a virtual space; and a setting unit for setting browsing information indicating at least one of
- An information setting device comprises an acquisition unit that acquires a three-dimensional virtual object to be displayed in a virtual space; and a setting unit for setting browsing information indicating whether or not browsing is permitted.
- the visibility of the virtual object itself can be restricted.
- FIG. FIG. 2 is a block diagram showing a configuration example of a terminal device 10-K; Explanatory drawing which shows an example of the setting method of browsing information. Explanatory drawing which shows an example of the processing method with respect to the virtual object VO.
- 2 is a block diagram showing a configuration example of a server 20;
- FIG. 4 is a sequence diagram showing a first operation of the information setting system 1;
- FIG. 4 is a sequence diagram showing a second operation of the information setting system 1;
- FIG. Explanatory drawing which shows an example of the setting method of browsing information.
- FIG. drawing which shows an example of the processing method with respect to the virtual object VO.
- FIG. 2 is a block diagram showing a configuration example of a server 20A; 4 is a table showing a configuration example of a prohibited image database ND; A table showing a configuration example of a browsing information database VD. 4 is a sequence diagram showing the first operation of the information setting system 1B; FIG. FIG. 11 is a sequence diagram showing a second operation of the information setting system 1B;
- FIG. 1 First Embodiment
- an information setting system 1 including a terminal device 10-K as an information setting device according to a first embodiment of the present invention will be described with reference to FIGS. 1 to 7.
- FIG. 1 First Embodiment
- FIG. 1 shows the overall configuration of an information setting system 1.
- the information setting system 1 includes terminal devices 10-1, 10-2, . . . 10-J, . N is an integer of 1 or more.
- J is an integer of 1 or more and N or less.
- K is 1 or more and N or less, and is an integer different from J.
- the terminal devices 10-1 to 10-N have the same configuration. However, terminal devices having different configurations may be included.
- the terminal devices 10-1 to 10-N and the server 20 are communicably connected to each other via the communication network NET.
- the user UJ uses the terminal device 10-J.
- the user UK uses the terminal device 10-K.
- the server 20 distributes various data and contents to the terminal devices 10-1 to 10-N via the communication network NET.
- the user U J uses the terminal device 10-J to create a three-dimensional virtual object.
- the terminal device 10 -J outputs the three-dimensional virtual object to the server 20 .
- the terminal that generates the three-dimensional virtual object and the terminal that outputs the three-dimensional virtual object to the server 20 may be separate entities.
- the server 20 outputs the three-dimensional virtual object to the terminal device 10-K used by the user UK .
- the user UK uses the terminal device 10-K to set the visibility of the three-dimensional virtual object.
- the server 20 distributes the three-dimensional virtual object whose viewability is set by the user UK to the terminal devices 10-1 to 10-N as the content described above. As will be described later, the server 20 may automatically set the viewability instead of the user UK setting the viewability using the terminal device 10-K.
- the user UJ may be the creator of the three-dimensional virtual object, and the user UK may be the administrator of the information setting system 1 .
- the user UK may be one of a plurality of users managing the information setting system 1 as a group.
- the account of user UJ may be an account linked as a management target to the account of user UK as an administrator.
- user UK may be the "parent" account and user UJ 's account may be the "child" account.
- the user UK may be a user who is not the administrator of the information setting system 1 but who sets the visibility in order to protect the content including the three-dimensional virtual object.
- the terminal device 10-J and the terminal device 10-K may be the same terminal device.
- the terminal devices 10-1 to 10-N cause the display 14, which will be described later, to display a virtual space including the three-dimensional virtual object.
- the virtual space is, for example, a celestial space.
- the terminal devices 10-1 to 10-N are preferably mobile terminal devices such as PCs (Personal Computers), smart phones, and tablets.
- the terminal device 10-J may be a dedicated scanner, for example.
- the display 14 may be separate from the terminal devices 10-1 to 10-N, and examples include XR glasses, XR goggles, or an HMD (Head Mounted Display) employing XR technology. There may be.
- FIG. 2 is a block diagram showing a configuration example of the terminal device 10-K.
- the terminal device 10 -K includes a processing device 11 , a storage device 12 , a communication device 13 , a display 14 and an input device 15 .
- Each element of the terminal device 10-K is interconnected using one or more buses for communicating information.
- the processing device 11 is a processor that controls the entire terminal device 10-K. Also, the processing device 11 is configured using, for example, a single chip or a plurality of chips. The processing unit 11 is configured using, for example, a central processing unit (CPU) including interfaces with peripheral devices, arithmetic units, registers, and the like. Some or all of the functions of the processing device 11 may be implemented using hardware such as DSP, ASIC, PLD, and FPGA. The processing device 11 executes various processes in parallel or sequentially.
- CPU central processing unit
- the storage device 12 is a recording medium that can be read and written by the processing device 11.
- the storage device 12 also stores a plurality of programs including the control program PR1 executed by the processing device 11 . Further, the storage device 12 further stores image information indicating an image displayed on the display 14 . Among other things, the storage device 12 stores image information used when generating the virtual object VO.
- the communication device 13 is hardware as a transmission/reception device for communicating with other devices.
- the communication device 13 is also called a network device, a network controller, a network card, a communication module, or the like, for example.
- the communication device 13 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 13 may have a wireless communication interface. Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection. Also, as a wireless communication interface, there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
- the display 14 is a device that displays images and character information.
- the display 14 displays various images under the control of the processing device 11 .
- various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are preferably used as the display 14 .
- the display 14 may be separate from the terminal device 10-K.
- the display 14 may be XR glasses, XR goggles, or an HMD employing XR technology, which is connected to the terminal device 10-K and can communicate with the terminal device 10-K.
- the display 14 is XR glasses, XR goggles, or an HMD employing XR technology, which is connected to the terminal device 10-K, the user UK wears the display 14 on the head. The user UK visually recognizes the virtual space through the display 14 worn on the head. The same is true for user UJ .
- the input device 15 receives an operation from the user UK .
- the input device 15 includes a pointing device such as a keyboard, touch pad, touch panel, or mouse.
- the input device 15 may also serve as the display 14 .
- the processing device 11 reads the control program PR1 from the storage device 12 and executes it. As a result, the processing device 11 functions as an acquisition unit 111 , a generation unit 112 , a display control unit 113 , a setting unit 114 , an appearance processing unit 115 and an output unit 116 .
- the acquisition unit 111 acquires image information representing a virtual object from the server 20 via the communication device 13 .
- the acquisition unit 111 acquires from the server 20 image information representing a three-dimensional virtual object generated by the user UJ using the terminal device 10-J.
- the acquisition unit 111 acquires operation information indicating the content of the operation on the input device 15 by the user UK .
- the acquiring unit 111 acquires the operation information indicating the content of the operation on the input device 15 by the user UJ in the terminal device 10-J instead of the terminal device 10-K.
- the generation unit 112 stores in the storage device 12 based on the operation information indicating the content of the operation of the user U J on the input device 15, which is acquired by the acquisition unit 111 in the terminal device 10-J, not in the terminal device 10-K. Using the obtained image information, a virtual object to be displayed in the virtual space is generated.
- the display control unit 113 causes the display 14 to display the virtual object using the image information acquired by the acquisition unit 111 .
- the display control unit 113 causes the display 14 to display a three-dimensional virtual object generated by the user UJ using the terminal device 10-J.
- the setting unit 114 sets viewing information indicating at least one of the viewable direction and the viewable direction for the three-dimensional virtual object acquired by the acquisition unit 111 . More specifically, the setting unit 114 sets the browsing information based on the operation information acquired by the acquisition unit 111 .
- FIG. 3 is an explanatory diagram showing an example of a method of setting browsing information.
- the display 14 displays the three-dimensional virtual object VO generated by the user UJ using the terminal device 10-J.
- three-dimensional polar coordinates expressed by a distance r from the origin O and two angle parameters ⁇ and ⁇ are set.
- the virtual object VO is a doll and is composed of a head P1 and a body P2.
- a sub-object S including the virtual object VO is displayed, centering on the point included in the virtual object VO.
- a sub-object S is a virtual transparent object.
- the shape of the sub-object S containing the virtual object VO is not limited to a sphere.
- the sub-object S may be a shape other than a sphere, such as a cube, cylinder, or the like.
- the center of gravity of the virtual object VO is the origin O, and a transparent sub-object S centered on the origin O is superimposed on the virtual object VO.
- the origin O is the center of gravity of the virtual object VO is merely an example.
- the origin O may be any point as long as it is included in the virtual object VO.
- the user UK operates the input device 25 to set, for the sub-object S, at least one of the direction in which the virtual object VO can be viewed and the direction in which the virtual object VO cannot be viewed.
- the user UK operates the input device 25 to set the viewable direction as a vector perpendicular to the surface of the sub-object S and passing through the origin O, that is, a normal vector to the surface of the sub-object S.
- V 1 to V 3 may be set.
- the user U K operates the input device 25 to set a vector orthogonal to the surface of the sub-object S and passing through the origin O, that is, normal vectors V 1 to V 3 may be set.
- the setting unit 114 sets these normal vectors expressed using three-dimensional polar coordinates (r, ⁇ , ⁇ ) as viewing information for the virtual object VO.
- the set browsing information is stored in the storage device 12 in a state of being associated with the image information indicating the virtual object VO.
- the normal vectors V 1 to V 3 are vectors that set the viewable direction and the viewable direction with respect to the virtual object VO, so the value of “r” in the polar coordinates may be any value.
- the terminal device 10-K can use the normal vector to the surface of the sub-object S to set viewing information indicating at least one of the viewable direction and the viewable direction of the virtual object VO. As a result, the terminal device 10-K can manage browsing information with the sub-object S as one unit.
- the user UK may set viewing information based on the shape of part or the whole of the virtual object VO.
- the user UK may set the viewing information based on at least one of the surface condition and surface appearance of a portion or the entirety of the virtual object VO.
- the user UK uses the terminal devices 10-1 to 10-N to view the virtual object VO according to the attributes of the viewers who view the virtual object VO.
- information can be set.
- the user UK may set viewing information for the virtual object VO depending on whether or not the attribute of the viewer is the administrator of the information setting system 1 .
- the user UK may set viewing information for the virtual object VO according to the viewer's age, sex, or belonging to a specific organization.
- the display 14 displays as many virtual objects VO as the number of attribute types of the viewer.
- the user UK sets viewing information for the sub-objects S superimposed on each of the virtual objects VO displayed for the number of attribute types.
- the setting unit 114 sets the viewing information according to the attribute of the viewer viewing the virtual object VO based on the user UK 's operation on the input device 15 .
- the terminal device 10-K can switch the viewing information indicating at least one of the viewable direction and the viewable direction, depending on the viewer's age, for example.
- the virtual object VO after the viewing information is set may overwrite the virtual object VO before the viewing information is set.
- the virtual object VO after the browse information is set may be newly created, and the virtual object VO before the browse information is set may be stored in the storage device 12 of the terminal device 10-K.
- the virtual object VO for which the browsing information has been set is output to the server 20, and then output from the server 20 to the terminal devices 10-1 to 10-N including the terminal device 10-J.
- the display control unit 113 provided in the terminal devices 10-1 to 10-N causes the display 14 to display the virtual object VO.
- the viewer can rotate the virtual object VO using the input device 15, for example.
- the viewer cannot rotate the virtual object VO to a posture in which the viewer views the virtual object VO from a viewing-disabled direction.
- an operation control unit controls the direction in which the viewer cannot view the virtual object VO in the virtual space. You may perform the process which becomes impossible.
- an operation control unit may execute a process in which the viewer is physically prevented from moving in the virtual space in the viewing prohibited direction set for the virtual object VO.
- the appearance processing unit 115 may perform processing to hide the appearance.
- the terminal device 10-K can limit the viewing ability of the virtual object VO itself.
- the terminal device 10-K can disable viewing only the bottom surface of the virtual object VO, for example.
- the terminal device 10-K sets the viewing information for the virtual object VO itself, there is no need for complicated processing for changing the field of view of the viewer each time in accordance with the rotation of the virtual object VO.
- the appearance processing unit 115 performs the processing of at least a part of the area of the three-dimensional virtual object VO that is displayed when the three-dimensional virtual object VO is browsed from an unviewable direction. Conceal your appearance.
- the appearance processing unit 115 also generates processing information indicating an image of the concealed area after concealment.
- FIG. 4 is an explanatory diagram showing an example of a processing method for the virtual object VO.
- the diagram of the virtual object VO shown in FIG. 4 is the diagram of the virtual object VO when viewed from the direction of the normal vector V3 in FIG. Also, in FIG. 3, the normal vector V3 indicates the view-disabled direction with respect to the virtual object VO.
- the user UK does not want the viewer to view the bottom surface of the torso P2 of the virtual object VO.
- the user UK uses the input device 15 to perform an operation of filling in the torso P2 of the virtual object VO shown in FIG.
- the appearance processing unit 115 performs processing for concealing the area of the body P2 when viewed from the direction of the normal vector V3.
- the user UK may perform an operation of filling in not only the area of the torso P2 but also the area of the head P1.
- the appearance processing unit 115 performs processing to hide both the head P1 and the body P2 regions when viewed from the direction of the normal vector V3 in the appearance of the virtual object VO.
- the processing information indicating the image of the area after concealment is stored in the storage device 12 while being associated with the image information indicating the virtual object VO.
- the terminal device 10-K can display only the bottom surface of the virtual object VO, for example. can be made unreadable.
- the terminal device 10-K changes the masked area in the viewer's field of view each time, depending on, for example, at which position in the viewer's field of view the area in the virtual object VO that the viewer does not want to see is displayed. No need to change.
- the output unit 116 outputs the browsing information and the processing information stored in the storage device 12 to the server 20 .
- the server 20 generates a new virtual object VO based on at least one of the image information acquired from the terminal device 10-J and the viewing information and processing information acquired from the terminal device 10-K. do.
- the acquisition unit 111 acquires image information indicating the new virtual object VO from the server 20. FIG.
- the output unit 116 of the terminal device 10-J not the terminal device 10-K, outputs image information representing the virtual object VO generated by the generation unit 112 to the server 20.
- FIG. 1 the output unit 116 of the terminal device 10-J, not the terminal device 10-K, outputs image information representing the virtual object VO generated by the generation unit 112 to the server 20.
- FIG. 5 is a block diagram showing a configuration example of the server 20.
- the server 20 comprises a processing device 21 , a storage device 22 , a communication device 23 , a display 24 and an input device 25 .
- the elements of server 20 are interconnected using one or more buses for communicating information.
- the processing device 21 is a processor that controls the server 20 as a whole. Also, the processing device 21 is configured using, for example, a single chip or a plurality of chips. The processing unit 21 is configured using, for example, a central processing unit (CPU) including interfaces with peripheral devices, arithmetic units, registers, and the like. Some or all of the functions of the processing device 21 may be implemented using hardware such as DSP, ASIC, PLD, and FPGA. The processing device 21 executes various processes in parallel or sequentially.
- CPU central processing unit
- the storage device 22 is a recording medium that can be read and written by the processing device 21 .
- the storage device 22 also stores a plurality of programs including the control program PR2 executed by the processing device 21 .
- the communication device 23 is hardware as a transmission/reception device for communicating with other devices.
- the communication device 23 is also called a network device, a network controller, a network card, a communication module, or the like, for example.
- the communication device 23 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 23 may have a wireless communication interface. Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection. Also, as a wireless communication interface, there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
- the display 24 is a device that displays images and character information.
- the display 24 displays various images under the control of the processing device 21 .
- various display panels such as a liquid crystal display panel and an organic EL display panel are preferably used as the display 24 .
- the input device 25 is a device that receives operations from the administrator of the information setting system 1.
- the input device 25 includes a pointing device such as a keyboard, touch pad, touch panel, or mouse.
- the input device 25 may also serve as the display 24 .
- the processing device 21 reads the control program PR2 from the storage device 22 and executes it. As a result, the processing device 21 functions as an acquisition unit 211 , a generation unit 212 and an output unit 213 .
- the acquisition unit 211 acquires image information indicating the virtual object VO generated by the user UJ from the terminal device 10-J via the communication device 23.
- FIG. Also, the acquisition unit 211 acquires browsing information and processing information from the terminal device 10 -K via the communication device 23 .
- the generating unit 212 generates a virtual image generated by the user UJ using the image information obtained by the obtaining unit 211 from the terminal device 10-J and at least one of the viewing information and the processing information obtained from the terminal device 10-K.
- a new virtual object VO that is different from the object VO is generated.
- the generation unit 212 creates a new virtual object VO that has an appearance based on the image information acquired from the terminal device 10-J, and that has at least one of a viewable direction and a viewable direction set by the view information. Generate.
- the generation unit 212 generates a new virtual object VO in which at least a part of the virtual object VO having an appearance based on the image information acquired from the terminal device 10-J is hidden by the image indicated by the processing information.
- the generating unit 212 may generate a temporary virtual object VO that has an appearance based on image information acquired from the terminal device 10-J and has at least one of the viewable direction and the viewable direction set by the view information. to generate After that, the generating unit 212 generates a new virtual object VO in which at least a part of the area of the temporary virtual object VO is hidden by the image indicated by the processing information.
- the virtual object VO is a virtual object VO in which at least one of viewing information and processing information is set by the user UK for the virtual object VO generated by the user UJ .
- the output unit 213 outputs the image information indicating the virtual object VO generated by the user UJ , which is acquired from the terminal device 10-J via the communication device 23, to the terminal device 10-K.
- the output unit 213 also outputs the image information indicating the new virtual object VO generated by the generation unit 212 to the terminal devices 10-1 to 10-N including the terminal device 10-J via the communication device 23. .
- the output unit 213 may The browsing information is output to the device 10-J. Also, in this case, the acquisition unit 211 acquires the processing information from the terminal device 10-J via the communication device 23.
- FIG. 6 is a sequence diagram showing the first operation of the information setting system 1 according to the first embodiment.
- a first operation is an operation in which the terminal device 10-K sets both viewing information and processing information for the virtual object VO.
- a first operation of the information setting system 1 will be described below with reference to FIG.
- step S1 the processing device 11 provided in the terminal device 10-J functions as the generation unit 112.
- FIG. The processing device 11 generates a three-dimensional virtual object VO to be displayed in the virtual space based on the operation information indicating the content of the operation on the input device 15 by the user UJ .
- step S2 the processing device 11 provided in the terminal device 10-J functions as the output unit 116.
- the processing device 11 outputs image information representing the virtual object VO generated in step S ⁇ b>1 to the server 20 .
- the processing device 21 provided in the server 20 functions as an acquisition unit 211 .
- the processing device 21 acquires image information indicating the virtual object VO generated by the terminal device 10-J from the terminal device 10-J.
- step S3 the processing device 21 provided in the server 20 functions as the output unit 213.
- the processing device 21 outputs the image information indicating the virtual object VO generated by the terminal device 10-J, which is acquired in step S2, to the terminal device 10-K.
- the processing device 11 provided in the terminal device 10 -K functions as an acquisition unit 111 .
- the processing device 11 acquires from the server 20 image information indicating the virtual object VO generated by the terminal device 10-J.
- step S4 the processing device 11 provided in the terminal device 10-K functions as the display control unit 113.
- the processing device 11 displays the virtual object VO on the display 14 using the image information acquired in step S3.
- step S5 the processing device 11 provided in the terminal device 10-K functions as the setting unit 114.
- the processing device 11 sets viewing information indicating at least one of a viewable direction and a viewable direction with respect to the virtual object VO indicated by the image information acquired in step S3.
- step S6 the processing device 11 provided in the terminal device 10-K functions as the output unit 116.
- the processing device 11 outputs the browsing information set in step S5 to the server 20 .
- the processing device 21 provided in the server 20 functions as an acquisition unit 211 .
- the processing device 21 acquires browsing information from the terminal device 10-K.
- step S7 the processing device 11 provided in the terminal device 10-K functions as the appearance processing section 115.
- the processing device 11 hides at least a part of the area of the virtual object VO that is displayed when the virtual object VO is browsed from an unviewable direction.
- the processing device 11 generates processing information indicating an image of the concealed area after concealment.
- step S8 the processing device 11 provided in the terminal device 10-K functions as the output unit 116.
- the processing device 11 outputs the processing information generated in step S7 to the server 20 .
- the processing device 21 provided in the server 20 functions as an acquisition unit 211 .
- the processing device 21 acquires processing information from the terminal device 10-K.
- step S9 the processing device 21 provided in the server 20 functions as the generation unit 212.
- the processing device 21 uses at least one of the image information acquired in step S2, the viewing information acquired in step S6, and the processing information acquired in step S8 to create the virtual object VO generated by the terminal device 10-J. generates a new virtual object VO that is different from the .
- step S ⁇ b>10 the processing device 21 provided in the server 20 functions as the output unit 213 .
- the processing device 21 outputs the image information representing the new virtual object VO generated by the generation unit 212 to the terminal devices 10-1 to 10-N including the terminal device 10-J.
- the processing device 11 provided in the terminal device 10 -J functions as an acquisition unit 111 .
- the processing device 11 acquires image information indicating the new virtual object VO from the server 20 .
- step S11 the processing device 11 provided in the terminal device 10-J functions as the display control unit 113.
- the processing device 11 causes the display 14 to display a new virtual object VO using the image information acquired in step S10.
- FIG. 7 is a sequence diagram showing the second operation of the information setting system 1 according to the first embodiment.
- the second operation is an operation in which the terminal device 10-K sets viewing information for the virtual object VO, and the terminal device 10-J sets processing information for the virtual object VO to which the viewing information has been set. is.
- the second operation of the information setting system 1 will be described below with reference to FIG.
- step S21 the processing device 11 provided in the terminal device 10-J functions as the generation unit 112.
- FIG. The processing device 11 generates a three-dimensional virtual object VO to be displayed in the virtual space based on the operation information indicating the content of the operation on the input device 15 by the user UJ .
- step S22 the processing device 11 provided in the terminal device 10-J functions as the output unit 116.
- the processing device 11 outputs image information indicating the virtual object VO generated in step S21 to the server 20 .
- the processing device 21 provided in the server 20 functions as an acquisition unit 211 .
- the processing device 21 acquires image information indicating the virtual object VO generated by the terminal device 10-J from the terminal device 10-J.
- step S23 the processing device 21 provided in the server 20 functions as the output unit 213.
- the processing device 21 outputs the image information indicating the virtual object VO generated by the terminal device 10-J, which is acquired in step S22, to the terminal device 10-K.
- the processing device 11 provided in the terminal device 10 -K functions as an acquisition unit 111 .
- the processing device 11 acquires from the server 20 image information indicating the virtual object VO generated by the terminal device 10-J.
- step S24 the processing device 11 provided in the terminal device 10-K functions as the display control unit 113.
- the processing device 11 displays the virtual object VO on the display 14 using the image information acquired in step S23.
- step S25 the processing device 11 provided in the terminal device 10-K functions as the setting unit 114.
- the processing device 11 sets viewing information indicating at least one of a viewable direction and a viewable direction with respect to the virtual object VO indicated by the image information acquired in step S23.
- step S26 the processing device 11 provided in the terminal device 10-K functions as the output unit 116.
- the processing device 11 outputs the viewing information set in step S25 to the server 20 .
- the processing device 21 provided in the server 20 functions as an acquisition unit 211 .
- the processing device 21 acquires browsing information from the terminal device 10-K.
- step S ⁇ b>27 the processing device 21 provided in the server 20 functions as the output unit 213 .
- the processing device 21 outputs the browsing information acquired in step S26 to the terminal device 10-J.
- the processing device 11 provided in the terminal device 10 -J functions as an acquisition unit 111 .
- the processing device 11 acquires browsing information from the server 20 .
- step S28 the processing device 11 provided in the terminal device 10-J functions as the display control unit 113.
- the processing device 11 uses the image information representing the virtual object VO generated in step S21 and the viewing information acquired in step S27 to cause the display 14 to display the virtual object VO to which the viewing information is set.
- step S29 the processing device 11 provided in the terminal device 10-J functions as the appearance processing section 115.
- the processing device 11 hides at least a part of the area of the virtual object VO that is displayed when the virtual object VO is browsed from an unviewable direction.
- the processing device 11 generates processing information indicating an image of the concealed area after concealment.
- step S30 the processing device 11 provided in the terminal device 10-J functions as the output unit 116.
- the processing device 11 outputs the processing information generated in step S29 to the server 20 .
- the processing device 21 provided in the server 20 functions as an acquisition unit 211 .
- the processing device 21 acquires processing information from the terminal device 10-J.
- step S ⁇ b>31 the processing device 21 provided in the server 20 functions as the generation unit 212 .
- the processing device 21 uses at least one of the image information acquired in step S22, the browsing information acquired in step S26, and the processing information acquired in step S30 to create the virtual object VO generated by the terminal device 10-J. generates a new virtual object VO that is different from the .
- step S ⁇ b>32 the processing device 21 provided in the server 20 functions as the output unit 213 .
- the processing device 21 outputs the image information representing the new virtual object VO generated by the generation unit 212 to the terminal devices 10-1 to 10-N including the terminal device 10-K.
- step S33 the processing device 11 provided in the terminal device 10-K functions as the display control unit 113.
- the processing device 11 causes the display 14 to display a new virtual object VO using the image information acquired in step S32.
- the terminal device 10 -K as an information setting device includes the acquisition unit 111 and the setting unit 114 .
- the acquisition unit 111 acquires a three-dimensional virtual object VO to be displayed in the virtual space.
- the setting unit 114 sets viewing information indicating at least one of a viewable direction and a viewable direction with respect to the three-dimensional virtual object VO.
- the terminal device 10-K Since the terminal device 10-K has the above configuration, it is possible to limit the viewability of the virtual object VO itself.
- the terminal device 10-K can disable viewing only the bottom surface of the virtual object VO, for example.
- an area of the virtual object VO that the viewer does not want to see may come into the viewer's field of view.
- the terminal device 10-K sets the viewing information for the virtual object VO itself, so that the viewer can view the virtual object VO even when the virtual object VO rotates in the virtual space, unlike when the view of the viewer is simply blocked.
- the virtual object VO cannot be browsed from the prohibited direction.
- the area of the mask in the field of view of the viewer is changed according to the position in the field of view of the viewer where the area of the virtual object VO that the viewer does not want to see is displayed. I had to change it each time.
- the terminal device 10-K sets the viewing information in the virtual object VO itself, it is not necessary to perform complicated processing for changing the field of view of the viewer each time in accordance with the rotation of the virtual object VO.
- the setting unit 114 sets the above viewing information to the object S including the three-dimensional virtual object VO.
- the terminal device 10-K Since the terminal device 10-K has the above configuration, in setting the viewing information indicating at least one of the viewable direction and the viewable direction of the virtual object VO itself, the normal vector to the surface of the object S available. As a result, the terminal device 10-K can manage browsing information with the object S as one unit.
- the terminal device 10 -K further includes the appearance processing unit 115 .
- the appearance processing unit 115 determines at least a part of the area of the three-dimensional virtual object VO that is displayed when the three-dimensional virtual object VO is viewed from a viewing-disabled direction. A region is subjected to a process of concealing the appearance of the three-dimensional virtual object VO.
- the terminal device 10-K has the above configuration, in order to restrict the viewability of the three-dimensional virtual object VO, it is possible to determine where the region in the virtual object VO that the viewer does not want to see is located in the view of the viewer. There is no need to change the area of the mask in the viewer's field of view each time it is displayed.
- the setting unit 114 sets the viewing information according to the attribute of the viewer who views the three-dimensional virtual object VO.
- the terminal device 10-K Since the terminal device 10-K has the above configuration, it is possible to switch the viewing information indicating at least one of the viewable direction and the viewable direction, depending on the age of the viewer, for example. For example, the terminal device 10-K can increase the number of prohibited directions when the viewer is relatively young, and can decrease the number of prohibited directions when the viewer is relatively old. be.
- FIG. 2 Second Embodiment
- the configuration of an information setting system 1A including terminal devices 10A-K as information setting devices according to a second embodiment of the present invention will be described with reference to FIGS. 8 and 9.
- FIG. 8 for the purpose of simplifying the description, among the components provided in the information setting system 1A according to the second embodiment, the same components as those of the information setting system 1 according to the first embodiment are While using the same code
- An information setting system 1A according to the second embodiment of the present invention has terminal devices as compared with the information setting system 1 according to the first embodiment. It is different in that terminal devices 10A-K are provided instead of 10-K. Otherwise, the overall configuration of the information setting system 1A is the same as the overall configuration of the information setting system 1 according to the first embodiment shown in FIG. 1, so illustration and description thereof will be omitted.
- the terminal device 10A-K includes a processing device 11A instead of the processing device 11 and a storage device 12A instead of the storage device 12.
- the processing device 11A includes a setting unit 114A instead of the setting unit 114 provided in the processing device 11, and an appearance processing unit 115A instead of the appearance processing unit 115.
- the storage device 12A stores a control program PR1A instead of the control program PR1 stored in the storage device 12A.
- the configuration of the terminal device 10A-K is the same as the configuration of the terminal device 10-K according to the first embodiment shown in FIG. 2, so illustration and description thereof will be omitted.
- the setting unit 114A sets viewing information indicating whether or not viewing is possible for each of the plurality of areas that form the surface of the three-dimensional virtual object VO.
- FIG. 8 is an explanatory diagram showing an example of a method for setting browsing information.
- the display 14 displays a three-dimensional virtual object VO generated by the user UJ using the terminal device 10-J.
- the surface of the virtual object VO is divided into areas A1 to A10. More specifically, the surface of the head P1 included in the virtual object VO is divided into A1 to A4. Also, the surface of the body portion P2 included in the virtual object VO is divided into A5 to A10.
- the user UK operates the input device 25 to set permission/prohibition of browsing for each of the areas A1 to A10. Specifically, the user UK operates the input device 25 to designate any one of the areas A1 to A10. After that, the user UK operates the input device 25 to set viewing information indicating whether viewing is permitted or prohibited for the designated area.
- the surface of the virtual object VO is divided into areas A1 to A10.
- the processing device 11A or the server 20 may automatically divide the surface of the virtual object VO.
- the user UJ or the user UK may operate the input device 25 to manually divide the surface of the virtual object VO.
- the appearance processing unit 115A performs a process of deleting at least the area set by the setting unit 114 to be unviewable from the three-dimensional virtual object VO. Appearance processing unit 115A also generates processing information indicating the area to be deleted.
- FIG. 9 is an explanatory diagram showing an example of a processing method for the virtual object VO.
- the setting unit 114A has set the areas A1, A3, A5, A7, and A9 to be viewable among the plurality of areas forming the surface of the virtual object VO.
- the setting unit 114A sets areas A2, A4, A6, A8, and A10 out of the plurality of areas forming the surface of the virtual object VO to be unviewable.
- the appearance processing unit 115A deletes at least areas A2, A4, A6, A8, and A10 from the virtual object VO, as shown in FIG.
- the appearance processing unit 115A divides the virtual object VO into areas A2, A4, A6, A8 , and A10 may be deleted.
- the appearance processing unit 115A leaves only the outer shell of the three-dimensional virtual object VO with the surface areas A1, A3, A5, A7, and A10 and has a predetermined thickness, and all the other parts are It may be deleted from the virtual object VO.
- the appearance processing unit 115A deleting at least the area set as unreadable from the virtual object VO, the amount of data of the virtual object VO after deleting the area set as unviewable is reduced. As a result, the processing load is reduced when the server 20 distributes the virtual objects VO to the terminal devices 10-1 to 10-N.
- the first operation of the information setting system 1A according to the second embodiment is basically the first embodiment shown in FIG. Since it is the same as the first operation of the information setting system 1, the illustration and detailed description thereof will be omitted.
- step S5 of the sequence diagram shown in FIG. The setting is different from the first operation of the information setting system 1 according to the first embodiment.
- step S7 of the sequence diagram shown in FIG. This is different from the first operation of the information setting system 1 according to the first embodiment.
- the second operation of the information setting system 1A according to the second embodiment is basically the second operation of the information setting system 1 according to the first embodiment shown in FIG. Since it is the same as the operation, its illustration and detailed description are omitted.
- step S25 of the sequence diagram shown in FIG. The setting is different from the first operation of the information setting system 1 according to the first embodiment.
- step S29 of the sequence diagram shown in FIG. This is different from the first operation of the information setting system 1 according to the first embodiment.
- the terminal device 10A-K as the information setting device includes the acquisition unit 111 and the setting unit 114A.
- the acquisition unit 111 acquires a three-dimensional virtual object VO to be displayed in the virtual space.
- the setting unit 114A sets viewing information indicating whether or not viewing is permitted for each of a plurality of areas that form the surface of the three-dimensional virtual object VO.
- the terminal devices 10A-K can restrict the visibility of the virtual object VO itself.
- the method of restricting the viewer's field of view when the virtual object VO rotates, an area of the virtual object VO that the viewer does not want to see may come into the viewer's field of view.
- the terminal devices 10A-K allow the viewer to view even when the virtual object VO rotates in the virtual space, unlike when the view of the viewer is simply blocked. Unable to browse areas that are not allowed.
- the area of the mask in the field of view of the viewer is changed according to the position in the field of view of the viewer where the area of the virtual object VO that the viewer does not want to see is displayed. I had to change it each time.
- the terminal devices 10A-K set the viewing information in the virtual object VO itself, it is not necessary to perform complicated processing for changing the field of view of the viewer each time according to the rotation of the virtual object VO.
- the terminal devices 10A-K further include an appearance processing section 115A.
- the appearance processing unit 115A performs a process of deleting at least an area set to be unviewable from the three-dimensional virtual object VO.
- the terminal devices 10A to 10K have the above configuration, the amount of data of the virtual object VO after deleting the area set to be unviewable is reduced. As a result, the processing load is reduced when the server 20 distributes the virtual objects VO to the terminal devices 10-1 to 10-N.
- FIG. 3 Third Embodiment
- the configuration of an information setting system 1B including a server 20A as an information setting device according to a third embodiment of the present invention will be described with reference to FIGS. 10 to 14.
- FIG. 10 to 14 for the purpose of simplification of explanation, among the constituent elements provided in the information setting system 1B according to the third embodiment, the same constituent elements as those of the information setting system 1 according to the first embodiment are While using the same code
- the terminal device 10-K sets the viewing information.
- the server 20A sets browsing information.
- the information setting system 1B according to the third embodiment of the present invention has a server 20 The difference is that the server 20A is provided instead of the . Otherwise, the overall configuration of the information setting system 1B is the same as the overall configuration of the information setting system 1 according to the first embodiment shown in FIG. 1, so illustration and description thereof will be omitted.
- the server 20A is an example of an information setting device.
- FIG. 10 is a block diagram showing a configuration example of the server 20A.
- the server 20A differs from the server 20 in that it has a processing device 21A instead of the processing device 21 and a storage device 22A instead of the storage device 22 .
- the storage device 22A stores the control program PR2A instead of the control program PR2 stored in the storage device 22.
- the storage device 22A also stores a display prohibited image database ND, a viewing information database VD, and a learning model LM.
- the display-prohibited image database ND is a database that stores image information indicating three-dimensional virtual objects that the server 20A is prohibited from outputting to the terminal devices 10-1 to 10-N. That is, in the virtual space displayed on the display 14 provided in the terminal devices 10-1 to 10-N, the display of the three-dimensional virtual object indicated by the image information stored in the display-prohibited image database ND is prohibited. .
- a virtual object whose display in the virtual space is prohibited is called a display prohibited object.
- FIG. 11 is a table showing a configuration example of the display prohibited image database ND.
- the display-prohibited image database ND stores file names indicating image information of display-prohibited objects NO. Note that the image information itself of the display prohibited object NO is stored in the storage device 22A.
- a file name indicating the image information of the 3D image of each region and the image information of the 3D image are stored. Display position is stored as a set. For example, in the example shown in FIG.
- the surface of the display prohibited object NO indicated by the file name "AAA.vrml” is a three-dimensional image indicated as "area 1", a three-dimensional image indicated as "area 2", . . consists of a three-dimensional image denoted as "area m”.
- "Area 1" is indicated by the file “A1.vrml”.
- "Area 2” is indicated by the file name "A2.vrml”.
- "Area m” is indicated by the file name "Am.vrml”.
- m is an integer of 2 or more.
- the display position of the three-dimensional image is specified using three-dimensional polar coordinates. For example, in the example shown in FIG.
- the display position of the three-dimensional image may be specified using a coordinate system other than the three-dimensional polar coordinate system, for example, a three-dimensional rectangular coordinate system.
- the viewing information database VD corresponds to which of the plurality of areas constituting the surface of the display prohibited object NO for each degree of similarity between the virtual object VO and the display prohibited object NO. , is a database that defines whether or not to prohibit viewing of an area constituting the surface of the virtual object VO.
- the “similarity” is calculated by the calculation unit 214, which will be described later.
- FIG. 12 is a table showing a configuration example of the browsing information database VD.
- the file name indicating the image information of the display prohibited object NO and the similarity are stored in pairs in the browsing information database VD.
- both the display-prohibited object NO and the virtual object VO are the entire image of the doll, and the image information indicated by the file name "A1.vrml" is, for example, the head of the doll as the display-prohibited object NO.
- the image information indicated by the file name "A2.vrml” is image information indicating the right arm of the doll as the display prohibited object NO, the right arm of the doll indicated by the virtual object VO cannot be viewed. is defined to be
- the learning model LM is a learning model used when the later-described setting unit 215 refers to the viewing information database VD to set viewing information for the virtual object VO.
- the setting unit 215 refers to the browsing information database VD, Of the plurality of areas forming the surface, the areas of the virtual object VO corresponding to the unviewable areas indicated by the file names "A1.vrml” and "A2.vrml” are made unreadable.
- the setting unit 215 inputs the image information representing the overall image of the virtual object VO and the image information indicated by the file name "A1.vrml" to the learning model LM.
- the learning model LM outputs the image information of the area corresponding to the area indicated by the file name "A1.vrml” among the plurality of areas forming the surface of the virtual object VO. The same applies to the area indicated by the file name "A2.vrml".
- the learning model LM outputs image information indicating the head of the doll as the virtual object VO.
- the learning model LM output image information indicating the right arm of the doll as the virtual object VO.
- the setting unit 215 sets viewing information indicating that viewing is not allowed for the two regions that constitute the surface of the virtual object VO, which are output from the learning model LM.
- the learning model LM is generated by learning teacher data in the learning phase.
- the training data used to generate the learning model LM are a set of (a1) image information of the virtual object VO, (a2) image information of each area included in the display prohibited object NO, and (b) each of the It has a plurality of sets of image information indicating a region within the virtual object VO corresponding to the region.
- the learning model LM is generated outside the server 20A.
- the learning model LM is preferably generated in a second server (not shown).
- the server 20A acquires the learning model LM from a second server (not shown) via the communication network NET.
- the processing device 21A further includes a calculation unit 214, a setting unit 215, and an appearance processing unit 216 in addition to the components provided in the processing device 21.
- the calculation unit 214 calculates the degree of similarity between the display prohibited object NO and the virtual object VO. For example, the calculation unit 214 extracts a plurality of feature points from each of the display-prohibited object NO stored in the display-prohibited image database ND and the virtual object VO, and based on the degree to which both feature points match, the above Calculate the similarity of
- the setting unit 215 sets browsing information based on the degree of similarity calculated by the calculation unit 214 .
- the viewing information here is viewing information indicating whether or not each of the plurality of areas forming the surface of the three-dimensional virtual object VO can be viewed.
- the setting unit 215 refers to the browsing information database VD using the file name of the display prohibited object number for which the calculation unit 214 has calculated the similarity and the similarity, and determines the browsing prohibited area. Extract.
- the setting unit 215 inputs the image information file indicating the virtual object VO and the image information indicating the unbrowsable area extracted from the browsing information database VD to the learning model LM, and A region forming the surface of the virtual object VO is output.
- the setting unit 215 sets viewing information indicating that viewing of the area output from the learning model LM is prohibited.
- the setting unit 215 sets the viewing information based on the degree of similarity calculated by the calculating unit 214. Therefore, when setting the viewing information for the virtual object VO itself, the server 20A can automatically set the viewing information. That is, when setting browsing information, the operation of the user UK of the terminal device 10-K is no longer essential.
- the appearance processing unit 216 performs a process of deleting at least the area set as not viewable from the virtual object VO.
- the appearance processing unit 216 also generates processing information indicating the area to be deleted.
- FIG. 13 is a sequence diagram showing the first operation of the information setting system 1B according to the third embodiment.
- a first operation is an operation in which the server 20A sets both viewing information and processing information for the virtual object VO.
- a first operation of the information setting system 1 will be described below with reference to FIG.
- step S41 the processing device 11 provided in the terminal device 10-J functions as the generator 112.
- FIG. The processing device 11 generates a three-dimensional virtual object VO to be displayed in the virtual space based on the operation information indicating the content of the operation on the input device 15 by the user UJ .
- step S42 the processing device 11 provided in the terminal device 10-J functions as the output unit 116.
- the processing device 11 outputs image information indicating the virtual object VO generated in step S41 to the server 20A.
- the processing device 21A provided in the server 20A functions as an acquisition unit 211 .
- the processing device 21A acquires image information indicating the virtual object VO generated by the terminal device 10-J from the terminal device 10-J.
- step S43 the processing device 21A provided in the server 20A functions as the calculation unit 214.
- the processing device 21A calculates the degree of similarity between the display prohibited object NO, which is a virtual object prohibited from being displayed in the virtual space, and the three-dimensional virtual object VO indicated by the image information acquired in step S42.
- step S44 the processing device 21A provided in the server 20A functions as the setting unit 215.
- the processing device 21A sets browsing information for the virtual object VO based on the degree of similarity calculated in step S42.
- step S45 the processing device 21A provided in the server 20A functions as the appearance processing section 216.
- the processing device 21A performs a process of deleting at least the area set to be unreadable from the virtual object VO. Further, the processing device 21A generates processing information indicating the area to be deleted.
- step S46 the processing device 21A provided in the server 20A functions as the output unit 213.
- the processing device 21A outputs the image information representing the virtual object VO processed in step S45 to the terminal devices 10-1 to 10-N including the terminal device 10-J.
- the processing device 11 provided in the terminal device 10 -J functions as an acquisition unit 111 .
- the processing device 11 acquires image information from the server 20A.
- step S48 the processing device 11 provided in the terminal device 10-J functions as the display control unit 113.
- the processing device 11 causes the display 14 to display a new virtual object VO using the image information acquired in step S46.
- FIG. 14 is a sequence diagram showing the second operation of the information setting system 1B according to the third embodiment.
- the second operation is an operation in which the server 20A sets browsing information for the virtual object VO, and the terminal device 10-J sets processing information for the virtual object VO to which the browsing information is set.
- the second operation of the information setting system 1 will be described below with reference to FIG.
- step S51 the processing device 11 provided in the terminal device 10-J functions as the generation unit 112.
- FIG. The processing device 11 generates a three-dimensional virtual object VO to be displayed in the virtual space based on the operation information indicating the content of the operation on the input device 15 by the user UJ .
- step S52 the processing device 11 provided in the terminal device 10-J functions as the output unit 116.
- the processing device 11 outputs image information indicating the virtual object VO generated in step S51 to the server 20A.
- the processing device 21A provided in the server 20A functions as an acquisition unit 211 .
- the processing device 21A acquires image information indicating the virtual object VO generated by the terminal device 10-J from the terminal device 10-J.
- step S53 the processing device 21A provided in the server 20A functions as the calculation unit 214.
- the processing device 21A calculates the degree of similarity between the display prohibited object NO, which is a virtual object prohibited from being displayed in the virtual space, and the three-dimensional virtual object VO indicated by the image information acquired in step S52.
- step S54 the processing device 21A provided in the server 20A functions as the setting unit 215.
- the processing device 21A sets browsing information for the virtual object VO based on the degree of similarity calculated in step S42.
- step S55 the processing device 21A provided in the server 20A functions as the output unit 213.
- the processing device 21A outputs the viewing information set in step S54 to the terminal device 10-J.
- the processing device 11 provided in the terminal device 10 -J functions as an acquisition unit 111 .
- the processing device 11 acquires browsing information from the server 20A.
- step S56 the processing device 11 provided in the terminal device 10-J functions as the display control unit 113.
- the processing device 11 uses the image information representing the virtual object VO generated in step S51 and the viewing information acquired in step S55 to cause the display 14 to display the virtual object VO to which the viewing information is set.
- step S57 the processing device 11 provided in the terminal device 10-J functions as the appearance processing section 115.
- the processing device 11 deletes at least the area set to be unreadable from the virtual object VO.
- the processing device 21A also generates processing information indicating the area to be deleted.
- step S58 the processing device 11 provided in the terminal device 10-J functions as the output unit 116.
- the processing device 11 outputs the processing information indicating the area to be deleted in step S57 to the server 20A.
- the processing device 21 provided in the server 20A functions as an acquisition unit 211 .
- the processing device 21A acquires processing information from the terminal device 10-J.
- step S59 the processing device 21A provided in the server 20A functions as the generation unit 212.
- the processing device 21A uses the image information acquired in step S52, the viewing information set in step S54, and the processing information acquired in step S58 to create a new virtual object VO that is different from the virtual object VO generated by the terminal device 10-J. create a virtual object VO.
- the processing device 21 provided in the server 20 functions as the output unit 213.
- the processing device 21 outputs image information representing the new virtual object VO generated by the generation unit 212 to the terminal devices 10-1 to 10-N.
- the server 20A as the information setting device includes the acquisition unit 211 and the setting unit 215.
- the acquisition unit 211 acquires a three-dimensional virtual object VO to be displayed in the virtual space.
- the setting unit 215 sets viewing information indicating whether or not viewing is permitted for each of a plurality of regions that form the surface of the three-dimensional virtual object VO.
- the server 20A Since the server 20A has the above configuration, it is possible to restrict the visibility of the virtual object VO itself. In particular, in the method of restricting the viewer's field of view, when the virtual object VO rotates, an area of the virtual object VO that the viewer does not want to see may come into the viewer's field of view. By setting the viewing information in the virtual object VO itself, the server 20A sets the viewing information in the direction in which the viewing is prohibited even when the virtual object VO rotates in the virtual space, unlike when the view of the viewer is simply blocked. The virtual object VO cannot be browsed from.
- the area of the mask in the field of view of the viewer is changed according to the position in the field of view of the viewer where the area of the virtual object VO that the viewer does not want to see is displayed. I had to change it each time.
- the server 20A sets the viewing information for the virtual object VO itself, complicated processing of changing the view of the viewer each time in accordance with the rotation of the virtual object VO becomes unnecessary.
- the server 20A further includes the calculation unit 214.
- the calculation unit 214 calculates the degree of similarity between the display prohibited object NO, which is a virtual object whose display in the virtual space is prohibited, and the three-dimensional virtual object VO acquired by the acquisition unit 211 .
- the setting unit 215 sets the browsing information based on the degree of similarity.
- the server 20A can automatically set the browsing information when setting the browsing information to the virtual object VO itself. That is, when setting browsing information, the operation of the user UK of the terminal device 10-K is no longer essential.
- the server 20 acquires browsing information and processing information from the terminal device 10-K and uses them to generate a new virtual object VO.
- the terminal device 10 -K may generate a new virtual object VO instead of the server 20 and output image information indicating the virtual object VO to the server 20 .
- the server 20 acquires browsing information from the terminal device 10-K and processing information from the terminal device 10-J, and uses these to generate a new virtual object VO. do.
- the terminal device 10 -J may generate a new virtual object VO and output image information indicating the virtual object VO to the server 20 .
- the information setting system 1A according to the second embodiment.
- the setting unit 215 similarly provided in the server 20A determines the browsing information according to the degree of similarity calculated by the calculation unit 214 provided in the server 20A. set. However, when the degree of similarity calculated by the calculation unit 214 is equal to or greater than the threshold, the server 20A outputs the image information indicating the virtual object VO to the terminal device 10-K, and the information setting system 1 according to the first embodiment. Similarly, the terminal device 10-K may set the browsing information. In other words, the user UK may use the terminal device 10-K to set the browsing information.
- the server 20A stores the display prohibited image database ND, the viewing information database VD, and the learning model LM, and includes a calculator 214.
- FIG. In the information setting system 1 according to the first embodiment and the information setting system 1A according to the second embodiment, the terminal device 10-K may have components similar to these. That is, in the information setting system 1 according to the first embodiment and the information setting system 1A according to the second embodiment, the terminal device 10-K calculates the degree of similarity between the virtual object VO and the display prohibited object NO, Browsing information may be set according to the calculated degree of similarity.
- the setting unit 114 sets viewing information indicating at least one of a viewable direction and a viewable direction to the object S including the three-dimensional virtual object VO. do.
- the setting unit 114A sets viewing information indicating whether or not viewing is permitted for each of the plurality of areas forming the surface of the virtual object VO.
- the setting unit 114 or 114A sets at least one of the browsable direction and the unbrowsable direction for each of the plurality of regions that constitute the surface of the object S centered on the point included in the three-dimensional virtual object VO.
- You may set browsing information that indicates . That is, the setting unit 114 or 114A selects at least one of the viewable direction and the viewable direction for each region using a normal vector passing through each of the plurality of regions forming the surface of the object S. You may set the browsing information to show.
- the setting unit 215 uses the learning model LM and refers to the viewing information database VD to set viewing information for the virtual object VO.
- the method of setting browsing information for the virtual object VO is not limited to this.
- the setting unit 215 refers to the viewing information database VD, and among the plurality of regions forming the surface of the display-prohibited object NO, the view-impossible region indicated by terms such as “head” and “right arm” is displayed.
- the area of the virtual object VO corresponding to is made unbrowsable. More specifically, the setting unit 215 inputs image information representing the overall image of the virtual object VO and the term “head” to the learning model LM.
- the learning model LM outputs image information of the area corresponding to the "head” among the plurality of areas forming the surface of the virtual object VO.
- the setting unit sets viewing information indicating that viewing is not possible for two regions, the “head” and the “right arm”, among the plurality of regions forming the surface of the virtual object VO.
- the setting unit 215 may perform pattern recognition of an image that cannot be viewed in a two-dimensional image displayed when the three-dimensional virtual object VO is displayed on the display 14 included in the terminal device 10. . After that, the setting unit 215 detects the area of the image in the two-dimensional image, and sets viewing information indicating that the area of the image cannot be viewed.
- the acquisition unit 211 provided in the server 20A acquires text information indicating an area that should not be browsed from the terminal device 10-K, and the setting unit 215 uses the learning model LM to perform A region corresponding to the text information may be identified. After that, the setting unit 215 sets viewing information indicating that the area is not viewable.
- the setting unit 215 uses, for example, the learning model LM to determine what is shown in the virtual object VO, and if an unviewable object is shown, the object in the virtual object VO is You may specify the area that is reflected. After that, the setting unit 215 sets viewing information indicating that the area is not viewable. For example, after the setting unit 215 uses the learning model LM to recognize that a cat appears in the three-dimensional virtual object VO, the setting unit 215 sets viewing information indicating that the area of the cat is not viewable. good too.
- the setting unit 215 may set viewing information indicating that the virtual object VO itself or one or more areas included in the virtual object VO cannot be viewed, based on the tag.
- the storage devices 12 and 12A, and the storage devices 22 and 22A are examples of ROM and RAM. discs, Blu-ray discs), smart cards, flash memory devices (e.g. cards, sticks, key drives), CD-ROMs (Compact Disc-ROMs), registers, removable discs, hard disks, floppies ) disk, magnetic strip, database, server or other suitable storage medium.
- the program may be transmitted from a network via an electric communication line.
- the program may be transmitted from the communication network NET via an electric communication line.
- the information, signals, etc. described may be represented using any of a variety of different technologies.
- data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
- input/output information and the like may be stored in a specific location (for example, memory), or may be managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
- the determination may be made by a value (0 or 1) represented using 1 bit, or by a true/false value (Boolean: true or false). Alternatively, it may be performed by numerical comparison (for example, comparison with a predetermined value).
- each function illustrated in FIGS. 1 to 14 is realized by any combination of at least one of hardware and software.
- the method of realizing each functional block is not particularly limited. That is, each functional block may be implemented using one device physically or logically coupled, or directly or indirectly using two or more physically or logically separated devices (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices.
- a functional block may be implemented by combining software in the one device or the plurality of devices.
- software, instructions, information, etc. may be transmitted and received via a transmission medium.
- the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
- wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
- wireless technology infrared, microwave, etc.
- system and “network” are used interchangeably.
- Information, parameters, etc. described in this disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using corresponding other information. may be represented as
- the terminal devices 10-1 to 10-N, 10A-J, and 10A-K, and the servers 20 and 20A may be mobile stations (MS).
- a mobile station is defined by those skilled in the art as subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be called a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable term.
- terms such as “mobile station”, “user terminal”, “user equipment (UE)", “terminal”, etc. may be used interchangeably.
- connection refers to any direct or indirect connection between two or more elements. Any connection or coupling is meant, including the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other. Couplings or connections between elements may be physical couplings or connections, logical couplings or connections, or a combination thereof. For example, “connection” may be replaced with "access.”
- two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
- the phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
- determining and “determining” as used in this disclosure may encompass a wide variety of actions.
- “Judgement” and “determination” are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure);
- "judgment” and “determination” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgment” or “decision” has been made.
- judgment and “decision” are considered to be “judgment” and “decision” by resolving, selecting, choosing, establishing, comparing, etc. can contain.
- judgment and “decision” may include considering that some action is “judgment” and “decision”.
- judgment (decision) may be replaced by "assuming", “expecting”, “considering”, and the like.
- the term "A and B are different” may mean “A and B are different from each other.” The term may also mean that "A and B are different from C”. Terms such as “separate,” “coupled,” etc. may also be interpreted in the same manner as “different.”
- notification of predetermined information is not limited to explicit notification, but is performed implicitly (for example, not notification of the predetermined information). good too.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023580099A JPWO2023153088A1 (enrdf_load_stackoverflow) | 2022-02-14 | 2022-12-21 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022020235 | 2022-02-14 | ||
JP2022-020235 | 2022-02-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023153088A1 true WO2023153088A1 (ja) | 2023-08-17 |
Family
ID=87564214
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/047065 WO2023153088A1 (ja) | 2022-02-14 | 2022-12-21 | 情報設定装置 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023153088A1 (enrdf_load_stackoverflow) |
WO (1) | WO2023153088A1 (enrdf_load_stackoverflow) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009213722A (ja) * | 2008-03-11 | 2009-09-24 | Sony Computer Entertainment Inc | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170163907A1 (en) * | 2014-07-22 | 2017-06-08 | Adddon Pty Ltd | Method and system for creating a video advertisement with embedded user content |
JP7444172B2 (ja) * | 2019-09-25 | 2024-03-06 | ソニーグループ株式会社 | 情報処理装置、映像の生成方法及びプログラム |
-
2022
- 2022-12-21 WO PCT/JP2022/047065 patent/WO2023153088A1/ja active Application Filing
- 2022-12-21 JP JP2023580099A patent/JPWO2023153088A1/ja active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009213722A (ja) * | 2008-03-11 | 2009-09-24 | Sony Computer Entertainment Inc | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
Non-Patent Citations (4)
Title |
---|
ALPHA: "[Super Smash Bros. SP] I explained how to peek into Peach, Zelda, and Palutena's pants.", ASMASHI, 26 April 2020 (2020-04-26), XP093082762, Retrieved from the Internet <URL:https://alphasmashgt.com/pants> [retrieved on 20230918] * |
ANONYMOUS: "This magazine's "FF" group. Final Fantasy VIII", WEEKLY FAMITSU, vol. 14, no. 14, 2 April 1999 (1999-04-02), pages 115 - 130, XP009548395 * |
ASHLEE: "Thread: When a female character wearing a skirt is unable to fight", ENDWALKER FINAL FANTASY XIV FORUM, 15 November 2013 (2013-11-15), XP093082763, Retrieved from the Internet <URL:https://forum.square-enix.com/ffxiv/threads/118495> [retrieved on 20230918] * |
YAHOO: "From Wednesday, April 6, 2022, Yahoo! JAPAN will be available from the European Union (EEA) and the United Kingdom. no longer able to", YAHOO JAPAN, 1 February 2022 (2022-02-01), XP093082769, Retrieved from the Internet <URL:https://detail.chiebukuro.yahoo.co.jp/qa/question_detail/q12210336687> [retrieved on 20230918] * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023153088A1 (enrdf_load_stackoverflow) | 2023-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11609675B2 (en) | Placement of objects in an augmented reality environment | |
Yue et al. | SceneCtrl: Mixed reality enhancement via efficient scene editing | |
KR20230075440A (ko) | 확장 현실 시스템을 위한 사용자 인터페이스 레이아웃 및 입력의 동적 구성 | |
CN114270312A (zh) | 经由模态窗口的安全授权 | |
D'Antoni et al. | Operating system support for augmented reality applications | |
Coppens | Merging real and virtual worlds: An analysis of the state of the art and practical evaluation of Microsoft Hololens | |
US8878873B2 (en) | Enhanced visibility of avatars satisfying a profile | |
CN112308770B (zh) | 人像转换模型生成方法及人像转换方法 | |
WO2022222349A1 (zh) | 一种虚拟现实场景的动态改变方法、装置及电子设备 | |
US10565762B2 (en) | Mitigation of bias in digital reality sessions | |
US20220319126A1 (en) | System and method for providing an augmented reality environment for a digital platform | |
Hu et al. | Vision-based multimodal interfaces: A survey and taxonomy for enhanced context-aware system design | |
Li et al. | MVP-Bench: Can Large Vision--Language Models Conduct Multi-level Visual Perception Like Humans? | |
US10609025B2 (en) | System and method for providing simulated environment | |
Basha et al. | Augmented Reality Experience for Real-World Objects, Monuments, and Cities | |
WO2023153088A1 (ja) | 情報設定装置 | |
WO2024238139A1 (en) | Method and system of generating customized three-dimensional images | |
US12347025B2 (en) | Restricting display area of applications | |
US20240428528A1 (en) | Method and device for facilitating a privacy-aware representation in a system | |
Datta | Cross-Reality Re-Rendering: Manipulating between Digital and Physical Realities | |
KR20220015019A (ko) | 데이터 마스킹을 이용하여 이미지를 변환하는 전자 장치 및 방법 | |
US20250225817A1 (en) | Robust and long-range multi-person identification using multi-task learning | |
US20230289440A1 (en) | Data processing terminals and related methods in lock, intermediate, and unlock modes | |
CN115080959B (zh) | 替身模型的训练方法、装置、设备及介质 | |
Guo et al. | Synchronous mixed reality (SMR): A personalized virtual‐real fusion framework with high immersion and effective interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22926102 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023580099 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22926102 Country of ref document: EP Kind code of ref document: A1 |