WO2019130991A1 - Dispositif de traitement d'informations - Google Patents

Dispositif de traitement d'informations Download PDF

Info

Publication number
WO2019130991A1
WO2019130991A1 PCT/JP2018/044278 JP2018044278W WO2019130991A1 WO 2019130991 A1 WO2019130991 A1 WO 2019130991A1 JP 2018044278 W JP2018044278 W JP 2018044278W WO 2019130991 A1 WO2019130991 A1 WO 2019130991A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
user
virtual object
image
hmd
Prior art date
Application number
PCT/JP2018/044278
Other languages
English (en)
Japanese (ja)
Inventor
敬幸 古田
雄太 樋口
和輝 東
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2019130991A1 publication Critical patent/WO2019130991A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • One aspect of the present invention relates to an information processing apparatus.
  • VR virtual reality
  • a technique called is known for example, a user object (avatar, character, etc.) interlocked with the user's action (for example, the action of a part of the body such as the head and hands) is generated on the virtual space and the user's action It is controlled according to Then, by displaying an image showing a view seen from the user object on the HMD, the user is provided with an experience as if the user object exists in the virtual space.
  • one aspect of the present invention is to provide an information processing apparatus capable of improving the convenience of a virtual reality experience of a user.
  • An information processing apparatus is an information processing apparatus for providing an image of a virtual space displayed on a display device worn by a user, the real space image obtained by imaging a real space in the vicinity of the user
  • An image acquisition unit to acquire, a virtual object generation unit that recognizes an object included in a real space image and generates a virtual object corresponding to the object in a virtual space, and at least a part of a virtual space including the virtual object
  • an image generation unit configured to generate the virtual space image to be displayed on the display device.
  • an object included in the real space image obtained by imaging the real space near the user is generated as a virtual object in the virtual space, and a virtual space image including the virtual object (A virtual space image in which the virtual object is reflected) is generated.
  • the user wearing the display device can visually recognize an object present in the vicinity of the user via the virtual space image. Therefore, according to the information processing apparatus, the convenience of the virtual reality experience of the user can be improved.
  • an information processing apparatus capable of improving the convenience of the user's virtual reality experience.
  • FIG. 1 is a diagram showing a functional configuration of an information processing system 100 including an information processing apparatus 10 according to an embodiment of the present invention.
  • the information processing apparatus 10 is an apparatus for providing a user with a virtual space in which arbitrary VR contents such as a game space and a chat space are expanded, via a head mounted display (HMD) 1 (display device) mounted to the user. is there. That is, the information processing apparatus 10 is an apparatus that provides a user with a virtual reality (VR) experience through an image of a virtual space displayed on the HMD 1.
  • the information processing apparatus 10 has a function of generating a virtual object corresponding to the object from the object existing in the real space on the virtual space.
  • the information processing apparatus 10 includes a communication unit 11, an image acquisition unit 12, a virtual object generation unit 13, a virtual object storage unit 14, a sharing setting unit 15, and an image generation unit 16. , An object detection unit 17 and a virtual object update unit 18.
  • the information processing apparatus 10 is, for example, a game terminal, a personal computer, a tablet terminal or the like that can communicate with the plurality of HMDs 1 attached by each of the plurality of users.
  • the implementation form of the information processing apparatus 10 is not limited to a specific form.
  • the information processing device 10 may be a computer device incorporated in the same device as the HMD 1.
  • the information processing apparatus 10 may be a server device or the like that can communicate with each of the HMDs 1 (or each computer terminal that controls the operation of each HMD 1) of each of a plurality of users via a communication line such as the Internet. Further, the information processing apparatus 10 may be physically configured by a single device or may be configured by a plurality of devices. For example, the information processing apparatus 10 is realized by a computer terminal (a computer terminal provided for each HMD 1) that controls the operation of each HMD 1 with some functions (for example, the functions of the image generation unit 16), and other functions are performed. You may be comprised as a distributed system implement
  • the HMD 1 is a display device mounted on the body (for example, the head) of the user.
  • the HMD 1 includes, for example, a display unit that displays an image (an image for the left eye and an image for the right eye) in front of each eye of the user in a state of being worn on the head of the user.
  • a stereoscopic image three-dimensional image
  • the display unit described above may be a display integrally configured with a main unit mounted on the user's body, such as a glasses type or helmet type, or a device detachably attachable to the main unit of the HMD 1 (For example, a display of a terminal such as a smartphone attached to the main unit) may function as the display unit.
  • the HMD 1 includes, for example, a sensor (eg, an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, a gyro sensor, etc.) capable of detecting the position, orientation (tilt), velocity, acceleration, etc. of the user's head (ie, the HMD 1). There is.
  • the HMD 1 periodically transmits information on the motion (position, orientation, velocity, acceleration, etc.) of the head of the user detected by such a sensor to the information processing apparatus 10 as motion information on the head of the user.
  • the HMD 1 includes, for example, a sensor such as an infrared camera that detects an action of the user's eyes (for example, the position and the movement of the black eye portion, etc.).
  • the sensor is, for example, a sensor having a known eye tracking function.
  • the said sensor detects the operation
  • the HMD 1 periodically transmits the operation information of the user's eyes detected as described above to the information processing apparatus 10.
  • the HMD 1 also includes a microphone (not shown) for inputting the voice of the user wearing the HMD 1 and a speaker (not shown) for outputting voice and the like of each user as accessories.
  • the voice acquired by the microphone is transmitted to the information processing apparatus 10.
  • the speaker outputs the voice or the like of the other user received from the information processing device 10. With such a microphone and a speaker, it is possible to make conversation (chat) between a plurality of users.
  • the microphone and the speaker may be devices integrated with the HMD 1 or may be devices different from the HMD 1.
  • the HMD 1 also includes a camera 2 (photographing device) for photographing a space in the vicinity of the user wearing the HMD 1 (in the present embodiment, the space in front of the user) as an accessory.
  • the HMD 1 and the camera 2 can communicate with each other.
  • the camera 2 may be a camera integrally configured with the main body of the HMD 1 or may be a camera provided to a main body of the HMD 1 and a device (e.g., a smartphone or the like) that is detachable.
  • the camera 2 of the HMD 1 recognizes a specific area 4 on the desk 3 in front of the user 5 wearing the HMD 1, and an object present on the specific area 4 is Take an image.
  • the specific area 4 is defined by a mat of green back placed on the desk 3 or the like.
  • a sensor or a marker that can be recognized by the camera 2 that can communicate with the camera 2 is embedded at a specific position (for example, a central position or four corners) of the mat, and the camera 2 communicates with the sensor (or The specific area 4 may be recognized based on the position of the sensor (or the marker) grasped by the recognition of the marker).
  • the camera 2 is not necessarily a device attached to the HMD 1 and may be a camera (a separate device from the HMD 1) fixedly disposed at a position capable of photographing a space including the specific area 4 Good. Further, the camera 2 may be configured by a plurality of fixed cameras that capture a space including the specific area 4 from a plurality of different angles. In this case, it is possible to obtain a three-dimensional image of an object present on the specific region 4 based on images of different angles taken by different fixed cameras.
  • the camera 2 starts capturing an image of the real space including the specific area 4 in response to an operation by the user on a controller attached to the HMD 1 (or a controller separate from the HMD 1).
  • the video taken by the camera 2 is transmitted to the HMD 1 as needed, and displayed superimposed on the virtual space image displayed on the HMD 1.
  • the virtual space image is an image of the virtual space of the angle specified based on the motion information of the head and eyes of the user wearing the HMD 1.
  • the video captured by the camera 2 may be displayed on a small window (so-called wipe) provided at a corner (for example, the upper right corner or the like) of the virtual space image.
  • the user can grasp the state of the space (real space) including the specific area 4 by viewing the virtual space image and simultaneously experiencing the virtual reality and confirming the small window-like screen. it can.
  • the object on the specific area 4 is not generated as a virtual object. Therefore, the object in the specific area 4 can not be treated (eg, carried) as a thing in the virtual space, and can not be recognized by users other than the user.
  • the communication unit 11 transmits / receives data to / from an external device such as the HMD 1 (including a microphone, a speaker, a camera 2, a controller, and the like that are accessories of the HMD 1) via a wired or wireless communication network.
  • the communication unit 11 receives from the HMD 1 the motion information of the head and eyes of the user acquired in the HMD 1 as described above.
  • the communication unit 11 transmits the image generated by the image generation unit 16 described later to the HMD 1.
  • an image of a virtual space of an angle determined based on the motion information of the head and eyes of each user is displayed.
  • the communication unit 11 also receives the voice of each user input to the above-described microphone, and transmits the received voice of each user to the speaker of each user. By such processing, the voice is shared among the users, and the above-described chat is realized.
  • the image acquisition unit 12 acquires a real space image obtained by imaging a real space near the user.
  • the image acquisition unit 12 acquires an image (details will be described later) acquired by the above-described camera 2 as a real space image through the communication unit 11.
  • the virtual object generation unit 13 recognizes an object included in the real space image, and generates a virtual object corresponding to the object in the virtual space.
  • the virtual object generation unit 13 generates a virtual object corresponding to an object designated by the user among a plurality of objects included in the real space image. That is, the virtual object generation unit 13 does not immediately generate virtual objects corresponding to all the objects included in the real space image, but generates only virtual objects corresponding to the object designated by the user.
  • objectification By such processing, only the virtual object desired by the user can be generated, and the processing load of generation of the virtual object (hereinafter also referred to as “objectification”) can be reduced. That is, the processing load and usage of hardware resources such as processors and memories can be reduced.
  • FIG. 2B shows a state in which two objects 6 (6A, 6B) exist on the specific area 4 in front of the user 5.
  • the object 6A is a plastic bottle containing a beverage
  • the object 6B is a notebook PC operated by the user 5.
  • the camera 2 acquires a real space image including the specific area 4 and transmits it to the HMD 1.
  • the real space image including the objects 6A and 6B is displayed on the HMD 1.
  • the user 5 designates a target area including the object 6 (here, the object 6B as an example here) which is the object of objectification in the real space image by the operation using the above-mentioned controller or the like. Subsequently, the real space image and the information indicating the target area are transmitted from the HMD 1 to the information processing apparatus 10.
  • the object 6 here, the object 6B as an example here
  • the real space image and the information indicating the target area are transmitted from the HMD 1 to the information processing apparatus 10.
  • the image acquisition unit 12 acquires these pieces of information (information indicating the real space image and the target area) via the communication unit 11. Then, the virtual object generation unit 13 performs known image recognition on the target area in the real space image. By such processing, appearance information of the object 6B included in the target area is extracted. As shown in FIG. 2B, the virtual object generation unit 13 generates a virtual object 8 corresponding to the object 6B based on the appearance information extracted in this manner.
  • the user object 7 associated with the user 5 is disposed on the virtual space V.
  • the virtual object generation unit 13 is configured such that the relative position of the virtual object 8 to the user object 7 in the virtual space V matches the relative position of the object 6B to the user 5 in the real space. Determine the position of 8.
  • the user can perform an operation on the object 6B in the real space by performing an operation (for example, an operation to carry) on the virtual object 8 in the virtual space V via the user object 7 .
  • the relative position of the virtual object 8 to the user object 7 may not coincide with the relative position of the object 6B to the user 5. That is, the virtual object generation unit 13 may generate the virtual object 8 at an arbitrary position (for example, a position designated by the user 5) in the virtual space.
  • the virtual object storage unit 14 stores information on the virtual object generated by the virtual object generation unit 13 (hereinafter, “virtual object information”).
  • virtual object information includes, for each virtual object, a virtual object ID for uniquely identifying a virtual object, appearance information for drawing a virtual object, generation time at which a virtual object is generated, virtual A camera ID for uniquely identifying the camera 2 (or the user 5 of the camera 2 or the like) who acquired the real space image on which the object is generated, a user (or the HMD 1 or the like who is permitted to share the virtual object) Share setting information indicating the device).
  • the camera ID is associated with the real space image as additional information, for example, when the real space image is photographed by the camera 2.
  • the virtual space V is a space shared by a plurality of users. That is, the virtual space V includes at least a first user (user 5 in this case) wearing the first HMD (HMD 1, first display device) and a second user (user 5) wearing the second HMD (HMD 1, second display device) Is a space shared with different users).
  • the virtual space V is, for example, a chat space for conducting business communication such as a meeting among a plurality of users.
  • the first user may not want the contents of the virtual object generated by the objectification to be known to users other than the specific user.
  • a virtual object corresponding to a memo or the like in which confidential information is described may be desired to be viewed only by a user having a specific job title or higher.
  • the sharing setting unit 15 shares the virtual object with the second user according to the operation content received from the first user for the virtual object generated by the virtual object generating unit 13 based on the specification by the first user.
  • a sharing setting screen for setting a user who is permitted to share the virtual object 8 is displayed on the first HMD.
  • the sharing setting screen for example, information indicating the appearance and the like of the virtual object 8 which is the target of the sharing setting, a screen for setting a user who permits sharing of the virtual object 8 and the like are displayed.
  • the sharing setting screen may be a setting screen capable of performing the sharing setting of each of the plurality of virtual objects.
  • the user 5 designates a user (or a user who does not permit sharing) who permits sharing of the virtual object 8 by performing an operation using the above-described controller or the like on the sharing setting screen.
  • the sharing setting unit 15 acquires setting information generated by such processing, and sets sharing setting information of the virtual object 8 based on the setting information. Specifically, the sharing setting unit 15 accesses virtual object information of the virtual object 8 stored in the virtual object storage unit 14, and sets or updates sharing setting information of the virtual object information.
  • the image generation unit 16 generates a virtual space image indicating at least a part of the virtual space V including the virtual object 8 generated by the virtual object generation unit 13. Specifically, the image generation unit 16 displays the virtual object 8 in the virtual space image (an image of an angle determined based on the motion information of the head and eyes of the user wearing the HMD 1) displayed on the HMD 1 In the case of including, a virtual space image including the virtual object 8 is generated.
  • the image generation unit 16 When the virtual space V is shared by a plurality of users, the image generation unit 16 generates a virtual space image for each user (for each HMD 1).
  • the image generation unit 16 does not display the virtual object 8 that is not permitted to share with the second user in the virtual space image displayed on the HMD 1 (second HMD) of the second user. That is, even if the virtual object 8 is included in the virtual space image for the second HMD, the image generation unit 16 hides the virtual object 8 in the virtual space image.
  • the image generation unit 16 is permitted to share the virtual object 8 with the second user, and the virtual object 8 is included in the virtual space image for the second HMD, the virtual space for the second HMD is Display the virtual object 8 on the image.
  • the virtual space for the second HMD is Display the virtual object 8 on the image.
  • the virtual space image for each user (for each HMD 1) generated by the image generation unit 16 is transmitted to the HMD 1 of each user. Through such processing, each user visually recognizes, via the HMD 1, a virtual space image in which display or non-display of the virtual object 8 according to the above-described sharing setting is reflected.
  • the object detection unit 17 detects an object corresponding to the virtual object from the real space image acquired by the image acquisition unit 12 after the virtual object is generated by the virtual object generation unit 13.
  • the object detection unit 17 detects the object when the same object as the already-objectized object is included in the further acquired real space image.
  • the object detection unit 17 may be appearance information (e.g., similar to the appearance of an object included in the real space image further acquired). Appearance information that is recognized as having a certain degree of similarity or more by known image recognition based on the outline, color, shape, etc. of an object, a camera ID indicating the camera 2 that has captured the real space image obtained further Virtual object information is searched in which the generation time past the time when the acquired real space image is acquired is associated.
  • the object detection unit 17 corresponds an object included in the further acquired real space image to a virtual object indicated by the extracted virtual object information. Detect as an object.
  • the image acquisition unit 12 acquires a real space image including the object 6B.
  • the object detection unit 17 selects appearance information similar to the appearance of the object 6B included in the real space image, and the real space
  • the virtual object information in which the camera ID indicating the camera 2 that captured the image and the generation time past the time when the real space image was captured is associated with each other is searched.
  • virtual object information of the virtual object 8 is extracted.
  • the object detection unit 17 detects the object 6B included in the real space image as an object corresponding to the virtual object 8.
  • the virtual object update unit 18 updates the state of the virtual object corresponding to the object based on the state of the object detected by the object detection unit 17.
  • the virtual object update unit 18 updates the state of the virtual object 8 corresponding to the object 6B based on the state of the object 6B detected by the object detection unit 17.
  • the state of the object 6B included in the real space image acquired at a time later than the time when the virtual object 8 is first generated may be different from the state of the object 6B at the time of generation of the virtual object 8.
  • the screen (a part of the appearance of the object 6B) of the object 6B (notebook PC) at the later time may be different from the screen of the object 6B at the time of generation of the virtual object 8.
  • the virtual object updating unit 18 updates the state of the virtual object 8 corresponding to the object 6B (here, the content of the screen) to the content of the screen of the object 6B captured in the real space image acquired at the later time point Do. Specifically, the virtual object update unit 18 adds the appearance information of the virtual object information of the virtual object 8 stored in the virtual object storage unit 14 to the real space image captured at the later time point. Update based on the contents of the screen. In addition, the virtual object update unit 18 changes the generation time of the virtual object information of the virtual object 8 to the time when the update is performed. By the processing as described above, the content of the latest object 6B in the real space can be reflected on the virtual object 8 corresponding to the object 6B.
  • the virtual object generation unit 13, the object detection unit 17, and the virtual object update unit 18 described above may execute the following processing.
  • the object detection unit 17 detects an object corresponding to a virtual object already generated from the real space image further acquired as described above, the first process of updating the virtual object (that is, the above-described process) Selection by the user as to which of the process of the virtual object update unit 18 and the second process of generating a new virtual object corresponding to the object (that is, the process of the virtual object generation unit 13 described above) Accept
  • the object detection unit 17 causes the HMD 1 of the user to display a selection screen for selecting one of the first process and the second process, and the result of the selection operation by the user from the user (user selection) To get
  • the virtual object update unit 18 executes the first process.
  • the virtual object generation unit 13 executes the second process. According to such a configuration, the first process of updating a virtual object that has already been created and the second process of creating a new new virtual object are appropriately switched and executed according to the user's request. be able to.
  • FIG. 3 is a sequence diagram showing processing until a virtual object is generated.
  • FIG. 4 is a sequence diagram showing processing from generation of a virtual object to display of a virtual space image corresponding to the sharing setting on each HMD 1.
  • FIG. 5 is a sequence diagram showing processing (updating of a virtual object or creation of a new virtual object) when an object which has already been turned into an object from a real space image is detected.
  • the information processing apparatus 10 generates a virtual space V shared by a plurality of users (step S1). Specifically, a virtual space V in which various objects such as user objects associated with each user are arranged at an initial position is generated. Virtual space data (an image of the virtual space viewed from each user object) indicating the virtual space V generated in this manner is transmitted to the HMD 1 (here, the first HMD and the second HMD) of each user (step S2) . Thereby, each user experiences virtual reality as if it were in the virtual space V via each HMD 1.
  • the first HMD instructs the camera 2 to start shooting in response to an operation from the user 5 (first user) of the first HMD with respect to the controller etc. (step S3).
  • the camera 2 having received the shooting start instruction starts shooting of the real space including the specific area 4 (see FIG. 2), and acquires an image of the real space (step S4).
  • the video taken by the camera 2 is transmitted to the first HMD at any time (step S5), and displayed superimposed on the virtual space image displayed on the first HMD (step S6). For example, an image captured by the camera 2 is displayed on a small window-like screen (wipe) provided at a corner of the virtual space image.
  • the first HMD instructs the camera 2 to acquire a real space image in response to an operation from the user 5 on the controller etc.
  • the real space image is a still image as a basis for extracting a virtual object.
  • the camera 2 having received the image acquisition instruction acquires a real space image obtained by imaging the real space including the specific area 4 (step S8).
  • the real space image acquired by the camera 2 is transmitted to the first HMD (step S9) and displayed on the first HMD (step S10).
  • the first HMD indicates a target region (here, a region including the object 6B as an example) including an object to be objectified in the real space image by receiving an operation on the controller or the like by the user 5.
  • Information is acquired (step S11).
  • the real space image acquired in step S9 and the information indicating the target area acquired in step S11 are transmitted to the information processing apparatus 10 (step S12).
  • the image acquisition unit 12 acquires information indicating the real space image and the target area transmitted in step S12 (step S13).
  • the virtual object generation unit 13 generates a virtual object 8 corresponding to the object 6B included in the target area by executing known image recognition on the target area in the real space image (step S14). .
  • virtual object information on the virtual object 8 is stored in the virtual object storage unit 14.
  • the sharing setting unit 15 transmits data such as the appearance of the virtual object 8 to the first HMD (step S15), and the sharing setting screen described above (for example, the target of the sharing setting)
  • the setting screen including the appearance of the virtual object 8 is displayed on the first HMD (step S16).
  • the first HMD (for example, the controller attached to the first HMD) acquires setting information indicating the content of the sharing setting input by the user 5 on the sharing setting screen (step S17), and transmits the setting information to the information processing apparatus 10 (Step S18).
  • the sharing setting unit 15 sets sharing setting information of the virtual object 8 based on the setting information (step S19).
  • the image generation unit 16 generates a virtual space image indicating at least a part of the virtual space V including the virtual object 8 generated by the virtual object generation unit 13 (step S20).
  • the image generation unit 16 generates a virtual space image for each user (for each HMD 1), transmits a virtual space image for the first HMD to the first HMD, and transmits a virtual space image for the second HMD to the second HMD (Steps S21 and S22).
  • a virtual space image is displayed in each of the first HMD and the second HMD (steps S23 and S24).
  • step S20 the image generation unit 16 converts the virtual object 8 into a virtual space image for the second HMD. Do not display That is, the image generation unit 16 generates a virtual space image in which the virtual object 8 is not displayed. In this case, the virtual object 8 is not displayed in the virtual space image displayed on the second HMD in step S24.
  • the image generation unit 16 displays the virtual object 8 in the virtual space image for the second HMD in step S20. As a result, the virtual object 8 is displayed on the virtual space image displayed on the second HMD in step S24.
  • steps S31 to S36 are the same as the processes of steps S8 to S13, and thus detailed description will be omitted.
  • the object detection unit 17 detects an object 6B corresponding to the virtual object 8 already generated from the real space image acquired in step S36 (step S37).
  • the object detection unit 17 is different from the first process (the process of the virtual object update unit 18) for updating the virtual object 8 and the new virtual object (the virtual object 8 already generated) corresponding to the object 6B.
  • the selection of the user 5 as to which of the second process (process of the virtual object generation unit 13) for generating a new object) is to be executed is received.
  • the object detection unit 17 notifies the first HMD that the object 6B corresponding to the virtual object 8 already generated is detected from the real space image (step S38).
  • the object detection unit 17 causes the first HMD to display a notification pop-up or the like.
  • the first HMD (the controller or the like) accepts the selection of the user 5 as to which of the first processing and the second processing is to be executed (step S39), and transmits the result of the selection to the information processing apparatus 10.
  • the information processing apparatus 10 executes a process according to the selection of the user 5 (step S41). Specifically, when the object detection unit 17 receives the selection of the user 5 indicating execution of the first process, the virtual object update unit 18 executes the first process. In this example, the virtual object update unit 18 updates the state of the virtual object 8 based on the state of the object 6B detected from the real space image acquired in step S36.
  • the virtual object generation unit 13 executes the second process.
  • the virtual object generation unit 13 generates a new virtual object based on the state of the object 6B detected from the real space image acquired in step S36. In this case, the new virtual object generated in this manner and the virtual object 8 already generated coexist in the virtual space V.
  • the object 6 included in the real space image obtained by imaging the real space near the user 5 is generated as the virtual object 8 in the virtual space V A virtual space image including the virtual object 8 (a virtual space image in which the virtual object 8 is captured) is generated.
  • the user 5 wearing the HMD 1 can visually recognize the object 6 present in the vicinity of the user via the virtual space image. Therefore, according to the information processing apparatus 10, the convenience of the virtual reality experience of the user 5 can be improved.
  • the information processing apparatus 10 further includes an object detection unit 17 and a virtual object update unit 18.
  • an object detection unit 17 detects whether a real space image including the already-objectified object 6B is acquired again.
  • the virtual object 8 corresponding to the object 6B is updated based on the state of the object 6B included in the real space image. be able to.
  • the latest state of the object 6B in the real space can be recognized through the virtual object 8 on the virtual space V.
  • the object detection unit 17 detects an object 6B corresponding to the virtual object 8 from the acquired real space image
  • the first process of updating the virtual object 8 and a new virtual object corresponding to the object 6B Accept the user's selection as to which of the second processes to generate.
  • the virtual object update unit 18 performs the first process
  • the object detection unit 17 indicates that the second process is to be performed.
  • the virtual object generation unit 13 executes the second process. According to this configuration, it is possible to appropriately switch and execute the update of the existing virtual object 8 and the generation of a new virtual object according to the user's request.
  • the virtual object generation unit 13 selects an object 6 (the object 6B in the example of FIG. 2) specified by the user 5 among the plurality of objects 6 (the objects 6A and 6B in the example of FIG. 2) included in the real space image. Create a virtual object 8 corresponding to.
  • an object 6 the object 6B in the example of FIG. 2 specified by the user 5
  • the plurality of objects 6 the objects 6A and 6B in the example of FIG. 2 included in the real space image.
  • unnecessary object formation processing can be omitted to reduce the processing amount of the processor, and memory used by the unnecessary virtual objects can be reduced. It is possible to suppress the increase of the amount.
  • the virtual space V is a space shared by at least the first user wearing the first HMD and the second user wearing the second HMD, and the information processing apparatus 10 includes the sharing setting unit 15 described above. .
  • the image generation unit 16 does not display a virtual object that is not permitted to share with the second user in the virtual space image displayed on the second HMD. According to this configuration, by performing the sharing setting as described above for each virtual object, a specific virtual object (for example, an object corresponding to a document or the like in which confidential information is described) can be It can be viewed only by the above users. This makes it possible to more smoothly carry out business communication such as a meeting via the virtual space V.
  • each functional block is realized by one physically and / or logically coupled device, or directly and / or indirectly two or more physically and / or logically separated devices. It may be connected (for example, wired and / or wirelessly) and realized by the plurality of devices.
  • the information processing apparatus 10 in the above embodiment may function as a computer that performs the processing of the information processing apparatus 10 in the above embodiment.
  • FIG. 6 is a diagram showing an example of the hardware configuration of the information processing apparatus 10 according to the present embodiment.
  • the above-described information processing apparatus 10 may be physically configured as a computer apparatus including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
  • the term “device” can be read as a circuit, a device, a unit or the like.
  • the hardware configuration of the information processing device 10 may be configured to include one or more of the devices illustrated in FIG. 6 or may be configured without including some devices.
  • Each function in the information processing apparatus 10 causes the processor 1001 to perform an operation by reading predetermined software (program) on hardware such as the processor 1001, the memory 1002, etc., communication by the communication device 1004, the memory 1002 and the storage 1003. This is realized by controlling the reading and / or writing of data in
  • the processor 1001 operates, for example, an operating system to control the entire computer.
  • the processor 1001 may be configured by a central processing unit (CPU) including an interface with a peripheral device, a control device, an arithmetic device, a register, and the like.
  • CPU central processing unit
  • the processor 1001 reads a program (program code), a software module, and / or data from the storage 1003 and / or the communication device 1004 to the memory 1002, and executes various processing according to these.
  • a program a program that causes a computer to execute at least a part of the operations described in the above embodiments is used.
  • the virtual object generation unit 13 of the information processing apparatus 10 may be realized by a control program stored in the memory 1002 and operated by the processor 1001, and similarly realized for other functional blocks shown in FIG. It is also good.
  • the various processes described above have been described to be executed by one processor 1001, but may be executed simultaneously or sequentially by two or more processors 1001.
  • the processor 1001 may be implemented by one or more chips.
  • the program may be transmitted from the network via a telecommunication line.
  • the memory 1002 is a computer readable recording medium, and includes, for example, at least one of a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable ROM), and a RAM (Random Access Memory). It may be done.
  • the memory 1002 may be called a register, a cache, a main memory (main storage device) or the like.
  • the memory 1002 may store a program (program code), a software module, etc. that can be executed to execute the information processing method (for example, the procedure shown in the sequence diagrams of FIGS. 3 to 5) according to the above embodiment. it can.
  • the storage 1003 is a computer readable recording medium, and is, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, Blu-ray A (registered trademark) disk, a smart card, a flash memory (for example, a card, a stick, a key drive), a floppy (registered trademark) disk, a magnetic strip, and the like may be used.
  • the storage 1003 may be called an auxiliary storage device.
  • the above-described storage medium may be, for example, a database including the memory 1002 and / or the storage 1003, a server, or any other suitable medium.
  • the communication device 1004 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the input device 1005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, and the like) that receives external input.
  • the output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that performs output to the outside.
  • the input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
  • each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information.
  • the bus 1007 may be configured by a single bus or may be configured by different buses among the devices.
  • the information processing apparatus 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable logic device (PLD), and a field programmable gate array (FPGA). And part or all of each functional block may be realized by the hardware.
  • processor 1001 may be implemented in at least one of these hardware.
  • the input / output information may be stored in a specific place (for example, a memory) or may be managed by a management table. Information to be input or output may be overwritten, updated or added. The output information etc. may be deleted. The input information or the like may be transmitted to another device.
  • the determination may be performed by a value (0 or 1) represented by one bit, may be performed by a true / false value (Boolean: true or false), or may be compared with a numerical value (for example, a predetermined value). Comparison with the value).
  • Software may be called software, firmware, middleware, microcode, hardware description language, or any other name, and may be instructions, instruction sets, codes, code segments, program codes, programs, subprograms, software modules. Should be interpreted broadly to mean applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc.
  • software, instructions and the like may be transmitted and received via a transmission medium.
  • software may use a wireline technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or a website, server or other using wireless technology such as infrared, radio and microwave When transmitted from a remote source, these wired and / or wireless technologies are included within the definition of transmission medium.
  • wireline technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or a website, server or other using wireless technology such as infrared, radio and microwave
  • data, instructions, commands, information, signals, bits, symbols, chips etc may be voltage, current, electromagnetic waves, magnetic fields or particles, light fields or photons, or any of these May be represented by a combination of
  • information, parameters, and the like described in the present specification may be represented by an absolute value, may be represented by a relative value from a predetermined value, or may be represented by corresponding other information. .
  • the phrase “based on” does not mean “based only on,” unless expressly stated otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
  • determining may encompass a wide variety of operations. “Decision” may be, for example, judging, calculating, computing, processing, deriving, investigating, looking up (eg table, database or other (Searching in the data structure of (a)), ascertaining it may be regarded as “decided”, and the like. Also, “determination” may be receiving (e.g., receiving information), transmitting (e.g., transmitting information), input (input), output (output), accessing (accessing) (e.g. For example, it can be regarded as “determining” access to data in memory. Also, “determining” may include considering “resolving", selecting, choosing, establishing, comparing, etc., as “determining”. That is, “determination” may include considering that some action is "decision”.
  • SYMBOLS 1 ... HMD (display apparatus), 5 ... user, 6, 6A, 6B ... object, 7 ... user object, 8 ... virtual object, 10 ... information processor, 12 ... image acquisition part, 13 ... virtual object production

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation, la présente invention concerne un dispositif de traitement d'informations (10) qui est destiné à fournir une image d'un espace virtuel (V) à afficher sur un visiocasque (HMD) (1) porté par un utilisateur. Le dispositif de traitement d'informations (10) comprend : une unité d'acquisition d'image (12) pour acquérir une image en espace réel obtenue par imagerie d'un espace réel au voisinage de l'utilisateur ; une unité de génération d'objet virtuel (13) pour reconnaître un objet inclus (6) dans l'image en espace réel et générer, dans l'espace virtuel V, un objet virtuel (8) correspondant à l'objet (6); et une unité de génération d'image (16) pour générer une image d'espace virtuel à afficher sur le HMD (1) qui montre au moins une partie de l'espace virtuel (V) comprenant l'objet virtuel (8).
PCT/JP2018/044278 2017-12-26 2018-11-30 Dispositif de traitement d'informations WO2019130991A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017249034A JP2021043476A (ja) 2017-12-26 2017-12-26 情報処理装置
JP2017-249034 2017-12-26

Publications (1)

Publication Number Publication Date
WO2019130991A1 true WO2019130991A1 (fr) 2019-07-04

Family

ID=67063503

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/044278 WO2019130991A1 (fr) 2017-12-26 2018-11-30 Dispositif de traitement d'informations

Country Status (2)

Country Link
JP (1) JP2021043476A (fr)
WO (1) WO2019130991A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021068195A (ja) * 2019-10-24 2021-04-30 克己 横道 情報処理システム、情報処理方法およびプログラム
JP2021162876A (ja) * 2020-03-30 2021-10-11 日産自動車株式会社 画像生成システム、画像生成装置及び画像生成方法
JP2022032540A (ja) * 2020-08-12 2022-02-25 武志 小畠 赤外線調査解析診断装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2022269888A1 (fr) * 2021-06-25 2022-12-29
WO2024047720A1 (fr) * 2022-08-30 2024-03-07 京セラ株式会社 Procédé de partage d'image virtuelle et système de partage d'image virtuelle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014197348A (ja) * 2013-03-29 2014-10-16 キヤノン株式会社 サーバ装置、情報処理方法及びプログラム
WO2015111283A1 (fr) * 2014-01-23 2015-07-30 ソニー株式会社 Dispositif d'affichage d'images et procédé d'affichage d'images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014197348A (ja) * 2013-03-29 2014-10-16 キヤノン株式会社 サーバ装置、情報処理方法及びプログラム
WO2015111283A1 (fr) * 2014-01-23 2015-07-30 ソニー株式会社 Dispositif d'affichage d'images et procédé d'affichage d'images

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021068195A (ja) * 2019-10-24 2021-04-30 克己 横道 情報処理システム、情報処理方法およびプログラム
JP7023005B2 (ja) 2019-10-24 2022-02-21 克己 横道 情報処理システム、情報処理方法およびプログラム
JP2021162876A (ja) * 2020-03-30 2021-10-11 日産自動車株式会社 画像生成システム、画像生成装置及び画像生成方法
JP7413122B2 (ja) 2020-03-30 2024-01-15 日産自動車株式会社 画像生成システム、画像生成装置及び画像生成方法
JP2022032540A (ja) * 2020-08-12 2022-02-25 武志 小畠 赤外線調査解析診断装置
JP7298921B2 (ja) 2020-08-12 2023-06-27 株式会社赤外線高精度技術利用機構 赤外線調査解析診断装置

Also Published As

Publication number Publication date
JP2021043476A (ja) 2021-03-18

Similar Documents

Publication Publication Date Title
WO2019130991A1 (fr) Dispositif de traitement d'informations
US11366516B2 (en) Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
WO2021135601A1 (fr) Procédé et appareil de photographie auxiliaire, équipement terminal, et support d'enregistrement
US10254847B2 (en) Device interaction with spatially aware gestures
US11282481B2 (en) Information processing device
KR20160145976A (ko) 영상 공유 방법 및 이를 수행하는 전자 장치
CN110136228B (zh) 虚拟角色的面部替换方法、装置、终端及存储介质
CN111432245B (zh) 多媒体信息的播放控制方法、装置、设备及存储介质
EP3772217A1 (fr) Appareil de commande de sortie, terminal d'affichage, système de commande à distance, procédé de commande et support d'enregistrement
WO2022057435A1 (fr) Système de réponse à une question basée sur une recherche, et support de stockage
CN111259183A (zh) 图像识图方法、装置、电子设备和介质
CN113613028A (zh) 直播数据处理方法、装置、终端、服务器及存储介质
CN112988789A (zh) 医学数据查询方法、装置及终端
CN114143280A (zh) 会话显示方法、装置、电子设备及存储介质
JP7094759B2 (ja) システム、情報処理方法及びプログラム
CN112948690A (zh) 搜索方法、装置、设备及存储介质
EP4035353A1 (fr) Appareil, système de traitement d'image, système de communication, procédé de réglage, procédé de traitement d'image et support d'enregistrement
WO2023037812A1 (fr) Système de prise en charge de dialogue en ligne
JP7267105B2 (ja) 情報処理装置及びプログラム
WO2023079875A1 (fr) Dispositif de traitement d'informations
WO2023026634A1 (fr) Dispositif de commande d'affichage
WO2023149379A1 (fr) Dispositif de traitement d'informations
JP2024075801A (ja) 表示制御装置
JP2023181639A (ja) 情報処理装置
JP2022069212A (ja) 制御装置、プログラム、及びシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18896643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18896643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP