WO2024075565A1 - Display control device and display control method - Google Patents

Display control device and display control method Download PDF

Info

Publication number
WO2024075565A1
WO2024075565A1 PCT/JP2023/034684 JP2023034684W WO2024075565A1 WO 2024075565 A1 WO2024075565 A1 WO 2024075565A1 JP 2023034684 W JP2023034684 W JP 2023034684W WO 2024075565 A1 WO2024075565 A1 WO 2024075565A1
Authority
WO
WIPO (PCT)
Prior art keywords
display control
control device
user
controller
detection unit
Prior art date
Application number
PCT/JP2023/034684
Other languages
French (fr)
Japanese (ja)
Inventor
毅 石川
京二郎 永野
洋史 湯浅
郁男 山野
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024075565A1 publication Critical patent/WO2024075565A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • This disclosure relates to a display control device and a display control method for controlling operations on virtual objects.
  • VR Virtual Reality
  • AR Augmented Reality
  • UIs User Interfaces
  • 6DoF Degree of Freedom
  • Patent Document 1 a mechanism for assigning a physical controller to a virtual controller
  • controllers for air operation which are primarily used in VR and AR, are currently difficult to use for creating or editing 3D models in applications that require high input accuracy (such as CAD).
  • this disclosure proposes a display control device and a display control method that can improve input accuracy.
  • a display control device includes an acquisition unit that acquires position and orientation information in a three-dimensional space from a first controller, and a display control unit that displays a virtual coordinate plane in the three-dimensional space on which two-dimensional coordinates can be specified based on the position and orientation information in the three-dimensional space.
  • FIG. 2 is a diagram showing an overview of a display control process according to the embodiment.
  • FIG. 1 is a diagram showing an example of a display control process according to an embodiment
  • FIG. 13 is a diagram showing an example of the display control process according to the embodiment
  • FIG. 1 is a diagram illustrating an example of the configuration of a display control device according to an embodiment.
  • FIG. 4 is a sequence diagram showing a flow of a display control process according to the embodiment.
  • FIG. 11 is a diagram for explaining a device operation according to a first modified example. 13 is a flowchart showing the flow of a display control process according to a first modified example.
  • FIG. 11 is a diagram for explaining a display control process according to a second modified example.
  • FIG. 1 is a diagram showing an example of a display control process according to an embodiment
  • FIG. 13 is a diagram showing an example of the display control process according to the embodiment
  • FIG. 1 is a diagram illustrating an example of the configuration of a display control device according to
  • FIG. 11 is a diagram (2) for explaining the display control process according to the second modified example.
  • FIG. 11 is a diagram (3) for explaining the display control process according to the second modified example.
  • FIG. 11 is a diagram for explaining a display control process according to a third modified example.
  • FIG. 13 is a diagram (2) for explaining the display control process according to the third modified example.
  • FIG. 13 is a diagram for explaining the display control process according to the third modified example.
  • FIG. 4 is a diagram for explaining the display control process according to the third modified example.
  • FIG. 5 is a diagram for explaining a display control process according to a third modified example.
  • FIG. 6 is a diagram for explaining the display control process according to the third modified example.
  • FIG. 7 is a diagram for explaining a display control process according to the third modified example.
  • FIG. 8 is a diagram for explaining a display control process according to a third modified example.
  • FIG. 13 is a diagram for explaining a display control process according to a fourth modified example.
  • FIG. 2 is a hardware configuration diagram illustrating an example of a computer that realizes the functions of the display control device.
  • Embodiment 1-1 Overview of display control processing according to embodiment 1-2. Configuration of display control device according to embodiment 1-3. Processing procedure according to embodiment 1-4. Modifications 1-4-1. First modification: Operation of only one device 1-4-2. Second modification: Adjustment of operation feel 1-4-3. Third modification: Processing of superimposition on real space 1-4-4. Fourth modification: Deformation of surface based on pointer position 2. Other embodiments 3. Effects of display control device according to the present disclosure 4. Hardware configuration
  • Fig. 1 is a diagram showing an overview of the display control process according to the embodiment.
  • Fig. 1 shows components of a display control system 1 that executes the display control process according to the embodiment.
  • the display control system 1 includes a display control device 100, a three-dimensional device 20, and a planar device 30.
  • the display control device 100 is an information processing terminal for implementing VR and AR technologies.
  • the display control device 100 is a wearable display that is worn on the head of a user 10.
  • the display control device 100 is, for example, a head mounted display (HMD) or AR glasses.
  • HMD head mounted display
  • the display control device 100 has a closed or transmissive display unit (display).
  • the display control device 100 displays a virtual space expressed by CG (Computer Graphics) or the like.
  • the display control device 100 displays a virtual object expressed by CG or the like on the display unit by superimposing it on real space.
  • the display control device 100 displays an object 70 as an example of a virtual object. That is, in the example of FIG. 1, the object 70 and the space viewed by the user 10 are assumed to be a virtual space configured by CG or the like. Note that, as will be described later, the space viewed by the user 10 may be a so-called AR space in which a virtual object is superimposed on real space.
  • the display control device 100 may have a configuration for outputting a predetermined output signal in addition to the display unit.
  • the display control device 100 may have a speaker or the like for outputting sound.
  • the 3D device 20 is an example of an input device according to an embodiment.
  • the 3D device 20 is operated by the user 10 and is used to input various information to the display control device 100.
  • the 3D device 20 is a VR controller capable of inputting so-called 6DoF information.
  • the 3D device 20 is equipped with sensors such as an inertial sensor, an acceleration sensor, and a gravity sensor, and detects position and orientation information of the device itself. The 3D device 20 then transmits the detected position and orientation information of the device itself to the display control device 100.
  • the three-dimensional device 20 may be a controller held by the user 10 in one hand as shown in FIG. 1, or may be a pen-shaped controller.
  • the three-dimensional device 20 is not limited to the example, and may be any device capable of acquiring position information in real space.
  • the three-dimensional device 20 may be an air mouse, a digital camera, a smartphone, or the like.
  • the display control device 100 can capture the position and orientation information of the three-dimensional device 20, the three-dimensional device 20 does not need to have a sensor of its own.
  • the three-dimensional device 20 may be a specified object, a human face, a finger, or the like, equipped with a marker that can be recognized by the display control device 100 or a specified external device (such as a video camera installed in real space).
  • the planar device 30 is an example of an input device according to an embodiment.
  • the planar device 30 is operated by the user 10 and is used to input various information to the display control device 100.
  • the planar device 30 is, for example, a mouse controller or a trackball mouse that can input two-dimensional coordinate information.
  • the planar device 30 is equipped with a sensor such as an optical sensor, and detects coordinate information (two-dimensional information) on a plane based on the operation of the planar device. The planar device 30 then transmits the detected coordinate information to the display control device 100.
  • the display control device 100 solves this problem by the following configuration. That is, the display control device 100 acquires position and orientation information in three-dimensional space from the first controller, and displays a virtual coordinate plane in three-dimensional space on which two-dimensional coordinates can be specified based on the position and orientation information in the three-dimensional space. Furthermore, the display control device 100 detects coordinates on the virtual coordinate plane from the second controller, thereby enabling the user 10 to accurately select a desired position in space.
  • the first controller is a three-dimensional device 20 that the user 10 holds in one hand.
  • the second controller is a two-dimensional device 30 that the user 10 holds in the hand other than the hand holding the first controller (e.g., the dominant hand of the user 10).
  • the display control device 100 combines two different devices to enable the user 10 to select any position. Specifically, the display control device 100 displays a virtual coordinate plane (plane 50 shown in FIG. 1) in space, and uses the three-dimensional device 20 as an input means for moving the plane 50. The display control device 100 also displays a pointer 60 on the plane 50, and uses the planar device 30 as an input means for moving the pointer 60 two-dimensionally. This allows the user 10 to perform highly accurate pointing operations in virtual space by inputting using the planar device 30, which can specify the precise position of the pointer 60, after displaying the plane 50 at any location in space (for example, near the object 70).
  • a virtual coordinate plane plane 50 shown in FIG. 1
  • the display control device 100 also displays a pointer 60 on the plane 50, and uses the planar device 30 as an input means for moving the pointer 60 two-dimensionally. This allows the user 10 to perform highly accurate pointing operations in virtual space by inputting using the planar device 30, which can specify the precise position of the pointer 60, after displaying the
  • the display control device 100 displays a surface 50 in a virtual space.
  • a virtual hand is set on the surface 50, and the user 10 can obtain feedback as if he or she were holding the hand and moving the surface 50 by operating the three-dimensional device 20 holding the hand.
  • the user 10 places the surface 50 near the object 70.
  • the user 10 then operates the planar device 30 placed on the table to specify the position of the pointer 60 on the surface 50.
  • the coordinates on the surface 50 and the spatial coordinates in the virtual space are associated with each other.
  • the display control device 100 can, for example, obtain an intersection point between the extension line of the position of the pointer 60 and the outline (3D model) of the object 70 based on the position and orientation (angle) of the surface 50.
  • the display control device 100 obtains spatial coordinates 65, which is an arbitrary point on the object 70, as the position corresponding to the pointer 60.
  • the user 10 can indicate a specific position on the object 70 by specifying the position of the pointer 60 on the surface 50. This allows the user 10 to select a point, edge, face, etc. that he or she wants to edit from the object 70 with high accuracy.
  • FIG. 2 is a diagram (1) showing an example of the display control process according to the embodiment.
  • the example in FIG. 2 shows a state in which the user 10 operates the three-dimensional device 20 (step S1) and moves the surface 50 in a direction toward the same object 70 as in FIG. 1.
  • the user 10 can more accurately specify a specific position of the object 70 by moving the surface 50 to a position where it is almost in contact with the object 70.
  • the example in FIG. 2 shows a situation in which the user 10 moves the surface 50, and then operates the planar device 30 to select a specific edge 66 of the object 70.
  • FIG. 3 is a diagram (2) showing an example of the display control process according to the embodiment.
  • the example shown in the upper part of FIG. 3 shows a state in which the user 10 operates the three-dimensional device 20 to move the surface 50 in a direction toward the object 70, similar to FIG. 2.
  • the position of the surface 50 is fixed by the operation of the user 10.
  • the user 10 can switch the operation target to the object 70 and operate the object 70 so as to move it closer to the surface 50.
  • the user 10 can also move the surface 50 and the object 70 relatively closer to each other, thereby enabling accurate input.
  • the display control process displays a surface 50 in space, allowing the three-dimensional device 20 and the two-dimensional device 30 to be used together to specify a position in space, thereby improving operability and input accuracy.
  • each device in FIG. 1 conceptually represents a function in the display control system 1, and may take various forms depending on the embodiment.
  • the display control device 100 may be composed of two or more devices each having a different function, which will be described later.
  • the display control device 100 may be a device in which the display unit and the information processing unit are separately configured.
  • the information processing unit of the display control device 100 may be any information processing device, such as a server or a PC (Personal Computer).
  • Fig. 4 is a diagram showing an example of the configuration of the display control device 100 according to an embodiment.
  • the display control device 100 has a communication unit 110, a storage unit 120, a control unit 130, a sensor unit 140, and a display unit 150.
  • the display control device 100 may also have an input unit (such as an operation button or a touch panel) that accepts various operations from a user 10 who operates the display control device 100.
  • the communication unit 110 is realized, for example, by a NIC (Network Interface Card) or a network interface controller.
  • the communication unit 110 is connected to a network N by wire or wirelessly, and transmits and receives information to and from the three-dimensional device 20 and the planar device 30 via the network N.
  • the network N is realized, for example, by a wireless communication standard or method such as Bluetooth (registered trademark), the Internet, Wi-Fi (registered trademark), UWB (Ultra Wide Band), or LPWA (Low Power Wide Area).
  • the storage unit 120 is realized, for example, by a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or an optical disk.
  • a semiconductor memory element such as a random access memory (RAM) or a flash memory
  • a storage device such as a hard disk or an optical disk.
  • the storage unit 120 stores various information related to the display control process according to the embodiment.
  • the storage unit 120 stores a 3D model of a virtual object to be displayed on the display unit 150.
  • the sensor unit 140 is a sensor that detects various types of environmental information.
  • the sensor unit 140 includes an outward-facing camera that captures images of the outside of the display control device 100 and an inward-facing camera that captures images of the user 10.
  • the sensor unit 140 functions as a recognition camera for recognizing the space in front of the user's eyes.
  • the sensor unit 140 recognizes the spatial position of the three-dimensional device 20 by photographing the three-dimensional device 20, whose spatial position has been calibrated in advance, and acquires position and orientation information of the three-dimensional device 20.
  • the sensor unit 140 also recognizes a subject (e.g., a real object located in real space) located in front of the display control device 100.
  • a subject e.g., a real object located in real space
  • the sensor unit 140 acquires an image of the subject located in front of the user, and can calculate the distance from the display control device 100 (in other words, the position of the user's viewpoint) to the subject based on the parallax between images captured by the stereo camera.
  • the sensor unit 140 may detect the distance in real space using a depth sensor capable of detecting the distance to a real object.
  • the sensor unit 140 may have a function of detecting various information related to the user's motion, such as the orientation, inclination, motion, and speed of the user's body. Specifically, the sensor unit 140 detects information related to the user's motion, such as information related to the user's head and posture, the motion of the user's head and body (acceleration and angular velocity), the direction of the field of view, and the speed of the viewpoint movement. For example, the sensor unit 140 functions as various motion sensors such as a three-axis acceleration sensor, a gyro sensor, and a speed sensor, and detects information related to the user's motion.
  • various motion sensors such as a three-axis acceleration sensor, a gyro sensor, and a speed sensor, and detects information related to the user's motion.
  • the sensor unit 140 detects at least one change in the position and posture of the user's head by detecting the components of the yaw direction, pitch direction, and roll direction as the motion of the user's head.
  • the sensor unit 140 does not necessarily need to be provided in the display control device 100, and may be, for example, an external sensor connected to the display control device 100 by wire or wirelessly.
  • the display unit 150 displays various information output from the control unit 130.
  • the display unit 150 is a display that outputs video to the user 10.
  • the display unit 150 may also include an audio output unit (such as a speaker) that outputs audio.
  • the control unit 130 is realized, for example, by a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU, etc., executing a program stored inside the display control device 100 using a RAM or the like as a working area.
  • the control unit 130 is also a controller, and may be realized, for example, by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • control unit 130 has an acquisition unit 131, a first detection unit 132, a second detection unit 133, and a display control unit 134.
  • the acquisition unit 131 acquires various types of information. For example, the acquisition unit 131 acquires position and orientation information in three-dimensional space from the three-dimensional device 20, which is the first controller. The acquisition unit 131 also acquires position information, which is information indicating a position on the surface 50, from the planar device 30, which is the second controller.
  • the first detection unit 132 detects the movement of the surface 50 in three-dimensional space based on the input from the stereoscopic device 20. Specifically, the first detection unit 132 detects (calculates) the spatial position and orientation information of the stereoscopic device 20 by combining the image of the stereoscopic device 20 captured by the sensor unit 140 and the position and orientation information transmitted from the stereoscopic device 20. That is, the position and orientation information of the stereoscopic device 20 is obtained by merging the image information captured by the outward camera of the sensor unit 140 and the IMU (Inertial Measurement Unit) information provided in the stereoscopic device 20.
  • IMU Inertial Measurement Unit
  • the input from the stereoscopic device 20 refers to the position and orientation information (or the variation value thereof) measured by the IMU or the like as a result of the stereoscopic device 20 being operated in space being input to the display control device 100 via the communication unit 110.
  • the first detection unit 132 detects the movement and position/orientation information of the surface 50 by detecting the fluctuation value of the position/orientation information obtained based on the result of the operation of the three-dimensional device 20 by the user 10 holding a handle virtually attached to the surface 50.
  • the first detection unit 132 can detect the position and posture of the three-dimensional device 20 by observing the state in which the infrared LED is shining with the outward camera of the sensor unit 140 and combining it with IMU information.
  • the first detection unit 132 can also detect the position and posture of the three-dimensional device 20 by using hand truck technology to capture the hand of the user 10 with the camera of the sensor unit 140 or a camera installed in the room and estimating the position and posture of the hand.
  • the first detection unit 132 may also detect the relative movement of the surface 50, rather than the movement of the surface 50 itself. In other words, the first detection unit 132 may detect the movement of an object whose position can be specified using the surface 50, based on an input from the three-dimensional device 20. In this case, the first detection unit 132 receives the designation of the object that the user 10 wishes to move, and performs processing to move the object closer to or farther away from the surface 50 in accordance with the operation of the user 10.
  • the second detection unit 133 detects a position specified on the surface 50 (for example, the position of a pointer indicating the specified position) based on an input from the planar device 30, which is the second controller. Note that since the surface 50 is given position and orientation information (such as the inclination of the plane) in space based on the operation of the three-dimensional device 20, the pointer located on the surface 50 also includes not only a two-dimensional position (the coordinates of the surface 50) but also position and orientation information corresponding to the surface 50.
  • the second detection unit 133 also detects a selected part of the object based on the position specified on the surface 50. For example, the second detection unit 133 detects a part of the object, such as a point, edge, or surface, that corresponds to the position indicated by the pointer, based on the specification by the user 10 or initial settings.
  • the second detection unit 133 may detect the part selected on the object based on the relationship between the angle of the surface 50 and the position of the pointer, and the position of the object. Specifically, the second detection unit 133 detects the point on the extension line of the pointer, which is determined from the angle of the surface 50 and the position of the pointer, that intersects with the object as the part designated by the user 10.
  • the second detection unit 133 may detect, as the part specified by the user 10, a point where an extension of a line normal to the surface 50, starting from the pointer position, intersects with the object.
  • the second detection unit 133 may detect, as the part specified by the user 10, a point where an extension of a line connecting the viewpoint position of the user 10 and the pointer position intersects with the object.
  • the second detection unit 133 may detect the part selected on the object based on the position where a line extending horizontally from the specified pointer position on the surface 50 intersects with the object, regardless of the angle of the surface 50.
  • the display control unit 134 controls the display unit 150 to display the information output from the control unit 130.
  • the display control unit 134 outputs the virtual space image rendered as video content to the output destination device.
  • the output destination device is not limited to a head-mounted display such as the display unit 150, but may also be an external display, a smartphone, a television, or other video output device.
  • the display control unit 134 displays a surface 50, which is a virtual coordinate plane on which two-dimensional coordinates can be specified, in three-dimensional space based on the position and orientation information of the three-dimensional device 20 in three-dimensional space.
  • the display control unit 134 also controls the display of a pointer on the surface 50 based on the position information on the surface 50 input from the planar device 30. When a part of an object is selected based on the pointer position, the display control unit 134 also clearly indicates the selected position.
  • Fig. 5 is a sequence diagram showing a flow of a display control process according to the embodiment.
  • the 3D device 20 acquires position and orientation information of its own device using an IMU or the like (step S11). Then, the 3D device 20 transmits the acquired position and orientation information to the display control device 100 (step S12).
  • the display control device 100 captures an image of the 3D device 20 using an outward-facing camera or the like (step S21). Then, the display control device 100 estimates the position and orientation of the 3D device 20 (step S22).
  • the display control device 100 combines the position and orientation information acquired from the three-dimensional device 20 and the position and orientation information acquired by photographing to identify the position and orientation of the three-dimensional device 20 (step S23).
  • the display control device 100 detects the position and orientation of the surface 50 in the virtual space based on the position and orientation information of the three-dimensional device 20 (step S24).
  • the display control device 100 displays the surface 50 on the display unit 150 based on the detected position and orientation of the surface 50.
  • the planar device 30 acquires planar coordinate information for identifying the position on the surface 50 (step S31). The planar device 30 then transmits the acquired planar coordinate information to the display control device 100 (step S32).
  • the display control device 100 detects the pointer position on the surface 50 based on the information acquired from the planar device 30 (step S25). Then, the display control device 100 detects the position specified by the user 10 within the object based on the position and orientation of the pointer (step S26). Note that the orientation information of the pointer in this case is calculated based on, for example, the angle of the surface 50.
  • the process according to the embodiment may be modified in various ways.
  • the user 10 operates the surface 50 and the pointer using both the three-dimensional device 20 and the planar device 30.
  • the user 10 may execute the information process according to the embodiment using either one of the devices.
  • the display control device 100 can perform the display control processing according to the present disclosure even if the first controller (three-dimensional device 20) and the second controller (planar device 30) are the same device.
  • the first detection unit 132 detects the same device as the first controller when the same device is located at a predetermined distance from the plane in real space.
  • the second detection unit 133 detects the same device as the second controller when the same device is located within a predetermined distance from the plane in real space.
  • the first detection unit 132 when the first detection unit 132 receives a designation from a user 10 of the same device to use the same device as a first controller, it may detect the same device as a first controller. Furthermore, when the second detection unit 133 receives a designation from a user 10 of the same device to use the same device as a second controller, it may detect the same device as a second controller.
  • Figure 6A is a diagram for explaining device operations related to the first modified example.
  • the user 10 operates the three-dimensional device 20 and the planar device 30, as in the embodiment.
  • the user 10 operates only the flat device 30.
  • the display control device 100 recognizes this operation.
  • the display control device 100 recognizes the flat device 30 as a device that performs the same operation as the three-dimensional device 20.
  • the display control device 100 recognizes the movement of the device as a device for moving the surface 50.
  • the display control device 100 recognizes the planar device 30 as a device for operating a pointer on the surface 50. This process allows the user 10 to realize the processing according to the embodiment using only one device.
  • the display control device 100 can also handle the three-dimensional device 20 in the same way as the planar device 30. For example, when the three-dimensional device 20 is moved near the tabletop, the display control device 100 recognizes such movement and handles the three-dimensional device 20 as the planar device 30. More specifically, when the three-dimensional device 20 is located near the tabletop, the display control device 100 can invalidate the input of position and orientation from the three-dimensional device 20 and handle it as a planar device 30 that inputs position information by touching the desk like a touch pen.
  • the display control device 100 may explicitly accept a switching operation from the user 10.
  • the stereoscopic device 20 has a function capable of identifying a two-dimensional position (such as a trackball)
  • the user 10 can flexibly perform the above switching.
  • the display control device 100 may treat the stereoscopic device 20 as the planar device 30 while it is flipped up. This is because the user 10 cannot see stereoscopically while it is flipped up.
  • FIG. 6B is a flowchart showing the flow of the display control process according to the first modified example.
  • the display control device 100 determines whether the surface 50 and the two devices performing the pointer operation are separate devices (step S41). If they are separate devices (step S41; Yes), the display control device 100 sets each device to a movement that conforms to the original settings, as in the embodiment.
  • the display control device 100 determines whether the device is recognized on a tabletop (any plane in three-dimensional space) (step S42). If the device is on a tabletop (step S42; Yes), the display control device 100 treats the device as a flat-surface device 30.
  • step S42 determines whether or not the user 10 has specified the device as a flat-screen device 30 (step S43). If the user 10 has specified the device as a flat-screen device 30 (step S43; Yes), the display control device 100 treats the device as a flat-screen device 30.
  • step S43 the display control device 100 determines whether the HMD is flipped up (step S44).
  • step S44; No the display control device 100 treats the device as a stereoscopic device 20. On the other hand, if the HMD is flipped up (step S44; Yes), the display control device 100 treats the device as a planar device 30.
  • FIG. 7 is a diagram (1) for explaining the display control processing relating to the second modified example.
  • the user 10 grasps the handle of the surface 50 with his/her hand, as in the embodiment, and moves the hand by a distance of 200 to move the surface 50.
  • the user 10 desires to move the surface 50 farther than the distance 205 that the user 10 actually moves his/her hand.
  • the display control device 100 provides a guide to the user 10 by, for example, displaying a ray in the virtual space from the hand toward the surface 50.
  • the user 10 can display the guide by explicitly selecting a display mode that has been set in advance by the display control device 100, such as a remote operation mode.
  • the display control device 100 may move the surface 50 a distance longer than the actual distance the user's 10 hand moves, using only the movement of the user's 10 hand, even if the user 10 does not hold the surface 50 with his/her hand.
  • the display control device 100 may move the surface 50 by treating it as if the user 10 were holding it with their hand, based on a guide extended from the hand.
  • the display control device 100 may perform display control to move the object instead of the surface 50, as shown in FIG. 3.
  • the display control device 100 may also control not only the surface 50 but also objects so that they can be remotely operated. This will be explained using FIG. 8.
  • FIG. 8 is a diagram (2) for explaining the display control process according to the second modified example.
  • the display control device 100 displays a guide 210 for grasping the object 70 through the surface 50.
  • the user 10 can grasp or select the object 70 via the guide 210. This allows the user 10 to perform operations such as grasping or moving an object 70 that is placed in a position that is out of reach in the virtual space.
  • the display control device 100 may switch the pointer operation mode, for example, to emit a ray (guide) from the pointer on the surface in a specific direction, so that a part of the object 70 can be selected.
  • the specific direction is, for example, a direction that can be determined based on the viewpoint position and pointer position of the user 10, such as a normal direction to the surface 50 or a direction connecting the viewpoint position of the user 10 and the pointer.
  • the display control device 100 may move the pointer on the surface to the part that is in contact when the object 70 and the surface 50 are not in contact with each other.
  • the display control device 100 may also fix the position of the surface 50 so that the surface 50 does not move to a position in the virtual space that cannot be grasped.
  • the display control unit 134 may display the surface 50 in three-dimensional space based on the viewpoint position of the user 10, regardless of the position and orientation information of the three-dimensional device 20 in three-dimensional space. This point will be explained using FIG. 9.
  • FIG. 9 is a diagram (3) for explaining the display control process related to the second modified example.
  • the display control device 100 controls so that the surface 50 is displayed at a fixed distance 215 from the display control device 100.
  • the surface 50 moves in conjunction with the movement of the display control device 100. This allows the user 10 to move the surface 50 for selecting an arbitrary position of the object 70 by, for example, directing his or her line of sight toward the object 70 without using the stereoscopic device 20. Note that if the surface 50 always moves in conjunction with the display control device 100, it becomes difficult to keep the surface 50 in a fixed position, so the display control device 100 may turn the automatic movement of the surface 50 on or off in response to a request from the user 10.
  • the display control device 100 may not only change the position and orientation of the surface 50, but also change the size.
  • the display control device 100 changes the size of the surface 50 in conjunction with an operation such as the user 10 rotating the mouse wheel.
  • the display control device 100 may also change the color of the surface 50.
  • the display control device 100 may also perform processing such as cutting out a cross section of an object in virtual space, or hiding any object that is even slightly within the area in front of the surface 50, depending on the position where the surface 50 is installed. In other words, the display control device 100 may change the display content of an object depending on the position of the surface 50. This allows the display control device 100 to improve the visibility of the object.
  • the display control device 100 may also superimpose the surface 50 on the real space, as in AR display.
  • the display control device 100 may snap the surface 50 to a surface or edge of an object in the real world, and adjust the movement so that the user can easily adjust the position. That is, when the display control unit 134 displays the surface 50 in a three-dimensional space in the real space, the first detection unit 132 may detect the surface 50 at a position where it is attached (snapped) to an arbitrary plane in the real space based on an input from the stereoscopic device 20, so that the surface 50 is displayed superimposed on the plane when the surface 50 approaches the plane by a predetermined distance.
  • FIG. 10 is a diagram (1) for explaining the display control process according to the third modified example.
  • FIG. 10 shows a state in which a surface 50 is superimposed on real space.
  • the example in FIG. 10 shows a state in which a surface 50 with a handle 220 is displayed as if it is attached to the screen of a real display 225.
  • the display control device 100 displays a guide 230 along the plane (the outer frame of the display 225 in this example).
  • the display control device 100 recognizes that the display 225 and the surface 50 have come within a predetermined distance, and snaps the surface 50 to the display 225 along the guide 230.
  • the user 10 can use the surface 50 that is tightly attached when selecting an object or the like projected on the display 225, and can perform selection operations with precision even in the AR space.
  • the display control device 100 may attach the surface 50 to a side surface adjacent to the screen, rather than to the front surface of the display 225. This allows the user 10 to use the display 225 as a normal monitor, while simultaneously performing operations on the surface 50 that is virtually displayed next to it.
  • FIG. 11 is a diagram (2) for explaining the display control process according to the third modified example.
  • the display control device 100 recognizes the desk 240 as a flat surface to which the surface 50 is attached. In other words, when the user 10 grasps the handle 220 and brings the surface 50 closer to the desk 240, the display control device 100 recognizes that the top surface of the desk 240 and the surface 50 have come within a predetermined distance, and snaps the surface 50 to the top surface of the desk 240.
  • FIG. 12 is a diagram (3) for explaining the display control processing related to the third modified example.
  • a touch pen 300 which is a virtual pointer for specifying coordinates on the surface 50.
  • the touch pen 300 is displayed, for example, when the user 10 brings the 3D device 20 close to a tabletop, and is operated based on the IMU information of the 3D device 20, etc.
  • the virtual object 305 is located behind the surface 50, i.e., below the desk surface, so if the actual movements are used, the desk surface will get in the way and the virtual object 305 cannot be selected with the touch pen 300 using the 3D device 20. For this reason, the display control device 100 makes adjustments regarding the selection operation.
  • FIG. 13 is a diagram (4) for explaining the display control process related to the third modified example.
  • the left part of FIG. 13 shows an example in which the display control device 100 moves the touch pen 300 by the same amount as the actual movement of the 3D device 20.
  • the touch pen 300 collides with the desk surface, and the user 10 does not get the sensation of touching the virtual object 305.
  • the display control device 100 when the display control device 100 recognizes that the virtual object 305 is located behind the surface 50, it dynamically changes the length of the touch pen 300, as shown on the right side of FIG. 13. Specifically, the display control device 100 adjusts the 3D device 20 so that it actually collides with the desk surface when the touch pen 300 comes into contact with the virtual object 305. This allows the display control device 100 to provide accurate force feedback to the user 10, and to perform accurate display control processing that does not penetrate the virtual object 305.
  • FIG. 14 shows an example of the display of the touch pen 300.
  • FIG. 14 is a diagram (5) for explaining the display control process according to the third modified example.
  • the display control device 100 determines the length of the touch pen 300 based on the distance to the virtual object 305 that the user 10 is trying to select. Specifically, the display control device 100 sets the distance 310 between the desk surface and the 3D device 20 to be the same as the distance 315 between the tip of the touch pen 300 and the virtual object 305.
  • the user 10 can also change the size of the surface 50 depending on the situation at the work location. That is, the first detection unit 132 may control the position and size of the surface 50 virtually displayed in real space based on the operation of the three-dimensional device 20 by the user 10. This point will be explained using FIG. 15.
  • FIG. 15 is a diagram (6) for explaining the display control process according to the third modified example.
  • the user 10 performs work using a surface 50 that is attached to a desk 240.
  • the user 10 grasps the handle 220 and performs an action such as spreading the surface 50, thereby expanding the surface 50 so that it covers the flat surface of the desk 240.
  • the display control device 100 expands or reduces the size of the surface 50 in response to the operation of the user 10. This allows the user 10 to effectively use the flat surface in real space on which the surface 50 is attached, thereby achieving better operability.
  • the display control device 100 may control the designated position to allow the user 10 to make an appropriate selection, for example by extending the designated position.
  • the second detection unit 133 may control the operation of the pointer to extend the selection position using the pointer and allow the user 10 to virtually operate the object. This point will be explained using FIG. 16.
  • FIG. 16 is a diagram (7) for explaining the display control process related to the third modified example.
  • a surface 50 is displayed at a 45 degree angle with respect to the display 225.
  • a camera 320 which is a virtual object, is displayed behind the surface 50. In this case, the user 10 cannot directly operate the camera 320 because it does not exist on the surface 50.
  • the display control device 100 extends a ray from the pointer 330 displayed by the user 10 and displays a guide 335 indicating the extended line, and a pointer 340 which is the intersection of the guide 335 and the camera 320. This allows the user 10 to operate the camera 320 using the extended pointer 340, even if the user 10 cannot actually move the pointer 330 to a position that touches the camera 320.
  • the display control device 100 can display the surface 50 at any angle in the AR display. This will be explained using FIG. 17.
  • FIG. 17 is a diagram (8) for explaining the display control process according to the third modified example.
  • the display control device 100 may display the surface 50 at a right angle (90 degrees) to the plane in space (the screen of the display 225 in this example), or at an angle of 45 degrees or 0 degrees (i.e., the desk surface). In other words, the display control device 100 can accept any angle from the user 10, and therefore can create a work area that meets the needs of the user 10.
  • the display control device 100 may display the surface 50 at a position offset a predetermined distance from the object.
  • the user 10 can perform selection operations on the surface 50 offset from the object, or can perform tasks such as sketching the object, thereby providing a work environment with excellent visibility.
  • the display control device 100 may change the size of the surface 50 itself in response to an operation by the user 10. That is, the second detection unit 133 may perform control so as to change the position or size of the surface 50 based on a position (e.g., a display position of a pointer) designated on the surface 50 by the user 10. This point will be described with reference to Fig. 18.
  • Fig. 18 is a diagram for explaining a display control process according to the fourth modified example.
  • the pointer 350 is displayed at the right edge of the surface 50.
  • the display control device 100 may display the surface 50 and the pointer 350 in a fixed manner, as shown in the lower left part of FIG. 18.
  • the display control device 100 may also move the surface 50 to the right when the pointer 350 reaches the right edge, as shown in the lower center part of FIG. 18.
  • the display control device 100 may also enlarge the display of the surface 50 when the pointer 350 reaches the right edge, as shown in the lower right part of FIG. 18.
  • the display control device 100 may switch between these display control processes at the selection of the user 10. This allows the user 10 to automatically change the work area in accordance with the movement of the pointer 350, thereby obtaining better operability.
  • each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure.
  • the specific form of distribution and integration of each device is not limited to that shown in the figure, and all or part of them can be functionally or physically distributed and integrated in any unit depending on various loads, usage conditions, etc.
  • the display control device according to the present disclosure includes an acquisition unit (the acquisition unit 131 in the embodiment) and a display control unit (the display control unit 134 in the embodiment).
  • the acquisition unit acquires position and orientation information in a three-dimensional space from a first controller (the three-dimensional device 20 in the embodiment).
  • the display control unit displays a virtual coordinate plane (the surface 50 in the embodiment) on which two-dimensional coordinates can be specified in the three-dimensional space based on the position and orientation information in the three-dimensional space.
  • the display control device can improve input accuracy, such as pointing to a specific location on an object in space, by displaying a plane on which two-dimensional coordinates can be specified in three-dimensional space.
  • the display control device also includes a first detection unit (first detection unit 132 in this embodiment) that detects movement in the three-dimensional space of the virtual coordinate plane based on an input from the first controller.
  • first detection unit detects movement of an object whose position can be specified using the virtual coordinate plane based on an input from the first controller.
  • the display control device uses a VR controller or the like to allow the user to move freely around the surface. This allows the user to perform input operations with a high degree of freedom.
  • the display control device also detects a position specified on the virtual coordinate plane based on an input from a second controller (in this embodiment, the planar device 30). For example, the second detection unit detects a part selected on the object based on a position specified on the virtual coordinate plane. Specifically, the second detection unit detects a part selected on the object based on the angle of the virtual coordinate plane and a position specified by the user on the virtual coordinate plane.
  • the display control device obtains input from the user using a planar device such as a mouse, and can therefore receive more accurate input than a VR controller, etc.
  • the second detection unit may also detect, as the selected part of the object, a point where an extension of a line normal to the virtual coordinate plane starting from the specified position intersects with the object, or a point where an extension of a line connecting the user's viewpoint position and the specified position intersects with the object.
  • the display control device can arbitrarily switch the behavior that determines the location selected by the pointer on the screen. This allows the user to avoid stressful behavior, such as the wrong location being selected due to the pointer position, and achieve the operability that they desire.
  • the first controller and the second controller may be the same device.
  • the first detection unit may detect the same device as the first controller when the same device is located at a position away from a plane in real space by a predetermined distance.
  • the second detection unit may detect the same device as the second controller when the same device is located within a predetermined distance from a plane in real space.
  • the first detection unit may detect the same device as the first controller when a designation to use the same device as the first controller is received from a user of the same device.
  • the second detection unit may detect the same device as the second controller when a designation to use the same device as the second controller is received from a user of the same device.
  • the display control device can realize the display control processing according to the present disclosure in a single device, and can provide users with information processing that achieves high input accuracy even in a variety of equipment environments.
  • the display control unit also displays a virtual coordinate plane in three-dimensional space based on the user's viewpoint position, regardless of the position and orientation information of the first controller in three-dimensional space.
  • the display control device can reduce the number of operation steps required by the user by displaying a fixed surface, thereby realizing information processing that places less of a burden on the user.
  • the display control unit also displays a virtual coordinate plane in the three-dimensional space of the real space.
  • the first detection unit detects the virtual coordinate plane at a position adsorbed to the plane based on an input from the first controller so that the virtual coordinate plane is displayed superimposed on the plane.
  • the display control device can achieve a behavior that is like snapping a surface to a plane in real space. This allows the user to easily set a location on a tabletop device that is convenient for use as a surface, thereby improving work efficiency.
  • the display control unit also virtually displays an object and a pointer for selecting the object in a three-dimensional space in the real space.
  • the second detection unit controls the size or contact point of the pointer so that the user can virtually touch the object using the pointer.
  • the display control device can provide the user with an operational feel that matches the sensation of real space by adjusting the size of the pointer and the contact point (i.e., the feedback sensation) to suit the real space.
  • the first detection unit also controls the position and size of the virtual coordinate plane virtually displayed in real space based on the user's operation of the first controller.
  • the display control device can change the size of the screen according to the user's desires, allowing the user to freely adjust the work area, thereby improving operability.
  • the second detection unit also extends the position selected by the pointer and controls the operation of the pointer so that the user can virtually operate the object.
  • the display control device can improve user operability by performing processing to eliminate inconveniences in the virtual space, such as objects being located far away.
  • the second detection unit also controls the change of the position or size of the virtual coordinate plane based on a position specified on the virtual coordinate plane by the user.
  • the display control device can freely change the size of the screen depending on the position of the pointer, providing the user with a comfortable working environment.
  • Fig. 19 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the display control device 100.
  • the computer 1000 has a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, a HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600.
  • Each unit of the computer 1000 is connected by a bus 1050.
  • the CPU 1100 operates based on the programs stored in the ROM 1300 or the HDD 1400 and controls each component. For example, the CPU 1100 loads the programs stored in the ROM 1300 or the HDD 1400 into the RAM 1200 and executes processes corresponding to the various programs.
  • the ROM 1300 stores boot programs such as the Basic Input Output System (BIOS) that is executed by the CPU 1100 when the computer 1000 starts up, as well as programs that depend on the hardware of the computer 1000.
  • BIOS Basic Input Output System
  • HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by CPU 1100 and data used by such programs.
  • HDD 1400 is a recording medium that records a display control program related to the present disclosure, which is an example of program data 1450.
  • the communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (e.g., the Internet).
  • the CPU 1100 receives data from other devices and transmits data generated by the CPU 1100 to other devices via the communication interface 1500.
  • the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000.
  • the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600.
  • the CPU 1100 also transmits data to an output device such as a display, edger, or printer via the input/output interface 1600.
  • the input/output interface 1600 may also function as a media interface that reads programs and the like recorded on a specific recording medium. Examples of media include optical recording media such as DVDs (Digital Versatile Discs) and PDs (Phase change rewritable Disks), magneto-optical recording media such as MOs (Magneto-Optical Disks), tape media, magnetic recording media, and semiconductor memories.
  • the CPU 1100 of the computer 1000 executes a display control program loaded onto the RAM 1200 to realize the functions of the control unit 130, etc.
  • the display control program according to the present disclosure and data in the storage unit 120 are stored in the HDD 1400.
  • the CPU 1100 reads and executes the program data 1450 from the HDD 1400, but as another example, the CPU 1100 may obtain these programs from other devices via the external network 1550.
  • the present technology can also be configured as follows.
  • an acquisition unit that acquires position and orientation information in a three-dimensional space from the first controller; a display control unit that displays, in the three-dimensional space, a virtual coordinate plane on which two-dimensional coordinates can be specified, based on position and orientation information in the three-dimensional space;
  • a display control device comprising: (2) a first detection unit that detects a movement in a three-dimensional space of the virtual coordinate plane based on an input by the first controller;
  • the display control device according to (1) further comprising: (3)
  • the first detection unit is detecting a movement of an object whose position can be specified using the virtual coordinate plane based on an input by a first controller;
  • the display control device according to (2) is detecting a movement of an object whose position can be specified using the virtual coordinate plane based on an input by a first controller.
  • the display control device (4) a second detection unit that detects a position designated on the virtual coordinate plane based on an input from a second controller;
  • the display control device according to (2) or (3), further comprising: (5)
  • the second detection unit is detecting a selected portion of an object based on a position specified on the virtual coordinate plane;
  • the display control device according to (4).
  • the second detection unit is detecting a part selected in the object based on an angle of the virtual coordinate plane and a position designated by a user on the virtual coordinate plane;
  • the display control device according to (5) above.
  • the second detection unit is a point where an extension line extending from the specified position in a normal direction to the virtual coordinate plane intersects with the object, or a point where an extension line connecting the user's viewpoint and the specified position intersects with the object, is detected as the selected part of the object;
  • the first controller and the second controller are the same device;
  • the first detection unit is When the same device is located at a position away from a plane in real space by a predetermined distance, the same device is detected as the first controller;
  • the second detection unit is When the same device is located within a predetermined distance from a plane in real space, the same device is detected as the second controller.
  • the display control device according to (8).
  • the first detection unit is when receiving a designation from a user of the same device to use the same device as the first controller, detecting the same device as the first controller;
  • the second detection unit is when a designation to use the same device as the second controller is received from a user of the same device, the same device is detected as the second controller;
  • the display control device according to (8).
  • the display control unit is displaying the virtual coordinate plane in the three-dimensional space based on a viewpoint position of a user, regardless of position and orientation information of the first controller in the three-dimensional space;
  • the display control device according to any one of (2) to (10).
  • the display control unit is Displaying the virtual coordinate plane on the three-dimensional space of a real space;
  • the first detection unit is when the virtual coordinate plane approaches an arbitrary plane in real space by a predetermined distance based on an input by the first controller, the virtual coordinate plane is detected at a position where the virtual coordinate plane is attracted to the arbitrary plane so that the virtual coordinate plane is displayed superimposed on the arbitrary plane.
  • a display control device according to any one of (5) to (7).
  • the display control unit is virtually displaying the object and a pointer for selecting the object in the three-dimensional space in the real space;
  • the second detection unit is controlling a size or a contact point of the pointer so that the user can virtually touch the object using the pointer;
  • the display control device according to (12).
  • the first detection unit is controlling a position and a size of the virtual coordinate plane virtually displayed in a real space based on an operation of the first controller by a user;
  • the second detection unit is extending a position selected by the pointer and controlling an operation of the pointer so that the user can virtually operate the object;
  • the second detection unit is and controlling the virtual coordinate plane so as to change a position or a size of the virtual coordinate plane based on a position designated on the virtual coordinate plane by a user.
  • the display control device according to any one of (4) to (7).
  • the computer acquiring position and orientation information in a three-dimensional space from a first controller; displaying a virtual coordinate plane in the three-dimensional space on the basis of the position and orientation information in the three-dimensional space, on which two-dimensional coordinates can be specified;
  • a display control method comprising:
  • REFERENCE SIGNS LIST 1 Display control system 10 User 20 Three-dimensional device 30 Planar device 50 Surface 100 Display control device 110 Communication unit 120 Storage unit 130 Control unit 131 Acquisition unit 132 First detection unit 133 Second detection unit 134 Display control unit 140 Sensor unit 150 Display unit

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

A display control device in one embodiment according to the present disclosure comprises: an acquisition unit that acquires, from a first controller, position/attitude information in a three-dimensional space; and a display control unit which, on the basis of the position/attitude information in the three-dimensional space, displays, in the three-dimensional space, a virtual coordinate plane on which two-dimensional coordinates can be designated. Moreover, the display control device may further comprise a first detection unit that, on the basis of input by means of the first controller, detects movement of the virtual coordinate plane in the three-dimensional space.

Description

表示制御装置および表示制御方法Display control device and display control method
 本開示は、仮想オブジェクトに対して操作制御を行うための表示制御装置および表示制御方法に関する。 This disclosure relates to a display control device and a display control method for controlling operations on virtual objects.
 映像表示技術の発展により、VR(Virtual Reality)やAR(Augmented Reality)用のデバイスを用いることで、仮想空間を構築したり、仮想空間上のコンテンツを実空間に重畳させたり、仮想オブジェクトを現実の物体のように立体視したりすることが可能となっている。 With the development of video display technology, it is now possible to use devices for VR (Virtual Reality) and AR (Augmented Reality) to create virtual spaces, overlay content from virtual spaces onto real space, and view virtual objects in 3D as if they were real objects.
 かかる技術に関連して、仮想オブジェクトに対応する3Dモデルをユーザが立体視しながら作成および編集するため、コントローラを把持してモデルを空中で動かしたり、空中に点や線を設置したり動かしたりする操作(6DoF(Degree of Freedom)操作)を可能とするUI(User Interface)が多く提案されている。一例として、物理的なコントローラを仮想的なコントローラに割り当てる仕組みを工夫することにより、ユーザの操作性を向上させる技術が提案されている(例えば、特許文献1)。 In relation to this technology, many UIs (User Interfaces) have been proposed that enable users to create and edit 3D models corresponding to virtual objects while viewing them in stereoscopic vision, by holding a controller to move the model in the air, or to place and move points and lines in the air (6DoF (Degree of Freedom) operations). As an example, a technology has been proposed that improves user operability by devising a mechanism for assigning a physical controller to a virtual controller (for example, Patent Document 1).
特許第6859422号公報Patent No. 6859422
 しかし、3次元空間上において仮想オブジェクトを選択したり編集したりするための操作については、さらに改善の余地がある。 However, there is room for further improvement in the operations for selecting and editing virtual objects in 3D space.
 例えば、ユーザが空中に手を持ち上げてコントローラを操作する場合、従来の卓上でのコントローラ(マウス等)を用いた操作と比べて身体的負荷が高くなることから、手が震えるなどして、入力精度が下がるという問題がある。このため、VRやARで主に用いられる空中操作用のコントローラは、高い入力精度が要求される用途(CAD等)での3Dモデルの作成や編集には利用しづらいという現状がある。 For example, when a user operates a controller by lifting their hand in the air, the physical strain is greater than when using a conventional tabletop controller (such as a mouse), which can lead to problems such as hand tremors and reduced input accuracy. For this reason, controllers for air operation, which are primarily used in VR and AR, are currently difficult to use for creating or editing 3D models in applications that require high input accuracy (such as CAD).
 そこで、本開示では、入力精度を向上させることのできる表示制御装置および表示制御方法を提案する。 Therefore, this disclosure proposes a display control device and a display control method that can improve input accuracy.
 上記の課題を解決するために、本開示に係る一形態の表示制御装置は、第1のコントローラから、3次元空間内の位置姿勢情報を取得する取得部と、前記3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面を前記3次元空間上に表示する表示制御部と、を備える。 In order to solve the above problem, a display control device according to one embodiment of the present disclosure includes an acquisition unit that acquires position and orientation information in a three-dimensional space from a first controller, and a display control unit that displays a virtual coordinate plane in the three-dimensional space on which two-dimensional coordinates can be specified based on the position and orientation information in the three-dimensional space.
実施形態に係る表示制御処理の概要を示す図である。FIG. 2 is a diagram showing an overview of a display control process according to the embodiment. 実施形態に係る表示制御処理の一例を示す図(1)である。FIG. 1 is a diagram showing an example of a display control process according to an embodiment; 実施形態に係る表示制御処理の一例を示す図(2)である。FIG. 13 is a diagram showing an example of the display control process according to the embodiment; 実施形態に係る表示制御装置の構成例を示す図である。FIG. 1 is a diagram illustrating an example of the configuration of a display control device according to an embodiment. 実施形態に係る表示制御処理の流れを示すシーケンス図である。FIG. 4 is a sequence diagram showing a flow of a display control process according to the embodiment. 第1の変形例に係るデバイス操作を説明するための図である。FIG. 11 is a diagram for explaining a device operation according to a first modified example. 第1の変形例に係る表示制御処理の流れを示すフローチャートである。13 is a flowchart showing the flow of a display control process according to a first modified example. 第2の変形例に係る表示制御処理を説明するための図(1)である。FIG. 11 is a diagram for explaining a display control process according to a second modified example. 第2の変形例に係る表示制御処理を説明するための図(2)である。FIG. 11 is a diagram (2) for explaining the display control process according to the second modified example. 第2の変形例に係る表示制御処理を説明するための図(3)である。FIG. 11 is a diagram (3) for explaining the display control process according to the second modified example. 第3の変形例に係る表示制御処理を説明するための図(1)である。FIG. 11 is a diagram for explaining a display control process according to a third modified example. 第3の変形例に係る表示制御処理を説明するための図(2)である。FIG. 13 is a diagram (2) for explaining the display control process according to the third modified example. 第3の変形例に係る表示制御処理を説明するための図(3)である。FIG. 13 is a diagram for explaining the display control process according to the third modified example. 第3の変形例に係る表示制御処理を説明するための図(4)である。FIG. 4 is a diagram for explaining the display control process according to the third modified example. 第3の変形例に係る表示制御処理を説明するための図(5)である。FIG. 5 is a diagram for explaining a display control process according to a third modified example. 第3の変形例に係る表示制御処理を説明するための図(6)である。FIG. 6 is a diagram for explaining the display control process according to the third modified example. 第3の変形例に係る表示制御処理を説明するための図(7)である。FIG. 7 is a diagram for explaining a display control process according to the third modified example. 第3の変形例に係る表示制御処理を説明するための図(8)である。FIG. 8 is a diagram for explaining a display control process according to a third modified example. 第4の変形例に係る表示制御処理を説明するための図である。FIG. 13 is a diagram for explaining a display control process according to a fourth modified example. 表示制御装置の機能を実現するコンピュータの一例を示すハードウェア構成図である。FIG. 2 is a hardware configuration diagram illustrating an example of a computer that realizes the functions of the display control device.
 以下に、実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 The following describes the embodiments in detail with reference to the drawings. Note that in each of the following embodiments, the same parts are designated by the same reference numerals, and duplicate descriptions will be omitted.
 以下に示す項目順序に従って本開示を説明する。
  1.実施形態
   1-1.実施形態に係る表示制御処理の概要
   1-2.実施形態に係る表示制御装置の構成
   1-3.実施形態に係る処理の手順
   1-4.変形例
    1-4-1.第1の変形例:1つのデバイスのみの操作
    1-4-2.第2の変形例:操作感覚の調整
    1-4-3.第3の変形例:実空間に重畳する処理
    1-4-4.第4の変形例:ポインタ位置に基づく面の変形
  2.その他の実施形態
  3.本開示に係る表示制御装置の効果
  4.ハードウェア構成
The present disclosure will be described in the following order.
1. Embodiment 1-1. Overview of display control processing according to embodiment 1-2. Configuration of display control device according to embodiment 1-3. Processing procedure according to embodiment 1-4. Modifications 1-4-1. First modification: Operation of only one device 1-4-2. Second modification: Adjustment of operation feel 1-4-3. Third modification: Processing of superimposition on real space 1-4-4. Fourth modification: Deformation of surface based on pointer position 2. Other embodiments 3. Effects of display control device according to the present disclosure 4. Hardware configuration
(1.実施形態)
(1-1.実施形態に係る表示制御処理の概要)
 図1を用いて、実施形態に係る表示制御処理の一例を説明する。図1は、実施形態に係る表示制御処理の概要を示す図である。図1には、実施形態に係る表示制御処理を実行する表示制御システム1の構成要素を示す。
1. EMBODIMENTS
(1-1. Overview of Display Control Process According to the Embodiment)
An example of a display control process according to the embodiment will be described with reference to Fig. 1. Fig. 1 is a diagram showing an overview of the display control process according to the embodiment. Fig. 1 shows components of a display control system 1 that executes the display control process according to the embodiment.
 図1に示すように、表示制御システム1は、表示制御装置100と、立体用デバイス20と、平面用デバイス30とを含む。 As shown in FIG. 1, the display control system 1 includes a display control device 100, a three-dimensional device 20, and a planar device 30.
 表示制御装置100は、VRやAR技術を実現するための情報処理端末である。実施形態では、表示制御装置100は、ユーザ10の頭部に装着されて利用されるウェアラブルディスプレイである。具体的には、表示制御装置100は、例えばヘッドマウントディスプレイ(HMD, Head Mounted Display)あるいはARグラス等である。 The display control device 100 is an information processing terminal for implementing VR and AR technologies. In an embodiment, the display control device 100 is a wearable display that is worn on the head of a user 10. Specifically, the display control device 100 is, for example, a head mounted display (HMD) or AR glasses.
 表示制御装置100は、密閉型もしくは透過型の表示部(ディスプレイ)を有する。例えば、表示制御装置100は、CG(Computer Graphics)等で表現される仮想空間を表示する。あるいは、表示制御装置100は、実空間上に重畳させて、CG等で表現される仮想オブジェクトを表示部に表示する。図1の例では、表示制御装置100は、仮想オブジェクトの一例としてオブジェクト70を表示する。すなわち、図1の例では、オブジェクト70およびユーザ10が視認している空間は、CG等で構成される仮想空間であるものとする。なお、後述するように、ユーザ10が視認する空間は、仮想オブジェクトが実空間上に重畳された、いわゆるAR空間である場合もありうる。また、表示制御装置100は、表示部以外にも、所定の出力信号を出力するための構成を有してもよい。例えば、表示制御装置100は、音声を出力するためのスピーカー等を有してもよい。 The display control device 100 has a closed or transmissive display unit (display). For example, the display control device 100 displays a virtual space expressed by CG (Computer Graphics) or the like. Alternatively, the display control device 100 displays a virtual object expressed by CG or the like on the display unit by superimposing it on real space. In the example of FIG. 1, the display control device 100 displays an object 70 as an example of a virtual object. That is, in the example of FIG. 1, the object 70 and the space viewed by the user 10 are assumed to be a virtual space configured by CG or the like. Note that, as will be described later, the space viewed by the user 10 may be a so-called AR space in which a virtual object is superimposed on real space. In addition, the display control device 100 may have a configuration for outputting a predetermined output signal in addition to the display unit. For example, the display control device 100 may have a speaker or the like for outputting sound.
 立体用デバイス20は、実施形態に係る入力装置の一例である。実施形態では、立体用デバイス20は、ユーザ10によって操作され、各種情報を表示制御装置100に入力するために利用される。立体用デバイス20は、いわゆる6DoF情報の入力が可能なVRコントローラである。具体的には、立体用デバイス20は、慣性センサや加速度センサ、重力センサ等のセンサを備え、自装置の位置姿勢情報を検知する。そして、立体用デバイス20は、検知した自装置の位置姿勢情報を表示制御装置100に送信する。 The 3D device 20 is an example of an input device according to an embodiment. In the embodiment, the 3D device 20 is operated by the user 10 and is used to input various information to the display control device 100. The 3D device 20 is a VR controller capable of inputting so-called 6DoF information. Specifically, the 3D device 20 is equipped with sensors such as an inertial sensor, an acceleration sensor, and a gravity sensor, and detects position and orientation information of the device itself. The 3D device 20 then transmits the detected position and orientation information of the device itself to the display control device 100.
 立体用デバイス20は、図1で示すようなユーザ10が片手で把持する形式のコントローラであってもよいし、ペン型のコントローラであってもよい。なお、立体用デバイス20は例示に限られず、実空間上の位置情報を取得可能な装置であれば、いずれの装置であってもよい。例えば、立体用デバイス20は、エアマウス(air mouse)や、デジタルカメラや、スマートフォン等であってもよい。また、表示制御装置100が立体用デバイス20の位置姿勢情報を捕捉できる場合、立体用デバイス20は、自装置がセンサを備えなくてもよい。例えば、立体用デバイス20は、表示制御装置100、もしくは所定の外部装置(実空間に設置されたビデオカメラ等)が認識可能なマーカーを備えた所定の物体や人間の顔や指等であってもよい。 The three-dimensional device 20 may be a controller held by the user 10 in one hand as shown in FIG. 1, or may be a pen-shaped controller. The three-dimensional device 20 is not limited to the example, and may be any device capable of acquiring position information in real space. For example, the three-dimensional device 20 may be an air mouse, a digital camera, a smartphone, or the like. In addition, if the display control device 100 can capture the position and orientation information of the three-dimensional device 20, the three-dimensional device 20 does not need to have a sensor of its own. For example, the three-dimensional device 20 may be a specified object, a human face, a finger, or the like, equipped with a marker that can be recognized by the display control device 100 or a specified external device (such as a video camera installed in real space).
 平面用デバイス30は、実施形態に係る入力装置の一例である。実施形態では、平面用デバイス30は、ユーザ10によって操作され、各種情報を表示制御装置100に入力するために利用される。平面用デバイス30は、例えば、2次元の座標情報を入力可能なマウスコントローラやトラックボールマウス等である。具体的には、平面用デバイス30は、光学センサ等のセンサを備え、自装置の動作に基づき平面上の座標情報(2次元情報)を検知する。そして、平面用デバイス30は、検知した座標情報を表示制御装置100に送信する。 The planar device 30 is an example of an input device according to an embodiment. In the embodiment, the planar device 30 is operated by the user 10 and is used to input various information to the display control device 100. The planar device 30 is, for example, a mouse controller or a trackball mouse that can input two-dimensional coordinate information. Specifically, the planar device 30 is equipped with a sensor such as an optical sensor, and detects coordinate information (two-dimensional information) on a plane based on the operation of the planar device. The planar device 30 then transmits the detected coordinate information to the display control device 100.
 ここで、VRもしくはAR空間におけるオブジェクトの任意の位置をユーザ10が指定する際の処理について説明する。従来、VRやAR空間における位置の選択には、立体用デバイス20のようなコントローラが利用されていた。しかしながら、空中でコントローラを操作する場合、従来の卓上でのコントローラを用いた操作と比べて身体的負荷が高くなることから、手が震えるなどして、入力精度が下がるという問題があった。 Here, we will explain the process when the user 10 specifies an arbitrary position of an object in a VR or AR space. Conventionally, a controller such as the three-dimensional device 20 has been used to select a position in a VR or AR space. However, operating a controller in the air places a greater physical strain than operating a controller on a conventional table, which can lead to problems such as hand tremors and reduced input accuracy.
 このような問題の解決手段として、3次元空間でマウス等の精度の高いデバイスを利用可能とする技術も提案されている。しかしながら、マウス等の平面用デバイスは、原則として特定の平面上の座標情報を入力する用途で利用されるため、3次元空間を対象として、任意の空間上の位置をポインティングすることは困難であった。この点について、VRコントローラ等の6DoF情報を入力できるコントローラとマウスとを併用する処理も模索されている。しかし、2つの異なるポインティングデバイスを同時に動作させると、従来の1つのポインタで操作をしてきたUIが破綻してしまうため、これら複数のデバイスは、通常、排他で動作するよう実装されている。このため、3次元空間上でのポインティングに関して、精度よく、かつ、より操作性を向上させる技術が望まれていた。 As a solution to these problems, technology has been proposed that allows the use of highly accurate devices such as a mouse in three-dimensional space. However, planar devices such as a mouse are generally used to input coordinate information on a specific plane, making it difficult to point to any position in three-dimensional space. In response to this, processing has been explored that uses a mouse in combination with a controller that can input 6DoF information, such as a VR controller. However, operating two different pointing devices simultaneously would break down the conventional UI that was operated with a single pointer, so these multiple devices are usually implemented to operate exclusively. For this reason, technology that improves accuracy and operability when pointing in three-dimensional space has been desired.
 そこで、本開示に係る表示制御装置100は、下記の構成により、かかる課題を解決する。すなわち、表示制御装置100は、第1のコントローラから、3次元空間内の位置姿勢情報を取得し、3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面を3次元空間上に表示する。さらに、表示制御装置100は、第2のコントローラから仮想座標平面上の座標を検出することで、空間上において精度よくユーザ10が所望する位置を選択することを可能とする。この場合、第1のコントローラは、ユーザ10が片手に把持する立体用デバイス20である。また、第2のコントローラは、ユーザ10が第1のコントローラを把持した手とは別の手(例えば、ユーザ10の利き手)に把持する平面用デバイス30である。 The display control device 100 according to the present disclosure solves this problem by the following configuration. That is, the display control device 100 acquires position and orientation information in three-dimensional space from the first controller, and displays a virtual coordinate plane in three-dimensional space on which two-dimensional coordinates can be specified based on the position and orientation information in the three-dimensional space. Furthermore, the display control device 100 detects coordinates on the virtual coordinate plane from the second controller, thereby enabling the user 10 to accurately select a desired position in space. In this case, the first controller is a three-dimensional device 20 that the user 10 holds in one hand. Also, the second controller is a two-dimensional device 30 that the user 10 holds in the hand other than the hand holding the first controller (e.g., the dominant hand of the user 10).
 すなわち、表示制御装置100は、2つの異なるデバイスを組み合わせて、ユーザ10が任意の位置を選択することを可能とする。具体的には、表示制御装置100は、空間上に仮想座標平面(図1に示す面50)を表示し、かかる面50を移動させるための入力手段として立体用デバイス20を用いる。また、表示制御装置100は、面50上にポインタ60を表示し、ポインタ60を2次元的に移動させるための入力手段として平面用デバイス30を用いる。これにより、ユーザ10は、空間上の任意の箇所(例えば、オブジェクト70の近傍)に面50を表示させたあとで、ポインタ60の微細な位置を指定可能な平面用デバイス30を利用して入力を行うことで、仮想空間での高精度なポインティング操作が可能になる。 In other words, the display control device 100 combines two different devices to enable the user 10 to select any position. Specifically, the display control device 100 displays a virtual coordinate plane (plane 50 shown in FIG. 1) in space, and uses the three-dimensional device 20 as an input means for moving the plane 50. The display control device 100 also displays a pointer 60 on the plane 50, and uses the planar device 30 as an input means for moving the pointer 60 two-dimensionally. This allows the user 10 to perform highly accurate pointing operations in virtual space by inputting using the planar device 30, which can specify the precise position of the pointer 60, after displaying the plane 50 at any location in space (for example, near the object 70).
 上記の表示制御処理について、図1を例示して説明する。図1に示す例では、表示制御装置100は、仮想空間上において面50を表示する。面50には仮想的な持ち手が設定されており、ユーザ10は、持ち手を把持した立体用デバイス20を操作することで、あたかも自身の手で持ち手を把持し、面50を動かしているようなフィードバックを得ることができる。 The above display control process will be described with reference to FIG. 1. In the example shown in FIG. 1, the display control device 100 displays a surface 50 in a virtual space. A virtual hand is set on the surface 50, and the user 10 can obtain feedback as if he or she were holding the hand and moving the surface 50 by operating the three-dimensional device 20 holding the hand.
 ユーザ10は、面50をオブジェクト70の近傍に配置する。その後、ユーザ10は、卓上に置かれた平面用デバイス30を操作し、面50上でポインタ60の位置を指定する。面50上の座標と、仮想空間上の空間座標は関連付けられている。このため、表示制御装置100は、例えば、面50の位置姿勢(角度)に基づいて、ポインタ60の位置の延長線上でオブジェクト70の外形(3Dモデル)との交点を求めることができる。図1の例では、表示制御装置100は、ポインタ60に対応する位置として、オブジェクト70の任意の1点である空間座標65を求める。すなわち、ユーザ10は、ポインタ60の位置を面50上で指定することにより、オブジェクト70の特定の位置を指し示すことができる。これにより、ユーザ10は、オブジェクト70のうち編集したい点や辺、面などを高精度に選択することができる。 The user 10 places the surface 50 near the object 70. The user 10 then operates the planar device 30 placed on the table to specify the position of the pointer 60 on the surface 50. The coordinates on the surface 50 and the spatial coordinates in the virtual space are associated with each other. For this reason, the display control device 100 can, for example, obtain an intersection point between the extension line of the position of the pointer 60 and the outline (3D model) of the object 70 based on the position and orientation (angle) of the surface 50. In the example of FIG. 1, the display control device 100 obtains spatial coordinates 65, which is an arbitrary point on the object 70, as the position corresponding to the pointer 60. In other words, the user 10 can indicate a specific position on the object 70 by specifying the position of the pointer 60 on the surface 50. This allows the user 10 to select a point, edge, face, etc. that he or she wants to edit from the object 70 with high accuracy.
 続いて、図2を用いて、他の表示例を示す。図2は、実施形態に係る表示制御処理の一例を示す図(1)である。図2の例では、ユーザ10が立体用デバイス20を操作し(ステップS1)、図1と同一のオブジェクト70に向かう方向に面50を移動させた状態を示す。 Next, another display example will be shown using FIG. 2. FIG. 2 is a diagram (1) showing an example of the display control process according to the embodiment. The example in FIG. 2 shows a state in which the user 10 operates the three-dimensional device 20 (step S1) and moves the surface 50 in a direction toward the same object 70 as in FIG. 1.
 このように、ユーザ10は、面50をオブジェクト70に接するくらいの位置まで移動させることで、より正確にオブジェクト70の特定の位置を指定することができる。図2の例では、ユーザ10が、面50を移動させたのち、平面用デバイス30を操作し、オブジェクト70のうち特定の辺66を選択した状況を示している。 In this way, the user 10 can more accurately specify a specific position of the object 70 by moving the surface 50 to a position where it is almost in contact with the object 70. The example in FIG. 2 shows a situation in which the user 10 moves the surface 50, and then operates the planar device 30 to select a specific edge 66 of the object 70.
 続いて、図3を用いて、他の表示例を示す。図3は、実施形態に係る表示制御処理の一例を示す図(2)である。図3の上部に示した例では、図2と同様、ユーザ10が立体用デバイス20を操作し、オブジェクト70に向かう方向に面50を移動させた状態を示す。 Next, another display example will be shown using FIG. 3. FIG. 3 is a diagram (2) showing an example of the display control process according to the embodiment. The example shown in the upper part of FIG. 3 shows a state in which the user 10 operates the three-dimensional device 20 to move the surface 50 in a direction toward the object 70, similar to FIG. 2.
 一方、図3の下部に示した例では、ユーザ10の操作により、面50の位置が固定されている。この場合、ユーザ10は、操作対象をオブジェクト70に切り替え、オブジェクト70を面50に近づけるように操作することができる。かかる手段によっても、ユーザ10は、相対的に面50とオブジェクト70とを近接させることができるので、精度のよい入力を行うことができる。 On the other hand, in the example shown in the lower part of FIG. 3, the position of the surface 50 is fixed by the operation of the user 10. In this case, the user 10 can switch the operation target to the object 70 and operate the object 70 so as to move it closer to the surface 50. By using such a method, the user 10 can also move the surface 50 and the object 70 relatively closer to each other, thereby enabling accurate input.
 以上、図1乃至図3を用いて説明したように、実施形態に係る表示制御処理によれば、空間上に面50を表示することで、立体用デバイス20と平面用デバイス30とを併用して空間上の位置指定を行うことができるので、操作性および入力精度を向上させることができる。 As described above with reference to Figures 1 to 3, the display control process according to the embodiment displays a surface 50 in space, allowing the three-dimensional device 20 and the two-dimensional device 30 to be used together to specify a position in space, thereby improving operability and input accuracy.
 なお、図1における各々の装置は、表示制御システム1における機能を概念的に示すものであり、実施形態によって様々な態様をとりうる。例えば、表示制御装置100は、後述する機能ごとに異なる2台以上の装置で構成されてもよい。具体的には、表示制御装置100は、ディスプレイ部と情報処理部とが別々に構成される装置であってもよい。この場合、表示制御装置100の情報処理部は、サーバやPC(Personal Computer)など任意の情報処理装置であってもよい。 Note that each device in FIG. 1 conceptually represents a function in the display control system 1, and may take various forms depending on the embodiment. For example, the display control device 100 may be composed of two or more devices each having a different function, which will be described later. Specifically, the display control device 100 may be a device in which the display unit and the information processing unit are separately configured. In this case, the information processing unit of the display control device 100 may be any information processing device, such as a server or a PC (Personal Computer).
(1-2.実施形態に係る表示制御装置の構成)
 次に、表示制御装置100の構成について説明する。図4は、実施形態に係る表示制御装置100の構成例を示す図である。
(1-2. Configuration of the display control device according to the embodiment)
Next, a description will be given of the configuration of the display control device 100. Fig. 4 is a diagram showing an example of the configuration of the display control device 100 according to an embodiment.
 図4に示すように、表示制御装置100は、通信部110と、記憶部120と、制御部130と、センサ部140と、表示部150を有する。なお、表示制御装置100は、表示制御装置100を操作するユーザ10から各種操作を受け付ける入力部(操作ボタンやタッチパネル等)を有してもよい。 As shown in FIG. 4, the display control device 100 has a communication unit 110, a storage unit 120, a control unit 130, a sensor unit 140, and a display unit 150. The display control device 100 may also have an input unit (such as an operation button or a touch panel) that accepts various operations from a user 10 who operates the display control device 100.
 通信部110は、例えば、NIC(Network Interface Card)やネットワークインタフェイスコントローラ(Network Interface Controller)等によって実現される。通信部110は、ネットワークNと有線または無線で接続され、ネットワークNを介して、立体用デバイス20や平面用デバイス30等と情報の送受信を行う。ネットワークNは、例えば、Bluetooth(登録商標)、インターネット、Wi-Fi(登録商標)、UWB(Ultra Wide Band)、LPWA(Low Power Wide Area)等の無線通信規格もしくは方式で実現される。 The communication unit 110 is realized, for example, by a NIC (Network Interface Card) or a network interface controller. The communication unit 110 is connected to a network N by wire or wirelessly, and transmits and receives information to and from the three-dimensional device 20 and the planar device 30 via the network N. The network N is realized, for example, by a wireless communication standard or method such as Bluetooth (registered trademark), the Internet, Wi-Fi (registered trademark), UWB (Ultra Wide Band), or LPWA (Low Power Wide Area).
 記憶部120は、例えば、RAM(Random Access Memory)、フラッシュメモリ(Flash Memory)等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。 The storage unit 120 is realized, for example, by a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or an optical disk.
 記憶部120は、実施形態に係る表示制御処理に関する種々の情報を記憶する。例えば、記憶部120は、表示部150に表示される仮想オブジェクトの3Dモデル等を記憶する。 The storage unit 120 stores various information related to the display control process according to the embodiment. For example, the storage unit 120 stores a 3D model of a virtual object to be displayed on the display unit 150.
 センサ部140は、種々の環境情報を検知するセンサである。例えば、センサ部140は、表示制御装置100の外部を撮影する外向きカメラ、ユーザ10側を撮影する内向きカメラを含む。 The sensor unit 140 is a sensor that detects various types of environmental information. For example, the sensor unit 140 includes an outward-facing camera that captures images of the outside of the display control device 100 and an inward-facing camera that captures images of the user 10.
 例えば、センサ部140は、ユーザの眼前の空間を認識するための認識カメラとしての機能を有する。例えば、センサ部140は、予め空間位置のキャリブレーションを済ませた立体用デバイス20を撮影することで、立体用デバイス20の空間間上の位置を認識し、立体用デバイス20の位置姿勢情報を取得する。 For example, the sensor unit 140 functions as a recognition camera for recognizing the space in front of the user's eyes. For example, the sensor unit 140 recognizes the spatial position of the three-dimensional device 20 by photographing the three-dimensional device 20, whose spatial position has been calibrated in advance, and acquires position and orientation information of the three-dimensional device 20.
 また、センサ部140は、表示制御装置100の前方に位置する被写体(例えば、実空間に位置する実オブジェクト)を認識する。この場合、センサ部140は、ユーザの前方に位置する被写体の画像を取得するとともに、ステレオカメラで撮像された画像間の視差に基づき、表示制御装置100(言い換えれば、ユーザの視点の位置)から被写体までの距離を算出することができる。あるいは、センサ部140は、実オブジェクトまでの距離を検知可能なデプスセンサを用いて、実空間における距離を検出してもよい。 The sensor unit 140 also recognizes a subject (e.g., a real object located in real space) located in front of the display control device 100. In this case, the sensor unit 140 acquires an image of the subject located in front of the user, and can calculate the distance from the display control device 100 (in other words, the position of the user's viewpoint) to the subject based on the parallax between images captured by the stereo camera. Alternatively, the sensor unit 140 may detect the distance in real space using a depth sensor capable of detecting the distance to a real object.
 また、センサ部140は、認識カメラとしての機能のほか、ユーザの身体の向き、傾き、動きや移動速度等、ユーザの動作に関する各種情報を検知する機能を有してもよい。具体的には、センサ部140は、ユーザの動作に関する情報として、ユーザの頭部や姿勢に関する情報、ユーザの頭部や身体の動き(加速度や角速度)、視野の方向や視点移動の速度等を検知する。例えば、センサ部140は、3軸加速度センサや、ジャイロセンサや、速度センサ等の各種モーションセンサとして機能し、ユーザの動作に関する情報を検知する。より具体的には、センサ部140は、ユーザの頭部の動きとして、ヨー(yaw)方向、ピッチ(pitch)方向、及びロール(roll)方向それぞれの成分を検出することで、ユーザの頭部の位置及び姿勢のうち少なくともいずれかの変化を検知する。なお、センサ部140は、必ずしも表示制御装置100に備えられることを要せず、例えば、表示制御装置100と有線もしくは無線で接続される外部センサであってもよい。 In addition to the function as a recognition camera, the sensor unit 140 may have a function of detecting various information related to the user's motion, such as the orientation, inclination, motion, and speed of the user's body. Specifically, the sensor unit 140 detects information related to the user's motion, such as information related to the user's head and posture, the motion of the user's head and body (acceleration and angular velocity), the direction of the field of view, and the speed of the viewpoint movement. For example, the sensor unit 140 functions as various motion sensors such as a three-axis acceleration sensor, a gyro sensor, and a speed sensor, and detects information related to the user's motion. More specifically, the sensor unit 140 detects at least one change in the position and posture of the user's head by detecting the components of the yaw direction, pitch direction, and roll direction as the motion of the user's head. Note that the sensor unit 140 does not necessarily need to be provided in the display control device 100, and may be, for example, an external sensor connected to the display control device 100 by wire or wirelessly.
 表示部150は、制御部130から出力された各種情報を表示する。例えば、表示部150は、ユーザ10に向けて映像を出力するディスプレイである。なお、表示部150は、音声を出力する音声出力部(スピーカー等)を含んでもよい。 The display unit 150 displays various information output from the control unit 130. For example, the display unit 150 is a display that outputs video to the user 10. The display unit 150 may also include an audio output unit (such as a speaker) that outputs audio.
 制御部130は、例えば、CPU(Central Processing Unit)やMPU(Micro Processing Unit)、GPU等によって、表示制御装置100内部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。また、制御部130は、コントローラ(controller)であり、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現されてもよい。 The control unit 130 is realized, for example, by a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU, etc., executing a program stored inside the display control device 100 using a RAM or the like as a working area. The control unit 130 is also a controller, and may be realized, for example, by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
 図4に示すように、制御部130は、取得部131と、第1検出部132と、第2検出部133と、表示制御部134とを有する。 As shown in FIG. 4, the control unit 130 has an acquisition unit 131, a first detection unit 132, a second detection unit 133, and a display control unit 134.
 取得部131は、各種情報を取得する。例えば、取得部131は、第1のコントローラである立体用デバイス20から、3次元空間内の位置姿勢情報を取得する。また、取得部131は、第2のコントローラである平面用デバイス30から、面50上の位置を示す情報である位置情報を取得する。 The acquisition unit 131 acquires various types of information. For example, the acquisition unit 131 acquires position and orientation information in three-dimensional space from the three-dimensional device 20, which is the first controller. The acquisition unit 131 also acquires position information, which is information indicating a position on the surface 50, from the planar device 30, which is the second controller.
 第1検出部132は、立体用デバイス20による入力に基づき、面50の3次元空間上の移動を検出する。具体的には、第1検出部132は、センサ部140により撮影された立体用デバイス20の画像と、立体用デバイス20から送信される位置姿勢情報とを組み合わせることにより、立体用デバイス20の空間上の位置姿勢情報を検出(算出)する。すなわち、立体用デバイス20の位置姿勢情報は、センサ部140の外向きカメラで立体用デバイス20を撮影した画像情報、および、立体用デバイス20が備えるIMU(Inertial Measurement Unit)情報とをマージすることで求められる。なお、立体用デバイス20による入力とは、立体用デバイス20が空間上で操作されたことによりIMU等で計測された位置姿勢情報(もしくは、その変動値)が、通信部110を介して表示制御装置100に入力されることをいう。 The first detection unit 132 detects the movement of the surface 50 in three-dimensional space based on the input from the stereoscopic device 20. Specifically, the first detection unit 132 detects (calculates) the spatial position and orientation information of the stereoscopic device 20 by combining the image of the stereoscopic device 20 captured by the sensor unit 140 and the position and orientation information transmitted from the stereoscopic device 20. That is, the position and orientation information of the stereoscopic device 20 is obtained by merging the image information captured by the outward camera of the sensor unit 140 and the IMU (Inertial Measurement Unit) information provided in the stereoscopic device 20. Note that the input from the stereoscopic device 20 refers to the position and orientation information (or the variation value thereof) measured by the IMU or the like as a result of the stereoscopic device 20 being operated in space being input to the display control device 100 via the communication unit 110.
 例えば、第1検出部132は、面50に仮想的に付与された持ち手を把持したユーザ10が立体用デバイス20を操作し、操作の結果に基づき得られた位置姿勢情報の変動値を検出することで、面50の移動および位置姿勢情報を検出する。 For example, the first detection unit 132 detects the movement and position/orientation information of the surface 50 by detecting the fluctuation value of the position/orientation information obtained based on the result of the operation of the three-dimensional device 20 by the user 10 holding a handle virtually attached to the surface 50.
 なお、立体用デバイス20の位置検出には、上記の処理以外にも、種々の既知の技術が利用されてもよい。例えば、立体用デバイス20に赤外線LED(Infrared Light Emitting Diode)が埋め込まれている場合、第1検出部132は、赤外線LEDが光っている様子をセンサ部140の外向きカメラで観測するとともに、IMU情報を組み合わせることで、立体用デバイス20の位置や姿勢を検出することができる。また、第1検出部132は、ハンドトラック技術を利用し、センサ部140のカメラや室内に設置されたカメラなどでユーザ10の手を撮影して手の位置姿勢を推定することで、立体用デバイス20の位置や姿勢を検出することもできる。 Note that, in addition to the above processing, various known technologies may be used to detect the position of the three-dimensional device 20. For example, if an infrared LED (Infrared Light Emitting Diode) is embedded in the three-dimensional device 20, the first detection unit 132 can detect the position and posture of the three-dimensional device 20 by observing the state in which the infrared LED is shining with the outward camera of the sensor unit 140 and combining it with IMU information. In addition, the first detection unit 132 can also detect the position and posture of the three-dimensional device 20 by using hand truck technology to capture the hand of the user 10 with the camera of the sensor unit 140 or a camera installed in the room and estimating the position and posture of the hand.
 また、第1検出部132は、面50そのものの移動ではなく、面50の相対的な移動を検出してもよい。すなわち、第1検出部132は、立体用デバイス20による入力に基づき、面50を用いて位置指定が可能なオブジェクトの移動を検出してもよい。この場合、第1検出部132は、ユーザ10が移動させたいオブジェクトの指定を受け付け、ユーザ10の操作にしたがい、オブジェクトを面50に近づけたり遠ざけたりする処理を行う。 The first detection unit 132 may also detect the relative movement of the surface 50, rather than the movement of the surface 50 itself. In other words, the first detection unit 132 may detect the movement of an object whose position can be specified using the surface 50, based on an input from the three-dimensional device 20. In this case, the first detection unit 132 receives the designation of the object that the user 10 wishes to move, and performs processing to move the object closer to or farther away from the surface 50 in accordance with the operation of the user 10.
 第2検出部133は、第2のコントローラである平面用デバイス30による入力に基づき、面50で指定された位置(例えば、指定位置を示したポインタの位置)を検出する。なお、面50が、立体用デバイス20の操作に基づき、空間上において位置姿勢情報(平面の傾き等)を与えられていることから、面50上に所在するポインタも、2次元位置(面50の座標)のみならず、面50に対応する位置姿勢情報を含む。 The second detection unit 133 detects a position specified on the surface 50 (for example, the position of a pointer indicating the specified position) based on an input from the planar device 30, which is the second controller. Note that since the surface 50 is given position and orientation information (such as the inclination of the plane) in space based on the operation of the three-dimensional device 20, the pointer located on the surface 50 also includes not only a two-dimensional position (the coordinates of the surface 50) but also position and orientation information corresponding to the surface 50.
 また、第2検出部133は、面50上で指定された位置に基づき、オブジェクトにおいて選択された部位を検出する。例えば、第2検出部133は、ユーザ10による指定や初期設定に基づき、ポインタで示された位置に対応するオブジェクトの点や辺、面などの部位を検出する。 The second detection unit 133 also detects a selected part of the object based on the position specified on the surface 50. For example, the second detection unit 133 detects a part of the object, such as a point, edge, or surface, that corresponds to the position indicated by the pointer, based on the specification by the user 10 or initial settings.
 このとき、第2検出部133は、面50の角度およびポインタの位置と、オブジェクトの位置との関係に基づいて、オブジェクトにおいて選択された部位を検出してもよい。具体的には、第2検出部133は、面50の角度およびポインタの位置から求められるポインタの延長線上において、オブジェクトと交差する点を、ユーザ10が指定した部位として検出する。 At this time, the second detection unit 133 may detect the part selected on the object based on the relationship between the angle of the surface 50 and the position of the pointer, and the position of the object. Specifically, the second detection unit 133 detects the point on the extension line of the pointer, which is determined from the angle of the surface 50 and the position of the pointer, that intersects with the object as the part designated by the user 10.
 例えば、第2検出部133は、ポインタ位置を始点として、面50に対して法線方向の延長線上においてオブジェクトと交差する点を、ユーザ10が指定した部位として検出してもよい。あるいは、第2検出部133は、ユーザ10の視点位置とポインタ位置とを結んだ線の延長線上においてオブジェクトと交差する点を、ユーザ10が指定した部位として検出してもよい。あるいは、第2検出部133は、面50の角度にかかわらず、面50上で指定されたポインタ位置から水平に延伸した線上においてオブジェクトと交差する位置に基づいて、オブジェクトにおいて選択された部位を検出してもよい。 For example, the second detection unit 133 may detect, as the part specified by the user 10, a point where an extension of a line normal to the surface 50, starting from the pointer position, intersects with the object. Alternatively, the second detection unit 133 may detect, as the part specified by the user 10, a point where an extension of a line connecting the viewpoint position of the user 10 and the pointer position intersects with the object. Alternatively, the second detection unit 133 may detect the part selected on the object based on the position where a line extending horizontally from the specified pointer position on the surface 50 intersects with the object, regardless of the angle of the surface 50.
 表示制御部134は、制御部130から出力される情報を表示部150に表示するよう制御する。すなわち、表示制御部134は、映像コンテンツとしてレンダリングされた仮想空間映像を出力先デバイスに出力する。なお、出力先デバイスは、表示部150のようなヘッドマウントディスプレイに限らず、外部ディスプレイやスマートフォン、テレビなどの映像出力装置でもよい。 The display control unit 134 controls the display unit 150 to display the information output from the control unit 130. In other words, the display control unit 134 outputs the virtual space image rendered as video content to the output destination device. Note that the output destination device is not limited to a head-mounted display such as the display unit 150, but may also be an external display, a smartphone, a television, or other video output device.
 例えば、表示制御部134は、立体用デバイス20の3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面である面50を、3次元空間上に表示する。 For example, the display control unit 134 displays a surface 50, which is a virtual coordinate plane on which two-dimensional coordinates can be specified, in three-dimensional space based on the position and orientation information of the three-dimensional device 20 in three-dimensional space.
 また、表示制御部134は、平面用デバイス30から入力される面50上の位置情報に基づいて、面50上にポインタを表示するよう制御する。また、表示制御部134は、ポインタ位置に基づいて、オブジェクトにおいて何らかの部位が選択された場合、当該選択された位置を明示する。 The display control unit 134 also controls the display of a pointer on the surface 50 based on the position information on the surface 50 input from the planar device 30. When a part of an object is selected based on the pointer position, the display control unit 134 also clearly indicates the selected position.
(1-3.実施形態に係る処理の手順)
 次に、図5を用いて、実施形態に係る処理の手順について説明する。図5は、実施形態に係る表示制御処理の流れを示すシーケンス図である。
(1-3. Processing Procedure According to the Embodiment)
Next, a procedure of a process according to the embodiment will be described with reference to Fig. 5. Fig. 5 is a sequence diagram showing a flow of a display control process according to the embodiment.
 図5に示すように、立体用デバイス20は、IMU等を用いて自装置の位置姿勢情報を取得する(ステップS11)。そして、立体用デバイス20は、取得した位置姿勢情報を表示制御装置100に送信する(ステップS12)。 As shown in FIG. 5, the 3D device 20 acquires position and orientation information of its own device using an IMU or the like (step S11). Then, the 3D device 20 transmits the acquired position and orientation information to the display control device 100 (step S12).
 表示制御装置100は、外向きカメラ等を用いて立体用デバイス20を撮影する(ステップS21)。そして、表示制御装置100は、立体用デバイス20の位置姿勢を推定する(ステップS22)。 The display control device 100 captures an image of the 3D device 20 using an outward-facing camera or the like (step S21). Then, the display control device 100 estimates the position and orientation of the 3D device 20 (step S22).
 その後、表示制御装置100は、立体用デバイス20から取得した位置姿勢情報および撮影により得た位置姿勢情報を組み合わせて、立体用デバイス20の位置姿勢を特定する(ステップS23)。表示制御装置100は、立体用デバイス20の位置姿勢情報に基づいて、仮想空間における面50の位置姿勢を検出する(ステップS24)。なお、表示制御装置100は、検出した面50の位置姿勢に基づいて、面50を表示部150に表示する。 Then, the display control device 100 combines the position and orientation information acquired from the three-dimensional device 20 and the position and orientation information acquired by photographing to identify the position and orientation of the three-dimensional device 20 (step S23). The display control device 100 detects the position and orientation of the surface 50 in the virtual space based on the position and orientation information of the three-dimensional device 20 (step S24). The display control device 100 displays the surface 50 on the display unit 150 based on the detected position and orientation of the surface 50.
 面50の位置が決まると、平面用デバイス30は、面50上の位置を特定するための平面座標情報を取得する(ステップS31)。そして、平面用デバイス30は、取得した平面座標情報を表示制御装置100に送信する(ステップS32)。 Once the position of the surface 50 is determined, the planar device 30 acquires planar coordinate information for identifying the position on the surface 50 (step S31). The planar device 30 then transmits the acquired planar coordinate information to the display control device 100 (step S32).
 表示制御装置100は、平面用デバイス30から取得した情報に基づいて、面50上のポインタ位置を検出する(ステップS25)。そして、表示制御装置100は、ポインタの位置姿勢に基づき、ユーザ10がオブジェクト内において指定した位置を検出する(ステップS26)。なお、この場合のポインタの姿勢情報とは、例えば面50の角度に基づき求められる。 The display control device 100 detects the pointer position on the surface 50 based on the information acquired from the planar device 30 (step S25). Then, the display control device 100 detects the position specified by the user 10 within the object based on the position and orientation of the pointer (step S26). Note that the orientation information of the pointer in this case is calculated based on, for example, the angle of the surface 50.
(1-4.変形例)
(1-4-1.第1の変形例:1つのデバイスのみの操作)
 上記実施形態に係る処理は、様々な変形を伴ってもよい。例えば、上記実施形態では、ユーザ10が、立体用デバイス20および平面用デバイス30の両方を用いて、面50およびポインタを操作する例を示した。しかし、ユーザ10は、いずれか片方のデバイスを用いて、実施形態に係る情報処理を実行させることもできる。
(1-4. Modified Examples)
(1-4-1. First Modification: Operation of Only One Device)
The process according to the embodiment may be modified in various ways. For example, in the embodiment, the user 10 operates the surface 50 and the pointer using both the three-dimensional device 20 and the planar device 30. However, the user 10 may execute the information process according to the embodiment using either one of the devices.
 すなわち、表示制御装置100は、第1のコントローラ(立体用デバイス20)と第2のコントローラ(平面用デバイス30)とが同一の装置であっても、本開示に係る表示制御処理を行うことができる。この場合、第1検出部132は、同一の装置が実空間上の平面から所定距離を離れた位置に所在する場合に、当該同一の装置を第1のコントローラとして検出する。また、第2検出部133は、当該同一の装置が実空間上の平面から所定距離以内に所在する場合に、当該同一の装置を第2のコントローラとして検出する。 In other words, the display control device 100 can perform the display control processing according to the present disclosure even if the first controller (three-dimensional device 20) and the second controller (planar device 30) are the same device. In this case, the first detection unit 132 detects the same device as the first controller when the same device is located at a predetermined distance from the plane in real space. Furthermore, the second detection unit 133 detects the same device as the second controller when the same device is located within a predetermined distance from the plane in real space.
 あるいは、第1検出部132は、同一の装置のユーザ10から、当該同一の装置を第1のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を第1のコントローラとして検出してもよい。また、第2検出部133は、同一の装置のユーザ10から、当該同一の装置を第2のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を第2のコントローラとして検出してもよい。 Alternatively, when the first detection unit 132 receives a designation from a user 10 of the same device to use the same device as a first controller, it may detect the same device as a first controller. Furthermore, when the second detection unit 133 receives a designation from a user 10 of the same device to use the same device as a second controller, it may detect the same device as a second controller.
 上記の処理について、図6Aおよび図6Bを用いて説明する。図6Aは、第1の変形例に係るデバイス操作を説明するための図である。 The above process will be explained using Figures 6A and 6B. Figure 6A is a diagram for explaining device operations related to the first modified example.
 図6Aの左部では、ユーザ10は、実施形態と同様、立体用デバイス20および平面用デバイス30を操作する。 In the left part of FIG. 6A, the user 10 operates the three-dimensional device 20 and the planar device 30, as in the embodiment.
 一方、図6Aの右部では、ユーザ10は、平面用デバイス30のみを操作する。この場合、表示制御装置100は、平面用デバイス30が卓上から持ち上げられた場合、その操作を認識する。そして、表示制御装置100は、平面用デバイス30が空中にある場合、平面用デバイス30を立体用デバイス20と同様の操作を行うデバイスと認識する。具体的には、表示制御装置100は、平面用デバイス30が空中にある場合、かかるデバイスの動きは、面50を移動させるためのデバイスとして認識する。 On the other hand, in the right part of FIG. 6A, the user 10 operates only the flat device 30. In this case, when the flat device 30 is lifted from the table, the display control device 100 recognizes this operation. Then, when the flat device 30 is in the air, the display control device 100 recognizes the flat device 30 as a device that performs the same operation as the three-dimensional device 20. Specifically, when the flat device 30 is in the air, the display control device 100 recognizes the movement of the device as a device for moving the surface 50.
 そして、ユーザ10が平面用デバイス30を卓上に置いた場合、表示制御装置100は、平面用デバイス30を面50上のポインタを操作するためのデバイスとして認識する。かかる処理により、ユーザ10は、1つのデバイスのみで、実施形態に係る処理を実現させることができる。 When the user 10 places the planar device 30 on a table, the display control device 100 recognizes the planar device 30 as a device for operating a pointer on the surface 50. This process allows the user 10 to realize the processing according to the embodiment using only one device.
 なお、表示制御装置100は、立体用デバイス20を平面用デバイス30と同じように取り扱うこともできる。例えば、表示制御装置100は、立体用デバイス20が卓上近傍に移動された場合、かかる移動を認識し、立体用デバイス20を平面用デバイス30として取り扱う。より具体的には、立体用デバイス20が卓上近傍に所在する場合、表示制御装置100は、立体用デバイス20からの位置姿勢の入力を無効とし、タッチペンのように机に触れて位置情報を入力する平面用デバイス30として取り扱うことができる。 The display control device 100 can also handle the three-dimensional device 20 in the same way as the planar device 30. For example, when the three-dimensional device 20 is moved near the tabletop, the display control device 100 recognizes such movement and handles the three-dimensional device 20 as the planar device 30. More specifically, when the three-dimensional device 20 is located near the tabletop, the display control device 100 can invalidate the input of position and orientation from the three-dimensional device 20 and handle it as a planar device 30 that inputs position information by touching the desk like a touch pen.
 なお、表示制御装置100は、1つのデバイスのみで実施形態に係る処理を行う場合、上記処理のほか、ユーザ10から明示的に切り替え動作を受け付けてもよい。例えば、立体用デバイス20が2次元位置を特定可能な機能(トラックボール等)を素なる場合、ユーザ10は、上記の切り替えを柔軟に行うことができる。 When the display control device 100 performs the processing according to the embodiment using only one device, in addition to the above processing, the display control device 100 may explicitly accept a switching operation from the user 10. For example, when the stereoscopic device 20 has a function capable of identifying a two-dimensional position (such as a trackball), the user 10 can flexibly perform the above switching.
 また、表示制御装置100は、表示制御装置100の表示部150を一時的に上部に外すことができる場合(フリップアップ等と称される)、フリップアップ中は、立体用デバイス20を平面用デバイス30として取り扱ってもよい。これは、フリップアップ中は、ユーザ10が立体視をできなくなることによる。 Furthermore, when the display unit 150 of the display control device 100 can be temporarily removed to the top (referred to as flip-up, etc.), the display control device 100 may treat the stereoscopic device 20 as the planar device 30 while it is flipped up. This is because the user 10 cannot see stereoscopically while it is flipped up.
 上記処理について、図6Bを用いて、処理の流れを説明する。図6Bは、第1の変形例に係る表示制御処理の流れを示すフローチャートである。 The flow of the above process will be explained using FIG. 6B. FIG. 6B is a flowchart showing the flow of the display control process according to the first modified example.
 表示制御装置100は、面50およびポインタ操作を行う2つのデバイスが、それぞれ別の装置であるか否かを判定する(ステップS41)。別の装置である場合(ステップS41;Yes)、表示制御装置100は、実施形態と同様、もともとの設定に即した動きを各デバイスに設定する。 The display control device 100 determines whether the surface 50 and the two devices performing the pointer operation are separate devices (step S41). If they are separate devices (step S41; Yes), the display control device 100 sets each device to a movement that conforms to the original settings, as in the embodiment.
 面50およびポインタ操作を行う2つのデバイスが別の装置でない場合(ステップS41;No)、表示制御装置100は、卓上(3次元空間における任意の平面上)において当該デバイスが認識されているかを判定する(ステップS42)。デバイスが卓上にある場合(ステップS42;Yes)、表示制御装置100は、当該デバイスを平面用デバイス30として取り扱う。 If the surface 50 and the two devices performing the pointer operation are not separate devices (step S41; No), the display control device 100 determines whether the device is recognized on a tabletop (any plane in three-dimensional space) (step S42). If the device is on a tabletop (step S42; Yes), the display control device 100 treats the device as a flat-surface device 30.
 一方、デバイスが卓上にない場合(ステップS42;No)、表示制御装置100は、当該デバイスを平面用デバイス30であるとユーザ10が指定しているか否かを判定する(ステップS43)。ユーザ10が当該デバイスを平面用デバイス30と指定している場合(ステップS43;Yes)、表示制御装置100は、当該デバイスを平面用デバイス30として取り扱う。 On the other hand, if the device is not on the table (step S42; No), the display control device 100 determines whether or not the user 10 has specified the device as a flat-screen device 30 (step S43). If the user 10 has specified the device as a flat-screen device 30 (step S43; Yes), the display control device 100 treats the device as a flat-screen device 30.
 一方、当該デバイスが平面用デバイス30と指定されていない場合(ステップS43;No)、表示制御装置100は、HMDがフリップアップされているか否かを判定する(ステップS44)。 On the other hand, if the device is not specified as a flat device 30 (step S43; No), the display control device 100 determines whether the HMD is flipped up (step S44).
 HMDがフリップアップされていない場合(ステップS44;No)、表示制御装置100は、当該デバイスを立体用デバイス20として取り扱う。一方、HMDがフリップアップされている場合(ステップS44;Yes)、表示制御装置100は、当該デバイスを平面用デバイス30として取り扱う。 If the HMD is not flipped up (step S44; No), the display control device 100 treats the device as a stereoscopic device 20. On the other hand, if the HMD is flipped up (step S44; Yes), the display control device 100 treats the device as a planar device 30.
(1-4-2.第2の変形例:操作感覚の調整)
 上記実施形態では、ユーザ10が立体用デバイス20を操作することで面50を移動させる例について説明した。例えば、面50には仮想的な持ち手が付与されており、ユーザ10は、立体用デバイス20を操作して持ち手を把持することで、面50をあたかも現実の物体のように動かすことができる。しかしながら、仮想的な空間では、現実にユーザ10が手を伸ばした距離よりも、より遠くに面50を移動させたいという状況も想定される。あるいは、仮想的な空間では、ユーザ10が手を伸ばした距離よりも、より遠くに面50が設置されており、持ち手を把持できないという状況も想定される。
(1-4-2. Second Modification: Adjustment of Operation Feel)
In the above embodiment, an example has been described in which the user 10 moves the surface 50 by operating the three-dimensional device 20. For example, the surface 50 is provided with a virtual handle, and the user 10 can move the surface 50 as if it were a real object by operating the three-dimensional device 20 and gripping the handle. However, in a virtual space, a situation is also assumed in which the user 10 wants to move the surface 50 farther than the distance the user 10 actually reaches out. Alternatively, a situation is also assumed in which the surface 50 is installed farther than the distance the user 10 reaches out, and the user 10 cannot grip the handle.
 このような状況の処理について、図7を用いて説明する。図7は、第2の変形例に係る表示制御処理を説明するための図(1)である。図7の左部では、ユーザ10は、実施形態と同様、手で面50の持ち手を把持し、距離200だけ手を動かすことで、面50を移動させる。 The processing for such a situation will be explained using FIG. 7. FIG. 7 is a diagram (1) for explaining the display control processing relating to the second modified example. In the left part of FIG. 7, the user 10 grasps the handle of the surface 50 with his/her hand, as in the embodiment, and moves the hand by a distance of 200 to move the surface 50.
 一方、図7の右部では、ユーザ10は、実際に手を移動する距離205よりも、より遠くに面50を移動させることを所望する。この場合、表示制御装置100は、仮想空間上に手から面50に向かう方向に光線(レイ:Ray)を表示するなどして、ユーザ10に対してガイドを提供する。例えば、ユーザ10は、遠隔操作モードなど、予め表示制御装置100によって設定された表示モードを明示的に選択するなどして、ガイドを表示させることができる。この場合、表示制御装置100は、ユーザ10が面50の持ち手を把持せずとも、ユーザ10の手の動きのみで、実際の手の移動距離よりも長い距離だけ面50を移動させるようにしてもよい。 On the other hand, in the right part of FIG. 7, the user 10 desires to move the surface 50 farther than the distance 205 that the user 10 actually moves his/her hand. In this case, the display control device 100 provides a guide to the user 10 by, for example, displaying a ray in the virtual space from the hand toward the surface 50. For example, the user 10 can display the guide by explicitly selecting a display mode that has been set in advance by the display control device 100, such as a remote operation mode. In this case, the display control device 100 may move the surface 50 a distance longer than the actual distance the user's 10 hand moves, using only the movement of the user's 10 hand, even if the user 10 does not hold the surface 50 with his/her hand.
 また、表示制御装置100は、面50が遠くに設置されている場合、手から延長されたガイドに基づいて、あたかもユーザ10が持ち手を把持しているかのように取り扱い、面50を移動させてもよい。 In addition, when the surface 50 is placed far away, the display control device 100 may move the surface 50 by treating it as if the user 10 were holding it with their hand, based on a guide extended from the hand.
 また、表示制御装置100は、ユーザ10が面50の持ち手ではない空間上の領域を把持した場合、図3に示したように、面50ではなくオブジェクトを移動させるよう表示制御してもよい。 Furthermore, when the user 10 grasps an area in space that is not the handle of the surface 50, the display control device 100 may perform display control to move the object instead of the surface 50, as shown in FIG. 3.
 また、表示制御装置100は、面50のみならず、オブジェクトを遠隔操作できるよう制御してもよい。この点について、図8を用いて説明する。図8は、第2の変形例に係る表示制御処理を説明するための図(2)である。 The display control device 100 may also control not only the surface 50 but also objects so that they can be remotely operated. This will be explained using FIG. 8. FIG. 8 is a diagram (2) for explaining the display control process according to the second modified example.
 図8の例では、表示制御装置100は、面50を突き抜けて、オブジェクト70を把持するためのガイド210を表示する。ユーザ10は、ガイド210を介してオブジェクト70を把持したり選択したりできる。これにより、ユーザ10は、仮想空間において手が届かない位置に設置されたオブジェクト70に対しても、把持や移動といった操作を行うことができる。 In the example of FIG. 8, the display control device 100 displays a guide 210 for grasping the object 70 through the surface 50. The user 10 can grasp or select the object 70 via the guide 210. This allows the user 10 to perform operations such as grasping or moving an object 70 that is placed in a position that is out of reach in the virtual space.
 なお、ユーザ10がオブジェクト70の一部を選択することを所望する場合に、オブジェクト70と面50とが接触していると、接触しているオブジェクト70の一部を選択できるが、接触していない部位は選択できない事態が発生しうる。この場合、表示制御装置100は、ポインタの動作モードを切り替えるなどして、面上のポインタから特定方向に光線(ガイド)を出し、オブジェクト70の一部を選択できるようにしてもよい。特定方向とは、例えば、面50に対する法線方向や、ユーザ10の視点位置とポインタを結んだ方向等、ユーザ10の視点位置とポインタ位置に基づき決定可能な方向である。また、表示制御装置100は、ユーザ10がオブジェクト70の一部を選択しやすくするため、オブジェクト70と面50が接触していない状態から、接触した際には、接触した部位に面上のポインタを移動させてもよい。 Note that when the user 10 wishes to select a part of the object 70, if the object 70 and the surface 50 are in contact, the part of the object 70 that is in contact can be selected, but a situation may occur in which a part that is not in contact cannot be selected. In this case, the display control device 100 may switch the pointer operation mode, for example, to emit a ray (guide) from the pointer on the surface in a specific direction, so that a part of the object 70 can be selected. The specific direction is, for example, a direction that can be determined based on the viewpoint position and pointer position of the user 10, such as a normal direction to the surface 50 or a direction connecting the viewpoint position of the user 10 and the pointer. Furthermore, in order to make it easier for the user 10 to select a part of the object 70, the display control device 100 may move the pointer on the surface to the part that is in contact when the object 70 and the surface 50 are not in contact with each other.
 また、表示制御装置100は、仮想空間上で面50が把持できない位置に移動しないよう、面50の位置を固定してもよい。すなわち、表示制御部134は、立体用デバイス20の3次元空間内の位置姿勢情報にかかわらず、ユーザ10の視点位置に基づいて、面50を3次元空間上に表示してもよい。この点について、図9を用いて説明する。図9は、第2の変形例に係る表示制御処理を説明するための図(3)である。 The display control device 100 may also fix the position of the surface 50 so that the surface 50 does not move to a position in the virtual space that cannot be grasped. In other words, the display control unit 134 may display the surface 50 in three-dimensional space based on the viewpoint position of the user 10, regardless of the position and orientation information of the three-dimensional device 20 in three-dimensional space. This point will be explained using FIG. 9. FIG. 9 is a diagram (3) for explaining the display control process related to the second modified example.
 図9の例では、表示制御装置100は、表示制御装置100から一定の距離215に面50を表示するよう制御する。この場合、面50は、表示制御装置100の移動に連動して移動する。これにより、ユーザ10は、立体用デバイス20を用いなくても、自身の視線をオブジェクト70に向けること等により、オブジェクト70の任意の位置を選択するための面50を移動させることができる。なお、常に表示制御装置100と連動して面50が移動すると、面50を一定の位置にとどめるのが困難となるため、表示制御装置100は、ユーザ10の要求に応じて面50の自動移動をオンもしくはオフにしてもよい。 In the example of FIG. 9, the display control device 100 controls so that the surface 50 is displayed at a fixed distance 215 from the display control device 100. In this case, the surface 50 moves in conjunction with the movement of the display control device 100. This allows the user 10 to move the surface 50 for selecting an arbitrary position of the object 70 by, for example, directing his or her line of sight toward the object 70 without using the stereoscopic device 20. Note that if the surface 50 always moves in conjunction with the display control device 100, it becomes difficult to keep the surface 50 in a fixed position, so the display control device 100 may turn the automatic movement of the surface 50 on or off in response to a request from the user 10.
 なお、表示制御装置100は、面50の位置姿勢を変更するだけでなく、大きさを変更してもよい。例えば、表示制御装置100は、ユーザ10がマウスホイールを回転させた操作等に連動して、面50の大きさを変更する。また、表示制御装置100は、面50の色を変更してもよい。 Note that the display control device 100 may not only change the position and orientation of the surface 50, but also change the size. For example, the display control device 100 changes the size of the surface 50 in conjunction with an operation such as the user 10 rotating the mouse wheel. The display control device 100 may also change the color of the surface 50.
 また、表示制御装置100は、面50が設置された位置に応じて、仮想空間のオブジェクトの断面を切り出す処理や、面50の手前側の領域に少しでも入っているオブジェクトを非表示とする処理等を行ってもよい。すなわち、表示制御装置100は、面50の位置に応じて、オブジェクトの表示内容を変更してもより。これにより、表示制御装置100は、オブジェクトの視認性を向上させることができる。 The display control device 100 may also perform processing such as cutting out a cross section of an object in virtual space, or hiding any object that is even slightly within the area in front of the surface 50, depending on the position where the surface 50 is installed. In other words, the display control device 100 may change the display content of an object depending on the position of the surface 50. This allows the display control device 100 to improve the visibility of the object.
(1-4-3.第3の変形例:実空間に重畳する処理)
 また、表示制御装置100は、AR表示のように、面50を実空間上に重畳してもよい。この場合、表示制御装置100は、面50を実世界の物体の面や辺にスナップして、ユーザが位置調整しやすいよう動きを調整してもよい。すなわち、表示制御部134が実空間の3次元空間上に面50を表示するとき、第1検出部132は、立体用デバイス20による入力に基づき、実空間上の任意の平面に面50が所定距離が接近した場合に、当該平面に面50が重畳して表示されるよう、当該平面に吸着(スナップ)された位置で面50を検出してもよい。
(1-4-3. Third Modification: Processing for Superimposing on Real Space)
The display control device 100 may also superimpose the surface 50 on the real space, as in AR display. In this case, the display control device 100 may snap the surface 50 to a surface or edge of an object in the real world, and adjust the movement so that the user can easily adjust the position. That is, when the display control unit 134 displays the surface 50 in a three-dimensional space in the real space, the first detection unit 132 may detect the surface 50 at a position where it is attached (snapped) to an arbitrary plane in the real space based on an input from the stereoscopic device 20, so that the surface 50 is displayed superimposed on the plane when the surface 50 approaches the plane by a predetermined distance.
 この点について、図10以下を用いて説明する。図10は、第3の変形例に係る表示制御処理を説明するための図(1)である。 This point will be explained using FIG. 10 and subsequent figures. FIG. 10 is a diagram (1) for explaining the display control process according to the third modified example.
 図10は、現実空間に面50が重畳された様子を示す。具体的には、図10の例では、現実のディスプレイ225の画面に、持ち手220が付与された面50が貼り付くように表示されている様子を示す。なお、面50がスナップされるような平面を認識した場合、表示制御装置100は、平面(この例ではディスプレイ225の外枠)に沿ってガイド230を表示する。 FIG. 10 shows a state in which a surface 50 is superimposed on real space. Specifically, the example in FIG. 10 shows a state in which a surface 50 with a handle 220 is displayed as if it is attached to the screen of a real display 225. When a plane to which the surface 50 can be snapped is recognized, the display control device 100 displays a guide 230 along the plane (the outer frame of the display 225 in this example).
 例えば、ユーザ10が持ち手220を把持して面50をディスプレイ225に近づけていくと、表示制御装置100は、ディスプレイ225と面50とが所定距離以内に近付いたことを認識し、ガイド230に沿って面50をディスプレイ225にスナップさせる。 For example, when the user 10 grasps the handle 220 and brings the surface 50 closer to the display 225, the display control device 100 recognizes that the display 225 and the surface 50 have come within a predetermined distance, and snaps the surface 50 to the display 225 along the guide 230.
 これにより、ユーザ10は、ディスプレイ225に投影されるオブジェクト等を選択する際に、ぴったりと貼り付いた面50を利用することができるので、AR空間においても、精度よく選択操作等を行うことができる。なお、表示制御装置100は、ディスプレイ225の前面ではなく、画面に隣接した側面に面50を貼り付けてもよい。これにより、ユーザ10は、ディスプレイ225を通常のモニターとして利用するとともに、その横に仮想表示させた面50上での操作等を並行して行うことができる。 As a result, the user 10 can use the surface 50 that is tightly attached when selecting an object or the like projected on the display 225, and can perform selection operations with precision even in the AR space. Note that the display control device 100 may attach the surface 50 to a side surface adjacent to the screen, rather than to the front surface of the display 225. This allows the user 10 to use the display 225 as a normal monitor, while simultaneously performing operations on the surface 50 that is virtually displayed next to it.
 他の例について、図11を用いて説明する。図11は、第3の変形例に係る表示制御処理を説明するための図(2)である。 Another example will be described with reference to FIG. 11. FIG. 11 is a diagram (2) for explaining the display control process according to the third modified example.
 図11の例では、表示制御装置100は、机240を面50が貼り付く平面として認識する。すなわち、ユーザ10が持ち手220を把持して面50を机240に近づけていくと、表示制御装置100は、机240の上面と面50とが所定距離以内に近付いたことを認識し、面50を机240の上面にスナップさせる。 In the example of FIG. 11, the display control device 100 recognizes the desk 240 as a flat surface to which the surface 50 is attached. In other words, when the user 10 grasps the handle 220 and brings the surface 50 closer to the desk 240, the display control device 100 recognizes that the top surface of the desk 240 and the surface 50 have come within a predetermined distance, and snaps the surface 50 to the top surface of the desk 240.
 これにより、ユーザ10は、ディスプレイ225を通常のモニターとして利用するとともに、ディスプレイ225が載せられた机240を、他の作業を行う別の領域として活用することができる。 This allows the user 10 to use the display 225 as a normal monitor, while also utilizing the desk 240 on which the display 225 is placed as a separate area for performing other tasks.
 さらに、卓上に配置された面50における情報処理について、図12を用いて説明する。図12は、第3の変形例に係る表示制御処理を説明するための図(3)である。 Furthermore, information processing on surface 50 placed on a table will be explained using FIG. 12. FIG. 12 is a diagram (3) for explaining the display control processing related to the third modified example.
 図12の例では、ユーザ10が、面50の奥に表示された仮想オブジェクト305を、面50における座標指定のための仮想ポインタであるタッチペン300を利用して選択する様子を示す。なお、タッチペン300は、例えば、立体用デバイス20をユーザ10が卓上に近づけた際に表示され、立体用デバイス20のIMU情報等に基づいて操作される。 12 shows the user 10 selecting a virtual object 305 displayed behind the surface 50 using a touch pen 300, which is a virtual pointer for specifying coordinates on the surface 50. The touch pen 300 is displayed, for example, when the user 10 brings the 3D device 20 close to a tabletop, and is operated based on the IMU information of the 3D device 20, etc.
 なお、図12の例の場合、仮想オブジェクト305が面50の奥、すなわち、机面の下に位置するため、現実の動きのままでは、机面が邪魔して、立体用デバイス20を用いてタッチペン300で仮想オブジェクト305を選択することができない。このため、表示制御装置100は、選択操作に関する調整を行う。 In the example of FIG. 12, the virtual object 305 is located behind the surface 50, i.e., below the desk surface, so if the actual movements are used, the desk surface will get in the way and the virtual object 305 cannot be selected with the touch pen 300 using the 3D device 20. For this reason, the display control device 100 makes adjustments regarding the selection operation.
 具体的には、表示制御部134が実空間の3次元空間上に、オブジェクトおよびオブジェクトを選択するためのポインタ(この例ではタッチペン300)を仮想表示するとき、第2検出部133は、ポインタを用いてオブジェクトにユーザ10が仮想的に接触可能なよう、ポインタの大きさもしくは接触点を制御してもよい。この点について、図13を用いて説明する。図13は、第3の変形例に係る表示制御処理を説明するための図(4)である。 Specifically, when the display control unit 134 virtually displays an object and a pointer for selecting the object (in this example, the touch pen 300) in a three-dimensional space in the real space, the second detection unit 133 may control the size or contact point of the pointer so that the user 10 can virtually touch the object using the pointer. This point will be explained using FIG. 13. FIG. 13 is a diagram (4) for explaining the display control process related to the third modified example.
 図13の左部では、表示制御装置100が、立体用デバイス20の現実の動きと同じ移動量だけ、タッチペン300を移動させている例を示す。この場合、タッチペン300は、机面に衝突してしまい、ユーザ10が仮想オブジェクト305に触れた感覚が得られない。 The left part of FIG. 13 shows an example in which the display control device 100 moves the touch pen 300 by the same amount as the actual movement of the 3D device 20. In this case, the touch pen 300 collides with the desk surface, and the user 10 does not get the sensation of touching the virtual object 305.
 このため、表示制御装置100は、仮想オブジェクト305が面50よりも奥側にあると認識する場合、図13の右側のように、動的にタッチペン300の長さを変える。具体的には、表示制御装置100は、仮想オブジェクト305にタッチペン300が接触した際に、実際に机面に立体用デバイス20が衝突するよう調整する。これにより、表示制御装置100は、ユーザ10に正確な力覚フィードバックを与えることができ、かつ、仮想オブジェクト305を突き抜けない正確な表示制御処理を行うことができる。 For this reason, when the display control device 100 recognizes that the virtual object 305 is located behind the surface 50, it dynamically changes the length of the touch pen 300, as shown on the right side of FIG. 13. Specifically, the display control device 100 adjusts the 3D device 20 so that it actually collides with the desk surface when the touch pen 300 comes into contact with the virtual object 305. This allows the display control device 100 to provide accurate force feedback to the user 10, and to perform accurate display control processing that does not penetrate the virtual object 305.
 図14に、タッチペン300の表示の例を示す。図14は、第3の変形例に係る表示制御処理を説明するための図(5)である。 FIG. 14 shows an example of the display of the touch pen 300. FIG. 14 is a diagram (5) for explaining the display control process according to the third modified example.
 例えば、表示制御装置100は、タッチペン300の長さを、ユーザ10が選択しようとしている仮想オブジェクト305との距離に基づき確定する。具体的には、表示制御装置100は、机面と立体用デバイス20との距離310が、タッチペン300の先と仮想オブジェクト305までの距離315と同一となるよう設定する。 For example, the display control device 100 determines the length of the touch pen 300 based on the distance to the virtual object 305 that the user 10 is trying to select. Specifically, the display control device 100 sets the distance 310 between the desk surface and the 3D device 20 to be the same as the distance 315 between the tip of the touch pen 300 and the virtual object 305.
 また、ユーザ10は、作業場所の状況に応じて、面50の大きさを変化させることもできる。すなわち、第1検出部132は、ユーザ10による立体用デバイス20の操作に基づき、実空間上に仮想表示した面50の位置およびサイズを制御してもよい。この点について、図15を用いて説明する。図15は、第3の変形例に係る表示制御処理を説明するための図(6)である。 The user 10 can also change the size of the surface 50 depending on the situation at the work location. That is, the first detection unit 132 may control the position and size of the surface 50 virtually displayed in real space based on the operation of the three-dimensional device 20 by the user 10. This point will be explained using FIG. 15. FIG. 15 is a diagram (6) for explaining the display control process according to the third modified example.
 図15に示す例では、ユーザ10は、机240上に貼り付けられている面50を用いて作業を行う。このとき、ユーザ10は、持ち手220を把持して面50を拡げるような動作を行うことで、机240の平面上を覆うように、面50を拡張できる。すなわち、表示制御装置100は、ユーザ10の操作に伴い、面50の大きさを拡張もしくは縮小する。これにより、ユーザ10は、面50が貼り付けられた実空間上の平面を有効に利用できるので、よりよい操作性を得ることができる。 In the example shown in FIG. 15, the user 10 performs work using a surface 50 that is attached to a desk 240. At this time, the user 10 grasps the handle 220 and performs an action such as spreading the surface 50, thereby expanding the surface 50 so that it covers the flat surface of the desk 240. In other words, the display control device 100 expands or reduces the size of the surface 50 in response to the operation of the user 10. This allows the user 10 to effectively use the flat surface in real space on which the surface 50 is attached, thereby achieving better operability.
 また、ユーザ10が仮想オブジェクトを選択する際、表示制御装置100は、指定位置を延伸する等して、適切な選択を行うことができるよう制御してもよい。すなわち、第2検出部133は、ポインタによる選択位置を延伸させ、オブジェクトをユーザ10が仮想的に操作可能なよう、当該ポインタの操作を制御してもよい。この点について、図16を用いて説明する。図16は、第3の変形例に係る表示制御処理を説明するための図(7)である。 Furthermore, when the user 10 selects a virtual object, the display control device 100 may control the designated position to allow the user 10 to make an appropriate selection, for example by extending the designated position. In other words, the second detection unit 133 may control the operation of the pointer to extend the selection position using the pointer and allow the user 10 to virtually operate the object. This point will be explained using FIG. 16. FIG. 16 is a diagram (7) for explaining the display control process related to the third modified example.
 図16の例では、面50がディスプレイ225に対して45度の角度で表示されている。また、面50の奥側には、仮想オブジェクトであるカメラ320が表示されている。この場合、ユーザ10は、カメラ320が面50上に存在しないため、直接には操作することができない。 In the example of FIG. 16, a surface 50 is displayed at a 45 degree angle with respect to the display 225. In addition, a camera 320, which is a virtual object, is displayed behind the surface 50. In this case, the user 10 cannot directly operate the camera 320 because it does not exist on the surface 50.
 このとき、表示制御装置100は、ユーザ10が表示したポインタ330から光線を延伸して、延伸した線を示すガイド335、および、ガイド335とカメラ320の交点であるポインタ340を表示する。これにより、ユーザ10は、実際にはカメラ320に触れる位置にポインタ330を移動できなくても、延伸されたポインタ340を用いて、カメラ320を操作することができる。 At this time, the display control device 100 extends a ray from the pointer 330 displayed by the user 10 and displays a guide 335 indicating the extended line, and a pointer 340 which is the intersection of the guide 335 and the camera 320. This allows the user 10 to operate the camera 320 using the extended pointer 340, even if the user 10 cannot actually move the pointer 330 to a position that touches the camera 320.
 表示制御装置100は、AR表示において、面50を任意の角度で表示できる。この点について、図17を用いて説明する。図17は、第3の変形例に係る表示制御処理を説明するための図(8)である。 The display control device 100 can display the surface 50 at any angle in the AR display. This will be explained using FIG. 17. FIG. 17 is a diagram (8) for explaining the display control process according to the third modified example.
 図17の例では、ユーザ10は、持ち手220を把持し、面50を移動させる。この場合、空間上の平面(この例ではディスプレイ225の画面)に対して、表示制御装置100は、ディスプレイ225の画面に対して直角(90度)に面50を表示してもよいし、斜め45度の角度や、0度(すなわち机面)の角度に面50を表示してもよい。すなわち、表示制御装置100は、任意の角度をユーザ10から受け付けることができるので、ユーザ10の所望に沿った作業領域を構築することができる。 In the example of FIG. 17, the user 10 grasps the handle 220 and moves the surface 50. In this case, the display control device 100 may display the surface 50 at a right angle (90 degrees) to the plane in space (the screen of the display 225 in this example), or at an angle of 45 degrees or 0 degrees (i.e., the desk surface). In other words, the display control device 100 can accept any angle from the user 10, and therefore can create a work area that meets the needs of the user 10.
 なお、表示制御装置100は、ユーザ10が仮想オブジェクトの設計や編集を所望する場合、当該対象物に対して所定距離だけオフセットした位置に面50を表示するようにしてもよい。この場合、ユーザ10は、対象物からオフセットされた面50上での選択操作や、あるいは、対象物をスケッチする等の作業を行うことができるため、視認性に優れた作業環境を得ることができる。 When the user 10 wishes to design or edit a virtual object, the display control device 100 may display the surface 50 at a position offset a predetermined distance from the object. In this case, the user 10 can perform selection operations on the surface 50 offset from the object, or can perform tasks such as sketching the object, thereby providing a work environment with excellent visibility.
(1-4-4.第4の変形例:ポインタ位置に基づく面の変形)
 表示制御装置100は、ユーザ10の操作に応じて、面50そのものの大きさを変化してもよい。すなわち、第2検出部133は、ユーザ10による面50上で指定された位置(例えば、ポインタの表示位置)に基づいて、面50の位置もしくは大きさを変更するよう制御してもよい。この点について、図18を用いて説明する。図18は、第4の変形例に係る表示制御処理を説明するための図である。
(1-4-4. Fourth Modification: Deformation of Surface Based on Pointer Position)
The display control device 100 may change the size of the surface 50 itself in response to an operation by the user 10. That is, the second detection unit 133 may perform control so as to change the position or size of the surface 50 based on a position (e.g., a display position of a pointer) designated on the surface 50 by the user 10. This point will be described with reference to Fig. 18. Fig. 18 is a diagram for explaining a display control process according to the fourth modified example.
 図18の上部の例では、面50に対してポインタ350が右端に表示されている。この場合、表示制御装置100は、図18の下左部のように、面50やポインタ350を固定表示してもよい。また、表示制御装置100は、図18の下中央部のように、ポインタ350が右端に到達したタイミングで、面50を右側に移動させてもよい。また、表示制御装置100は、図18の下右部のように、ポインタ350が右端に到達したタイミングで、面50を拡大表示してもよい。表示制御装置100は、ユーザ10の選択により、これらの表示制御処理を切り替えてもよい。これにより、ユーザ10は、ポインタ350の移動に合わせて作業領域を自動的に変更できるので、よりよい操作性を得ることができる。 In the example at the top of FIG. 18, the pointer 350 is displayed at the right edge of the surface 50. In this case, the display control device 100 may display the surface 50 and the pointer 350 in a fixed manner, as shown in the lower left part of FIG. 18. The display control device 100 may also move the surface 50 to the right when the pointer 350 reaches the right edge, as shown in the lower center part of FIG. 18. The display control device 100 may also enlarge the display of the surface 50 when the pointer 350 reaches the right edge, as shown in the lower right part of FIG. 18. The display control device 100 may switch between these display control processes at the selection of the user 10. This allows the user 10 to automatically change the work area in accordance with the movement of the pointer 350, thereby obtaining better operability.
(2.その他の実施形態)
 上述した各実施形態に係る処理は、上記各実施形態以外にも種々の異なる形態にて実施されてよい。
2. Other Embodiments
The processing according to each of the above-described embodiments may be implemented in various different forms other than the above-described embodiments.
 また、上記各実施形態において説明した各処理のうち、自動的に行われるものとして説明した処理の全部または一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部または一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。例えば、各図に示した各種情報は、図示した情報に限られない。 Furthermore, among the processes described in each of the above embodiments, all or part of the processes described as being performed automatically can be performed manually, or all or part of the processes described as being performed manually can be performed automatically using known methods. In addition, the information including the processing procedures, specific names, various data and parameters shown in the above documents and drawings can be changed as desired unless otherwise specified. For example, the various information shown in each drawing is not limited to the information shown in the drawings.
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部または一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的または物理的に分散・統合して構成することができる。 Furthermore, each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. In other words, the specific form of distribution and integration of each device is not limited to that shown in the figure, and all or part of them can be functionally or physically distributed and integrated in any unit depending on various loads, usage conditions, etc.
 また、上述してきた各実施形態および変形例は、処理内容を矛盾させない範囲で適宜組み合わせることが可能である。 Furthermore, the above-mentioned embodiments and variations can be combined as appropriate to the extent that they do not cause any contradictions in the processing content.
 また、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、他の効果があってもよい。 Furthermore, the effects described in this specification are merely examples and are not limiting, and other effects may also be present.
(3.本開示に係る表示制御装置の効果)
 上述のように、本開示に係る表示制御装置(実施形態では表示制御装置100)は、取得部(実施形態では取得部131)と、表示制御部(実施形態では表示制御部134)とを備える。取得部は、第1のコントローラ(実施形態では立体用デバイス20)から、3次元空間内の位置姿勢情報を取得する。表示制御部は、3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面(実施形態では面50)を3次元空間上に表示する。
(3. Effects of the display control device according to the present disclosure)
As described above, the display control device according to the present disclosure (the display control device 100 in the embodiment) includes an acquisition unit (the acquisition unit 131 in the embodiment) and a display control unit (the display control unit 134 in the embodiment). The acquisition unit acquires position and orientation information in a three-dimensional space from a first controller (the three-dimensional device 20 in the embodiment). The display control unit displays a virtual coordinate plane (the surface 50 in the embodiment) on which two-dimensional coordinates can be specified in the three-dimensional space based on the position and orientation information in the three-dimensional space.
 このように、本開示に係る表示制御装置は、3次元空間上に、さらに2次元座標を指定可能な平面を表示することにより、空間における物体の特定箇所へのポインティングなど、入力精度を向上させることができる。 In this way, the display control device according to the present disclosure can improve input accuracy, such as pointing to a specific location on an object in space, by displaying a plane on which two-dimensional coordinates can be specified in three-dimensional space.
 また、表示制御装置は、第1のコントローラによる入力に基づき、仮想座標平面の3次元空間上の移動を検出する第1検出部(実施形態では第1検出部132)を備える。例えば、第1検出部は、第1のコントローラによる入力に基づき、仮想座標平面を用いて位置指定が可能なオブジェクトの移動を検出する。 The display control device also includes a first detection unit (first detection unit 132 in this embodiment) that detects movement in the three-dimensional space of the virtual coordinate plane based on an input from the first controller. For example, the first detection unit detects movement of an object whose position can be specified using the virtual coordinate plane based on an input from the first controller.
 このように、表示制御装置は、VRコントローラ等を用いて、ユーザが面を自在に移動可能なよう制御する。これにより、ユーザは、自由度の高い入力操作を行うことができる。 In this way, the display control device uses a VR controller or the like to allow the user to move freely around the surface. This allows the user to perform input operations with a high degree of freedom.
 また、表示制御装置は、第2のコントローラ(実施形態では平面用デバイス30)による入力に基づき、仮想座標平面上で指定された位置を検出する。例えば、第2検出部は、仮想座標平面上で指定された位置に基づき、オブジェクトにおいて選択された部位を検出する。具体的には、第2検出部は、仮想座標平面の角度および仮想座標平面上でユーザから指定された位置に基づいて、オブジェクトにおいて選択された部位を検出する。 The display control device also detects a position specified on the virtual coordinate plane based on an input from a second controller (in this embodiment, the planar device 30). For example, the second detection unit detects a part selected on the object based on a position specified on the virtual coordinate plane. Specifically, the second detection unit detects a part selected on the object based on the angle of the virtual coordinate plane and a position specified by the user on the virtual coordinate plane.
 このように、表示制御装置は、マウス等の平面用デバイスを用いたユーザからの入力を取得するので、VRコントローラ等と比較して、正確な入力を受け付けることができる。 In this way, the display control device obtains input from the user using a planar device such as a mouse, and can therefore receive more accurate input than a VR controller, etc.
 また、第2検出部は、指定された位置を始点として仮想座標平面に対して法線方向の延長線上においてオブジェクトと交差する点、もしくは、ユーザの視点位置と指定された位置とを結んだ線の延長線上においてオブジェクトと交差する点を、オブジェクトにおいて選択された部位として検出してもよい。 The second detection unit may also detect, as the selected part of the object, a point where an extension of a line normal to the virtual coordinate plane starting from the specified position intersects with the object, or a point where an extension of a line connecting the user's viewpoint position and the specified position intersects with the object.
 このように、表示制御装置は、面上のポインタによって選択される箇所を決定する挙動を任意に切り替えることができる。これにより、ユーザは、ポインタ位置によって誤った箇所が選択されてしまう等、ユーザにとってストレスとなる挙動を回避でき、自身の所望する操作性を実現することができる。 In this way, the display control device can arbitrarily switch the behavior that determines the location selected by the pointer on the screen. This allows the user to avoid stressful behavior, such as the wrong location being selected due to the pointer position, and achieve the operability that they desire.
 また、表示制御装置が実行する表示制御処理において、第1のコントローラと第2のコントローラとは同一の装置であってもよい。この場合、第1検出部は、同一の装置が実空間上の平面から所定距離を離れた位置に所在する場合に、当該同一の装置を第1のコントローラとして検出してもよい。また、第2検出部は、同一の装置が実空間上の平面から所定距離以内に所在する場合に、当該同一の装置を第2のコントローラとして検出してもよい。あるいは、第1検出部は、同一の装置のユーザから、当該同一の装置を第1のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を当該第1のコントローラとして検出してもよい。第2検出部は、同一の装置のユーザから、当該同一の装置を第2のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を当該第2のコントローラとして検出してもよい。 In addition, in the display control process executed by the display control device, the first controller and the second controller may be the same device. In this case, the first detection unit may detect the same device as the first controller when the same device is located at a position away from a plane in real space by a predetermined distance. Furthermore, the second detection unit may detect the same device as the second controller when the same device is located within a predetermined distance from a plane in real space. Alternatively, the first detection unit may detect the same device as the first controller when a designation to use the same device as the first controller is received from a user of the same device. The second detection unit may detect the same device as the second controller when a designation to use the same device as the second controller is received from a user of the same device.
 このように、表示制御装置は、1つの装置で本開示に係る表示制御処理を実現できるので、様々な設備環境下においても高い入力精度を実現する情報処理をユーザに提供できる。 In this way, the display control device can realize the display control processing according to the present disclosure in a single device, and can provide users with information processing that achieves high input accuracy even in a variety of equipment environments.
 また、表示制御部は、第1のコントローラの3次元空間内の位置姿勢情報にかかわらず、ユーザの視点位置に基づいて、仮想座標平面を3次元空間上に表示する。 The display control unit also displays a virtual coordinate plane in three-dimensional space based on the user's viewpoint position, regardless of the position and orientation information of the first controller in three-dimensional space.
 このように、表示制御装置は、面を固定して表示することで、ユーザの操作手順を減らすことができるので、ユーザにとって負担の少ない情報処理を実現できる。 In this way, the display control device can reduce the number of operation steps required by the user by displaying a fixed surface, thereby realizing information processing that places less of a burden on the user.
 また、表示制御部は、実空間の3次元空間上に仮想座標平面を表示する。第1検出部は、第1のコントローラによる入力に基づき、実空間上の任意の平面に仮想座標平面が所定距離が接近した場合に、当該平面に重畳して表示されるよう、当該平面に吸着された位置で仮想座標平面を検出する。 The display control unit also displays a virtual coordinate plane in the three-dimensional space of the real space. When the virtual coordinate plane approaches an arbitrary plane in the real space by a predetermined distance, the first detection unit detects the virtual coordinate plane at a position adsorbed to the plane based on an input from the first controller so that the virtual coordinate plane is displayed superimposed on the plane.
 このように、表示制御装置は、実空間の平面に面をスナップするような挙動を実現できる。これにより、ユーザは、卓上の平面用デバイスを利用しやすい場所を面として簡易に設定できるので、作業効率を向上することができる。 In this way, the display control device can achieve a behavior that is like snapping a surface to a plane in real space. This allows the user to easily set a location on a tabletop device that is convenient for use as a surface, thereby improving work efficiency.
 また、表示制御部は、実空間の3次元空間上に、オブジェクトおよび当該オブジェクトを選択するためのポインタを仮想表示する。第2検出部は、ポインタを用いてオブジェクトにユーザが仮想的に接触可能なよう、当該ポインタの大きさもしくは接触点を制御する。 The display control unit also virtually displays an object and a pointer for selecting the object in a three-dimensional space in the real space. The second detection unit controls the size or contact point of the pointer so that the user can virtually touch the object using the pointer.
 このように、表示制御装置は、ポインタの大きさや接触点(すなわち、フィードバック感覚)を実空間に合わせて調整することで、実空間の感覚に即した操作感をユーザに提供できる。 In this way, the display control device can provide the user with an operational feel that matches the sensation of real space by adjusting the size of the pointer and the contact point (i.e., the feedback sensation) to suit the real space.
 また、第1検出部は、ユーザによる第1のコントローラの操作に基づき、実空間上に仮想表示した仮想座標平面の位置およびサイズを制御する。 The first detection unit also controls the position and size of the virtual coordinate plane virtually displayed in real space based on the user's operation of the first controller.
 このように、表示制御装置は、ユーザの所望に合わせて面の大きさを可変することで、ユーザが自在に作業領域を調整することを可能とするので、操作性を向上させることができる。 In this way, the display control device can change the size of the screen according to the user's desires, allowing the user to freely adjust the work area, thereby improving operability.
 また、第2検出部は、ポインタによる選択位置を延伸させ、オブジェクトをユーザが仮想的に操作可能なよう、当該ポインタの操作を制御する。 The second detection unit also extends the position selected by the pointer and controls the operation of the pointer so that the user can virtually operate the object.
 このように、表示制御装置は、オブジェクトが遠方に位置するなど仮想空間の不都合を解消する処理を行うことで、ユーザの操作性を向上させることができる。 In this way, the display control device can improve user operability by performing processing to eliminate inconveniences in the virtual space, such as objects being located far away.
 また、第2検出部は、ユーザによる仮想座標平面上で指定された位置に基づいて、当該仮想座標平面の位置もしくは大きさを変更するよう制御する。 The second detection unit also controls the change of the position or size of the virtual coordinate plane based on a position specified on the virtual coordinate plane by the user.
 このように、表示制御装置は、ポインタの位置に応じて自在に面の大きさを可変することで、快適な作業環境をユーザに提供することができる。 In this way, the display control device can freely change the size of the screen depending on the position of the pointer, providing the user with a comfortable working environment.
(4.ハードウェア構成)
 上述してきた各実施形態に係る表示制御装置100等の情報機器は、例えば図19に示すような構成のコンピュータ1000によって実現される。以下、表示制御装置100を例に挙げて説明する。図19は、表示制御装置100の機能を実現するコンピュータ1000の一例を示すハードウェア構成図である。コンピュータ1000は、CPU1100、RAM1200、ROM(Read Only Memory)1300、HDD(Hard Disk Drive)1400、通信インターフェイス1500、および入出力インターフェイス1600を有する。コンピュータ1000の各部は、バス1050によって接続される。
(4. Hardware Configuration)
Information devices such as the display control device 100 according to each embodiment described above are realized by a computer 1000 having a configuration as shown in Fig. 19, for example. The display control device 100 will be described below as an example. Fig. 19 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the display control device 100. The computer 1000 has a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, a HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600. Each unit of the computer 1000 is connected by a bus 1050.
 CPU1100は、ROM1300またはHDD1400に格納されたプログラムに基づいて動作し、各部の制御を行う。例えば、CPU1100は、ROM1300またはHDD1400に格納されたプログラムをRAM1200に展開し、各種プログラムに対応した処理を実行する。 The CPU 1100 operates based on the programs stored in the ROM 1300 or the HDD 1400 and controls each component. For example, the CPU 1100 loads the programs stored in the ROM 1300 or the HDD 1400 into the RAM 1200 and executes processes corresponding to the various programs.
 ROM1300は、コンピュータ1000の起動時にCPU1100によって実行されるBIOS(Basic Input Output System)等のブートプログラムや、コンピュータ1000のハードウェアに依存するプログラム等を格納する。 The ROM 1300 stores boot programs such as the Basic Input Output System (BIOS) that is executed by the CPU 1100 when the computer 1000 starts up, as well as programs that depend on the hardware of the computer 1000.
 HDD1400は、CPU1100によって実行されるプログラム、および、かかるプログラムによって使用されるデータ等を非一時的に記録する、コンピュータが読み取り可能な記録媒体である。具体的には、HDD1400は、プログラムデータ1450の一例である、本開示に係る表示制御プログラムを記録する記録媒体である。 HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by CPU 1100 and data used by such programs. Specifically, HDD 1400 is a recording medium that records a display control program related to the present disclosure, which is an example of program data 1450.
 通信インターフェイス1500は、コンピュータ1000が外部ネットワーク1550(例えばインターネット)と接続するためのインターフェイスである。例えば、CPU1100は、通信インターフェイス1500を介して、他の機器からデータを受信したり、CPU1100が生成したデータを他の機器へ送信したりする。 The communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (e.g., the Internet). For example, the CPU 1100 receives data from other devices and transmits data generated by the CPU 1100 to other devices via the communication interface 1500.
 入出力インターフェイス1600は、入出力デバイス1650とコンピュータ1000とを接続するためのインターフェイスである。例えば、CPU1100は、入出力インターフェイス1600を介して、キーボードやマウス等の入力デバイスからデータを受信する。また、CPU1100は、入出力インターフェイス1600を介して、ディスプレイやエッジーやプリンタ等の出力デバイスにデータを送信する。また、入出力インターフェイス1600は、所定の記録媒体(メディア)に記録されたプログラム等を読み取るメディアインターフェイスとして機能してもよい。メディアとは、例えばDVD(Digital Versatile Disc)、PD(Phase change rewritable Disk)等の光学記録媒体、MO(Magneto-Optical disk)等の光磁気記録媒体、テープ媒体、磁気記録媒体、または半導体メモリ等である。 The input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000. For example, the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600. The CPU 1100 also transmits data to an output device such as a display, edger, or printer via the input/output interface 1600. The input/output interface 1600 may also function as a media interface that reads programs and the like recorded on a specific recording medium. Examples of media include optical recording media such as DVDs (Digital Versatile Discs) and PDs (Phase change rewritable Disks), magneto-optical recording media such as MOs (Magneto-Optical Disks), tape media, magnetic recording media, and semiconductor memories.
 例えば、コンピュータ1000が実施形態に係る表示制御装置100として機能する場合、コンピュータ1000のCPU1100は、RAM1200上にロードされた表示制御プログラムを実行することにより、制御部130等の機能を実現する。また、HDD1400には、本開示に係る表示制御プログラムや、記憶部120内のデータが格納される。なお、CPU1100は、プログラムデータ1450をHDD1400から読み取って実行するが、他の例として、外部ネットワーク1550を介して、他の装置からこれらのプログラムを取得してもよい。 For example, when the computer 1000 functions as the display control device 100 according to the embodiment, the CPU 1100 of the computer 1000 executes a display control program loaded onto the RAM 1200 to realize the functions of the control unit 130, etc. Also, the display control program according to the present disclosure and data in the storage unit 120 are stored in the HDD 1400. The CPU 1100 reads and executes the program data 1450 from the HDD 1400, but as another example, the CPU 1100 may obtain these programs from other devices via the external network 1550.
 なお、本技術は以下のような構成も取ることができる。
(1)
 第1のコントローラから、3次元空間内の位置姿勢情報を取得する取得部と、
 前記3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面を前記3次元空間上に表示する表示制御部と、
 を備える表示制御装置。
(2)
 前記第1のコントローラによる入力に基づき、前記仮想座標平面の3次元空間上の移動を検出する第1検出部、
 をさらに備える前記(1)に記載の表示制御装置。
(3)
 前記第1検出部は、
 第1のコントローラによる入力に基づき、前記仮想座標平面を用いて位置指定が可能なオブジェクトの移動を検出する、
 前記(2)に記載の表示制御装置。
(4)
 第2のコントローラによる入力に基づき、前記仮想座標平面上で指定された位置を検出する第2検出部、
 をさらに備える前記(2)または(3)に記載の表示制御装置。
(5)
 前記第2検出部は、
 前記仮想座標平面上で指定された位置に基づき、オブジェクトにおいて選択された部位を検出する、
 前記(4)に記載の表示制御装置。
(6)
 前記第2検出部は、
 前記仮想座標平面の角度および前記仮想座標平面上でユーザから指定された位置に基づいて、前記オブジェクトにおいて選択された部位を検出する、
 前記(5)に記載の表示制御装置。
(7)
 前記第2検出部は、
 前記指定された位置を始点として前記仮想座標平面に対して法線方向の延長線上において前記オブジェクトと交差する点、もしくは、ユーザの視点位置と前記指定された位置とを結んだ線の延長線上において前記オブジェクトと交差する点を、前記オブジェクトにおいて選択された部位として検出する、
 前記(6)に記載の表示制御装置。
(8)
 前記第1のコントローラと前記第2のコントローラとが同一の装置である、
 前記(4)~(7)のいずれか一つに記載の表示制御装置。
(9)
 前記第1検出部は、
 前記同一の装置が実空間上の平面から所定距離を離れた位置に所在する場合に、当該同一の装置を前記第1のコントローラとして検出し、
 前記第2検出部は、
 前記同一の装置が実空間上の平面から所定距離以内に所在する場合に、当該同一の装置を前記第2のコントローラとして検出する、
 前記(8)に記載の表示制御装置。
(10)
 前記第1検出部は、
 前記同一の装置のユーザから、当該同一の装置を前記第1のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を当該第1のコントローラとして検出し、
 前記第2検出部は、
 前記同一の装置のユーザから、当該同一の装置を前記第2のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を当該第2のコントローラとして検出する、
 前記(8)に記載の表示制御装置。
(11)
 前記表示制御部は、
 前記第1のコントローラの3次元空間内の位置姿勢情報にかかわらず、ユーザの視点位置に基づいて、前記仮想座標平面を前記3次元空間上に表示する、
 前記(2)~(10)のいずれか一つに記載の表示制御装置。
(12)
 前記表示制御部は、
 実空間の前記3次元空間上に前記仮想座標平面を表示し、
 前記第1検出部は、
 前記第1のコントローラによる入力に基づき、実空間上の任意の平面に前記仮想座標平面が所定距離が接近した場合に、当該平面に重畳して表示されるよう、当該平面に吸着された位置で前記仮想座標平面を検出する、
 前記(5)~(7)のいずれか一つに記載の表示制御装置。
(13)
 前記表示制御部は、
 実空間の前記3次元空間上に、前記オブジェクトおよび当該オブジェクトを選択するためのポインタを仮想表示し、
 前記第2検出部は、
 前記ポインタを用いて前記オブジェクトにユーザが仮想的に接触可能なよう、当該ポインタの大きさもしくは接触点を制御する、
 前記(12)に記載の表示制御装置。
(14)
 前記第1検出部は、
 ユーザによる前記第1のコントローラの操作に基づき、実空間上に仮想表示した前記仮想座標平面の位置およびサイズを制御する、
 前記(12)または(13)に記載の表示制御装置。
(15)
 前記第2検出部は、
 前記ポインタによる選択位置を延伸させ、前記オブジェクトをユーザが仮想的に操作可能なよう、当該ポインタの操作を制御する、
 前記(13)に記載の表示制御装置。
(16)
 前記第2検出部は、
 ユーザによる前記仮想座標平面上で指定された位置に基づいて、当該仮想座標平面の位置もしくは大きさを変更するよう制御する、
 前記(4)~(7)のいずれか一つに記載の表示制御装置。
(17)
 コンピュータが、
 第1のコントローラから、3次元空間内の位置姿勢情報を取得し、
 前記3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面を前記3次元空間上に表示する、
 ことを含む表示制御方法。
The present technology can also be configured as follows.
(1)
an acquisition unit that acquires position and orientation information in a three-dimensional space from the first controller;
a display control unit that displays, in the three-dimensional space, a virtual coordinate plane on which two-dimensional coordinates can be specified, based on position and orientation information in the three-dimensional space;
A display control device comprising:
(2)
a first detection unit that detects a movement in a three-dimensional space of the virtual coordinate plane based on an input by the first controller;
The display control device according to (1) further comprising:
(3)
The first detection unit is
detecting a movement of an object whose position can be specified using the virtual coordinate plane based on an input by a first controller;
The display control device according to (2).
(4)
a second detection unit that detects a position designated on the virtual coordinate plane based on an input from a second controller;
The display control device according to (2) or (3), further comprising:
(5)
The second detection unit is
detecting a selected portion of an object based on a position specified on the virtual coordinate plane;
The display control device according to (4).
(6)
The second detection unit is
detecting a part selected in the object based on an angle of the virtual coordinate plane and a position designated by a user on the virtual coordinate plane;
The display control device according to (5) above.
(7)
The second detection unit is
a point where an extension line extending from the specified position in a normal direction to the virtual coordinate plane intersects with the object, or a point where an extension line connecting the user's viewpoint and the specified position intersects with the object, is detected as the selected part of the object;
The display control device according to (6) above.
(8)
the first controller and the second controller are the same device;
The display control device according to any one of (4) to (7).
(9)
The first detection unit is
When the same device is located at a position away from a plane in real space by a predetermined distance, the same device is detected as the first controller;
The second detection unit is
When the same device is located within a predetermined distance from a plane in real space, the same device is detected as the second controller.
The display control device according to (8).
(10)
The first detection unit is
when receiving a designation from a user of the same device to use the same device as the first controller, detecting the same device as the first controller;
The second detection unit is
when a designation to use the same device as the second controller is received from a user of the same device, the same device is detected as the second controller;
The display control device according to (8).
(11)
The display control unit is
displaying the virtual coordinate plane in the three-dimensional space based on a viewpoint position of a user, regardless of position and orientation information of the first controller in the three-dimensional space;
The display control device according to any one of (2) to (10).
(12)
The display control unit is
Displaying the virtual coordinate plane on the three-dimensional space of a real space;
The first detection unit is
when the virtual coordinate plane approaches an arbitrary plane in real space by a predetermined distance based on an input by the first controller, the virtual coordinate plane is detected at a position where the virtual coordinate plane is attracted to the arbitrary plane so that the virtual coordinate plane is displayed superimposed on the arbitrary plane.
A display control device according to any one of (5) to (7).
(13)
The display control unit is
virtually displaying the object and a pointer for selecting the object in the three-dimensional space in the real space;
The second detection unit is
controlling a size or a contact point of the pointer so that the user can virtually touch the object using the pointer;
The display control device according to (12).
(14)
The first detection unit is
controlling a position and a size of the virtual coordinate plane virtually displayed in a real space based on an operation of the first controller by a user;
The display control device according to (12) or (13).
(15)
The second detection unit is
extending a position selected by the pointer and controlling an operation of the pointer so that the user can virtually operate the object;
The display control device according to (13).
(16)
The second detection unit is
and controlling the virtual coordinate plane so as to change a position or a size of the virtual coordinate plane based on a position designated on the virtual coordinate plane by a user.
The display control device according to any one of (4) to (7).
(17)
The computer
acquiring position and orientation information in a three-dimensional space from a first controller;
displaying a virtual coordinate plane in the three-dimensional space on the basis of the position and orientation information in the three-dimensional space, on which two-dimensional coordinates can be specified;
A display control method comprising:
 1   表示制御システム
 10  ユーザ
 20  立体用デバイス
 30  平面用デバイス
 50  面
 100 表示制御装置
 110 通信部
 120 記憶部
 130 制御部
 131 取得部
 132 第1検出部
 133 第2検出部
 134 表示制御部
 140 センサ部
 150 表示部
REFERENCE SIGNS LIST 1 Display control system 10 User 20 Three-dimensional device 30 Planar device 50 Surface 100 Display control device 110 Communication unit 120 Storage unit 130 Control unit 131 Acquisition unit 132 First detection unit 133 Second detection unit 134 Display control unit 140 Sensor unit 150 Display unit

Claims (17)

  1.  第1のコントローラから、3次元空間内の位置姿勢情報を取得する取得部と、
     前記3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面を前記3次元空間上に表示する表示制御部と、
     を備える表示制御装置。
    an acquisition unit that acquires position and orientation information in a three-dimensional space from the first controller;
    a display control unit that displays, in the three-dimensional space, a virtual coordinate plane on which two-dimensional coordinates can be specified, based on position and orientation information in the three-dimensional space;
    A display control device comprising:
  2.  前記第1のコントローラによる入力に基づき、前記仮想座標平面の3次元空間上の移動を検出する第1検出部、
     をさらに備える請求項1に記載の表示制御装置。
    a first detection unit that detects a movement in a three-dimensional space of the virtual coordinate plane based on an input by the first controller;
    The display control device according to claim 1 , further comprising:
  3.  前記第1検出部は、
     第1のコントローラによる入力に基づき、前記仮想座標平面を用いて位置指定が可能なオブジェクトの移動を検出する、
     請求項2に記載の表示制御装置。
    The first detection unit is
    detecting a movement of an object whose position can be specified using the virtual coordinate plane based on an input by a first controller;
    The display control device according to claim 2 .
  4.  第2のコントローラによる入力に基づき、前記仮想座標平面上で指定された位置を検出する第2検出部、
     をさらに備える請求項2に記載の表示制御装置。
    a second detection unit that detects a position designated on the virtual coordinate plane based on an input from a second controller;
    The display control device according to claim 2 , further comprising:
  5.  前記第2検出部は、
     前記仮想座標平面上で指定された位置に基づき、オブジェクトにおいて選択された部位を検出する、
     請求項4に記載の表示制御装置。
    The second detection unit is
    detecting a selected portion of an object based on a position specified on the virtual coordinate plane;
    The display control device according to claim 4.
  6.  前記第2検出部は、
     前記仮想座標平面の角度および前記仮想座標平面上でユーザから指定された位置に基づいて、前記オブジェクトにおいて選択された部位を検出する、
     請求項5に記載の表示制御装置。
    The second detection unit is
    detecting a part selected in the object based on an angle of the virtual coordinate plane and a position designated by a user on the virtual coordinate plane;
    The display control device according to claim 5 .
  7.  前記第2検出部は、
     前記指定された位置を始点として前記仮想座標平面に対して法線方向の延長線上において前記オブジェクトと交差する点、もしくは、ユーザの視点位置と前記指定された位置とを結んだ線の延長線上において前記オブジェクトと交差する点を、前記オブジェクトにおいて選択された部位として検出する、
     請求項6に記載の表示制御装置。
    The second detection unit is
    a point where an extension line extending from the specified position in a normal direction to the virtual coordinate plane intersects with the object, or a point where an extension line connecting the user's viewpoint and the specified position intersects with the object, is detected as the selected part of the object;
    The display control device according to claim 6.
  8.  前記第1のコントローラと前記第2のコントローラとが同一の装置である、
     請求項4に記載の表示制御装置。
    the first controller and the second controller are the same device;
    The display control device according to claim 4.
  9.  前記第1検出部は、
     前記同一の装置が実空間上の平面から所定距離を離れた位置に所在する場合に、当該同一の装置を前記第1のコントローラとして検出し、
     前記第2検出部は、
     前記同一の装置が実空間上の平面から所定距離以内に所在する場合に、当該同一の装置を前記第2のコントローラとして検出する、
     請求項8に記載の表示制御装置。
    The first detection unit is
    When the same device is located at a position away from a plane in real space by a predetermined distance, the same device is detected as the first controller;
    The second detection unit is
    When the same device is located within a predetermined distance from a plane in real space, the same device is detected as the second controller.
    The display control device according to claim 8.
  10.  前記第1検出部は、
     前記同一の装置のユーザから、当該同一の装置を前記第1のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を当該第1のコントローラとして検出し、
     前記第2検出部は、
     前記同一の装置のユーザから、当該同一の装置を前記第2のコントローラとして使用する旨の指定を受け付けた場合に、当該同一の装置を当該第2のコントローラとして検出する、
     請求項8に記載の表示制御装置。
    The first detection unit is
    when receiving a designation from a user of the same device to use the same device as the first controller, detecting the same device as the first controller;
    The second detection unit is
    when a designation to use the same device as the second controller is received from a user of the same device, the same device is detected as the second controller;
    The display control device according to claim 8.
  11.  前記表示制御部は、
     前記第1のコントローラの3次元空間内の位置姿勢情報にかかわらず、ユーザの視点位置に基づいて、前記仮想座標平面を前記3次元空間上に表示する、
     請求項2に記載の表示制御装置。
    The display control unit is
    displaying the virtual coordinate plane in the three-dimensional space based on a viewpoint position of a user, regardless of position and orientation information of the first controller in the three-dimensional space;
    The display control device according to claim 2 .
  12.  前記表示制御部は、
     実空間の前記3次元空間上に前記仮想座標平面を表示し、
     前記第1検出部は、
     前記第1のコントローラによる入力に基づき、実空間上の任意の平面に前記仮想座標平面が所定距離が接近した場合に、当該平面に重畳して表示されるよう、当該平面に吸着された位置で前記仮想座標平面を検出する、
     請求項5に記載の表示制御装置。
    The display control unit is
    Displaying the virtual coordinate plane on the three-dimensional space of a real space;
    The first detection unit is
    when the virtual coordinate plane approaches an arbitrary plane in real space by a predetermined distance based on an input by the first controller, the virtual coordinate plane is detected at a position where the virtual coordinate plane is attracted to the arbitrary plane so that the virtual coordinate plane is displayed superimposed on the arbitrary plane.
    The display control device according to claim 5 .
  13.  前記表示制御部は、
     実空間の前記3次元空間上に、前記オブジェクトおよび当該オブジェクトを選択するためのポインタを仮想表示し、
     前記第2検出部は、
     前記ポインタを用いて前記オブジェクトにユーザが仮想的に接触可能なよう、当該ポインタの大きさもしくは接触点を制御する、
     請求項12に記載の表示制御装置。
    The display control unit is
    virtually displaying the object and a pointer for selecting the object in the three-dimensional space in the real space;
    The second detection unit is
    controlling a size or a contact point of the pointer so that the user can virtually touch the object using the pointer;
    The display control device according to claim 12.
  14.  前記第1検出部は、
     ユーザによる前記第1のコントローラの操作に基づき、実空間上に仮想表示した前記仮想座標平面の位置およびサイズを制御する、
     請求項12に記載の表示制御装置。
    The first detection unit is
    controlling a position and a size of the virtual coordinate plane virtually displayed in a real space based on an operation of the first controller by a user;
    The display control device according to claim 12.
  15.  前記第2検出部は、
     前記ポインタによる選択位置を延伸させ、前記オブジェクトをユーザが仮想的に操作可能なよう、当該ポインタの操作を制御する、
     請求項13に記載の表示制御装置。
    The second detection unit is
    extending a position selected by the pointer and controlling an operation of the pointer so that the user can virtually operate the object;
    The display control device according to claim 13.
  16.  前記第2検出部は、
     ユーザによる前記仮想座標平面上で指定された位置に基づいて、当該仮想座標平面の位置もしくは大きさを変更するよう制御する、
     請求項5に記載の表示制御装置。
    The second detection unit is
    and controlling the virtual coordinate plane so as to change a position or a size of the virtual coordinate plane based on a position designated on the virtual coordinate plane by a user.
    The display control device according to claim 5 .
  17.  コンピュータが、
     第1のコントローラから、3次元空間内の位置姿勢情報を取得し、
     前記3次元空間内の位置姿勢情報に基づいて、2次元座標を指定可能な仮想座標平面を前記3次元空間上に表示する、
     ことを含む表示制御方法。
    The computer
    acquiring position and orientation information in a three-dimensional space from a first controller;
    displaying a virtual coordinate plane in the three-dimensional space on the basis of the position and orientation information in the three-dimensional space, on which two-dimensional coordinates can be specified;
    A display control method comprising:
PCT/JP2023/034684 2022-10-04 2023-09-25 Display control device and display control method WO2024075565A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022160170 2022-10-04
JP2022-160170 2022-10-04

Publications (1)

Publication Number Publication Date
WO2024075565A1 true WO2024075565A1 (en) 2024-04-11

Family

ID=90608193

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/034684 WO2024075565A1 (en) 2022-10-04 2023-09-25 Display control device and display control method

Country Status (1)

Country Link
WO (1) WO2024075565A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003248843A (en) * 2002-12-24 2003-09-05 Fujitsu Ltd Editing control system for object in virtual three- dimensional space
JP2017027206A (en) * 2015-07-17 2017-02-02 キヤノン株式会社 Information processing apparatus, virtual object operation method, computer program, and storage medium
JP2018077876A (en) * 2013-04-02 2018-05-17 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2020513625A (en) * 2016-12-05 2020-05-14 グーグル エルエルシー Generation of virtual annotation planes by gestures in augmented reality and / or virtual reality environments

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003248843A (en) * 2002-12-24 2003-09-05 Fujitsu Ltd Editing control system for object in virtual three- dimensional space
JP2018077876A (en) * 2013-04-02 2018-05-17 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2017027206A (en) * 2015-07-17 2017-02-02 キヤノン株式会社 Information processing apparatus, virtual object operation method, computer program, and storage medium
JP2020513625A (en) * 2016-12-05 2020-05-14 グーグル エルエルシー Generation of virtual annotation planes by gestures in augmented reality and / or virtual reality environments

Similar Documents

Publication Publication Date Title
US20220121344A1 (en) Methods for interacting with virtual controls and/or an affordance for moving virtual objects in virtual environments
EP3564788B1 (en) Three-dimensional user input
JP6057396B2 (en) 3D user interface device and 3D operation processing method
CN116324680A (en) Method for manipulating objects in an environment
US9292184B2 (en) Indirect 3D scene positioning control
WO2017136125A2 (en) Object motion tracking with remote device
EP3262505B1 (en) Interactive system control apparatus and method
JP6601402B2 (en) Control device, control method and program
JP6959703B2 (en) Virtual reality headset stand
WO2019065846A1 (en) Program, information processing method, information processing system, head mounted display device, and information processing device
US20220291744A1 (en) Display processing device, display processing method, and recording medium
JP2022184958A (en) animation production system
CN113498531A (en) Head-mounted information processing device and head-mounted display system
CN109102571A (en) A kind of control method of virtual image, device, equipment and its storage medium
WO2024075565A1 (en) Display control device and display control method
JP2022153476A (en) Animation creation system
WO2023021757A1 (en) Information processing device, information processing method, and program
WO2023095519A1 (en) Display control device, display control method, and program
CN117716327A (en) Method and apparatus for managing interactions of a user interface with physical objects
US20230162450A1 (en) Connecting Spatially Distinct Settings
WO2023286316A1 (en) Input device, system, and control method
WO2023281819A1 (en) Information processing device for determining retention of object
CN117616365A (en) Method and apparatus for dynamically selecting an operating modality of an object
CN116724284A (en) Method and device for processing user input of multiple devices
CN115004132A (en) Information processing apparatus, information processing system, and information processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23874693

Country of ref document: EP

Kind code of ref document: A1