WO2023026798A1 - 表示制御装置 - Google Patents
表示制御装置 Download PDFInfo
- Publication number
- WO2023026798A1 WO2023026798A1 PCT/JP2022/029702 JP2022029702W WO2023026798A1 WO 2023026798 A1 WO2023026798 A1 WO 2023026798A1 JP 2022029702 W JP2022029702 W JP 2022029702W WO 2023026798 A1 WO2023026798 A1 WO 2023026798A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- virtual space
- user
- information
- displays
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 62
- 230000008859 change Effects 0.000 claims description 3
- 230000035900 sweating Effects 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 abstract description 11
- 230000006870 function Effects 0.000 description 27
- 238000000034 method Methods 0.000 description 18
- 238000004891 communication Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 10
- 230000009471 action Effects 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 238000007796 conventional method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 230000010365 information processing Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 210000005252 bulbus oculi Anatomy 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 210000001508 eye Anatomy 0.000 description 3
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Definitions
- the present invention relates to a display control device that controls display on a display.
- a transmissive head-mounted display allows a user to see the real world and to refer to information displayed on the display.
- transmissive head-mounted displays information arranged in a virtual space is displayed according to the line of sight in the virtual space. Further, in this case, the line of sight is controlled so as to move according to the direction of the head mounted display or the like. With such control, the user wearing the head-mounted display can refer to virtual space information according to the orientation of his or her head, and can see the real world.
- Sharing a display with another user may also occur in the above aspects of display. For example, it is conceivable to display common information by sharing a virtual space between head-mounted displays. In this case, when the user moves the information displayed in the virtual space, the display position of the information moves on the other user's display.
- the movement of information by other users may hinder the user.
- the user of the other head-mounted display is looking at something in the real world instead of the display, if the displayed information is moved in front of the user's actions, it interferes with the user's actions.
- An embodiment of the present invention has been made in view of the above, and aims to provide a display control device capable of appropriately displaying information in a virtual space common to a plurality of displays. aim.
- a display control device distributes display information commonly arranged in a virtual space shared with other displays to each display in the virtual space.
- a display control device that controls display on a display that displays according to the line of sight, and includes a detection unit that detects the state of a user wearing another display, and a visual field of the virtual space that corresponds to the line of sight of the virtual space on the other display.
- a view information acquisition unit that acquires information, and a control unit that controls operations related to display on the display according to the state of the user detected by the detection unit and the view information of the virtual space acquired by the view information acquisition unit. And prepare.
- the operation related to the display of the display is controlled according to the state of the user wearing the other display and the field-of-view information of the virtual space.
- the display can be controlled so that the display on the other display does not interfere with the user. Therefore, according to the display control device according to the embodiment of the present invention, it is possible to appropriately perform display when displaying common virtual space information among a plurality of displays.
- display when displaying common virtual space information among a plurality of displays, display can be performed appropriately.
- FIG. 3 is a diagram showing an example of a virtual space related to display on a display, display information arranged in the virtual space, and a displayed area; FIG. 3 illustrates sharing of virtual space and display information between displays;
- FIG. 4 is a diagram showing an example of display control on a display;
- FIG. 10 is a diagram showing an example of a region related to control;
- 4 is a flow chart showing processing executed by a display that is a display control device according to an embodiment of the present invention; It is a figure which shows the hardware constitutions of the display which is a display control apparatus which concerns on embodiment of this invention.
- FIG. 1 shows a display 10 which is a display control device according to this embodiment.
- the display 10 which is a display device, controls display on the display 10 itself.
- the display 10 displays display information arranged in a common virtual space with the other displays 20 according to the line of sight of each display in the virtual space.
- the displays 10 and 20 display display information arranged in a virtual three-dimensional space according to the line of sight in the virtual space.
- Display information is information displayed on the displays 10 and 20 .
- the display information is, for example, contents such as characters, images, and moving images.
- the display information may be of a scheduler, a browser, or the like that can change the displayed content.
- the display information may be communication such as a chat between the users of the displays 10 and 20 .
- the display information may be anything other than the above as long as it can be arranged in the virtual space and displayed according to the line of sight in the virtual space.
- the displays 10 and 20 are, for example, displays that display virtual content (display information) using AR (Augmented Reality) or the like.
- the displays 10 and 20 are transmissive displays worn on the user's eyes.
- the displays 10 and 20 may be glasses-type head-mounted displays, that is, see-through glasses (smart glasses, AR glasses).
- the displays 10 and 20 are used by different users. Also, the display 10 need not be worn by the user.
- the displays 10 and 20 display the display information of the area visible in the direction of the line of sight in the virtual space from the reference position of the line of sight in the virtual space.
- the line of sight in the virtual space includes the reference position of the line of sight in the virtual space and the direction of the line of sight in the virtual space.
- the displays 10 and 20 can be displayed by moving the line of sight in the virtual space. By moving the line of sight in the virtual space, the display information seen in the virtual space changes, and the display on the displays 10 and 20 also changes.
- the movement of the line of sight in the virtual space is performed according to a preset method.
- the movement of the line of sight in the virtual space is performed with a preset degree of freedom.
- the preset degree of freedom is, for example, 3DoF (Degree of Freedom).
- 3DoF the reference position of the line of sight in the virtual space is fixed, and only the direction of the line of sight in the virtual space can be rotated around the X, Y, and Z axes of the virtual space.
- the line of sight in the virtual space may be moved by a degree of freedom other than the above and by a method other than the above.
- a sphere (omnidirectional sphere) 100 is provided in virtual space, and display information 200 is arranged along the sphere 100 .
- the two-dimensional display information 200 is arranged along the tangential plane of the sphere 100 toward the center of the sphere 100 . That is, the display information 200 is arranged so as to be visible from the center of the sphere 100 .
- the center of the sphere 100 is the reference position of the line of sight in the virtual space.
- the displays 10 and 20 display the display information 200 of the area 300 of the sphere 100 visible in the direction of the line of sight in the virtual space from the reference position of the line of sight in the virtual space.
- the shape of the area 300 is the shape of the display screens of the displays 10 and 20, for example, a rectangular shape as shown in FIG. Since the viewing angles of the displays 10 and 20 (the angles at which the virtual space can be displayed) are limited as described above, only a portion of the sphere 100 is displayed. In the example shown in FIG. 2, the position of the area 300 of the sphere 100 changes by changing the direction of the line of sight in the virtual space, and the display on the displays 10 and 20 changes.
- displays 10 and 20 may be omnidirectional displays.
- the display on the displays 10 and 20 may be performed by an application that performs the display.
- the direction of the line of sight when the application is activated may be a preset default direction, or may be a different direction for each of the displays 10 and 20 .
- the displays 10 and 20 may be provided with sensors that detect the movement and orientation of the displays 10 and 20 themselves.
- an acceleration sensor that detects acceleration
- a gyro sensor that detects angular velocity
- the displays 10 and 20 may detect the movement and orientation of the displays 10 and 20 themselves using these sensors, and move the line of sight in the virtual space according to the detected movement and orientation. That is, the line of sight may be moved according to the direction of the head (face) of the user wearing the displays 10 and 20 .
- the movement of the line of sight in the virtual space may be performed according to detection by sensors other than those described above.
- the movement of the line of sight in the virtual space may be performed by the user's operation on the displays 10 and 20 instead of by the sensor.
- the display on the displays 10 and 20 using the line of sight in the virtual space may be performed in the same manner as the conventional display by AR or the like.
- the holding and processing of information about the virtual space may be performed in either of the displays 10 and 20, or may be performed in a cloud or the like connected to the displays 10 and 20.
- the display on the display 10 and the display on the other display 20 use display information commonly arranged in a common virtual space. That is, the virtual space and display information may be shared among the multiple displays 10 and 20 . A plurality of displays 10 and 20 with which the virtual space and display information are shared are set in advance.
- the line of sight of the virtual space in the display on the display 10 and the display on the other display 20 are independent for each of the displays 10 and 20 . If the lines of sight on the displays 10 and 20 are different, for example, in the case of 3DoF, if the directions of the lines of sight are different, the displays 10 and 20 perform different displays according to the respective lines of sight.
- the same display is performed on the displays 10 and 20 .
- an example in which two displays 10 and 20 share a virtual space is shown, but three or more displays 10 and 20 may share a virtual space.
- operations related to display for example, user operations on display information are performed.
- the displays 10 and 20 accept a user's operation to move the display information 200 in the virtual space, and move the display information 200 . Since the arrangement of the display information 200 in the virtual space is common as described above, when the display information 200 is moved on one of the displays 10 and 20, the other display 10 is moved as shown on the left side of FIG. , 20, the display information 200 moves. Operations other than those described above may be performed on the displays 10 and 20 . For example, operations such as enlargement and reduction of display information, character input for display information, and rotation of the sphere 100 may be performed.
- the operation may be performed using, for example, an information processing device such as a smart phone carried by the user wearing the displays 10 and 20 .
- the displays 10 and 20 and the smartphone are connected in advance so that information can be transmitted and received.
- the operation is reflected on the displays 10 and 20 .
- the smartphone held by the user may be displayed so that a laser pointer appears at the position where the smartphone is pointed.
- the above displays and operations on the displays 10 and 20 may be performed in the same manner as in conventional methods.
- the displays 10 and 20 conventional displays having the above functions can be used. Moreover, the displays 10 and 20 have a communication function. The communication functions of the displays 10 and 20 are used to share the virtual space and display information described above and to realize the functions according to the present embodiment. Further, some of the above-described functions of the displays 10 and 20 and some of the functions according to the present embodiment described later are possessed by an information processing device (for example, a smartphone) connected to the display device (for example, the above-described see-through glasses). may That is, the displays 10 and 20 according to the present embodiment may be realized by including the display device and the information processing device.
- an information processing device for example, a smartphone
- the display device for example, the above-described see-through glasses
- the displays 10 and 20 are omnidirectional displays as described above
- the displays 10 and 20 display images, videos, browsers, etc. as the display information as described above, so that the displays 10 and 20 are attached It is possible to construct a work space dedicated to a user who has
- by sharing the virtual space and display information among the plurality of displays 10 and 20 as described above it is possible, for example, to share images with other remote users in the image of multiple people entering the omnidirectional display. Display information can be shared at the same time.
- the displays 10 and 20 are see-through glasses, it is possible to use the displayed information while doing other work in the real space, that is, to use it while doing other work.
- the display 10 includes a display section 11, a detection section 12, a view information acquisition section 13, and a control section .
- Another display 20 sharing virtual space and display information with the display 10 has functions corresponding to the functions of the display 10 according to the present embodiment.
- Other functions of the display 20 will be described together with the description of the functions of the display 10 according to the present embodiment below.
- the display 10 may have functions other than those described above that conventional display devices, such as conventional see-through glasses, have.
- the display 10 may have the functions of another display 20 according to this embodiment.
- Other displays 20 may also have the functionality of this embodiment of the display 10 .
- the display unit 11 is a functional unit that displays on the display 10.
- the display unit 11 receives and displays display information to be displayed on the display 10 .
- the display unit 11 may input display information stored in the display 10, or receive and input display information from the outside.
- the display unit 11 displays the display information arranged in the virtual space according to the line of sight in the virtual space.
- the display unit 11 shares the virtual space and display information with the other displays 20 .
- the sharing of the virtual space and the display information is performed, for example, by transmitting/receiving the display information and the information indicating the position of the virtual space where the display information is arranged to/from the other display 20 .
- the virtual space and display information may be shared not directly between the displays 10 and 20 but via a server with which the displays 10 and 20 can communicate. Also, the virtual space and display information may be shared between the displays 10 and 20 by a method other than the above. Information other than the above that is transmitted and received between the displays 10 and 20 may be transmitted and received directly between the displays 10 and 20, or may be transmitted and received via a server or the like.
- the display unit 11 receives an operation related to display and executes the operation.
- An operation related to display is, for example, an operation of moving display information in a virtual space. Also, the operation related to the display may be another operation.
- An operation related to display is performed by the user of the display 10 . When the user of the display 10 performs an operation, the operation related to the display is accepted by accepting the user's input operation to the display 10, for example.
- the display unit 11 transmits information indicating the accepted operation to the other display 20 .
- the other display 20 receives the transmitted information indicating the operation and executes the operation so that the virtual space and the display information are shared with the display 10 .
- the display unit 11 receives from the other display 20 information indicating the operation related to the display performed on the other display 20, and shares the virtual space and the display information with the other display 20. perform the operation as instructed.
- the above function of the display unit 11 may be the same as the conventional function.
- operations related to display by the display unit 11 are controlled by the control unit 14 .
- the detection unit 12 is a functional unit that detects the state of the user wearing the other display 20 .
- the detection unit 12 may detect the line of sight of the user in the physical space as the state of the user wearing the other display 20 .
- the detection unit 12 may detect the focal length of the line of sight of the user in the physical space as the state of the user wearing the other display 20 .
- the detection unit 12 may detect the line-of-sight direction of the user in the physical space as the state of the user wearing the other display 20 .
- the detection unit 12 may detect the moving state of the user as the state of the user wearing the other display 20 .
- the detection unit 12 receives information related to detection from another display 20 and performs the above detection.
- another display 20 is provided with a sensor for detecting the state of the user who wears the other display 20, and the sensor is used to detect the state of the user.
- another display 20 is provided with a camera capable of imaging the eyeball of the user. The other display 20 transmits to the display 10 the image of the user's eye obtained by imaging the camera.
- the detection unit 12 receives images from other displays 20 .
- the detection unit 12 detects the line of sight of the user in the real space, more specifically, the focal length and direction of the line of sight from the image of the user's eye received from the other display 20 .
- the detection unit 12 detects the movement of the user's eyeball from the moving image of the user's eyeball, and detects the focal length and direction of the line of sight based on the movement.
- the detection unit 12 detects, for example, the distance from the eyeball to the focal point as the focal length.
- the detection unit 12 detects, as the direction of the line of sight, for example, the position of the line of sight in the physical space on the display screen of the other display 20 (the position of the intersection of the line of sight and the display screen, for example, coordinates on the display screen). Detection of the line of sight in the physical space from the image may be performed in the same manner as in the conventional method. Alternatively, another display 20 may perform processing for detecting the line of sight of the user in the real space from the above image, and transmit information on the detection result to the display 10 . The detection unit 12 receives detection result information from the other display 20 and detects the line of sight of the user in the real space.
- the detection unit 12 may detect whether or not the user is walking as the movement state of the user.
- the other display 20 is provided with an acceleration sensor, for example. Another display 20 transmits information obtained by the acceleration sensor to the display 10 .
- the detector 12 receives the information from another display 20 .
- the detection unit 12 detects whether or not the user is walking from the information obtained by the acceleration sensor received from the other display 20 . Detection of whether or not the user is walking from the acceleration may be performed in the same manner as in the conventional method.
- the other display 20 is provided with a positioning function of its own device by GPS (Global Positioning System) or the like, and the detection unit 12 detects the position of the other display 20 based on the information indicating the position of the other display 20 obtained by the positioning function. , whether the user is moving (eg, walking) or stationary. This detection may also be performed in the same manner as in the conventional method.
- the detection unit 12 continuously detects the state of the user, for example, at regular intervals.
- the detection unit 12 outputs information indicating the detected state of the user to the control unit 14 each time of detection.
- the detection unit 12 may detect at least one of the user states described above.
- the detection unit 12 may detect the state of the user by a method other than the above-described method.
- the detection unit 12 may detect user states other than those described above as long as they are useful for control by the control unit 14, which will be described later.
- the view information acquisition unit 13 is a functional unit that acquires the view information of the virtual space according to the line of sight of the virtual space on the other display 20 .
- the view information acquisition unit 13 receives and acquires the view information of the virtual space from the other display 20 .
- the other display 20 transmits information indicating the range displayed on the display screen of the other display 20 to the display 10 as the field-of-view information of the virtual space.
- the range displayed on the display screen is an area 300 of the sphere 100 that is visible in the direction of the line of sight of the virtual space from the reference position of the line of sight of the virtual space shown in FIG.
- the other display 20 indicates, as virtual space field-of-view information, information capable of calculating the range displayed on the display screen of the other display 20, for example, the reference position of the line of sight in the virtual space and the direction of the line of sight in the virtual space. Send the information to the display 10 .
- the reference position of the line of sight is a fixed position in the virtual space, so it does not need to be included in the visual field information of the virtual space.
- the visibility information acquisition unit 13 continuously acquires visibility information of the virtual space, for example, at regular intervals, in the same manner as detection by the detection unit 12 .
- the view information acquisition unit 13 outputs the acquired view information of the virtual space to the control unit 14 each time it is acquired.
- the control unit 14 is a functional unit that controls operations related to display on the display 10 according to the state of the user detected by the detection unit 12 and the view information of the virtual space acquired by the view information acquisition unit 13. .
- the control unit 14 may determine whether or not to perform control according to the focal length detected by the detection unit 12 .
- the control unit 14 may set the control area according to the line-of-sight direction in the real space detected by the detection unit 12 .
- the control unit 14 may set a control area according to the visibility information of the virtual space acquired by the visibility information acquisition unit 13, and control operations related to the display of the display 10 related to the set area. .
- the control unit 14 may perform control to prohibit the operation of changing the display of the display information in the set area.
- the control by the control unit 14 is, for example, to prevent operations on the display 10 from interfering with users wearing other displays 20 . Since the other display 20 is transmissive, the user wearing the other display 20 can also perform actions other than looking at the display 20 . For example, the user can view a bus timetable or walk while the display 20 is displaying.
- the other display 20 displays the area 300 according to the line of sight of the virtual space on the other display 20 .
- display information 200 is moved to an area at the same position as area 300 in virtual space by an operation on display 10 .
- the display information 200 moves to the area 300 on the other display 20 as well. Due to this movement, the display information 200 that has not been displayed until now is displayed on the other display 20 .
- the newly displayed display information 200 obstructs the user's view of the other displays 20 and interferes with the user's walking. Also, users of other displays 20 are in danger because they cannot see their surroundings. This is especially dangerous when other display 20 users are walking in crosswalks. Alternatively, even if the user of another display 20 is not looking at the display of the other display 20 and is looking at the real space other than walking, the newly displayed display information 200 is the behavior of the user. can be a hindrance to In this way, when the user of the other display 20 is performing an action such as work in the real world other than looking at the other display 20, the operation of the user of the display 10 is the same as that of the user of the other display 20. It can get in the way of movement.
- control by the control unit 14 is to prevent the user's operation of the display 10 from interfering with the actions of the users of the other displays 20 .
- control by the control unit 14 does not necessarily have to be performed for the above purposes, and may be performed for purposes other than the above.
- the control unit 14 performs control as follows.
- the control unit 14 receives information indicating the state of the user of the other display 20 from the detection unit 12 .
- the control unit 14 inputs the visual field information of the virtual space from the visual field information acquisition unit 13 .
- the control unit 14 determines whether or not to control the operation related to the display of the display 10, i.e., control the user's operation of the display 10, from the information input from the detection unit 12 based on pre-stored determination criteria. do. For example, if the information input from the detection unit 12 indicates the focal length in the real space, the control unit 14 makes the above determination based on whether the focal length is within a preset range.
- the above range is, for example, a range in which it can be determined that the user wearing the other display 20 is focused on the display screen of the other display 20, that is, the user is looking at the display screen of the other display 20. is.
- the control unit 14 determines not to control the operation related to the display of the display 10 assuming that the user is looking at the display screen of another display 20 . This is because it is considered that if the user is looking at the display screen of another display 20, the operation on the display 10 will not interfere with the user.
- the control unit 14 determines to control the operation related to the display of the display 10, and the detection unit 12 If the input information indicates that the user is not walking, the control unit 14 determines not to control the operation related to the display on the display 10 . In the case where the control unit 14 determines whether or not to control the operation related to the display of the display 10 based on a plurality of determination criteria, if it is determined that the control is to be performed based on any of the determination criteria, (another determination (Even if it is determined not to perform control based on the standard), it is finally determined to perform control.
- the control unit 14 determines to perform control of the user's operation related to the display of the display 10
- the area 400 related to the control is displayed according to the virtual space view information input from the view information acquisition unit 13 . set.
- the control unit 14 notifies the display unit 11 of the set area 400 and performs control according to the area 400 .
- the control prohibits, for example, an operation related to display on the display 10 related to the set area 400 .
- this control prohibits an operation to move the display information 200 from outside the set area 400 to the set area 400 .
- the operation to be controlled is not limited to the above, and may include an operation of moving the display information 200 from the set area 400 to the outside of the set area 400 .
- the operation to be controlled may include an operation of enlarging, reducing, or moving the display information within the set area 400 .
- the control may prohibit some of the operations. For example, if the operation is to move the display information, the control may be a control to move the display information slightly without completely moving it according to the operation.
- the control unit 14 sets the area 400 related to control as follows.
- the control unit 14 sets an area 400 related to control based on the area 300 displayed on the other display 20 according to the visual field information of the virtual space acquired by the visual field information acquiring unit 13 .
- the control unit 14 designates a region 410 of the virtual space at the same position as the region 300 corresponding to the line of sight of the virtual space displayed on the other display 20 as the region 400 related to control. set.
- the position of the area 300 described above is indicated by view information in the virtual space.
- FIG. 5A the control unit 14 designates a region 410 of the virtual space at the same position as the region 300 corresponding to the line of sight of the virtual space displayed on the other display 20 as the region 400 related to control. set.
- the position of the area 300 described above is indicated by view information in the virtual space.
- the control unit 14 may include a region 410 in the virtual space at the same position as the region 300 in the other display 20 and further include a region 420 extending the region 410 in one direction.
- area 400 is set to .
- the direction in which the area 420 is provided is, for example, the direction in the virtual space corresponding to the vertical downward direction in the real space.
- the direction and the size of the area 420 are set in advance.
- the control unit 14 controls the area 400 so as to include an area 410 in the virtual space at the same position as the area 300 in the other display 20 and further include an area 430 around the area 410. set.
- Region 430 is, for example, rectangular region 410 that is larger than the rectangular region that contains rectangular region 410 .
- the position of the area 430 with respect to the area 410 and the size of the area 430 are preset.
- the display unit 11 receives control from the control unit 14 and performs processing according to the control.
- the display unit 11 identifies whether or not the operation is an operation to be controlled. For example, the display unit 11 determines whether the received operation is to move display information included in the area 400 set in the control unit 14 or the like. When it is determined that the received operation is to move the display information included in the area 400 set in the control unit 14, the display unit 11 identifies the operation as an operation to be controlled. . If the operation is identified as an operation to be controlled, the display unit 11 prohibits the operation. When identifying that the operation is not an operation to be controlled, the display unit 11 performs the operation without prohibiting it.
- control unit 14 may display the area 400 on the display 10 so that the user of the display 10 can recognize the area 400 related to control.
- the frame portion of the area 400 or the entire area may be colored and displayed on the display 10 .
- control does not necessarily have to be control that prohibits the operation, and may be control that changes the display related to the operation.
- control may be performed to hide or translucent the display information related to the operation of the control target, or to display only the frame.
- control may be performed to display the display information related to the operation of the controlled object at a preset position on the display 10, for example, at a corner of the field of view.
- control other than the above-described control may be performed as long as it meets the above-described purpose.
- control unit 14 may set the area 400 related to control using information input from the detection unit 12 based on pre-stored determination criteria. For example, if the information input from the detection unit 12 indicates the line-of-sight direction in the real space, the control unit 14 also sets the control-related area 400 from the line-of-sight direction.
- the control unit 14 identifies the position of the sphere 100 in the virtual space corresponding to the line-of-sight position on the display screen, which is the line-of-sight direction. The identification may be performed by conventional methods.
- the control unit 14 calculates a region along the sphere 100 with reference to the specified position.
- the area is, for example, a preset circular or rectangular area centered on the specified position.
- the control unit 14 sets the final control-related region 400 by overlapping the calculated region and the control-related region 400 set according to the visibility information of the virtual space as described above.
- the area 400 related to control By setting the area 400 related to control in this way, the area 400 can be adapted to the line of sight of the user of the other display 20 in the real space.
- the above is the function of the display 10 according to the present embodiment.
- the state of the user wearing the other display 20 is detected by the detection unit 12 (S01). Detection of the state of the user is performed based on information transmitted from other displays 20 . Subsequently, the visual field information acquisition unit 13 receives and acquires virtual space visual field information corresponding to the line of sight of the virtual space on the other display 20 (S02). Subsequently, the control unit 14 controls the display unit 11 to operate the display 10 according to the user's state and the virtual space view information (S03). The above processes (S01 to S03) are repeated while the display unit 11 and other displays 20 are displaying. The above is the processing executed by the display 10 according to the present embodiment.
- the operation related to the display of the display 10 is controlled according to the state of the user wearing the other display 20 and the view information of the virtual space.
- the display can be controlled so that the display on the other display 20 due to the display on the display 10 does not interfere with the user. Therefore, according to the present embodiment, when displaying common virtual space information between the plurality of displays 10 and 20, it is possible to perform appropriate display.
- the line of sight of the user in the physical space may be detected as the state of the user wearing the other display 20 .
- the focal length of the line of sight of the user in the real space may be detected, and it may be determined whether or not to perform control according to the focal length.
- control can be performed when the user is not looking at the display screen of the other display 20, and the display of the other display 20 may interfere with the user or cause the user to You can prevent it from creating danger.
- the direction of the line of sight of the user in the real space may be detected, and the area 400 related to control may be set according to the direction of the line of sight. According to this configuration, for example, it is possible to control only the necessary area 400 according to the direction of the user's line of sight, and display on the display 10 can be performed more appropriately.
- the movement state of the user more specifically, whether or not the user is walking may be detected.
- control can be performed while the user is walking, and it is possible to prevent the display of the other display 20 from causing danger while the user is walking.
- the state of the user wearing the other display 20 is not limited to the above, and other states may be detected and used as long as they are useful for controlling the display. For example, states such as whether the user is cooking, looking at a sign, or having a conversation may be detected.
- an area 400 related to control may be set according to the field-of-view information of the virtual space, and the operation related to the display of the display 10 related to the set area 400 may be controlled.
- the control may be control that prohibits an operation to change the display of the display information in the set area 400 .
- display on the display 10 can be performed appropriately and reliably according to the state of the user of the other display 20 .
- control other than the above may be performed as long as the control is such that the display on the display 10 is performed appropriately.
- the detection unit 12 may detect the heartbeat or sweating state of the user in addition to or instead of the above information. Specifically, the detection unit 12 may detect the user's heart rate or perspiration amount, or both. In this case, a sensor for detecting a heart rate or a sensor for detecting the amount of perspiration is attached to the user, and the display 10 transmits information indicating the heart rate or the amount of perspiration of the user from the sensor via another display 20. to get Conventional sensors can be used as these sensors.
- control unit 14 determines whether or not to control the operation related to the display of the display 10 based on the information input from the detection unit 12 based on the pre-stored determination criteria as described above. For example, the control unit 14 performs control when the sensor indicated by the information input from the detection unit 12 or the amount of perspiration is equal to or greater than a preset threshold value or is equal to or less than a preset threshold value. to decide. With the configuration as described above, appropriate display can be performed depending on the heart rate or the amount of perspiration of the user wearing the other display 20 .
- the display control device has been described as the display 10 having a display function, but it does not necessarily have to have a display function.
- the display control device displays display information commonly arranged in a virtual space shared with other displays in accordance with the line of sight of each display in the virtual space (that is, includes the display unit 11).
- a device (system) that is connected to control display on a display and that includes the above-described detection unit 12 , view information acquisition unit 13 , and control unit 14 may be used.
- each functional block may be implemented using one device that is physically or logically coupled, or directly or indirectly using two or more devices that are physically or logically separated (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices.
- a functional block may be implemented by combining software in the one device or the plurality of devices.
- Functions include judging, determining, determining, calculating, calculating, processing, deriving, investigating, searching, checking, receiving, transmitting, outputting, accessing, resolving, selecting, choosing, establishing, comparing, assuming, expecting, assuming, Broadcasting, notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc. can't
- a functional block (component) responsible for transmission is called a transmitting unit or transmitter.
- the implementation method is not particularly limited.
- the display 10 in one embodiment of the present disclosure may function as a computer that performs information processing of the present disclosure.
- FIG. 7 is a diagram illustrating an example of a hardware configuration of display 10 according to an embodiment of the present disclosure.
- the display 10 described above may be physically configured as a computer device including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
- the hardware configuration of the other display 20 may be the one described here.
- the term "apparatus” can be read as a circuit, device, unit, or the like.
- the hardware configuration of the display 10 may be configured to include one or more of each device shown in the drawing, or may be configured without some of the devices.
- Each function of the display 10 is performed by loading predetermined software (programs) on hardware such as the processor 1001 and the memory 1002 so that the processor 1001 performs calculations, controls communication by the communication device 1004, controls the communication by the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 .
- the processor 1001 for example, operates an operating system and controls the entire computer.
- the processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, registers, and the like.
- CPU central processing unit
- each function of display 10 described above may be implemented by processor 1001 .
- the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them.
- programs program codes
- software modules software modules
- data etc.
- the program a program that causes a computer to execute at least part of the operations described in the above embodiments is used.
- each function of display 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 .
- FIG. Processor 1001 may be implemented by one or more chips.
- the program may be transmitted from a network via an electric communication line.
- the memory 1002 is a computer-readable recording medium, and is composed of at least one of, for example, ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), etc. may be
- ROM Read Only Memory
- EPROM Erasable Programmable ROM
- EEPROM Electrical Erasable Programmable ROM
- RAM Random Access Memory
- the memory 1002 may also be called a register, cache, main memory (main storage device), or the like.
- the memory 1002 can store executable programs (program code), software modules, etc. for performing information processing according to an embodiment of the present disclosure.
- the storage 1003 is a computer-readable recording medium, for example, an optical disc such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disc, a magneto-optical disc (for example, a compact disc, a digital versatile disc, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
- Storage 1003 may also be called an auxiliary storage device.
- the storage medium included in display 10 may be, for example, a database, server, or other suitable medium including at least one of memory 1002 and storage 1003 .
- the communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via at least one of a wired network and a wireless network, and is also called a network device, a network controller, a network card, a communication module, or the like.
- the input device 1005 is an input device (for example, keyboard, mouse, microphone, switch, button, sensor, etc.) that receives input from the outside.
- the output device 1006 is an output device (eg, display, speaker, LED lamp, etc.) that outputs to the outside. Note that the input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
- Each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information.
- the bus 1007 may be configured using a single bus, or may be configured using different buses between devices.
- the display 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array).
- DSP digital signal processor
- ASIC Application Specific Integrated Circuit
- PLD Physical Location Deposition
- FPGA Field Programmable Gate Array
- a part or all of each functional block may be implemented by the hardware.
- processor 1001 may be implemented using at least one of these pieces of hardware.
- Input/output information may be stored in a specific location (for example, memory) or managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
- the determination may be made by a value represented by one bit (0 or 1), by a true/false value (Boolean: true or false), or by numerical comparison (for example, a predetermined value).
- notification of predetermined information is not limited to being performed explicitly, but may be performed implicitly (for example, not notifying the predetermined information). good too.
- Software whether referred to as software, firmware, middleware, microcode, hardware description language or otherwise, includes instructions, instruction sets, code, code segments, program code, programs, subprograms, and software modules. , applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, and the like.
- software, instructions, information, etc. may be transmitted and received via a transmission medium.
- the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
- wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
- wireless technology infrared, microwave, etc.
- system and “network” used in this disclosure are used interchangeably.
- information, parameters, etc. described in the present disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using other corresponding information. may be represented.
- determining and “determining” used in this disclosure may encompass a wide variety of actions.
- “Judgement” and “determination” are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure), ascertaining as “judged” or “determined”, and the like.
- "judgment” and “determination” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgment” or “decision” has been made.
- judgment and “decision” are considered to be “judgment” and “decision” by resolving, selecting, choosing, establishing, comparing, etc. can contain.
- judgment and “decision” may include considering that some action is “judgment” and “decision”.
- judgment (decision) may be read as “assuming”, “expecting”, “considering”, or the like.
- connection means any direct or indirect connection or coupling between two or more elements, It can include the presence of one or more intermediate elements between two elements being “connected” or “coupled.” Couplings or connections between elements may be physical, logical, or a combination thereof. For example, “connection” may be read as "access”.
- two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
- any reference to elements using the "first,” “second,” etc. designations used in this disclosure does not generally limit the quantity or order of those elements. These designations may be used in this disclosure as a convenient method of distinguishing between two or more elements. Thus, reference to a first and second element does not imply that only two elements can be employed or that the first element must precede the second element in any way.
- a and B are different may mean “A and B are different from each other.”
- the term may also mean that "A and B are different from C”.
- Terms such as “separate,” “coupled,” etc. may also be interpreted in the same manner as “different.”
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (8)
- 他のディスプレイとの間で共通の仮想空間に共通して配置された表示情報を、仮想空間のディスプレイ毎の視線に応じて表示するディスプレイにおける表示を制御する表示制御装置であって、
前記他のディスプレイを装着するユーザの状態を検出する検出部と、
前記他のディスプレイにおける仮想空間の視線に応じた仮想空間の視界情報を取得する視界情報取得部と、
前記検出部によって検出されたユーザの状態、及び前記視界情報取得部によって取得された仮想空間の視界情報に応じて、前記ディスプレイの表示に係る操作に対する制御を行う制御部と、
を備える表示制御装置。 - 前記検出部は、前記他のディスプレイを装着するユーザの状態として、当該ユーザの現実空間の視線を検出する請求項1に記載の表示制御装置。
- 前記検出部は、前記他のディスプレイを装着するユーザの状態として、当該ユーザの現実空間の視線の焦点距離を検出して、
前記制御部は、前記検出部によって検出された焦点距離に応じて、前記制御を行うか否かを判断する、請求項2に記載の表示制御装置。 - 前記検出部は、前記他のディスプレイを装着するユーザの状態として、当該ユーザの現実空間の視線の方向を検出して、
前記制御部は、前記検出部によって検出された現実空間の視線の方向に応じて、前記制御に係る領域を設定する、請求項2又は3に記載の表示制御装置。 - 前記検出部は、前記他のディスプレイを装着するユーザの状態として、当該ユーザの移動状態を検出する請求項1~4の何れか一項に記載の表示制御装置。
- 前記制御部は、前記視界情報取得部によって取得された仮想空間の視界情報に応じて前記制御に係る領域を設定し、設定した領域に関連する前記ディスプレイの表示に係る操作に対する制御を行う請求項1~5の何れか一項に記載の表示制御装置。
- 前記制御部は、設定した領域において表示情報の表示を変更する操作を禁止する制御を行う請求項6に記載の表示制御装置。
- 前記検出部は、前記他のディスプレイを装着するユーザの状態として、当該ユーザの心拍又は発汗に係る状態を検出する請求項1~7の何れか一項に記載の表示制御装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/294,690 US20240289080A1 (en) | 2021-08-23 | 2022-08-02 | Display control device |
JP2023543777A JPWO2023026798A1 (ja) | 2021-08-23 | 2022-08-02 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-135589 | 2021-08-23 | ||
JP2021135589 | 2021-08-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023026798A1 true WO2023026798A1 (ja) | 2023-03-02 |
Family
ID=85323118
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/029702 WO2023026798A1 (ja) | 2021-08-23 | 2022-08-02 | 表示制御装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240289080A1 (ja) |
JP (1) | JPWO2023026798A1 (ja) |
WO (1) | WO2023026798A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016082411A (ja) * | 2014-10-17 | 2016-05-16 | 国立大学法人電気通信大学 | ヘッドマウントディスプレイ、画像表示方法及びプログラム |
JP2019028638A (ja) * | 2017-07-28 | 2019-02-21 | コニカミノルタ株式会社 | ヘッドマウントディスプレイとその映像表示方法 |
WO2019220729A1 (ja) * | 2018-05-16 | 2019-11-21 | ソニー株式会社 | 情報処理装置、情報処理方法、および記録媒体 |
JP2020102232A (ja) * | 2014-07-31 | 2020-07-02 | キヤノンマーケティングジャパン株式会社 | 情報処理システム、その制御方法、及びプログラム、並びに、情報処理装置、その制御方法、及びプログラム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI493227B (zh) * | 2014-03-07 | 2015-07-21 | Bion Inc | A head-mounted display device that displays motion information |
KR20160064978A (ko) * | 2014-11-28 | 2016-06-08 | 가부시키가이샤 한도오따이 에네루기 켄큐쇼 | 표시 장치, 모듈, 표시 시스템, 및 전자 기기 |
-
2022
- 2022-08-02 JP JP2023543777A patent/JPWO2023026798A1/ja active Pending
- 2022-08-02 US US18/294,690 patent/US20240289080A1/en active Pending
- 2022-08-02 WO PCT/JP2022/029702 patent/WO2023026798A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020102232A (ja) * | 2014-07-31 | 2020-07-02 | キヤノンマーケティングジャパン株式会社 | 情報処理システム、その制御方法、及びプログラム、並びに、情報処理装置、その制御方法、及びプログラム |
JP2016082411A (ja) * | 2014-10-17 | 2016-05-16 | 国立大学法人電気通信大学 | ヘッドマウントディスプレイ、画像表示方法及びプログラム |
JP2019028638A (ja) * | 2017-07-28 | 2019-02-21 | コニカミノルタ株式会社 | ヘッドマウントディスプレイとその映像表示方法 |
WO2019220729A1 (ja) * | 2018-05-16 | 2019-11-21 | ソニー株式会社 | 情報処理装置、情報処理方法、および記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023026798A1 (ja) | 2023-03-02 |
US20240289080A1 (en) | 2024-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11017603B2 (en) | Method and system for user interaction | |
US20200168177A1 (en) | Electronic device, augmented reality device for providing augmented reality service, and method of operating same | |
EP3144775B1 (en) | Information processing system and information processing method | |
US20200335065A1 (en) | Information processing device | |
WO2019130991A1 (ja) | 情報処理装置 | |
US20180314326A1 (en) | Virtual space position designation method, system for executing the method and non-transitory computer readable medium | |
KR20150082843A (ko) | 헤드 마운티드 디스플레이 및 그 제어 방법 | |
CN111309142A (zh) | 用于切换显示设备的输入模态的方法和设备 | |
WO2023026798A1 (ja) | 表示制御装置 | |
US20220083145A1 (en) | Information display apparatus using line of sight and gestures | |
WO2022201739A1 (ja) | 表示制御装置 | |
WO2022201936A1 (ja) | 表示制御装置 | |
WO2023026700A1 (ja) | 表示制御装置 | |
WO2022190735A1 (ja) | 表示制御装置 | |
WO2022202065A1 (ja) | 表示制御装置 | |
JP6999822B2 (ja) | 端末装置および端末装置の制御方法 | |
WO2023026628A1 (ja) | 表示制御装置 | |
WO2023047865A1 (ja) | 仮想空間提供装置 | |
JP7267105B2 (ja) | 情報処理装置及びプログラム | |
JPWO2020031490A1 (ja) | 端末装置および端末装置の制御方法 | |
WO2023007927A1 (ja) | 仮想空間提供装置 | |
JP2024089241A (ja) | 表示制御装置 | |
WO2023223750A1 (ja) | 表示装置 | |
WO2023218751A1 (ja) | 表示制御装置 | |
JP7139395B2 (ja) | 制御装置、プログラム、及びシステム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22861089 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023543777 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18294690 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22861089 Country of ref document: EP Kind code of ref document: A1 |