WO2024048195A1 - Appareil de commande d'affichage - Google Patents

Appareil de commande d'affichage Download PDF

Info

Publication number
WO2024048195A1
WO2024048195A1 PCT/JP2023/028418 JP2023028418W WO2024048195A1 WO 2024048195 A1 WO2024048195 A1 WO 2024048195A1 JP 2023028418 W JP2023028418 W JP 2023028418W WO 2024048195 A1 WO2024048195 A1 WO 2024048195A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
display control
glass
user
glass device
Prior art date
Application number
PCT/JP2023/028418
Other languages
English (en)
Japanese (ja)
Inventor
有希 中村
康夫 森永
充宏 後藤
達哉 西▲崎▼
貴哉 長谷川
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2024048195A1 publication Critical patent/WO2024048195A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present disclosure relates to a display control device.
  • HMD Head Mounted Display
  • An example of an HMD device is a glass device such as AR (Augmented Reality) glasses.
  • AR glasses an image of a virtual object that does not exist in real space is displayed superimposed on real space.
  • the user of the glass device can acquire information corresponding to the selected virtual object.
  • An operation for selecting a virtual object is called a ray operation.
  • the ray operation includes an aim operation in which a virtual indicator such as a cursor is positioned on a virtual object desired to be selected, and a determination operation in which the selection is confirmed.
  • An example of a mode for realizing the ray operation in a glass device is a mode in which a smartphone connected to the glass device plays the role of an operation device such as a 3D mouse.
  • the method of having a smartphone take on the role of an operating device has a problem in that the user's one hand is occupied due to the Ray operation.
  • the user has to take out the smartphone and check the UI screen every time the user performs a lay operation, and there is also the problem that erroneous operations such as pressing the wrong button are likely to occur.
  • Patent Document 1 discloses a technique that utilizes a user's line of sight for ray operation on a glass device.
  • a technology that uses the user's line of sight for ray operation is also referred to as HPG (Head Position Gaze).
  • HPG has a problem in that the user needs to keep staring at the virtual object he or she wishes to select for several seconds, resulting in a waiting time of several seconds before the selection is confirmed. Additionally, HPG has a problem in that erroneous selections may occur frequently, such as a virtual object that the user unconsciously continues to focus on being selected as an operation target. In order to deal with these problems, it is conceivable to provide the glass device with a dedicated button for the decision operation, but there is a problem that the development cost and manufacturing cost of the glass device increase.
  • a display control device includes a display control section and a determination section.
  • the display control unit displays on the glass device one or more virtual objects included in the field of view of the glass device in three-dimensional space. Further, the display control unit displays a virtual indicator for selecting any one of the one or more virtual objects on the glass device based on a line of sight of a user of the glass device.
  • the determination unit determines, as an operation target, a virtual object pointed to by the virtual indicator at the time when an input operation is performed on the wearable terminal worn on the user's body.
  • the present disclosure it is possible to quickly and accurately determine the virtual object to be manipulated, compared to a mode in which "the action of staring at a virtual object for several seconds" is interpreted as a "determination operation.” Further, according to the present disclosure, there is no need to provide a dedicated button for a decision operation on the glass device, and the development cost and manufacturing cost of the glass device do not increase.
  • FIG. 1 is a block diagram showing a configuration example of a display system 1A according to a first embodiment of the present disclosure.
  • 7 is a diagram showing an example of arrangement of virtual objects in a region R.
  • FIG. It is a figure which shows an example of the projection image G1. It is a figure which shows an example of the image G2 reflected in the eyes of the user U wearing the glass device 20.
  • FIG. 6 is a diagram illustrating an example of updating the indicated position of the virtual indicator VC.
  • It is a block diagram showing an example of composition of portable device 10A. It is a flowchart showing the flow of a display control method executed by the processing device 140 of the mobile device 10A according to the program PRA.
  • FIG. 2 is a block diagram showing a configuration example of a display system 1B according to a second embodiment of the present disclosure.
  • FIG. 2 is a block diagram illustrating a configuration example of a display system 1C according to a third embodiment of the present disclosure.
  • It is a block diagram showing an example of composition of portable device 10B.
  • It is a flowchart showing the flow of a display control method executed by the processing device 140 of the mobile device 10B according to the program PRB.
  • FIG. 3 is a diagram for explaining the operation of the mobile device 10B.
  • FIG. 3 is a diagram for explaining the operation of the mobile device 10B.
  • FIG. 3 is a block diagram illustrating a configuration example of a display system 1D according to a fourth embodiment of the present disclosure.
  • FIG. 1 is a block diagram showing a configuration example of a display system 1A according to a first embodiment of the present disclosure.
  • the display system 1A includes a mobile device 10A, a glass device 20, and a wearable terminal 30A.
  • the glass device 20 is an HMD device worn on the head of the user U.
  • the glass device 20 displays an image of a virtual object that does not exist in real space without blocking the field of view of the user U who wears the glass device 20 on his head. Since the glass device 20 does not block the user's U's field of view, the image of the virtual object appears in the user's U's eyes superimposed on the real space.
  • the glass device 20 in this embodiment is an AR glass, it may also be a VR (Virtual Reality) glass.
  • the mobile device 10A is, for example, a smartphone.
  • the mobile device 10A is not limited to a smartphone, but may be, for example, a tablet or a notebook personal computer.
  • the mobile device 10A is worn on the user U's body similarly to the glass device 20.
  • the mobile device 10A is attached to the user U's body by hanging it from the user U's neck using a strap or the like.
  • a glass device 20 is connected to the mobile device 10A by wire.
  • a wearable terminal 30A is wirelessly connected to the mobile device 10A.
  • the mobile device 10A communicates with the management device 40 via the communication line NW.
  • the management device 40 is a server device that provides content distribution service in AR.
  • a plurality of virtual objects are arranged within a spherical area of a predetermined radius centered on the user U, and information is provided according to the virtual object selected by the user.
  • a spherical area in which a plurality of virtual objects are arranged moves as the user U moves. Therefore, the user U can experience moving in the real space while being surrounded by a plurality of virtual objects.
  • the management device 40 stores virtual object information and position information in advance.
  • the virtual object information represents an image of a virtual object placed within a spherical area centered on the user U.
  • the position information indicates the placement position of the virtual object within the area.
  • a spherical area centered on user U is an example of a three-dimensional space in the present disclosure.
  • the position information in this embodiment is information indicating a position in a three-dimensional coordinate system whose origin is the center of the spherical region R.
  • the mobile device 10A displays an image that the user U sees through the glass device 20 within the spherical area on the glass device 20 based on the virtual object information and position information received from the management device 40.
  • the image that the user U sees through the glass device 20 in the spherical area is referred to as an image corresponding to the field of view of the glass device 20.
  • An image corresponding to the field of view of the glass device 20 is determined according to the direction of the optical axis of the glass device 20.
  • the optical axis of the glass device 20 refers to the optical axis of a lens that guides image light representing real space to the user's U eyes.
  • the direction of the optical axis of the glass device 20 is an example of the direction of the glass device 20 in the present disclosure.
  • FIG. 2 is an overhead view of the spherical region R in which the virtual objects VOB1 to VOB6 are arranged, viewed from the vertical axis direction.
  • Point O in FIG. 2 corresponds to the position of user U wearing glass device 20
  • arrow S in FIG. 2 represents the direction of the optical axis of glass device 20 in the global coordinate system.
  • the glass device 20 if the glass device 20 has an imaging function, it can be specified by using a self-position recognition service in AR using an image captured by the imaging function. Can be done.
  • the optical axis of the glass device 20 in the global coordinate system is determined according to the output of the sensor.
  • the direction of the image may be specified.
  • the field of view of the glass device 20 means the range of a predetermined angle ⁇ within the spherical region.
  • the predetermined angle ⁇ is an angle bisected by a line having a direction along the optical axis of the glass device 20. Note that the predetermined angle ⁇ is actually a solid angle.
  • the fan-shaped OAB corresponds to the field of view of the glass device 20.
  • the field of view of the glass device 20 may include images of one or more virtual objects.
  • the image corresponding to the field of view of the glass device 20 is a projected image obtained by projecting the first image onto a plane orthogonal to the arrow S.
  • the first image is an image that is visually recognized when the first portion of region R is viewed from point O in the direction of arrow S.
  • the first portion of the region R is the range of the predetermined angle ⁇ within the region R.
  • the predetermined angle ⁇ for region R is an angle bisected by arrow S.
  • the mobile device 10A generates the projection image based on the virtual object information and position information received from the management device 40 and the direction of the optical axis of the glass device 20.
  • the mobile device 10A displays the generated projection image on the glass device 20.
  • a virtual object VOB1 and a virtual object VOB2 are arranged within the fan-shaped OAB. Therefore, as shown in FIG. 3, the mobile device 10A generates a projection image G1 in which images of the virtual object VOB1 and the virtual object VOB2 are arranged.
  • the mobile device 10A displays the projected image G1 on the glass device 20. Note that a portion of the projected image G1 that does not correspond to either the virtual object VOB1 or the virtual object VOB2 is transparent.
  • the mobile device 10A is an example of a display control device in the present disclosure.
  • the virtual object VOB1 in this embodiment is associated with video data. By determining the virtual object VOB1 as an operation target, the user U can view the video represented by the video data. Furthermore, the virtual object VOB2 in this embodiment is associated with character string data representing a news text regarding matters such as politics, economics, or sports. The user U can display the text of the news on the glass device 20 by determining the virtual object VOB2 as the operation target.
  • the mobile device 10A displays the virtual indicator VC on the glass device 20 superimposed on the projection image G1.
  • the virtual indicator VC is used to cause the user U to select one of the virtual objects displayed on the glass device 20 as an operation target.
  • the image G2 shown in FIG. 4 appears in the eyes of the user U.
  • Wearable terminal 30A is a device for operating virtual indicator VC displayed on glass device 20.
  • the virtual indicator VC in this embodiment includes a starting point SP located at a predetermined point within the projection image G1 and an end point EP.
  • the virtual indicator VC is an arrow-shaped line segment heading from the starting point SP to the ending point EP.
  • the starting point SP in this embodiment is the midpoint of the lower side of the four sides that partition the periphery of the projection image G1, that is, the midpoint of the lower side of the two sides perpendicular to the vertical axis.
  • the position of the midpoint of the lower side of the projection image G1 is the position of the center of the region R, that is, the position closest to the user's U position.
  • the mobile device 10A displays the virtual object VOBP corresponding to the user U at the position of the starting point SP, as shown in FIG. 4, but the display of the virtual object VOBP may be omitted.
  • the mobile device 10A sets the end point EP of the virtual indicator VC to the center OP of the projection image G1. Thereafter, the mobile device 10A updates the position of the end point EP of the virtual indicator VC according to the position and orientation of the wearable terminal 30A in the global coordinate system. That is, the position of the end point EP of the virtual indicator VC corresponds to the position indicated by the virtual indicator VC.
  • the indicated position of the virtual indicator VC immediately after the start of use of the content distribution service will be referred to as the initial indicated position.
  • the reason why the center OP of the projected image G1 is set as the initial designated position is because the center OP corresponds to the center of the field of view of the user U who is wearing the glass device 20, and the line of sight of the user U is located at the center of the field of view of the user U. This is because there are many cases where That is, the reason why the initial indicated position is set to the center OP is to display the virtual indicator VC according to the user's U line of sight. Note that if the glass device 20 has a function of detecting the position of the line of sight of the user U within the display area, the mobile device 10A may use the position detected by the function as the initial designated position.
  • the wearable terminal 30A is worn on the body of the user of the glass device 20.
  • the wearable terminal 30A is attached to the user's finger or arm of the glass device 20.
  • the wearable terminal 30A is, for example, a 3D mouse, a ring mouse, a smart ring, or a smart watch.
  • Wearable terminal 30A includes a sensor 300.
  • the sensor 300 detects changes in the position and orientation of the sensor 300 in the global coordinate system according to the movement of the finger or arm wearing the wearable terminal 30A.
  • the sensor 300 is, for example, a three-axis acceleration sensor and an angular velocity sensor.
  • wearable terminal 30A When the user U moves the finger or arm on which the wearable terminal 30A is attached, forward and backward, left and right, or up and down, the position and orientation of the wearable terminal 30A in the global coordinate system changes, and data corresponding to the change is output from the sensor 300. Wearable terminal 30A transmits data output from sensor 300 to mobile device 10A as first operation content data. Although details will be described later, the mobile device 10A updates the position indicated by the virtual indicator VC in accordance with the first operation content data received from the wearable terminal 30A.
  • FIG. 5 is a diagram showing an example of an image G2 that appears in the eyes of the user U after moving the wearable terminal 30A from right to left.
  • aim operation is realized by moving the wearable terminal 30A back and forth, left and right, or up and down. Therefore, compared to a mode in which the aiming operation is realized by moving the mobile device 10A back and forth, left and right, or up and down, it is possible to perform the aiming operation with high precision while reducing the frequency of occurrence of erroneous operations.
  • the wearable terminal 30A also includes an input device 310 that accepts input operations using the user's fingers.
  • the input device 310 includes a plurality of operators including a first operator, a second operator, and a third operator.
  • the first operator corresponds to a start operation to start using the content distribution service.
  • the second operator corresponds to a decision operation that confirms the selection of a virtual object.
  • the third operator corresponds to an end operation to end the use of the content distribution service.
  • the user U can perform a start operation, a decision operation, or an end operation by pressing any of these operators.
  • the input device 310 is provided with a plurality of operators each corresponding to a plurality of types of input operations, but the input device 310 is provided with one operator, and a long press or Each input operation may be performed by pressing the button twice in succession.
  • Wearable terminal 30A transmits second operation content data indicating the content of the input operation performed on input device 310 to mobile device 10A.
  • the second operation content data indicating that the first operator is pressed indicates a start operation.
  • the second operation content data indicating that the second operator is pressed indicates a decision operation.
  • the third operation content data indicating that the third operator is pressed indicates an end operation.
  • the mobile device 10A executes processing according to the content of the input operation indicated by the second operation content data received from the wearable terminal 30A.
  • FIG. 6 is a block diagram illustrating a configuration example of a mobile device 10A according to an embodiment of the present disclosure.
  • the mobile device 10A includes a first communication device 100, a second communication device 110, a third communication device 120, a storage device 130, a processing device 140, and a bus 150.
  • the first communication device 100, the second communication device 110, the third communication device 120, the storage device 130, and the processing device 140 are interconnected by a bus 150 that mediates data exchange.
  • the bus 150 may be configured using a single bus, or may be configured using different buses for each element.
  • the first communication device 100 is hardware (transmission/reception device) for communicating with the management device 40 via the communication line NW.
  • the first communication device 100 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the first communication device 100 acquires virtual object information and position information by communicating with the management device 40 under the control of the processing device 140.
  • the second communication device 110 is a communication module for communicating with the glass device 20 by wire.
  • the second communication device 110 supplies data received from the glass device 20 to the processing device 140 . Further, the second communication device 110 transmits image data provided from the processing device 140 to the glass device 20.
  • the third communication device 120 is a communication module for wirelessly communicating with the wearable terminal 30A.
  • the third communication device 120 supplies the first operation content data and the second operation content data received from the wearable terminal 30A to the processing device 140.
  • the glass device 20 may be modified to communicate with the third communication device 120 wirelessly, and the second communication device 110 may be omitted in this mode.
  • the wearable terminal 30A may be modified to communicate with the second communication device 110 by wire, and the third communication device 120 may be omitted in this mode.
  • the storage device 130 is a recording medium that can be read by the processing device 140.
  • the storage device 130 may be configured of at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), and the like.
  • a program PRA is stored in the storage device 130 in advance. Although detailed illustration is omitted in FIG. 6, the program PRA includes a program that converts changes in position and orientation in the global coordinate system into changes in position and orientation in a two-dimensional coordinate system that defines the position within the display area of the glass device 20. Contains a transformation matrix for
  • the processing device 140 includes one or more CPUs (Central Processing Units). Upon receiving the second operation content data indicating the start operation from the wearable terminal 30A, the processing device 140 reads the program PRA from the storage device 130 and starts executing the program PRA. The processing device 140 operating according to the program PRA functions as a display control section 140a and a determination section 140b shown in FIG. Further, the processing device 140 operating according to the program PRA ends the execution of the program PRA upon receiving the second operation content data indicating the end operation from the wearable terminal 30A.
  • CPUs Central Processing Units
  • the display control unit 140a and determination unit 140b shown in FIG. 6 are software modules realized by operating a computer such as a CPU according to software such as a program.
  • the functions each of the display control section 140a and the determination section 140b are responsible for are as follows.
  • the display control unit 140a acquires virtual object information and position information by communicating with the management device 40 using the first communication device 100. Then, the display control unit 140a specifies the direction of the optical axis of the glass device 20 by communicating with the glass device 20 using the second communication device 110, and identifies the direction of the optical axis of the glass device 20 that is included in the field of view of the glass device 20. Images of a plurality of virtual objects are displayed on the glass device 20. To explain in more detail, the display control unit 140a generates the above-mentioned projection image for a range of a predetermined angle ⁇ within the region R. The predetermined angle ⁇ is an angle bisected by a line having a direction along the optical axis of the glass device 20 in the global coordinate system. Then, the display control unit 140a displays the projected image on the glass device 20 by transmitting image data representing the projected image to the glass device 20 using the second communication device 110.
  • the display control unit 140a sets the center of the display area of the glass device 20, that is, the center of the above-mentioned projection image, as the initial indication position, and displays the virtual indicator VC on the glass device 20, superimposed on the above-mentioned projection image. Then, each time the display control unit 140a receives the first operation content data via the third communication device 120, the display control unit 140a converts the changes in the position and orientation indicated by the first operation content data into a glass using the above-described conversion matrix. This is converted into a change in the position and orientation of the device 20 in the field of view, that is, a change in the position and orientation in the display area of the glass device 20. The display control unit 140a updates the indicated position of the virtual indicator VC by adding changes in the position and orientation of the glass device 20 in the display area to the current indicated position.
  • the determining unit 140b determines the virtual object indicated by the virtual indicator VC at the time when the determining operation is performed on the wearable terminal 30A as the operation target.
  • the virtual object is a virtual object indicated by the virtual indicator VC. Note that if there is no virtual object indicated by the virtual indicator VC, that is, if the indicated position of the virtual indicator VC does not exist in the image corresponding to any virtual object, the determining unit 140b determines the operation target. do not have.
  • the processing device 140 operating according to the program PRA executes the display control method shown in FIG. 7. As shown in FIG. 7, this display control method includes each process of step SA100 to step SA170. The contents of each process from step SA100 to step SA170 are as follows.
  • step SA100 the processing device 140 functions as the display control unit 140a and executes the acquisition process.
  • the processing device 140 acquires virtual object information and position information for each of the plurality of virtual objects to be placed within the spherical region R by communicating with the management device 40.
  • step SA110 the processing device 140 functions as the display control unit 140a and executes display control processing.
  • the processing device 140 specifies the direction of the optical axis of the glass device 20 and displays on the glass device 20 an image of one or more virtual objects that will be included in the field of view of the glass device 20.
  • the processing device 140 displays a projected image of a range of a predetermined angle ⁇ in the region R on the glass device 20.
  • the predetermined angle ⁇ is an angle bisected by a line having a direction along the optical axis of the glass device 20 in the global coordinate system.
  • the processing device 140 sets the center of the projected image as the initial pointing position, and displays the virtual indicator VC on the glass device 20 in a superimposed manner on the projected image.
  • step SA120 the processing device 140 determines whether a movement operation has been performed.
  • the moving operation refers to an operation that moves the position indicated by the virtual indicator VC.
  • step SA120 when the processing device 140 receives the first operation content data from the wearable terminal 30A via the third communication device 120, it determines that a movement operation has been performed. If the determination result in step SA120 is "Yes”, processing device 140 executes the process in step SA130 and then executes the process in step SA120 again. On the other hand, if the determination result in step SA120 is "No", processing device 140 executes the process in step SA140.
  • step SA130 the processing device 140 functions as the display control unit 140a and executes the movement process.
  • the processing device 140 updates the indicated position of the virtual indicator VC according to changes in the position and orientation of the wearable terminal 30A indicated by the first operation content data.
  • step SA140 the processing device 140 determines whether a determination operation has been performed.
  • the determining operation in this embodiment is pressing the second operator.
  • step SA140 the processing device 140 determines that the determining operation has been performed when the second operation content data indicating that the second operator is pressed is received from the wearable terminal 30A via the third communication device 120. If the determination result in step SA140 is "Yes”, processing device 140 executes the process in step SA120 again after executing the process in step SA150. If the determination result in step SA140 is "No", processing device 140 executes the process in step SA160.
  • step SA150 the processing device 140 functions as the determination unit 140b and executes determination processing.
  • the processing device 140 determines the virtual object indicated by the virtual indicator VC at the time when the determination operation is performed on the wearable terminal 30A as the operation target, and converts the virtual object determined as the operation target. Execute the appropriate processing. For example, when the virtual object VOB1 is determined as the operation target, the processing device 140 plays back the video associated with the virtual object VOB1.
  • step SA160 the processing device 140 determines whether the field of view of the glass device 20 has changed. In step SA160, the processing device 140 determines that the field of view of the glass device 20 has changed when the direction of the optical axis of the glass device 20 in the global coordinate system has changed. If the determination result in step SA160 is "Yes”, processing device 140 executes the process in step SA110 again. If the determination result in step SA160 is "No”, processing device 140 executes the process in step SA170.
  • step SA170 the processing device 140 determines whether a termination operation has been performed.
  • the ending operation in this embodiment is pressing the third operator.
  • step SA170 when processing device 140 receives second operation content data indicating that the third operator has been pressed from wearable terminal 30A via third communication device 120, processing device 140 determines that an end operation has been performed. If the determination result in step SA170 is "Yes”, processing device 140 ends this display control method. If the determination result in step SA170 is "No", processing device 140 executes the process in step SA120 again.
  • aim operation is performed by moving the wearable terminal 30A as an operation device back and forth, left and right, or up and down. Therefore, compared to a mode in which an aim operation is performed using the mobile device 10A as an operation device, it is possible to perform an accurate aim operation while reducing the frequency of occurrence of erroneous operations.
  • the virtual object pointed to by the virtual indicator VC at the time when the determination operation is performed by the input operation on the input device 310 is determined as the operation target. Therefore, the virtual object to be manipulated can be determined more quickly and accurately than in the case where "an action of staring at a virtual object for several seconds" is interpreted as a "determination operation.”
  • FIG. 8 is a block diagram showing a configuration example of a display system 1B according to a second embodiment of the present disclosure.
  • the same components as in FIG. 1 are given the same reference numerals.
  • the configuration of the display system 1B differs from the configuration of the display system 1A in that a wearable terminal 30B is provided instead of the wearable terminal 30A.
  • wearable terminal 30B differs from that of wearable terminal 30A in that a sound collection device 320 is provided instead of input device 310.
  • the sound collection device 320 is, for example, a microphone, and collects the user's voice.
  • Wearable terminal 30B transmits audio data representing the waveform of the audio collected by sound collection device 320 to mobile device 10A as second operation content data.
  • a first keyword is assigned to a start operation
  • a second keyword is assigned to a decision operation
  • a third keyword is assigned to an end operation.
  • User U performs each input operation by speaking the keyword assigned to each input operation.
  • the first keyword, the second keyword, and the third keyword may be arbitrarily set by the user U.
  • the second operation content data indicating the uttered sound of the first keyword indicates a start operation.
  • the second operation content data indicating the uttered sound of the second keyword indicates a determination operation.
  • the third operation content data indicating the uttered voice of the third operation keyword indicates an end operation. That is, this embodiment differs from the first embodiment in that the start operation, determination operation, and end operation are performed by voice input.
  • the aim operation is performed by using the wearable terminal 30B as an operation device. Therefore, compared to a mode in which an aim operation is performed using the mobile device 10A as an operation device, it is possible to perform an accurate aim operation while reducing the frequency of occurrence of erroneous operations. Also in this embodiment, the virtual object pointed to by the virtual indicator at the time when the determination operation is performed by voice input to the sound collection device 320 is determined as the operation target. Therefore, similarly to the first embodiment, it is possible to quickly and accurately determine the virtual object to be manipulated.
  • FIG. 9 is a block diagram showing a configuration example of a display system 1C according to a third embodiment of the present disclosure.
  • the same components as in FIG. 1 are given the same reference numerals.
  • the configuration of the display system 1C is that a wearable terminal 30C is provided in place of the wearable terminal 30A, and a portable device 10B is provided in place of the portable device 10A. , the configuration is different from that of the display system 1A.
  • FIG. 10 is a block diagram showing a configuration example of the mobile device 10B.
  • the same components as in FIG. 6 are given the same reference numerals.
  • the configuration of the mobile device 10B differs from the configuration of the mobile device 10A in that the program PRB is stored in the storage device 130 instead of the program PRA.
  • the processing device 140 reads the program PRB from the storage device 130 and starts executing the program PRB upon receiving the second operation content data indicating the start operation from the wearable terminal 30A.
  • the program PRB differs from the program PRA in that the processing device 140 operating according to the program PRB functions as the display control section 140c and the determination section 140b.
  • the display control unit 140c acquires virtual object information and position information from the management device 40, displays an image on the glass device 20 according to the field of view of the glass device 20, and initializes the center of the display area on the glass device 20. It is the same as the display control unit 140a in that it displays the virtual indicator VC on the glass device 20 by setting it as a pointing position.
  • the display control unit 140c differs from the display control unit 140a in that the indicated position of the virtual indicator VC is fixed at the initial indicated position.
  • the fact that the indicated position of the virtual indicator VC is fixed at the center of the display area of the glass device 20 means that the indicated position of the virtual indicator VC is fixed in response to changes in the field of view of the glass device 20 (changes in the orientation of the glass device 20). This means that it will be changed.
  • the display control unit 140c changes the indicated position of the virtual indicator VC according to a change in the field of view of the glass device 20 (a change in the orientation of the glass device 20), so that the user U wearing the glass device 20 on the head can Display of a virtual indicator VC based on line of sight is realized.
  • the program PRB also differs from the program PRA in that it causes the processing device 140 to execute the display control method shown in the flowchart of FIG. In FIG. 11, the same processes as those in FIG. 7 are given the same reference numerals.
  • the display control method according to the present embodiment does not include the processes of step SA120 and step SA130, and is the first step in that step SA140 is executed subsequent to step SA110. This is different from the display control method in the first embodiment.
  • the indicated position of the virtual indicator VC is changed according to a change in the field of view of the glass device 20 (a change in the orientation of the glass device 20). Therefore, the user U's operation to change the field of view, such as moving his head (action to change the orientation of the glass device 20), also serves as a movement operation. Therefore, in this embodiment, each process of step SA120 and step SA130 is unnecessary.
  • the user U rotates his head from right to left so that the center of the display area on the glass device 20 overlaps with the image of the virtual object VOB1.
  • the field of view of the glass device 20 changes as shown in FIG. 12.
  • the fan-shaped OAB corresponds to the field of view of the glass device 20, as in FIG.
  • the virtual object VOB1 is located at the center of the field of view.
  • the indicated position of the virtual indicator VC is fixed at the center of the display area of the glass device 20. Therefore, the image G2 shown in FIG. 13 is displayed on the glass device 20.
  • the aim operation is realized by moving the head equipped with the glass device 20. Therefore, compared to a mode in which an aim operation is performed by using the portable device 10B as an operation device, it is possible to perform an accurate aim operation while reducing the frequency of occurrence of erroneous operations.
  • the virtual object pointed to by the virtual indicator VC at the time when a determination operation is performed by an input operation on the input device 310 is determined as the operation target, as in the first embodiment. Therefore, the virtual object to be manipulated can be determined more quickly and accurately than in the case where "an action of staring at a virtual object for several seconds" is interpreted as a "determination operation.”
  • the display control unit 140c controls the operation target even if the orientation of the glass device 20 changes while the determination operation continues, that is, while the second operator continues to be pressed.
  • the position of the virtual object in the region R may be updated so that the display position of the determined virtual object on the glass device 20 is fixed. According to this aspect, while the user U continues to perform the determination operation, the virtual object determined as the operation target appears to be moving in the same direction as the direction of the head movement, and Drag and drop of objects is realized.
  • FIG. 14 is a block diagram showing a configuration example of a display system 1D according to a fourth embodiment of the present disclosure.
  • the same components as in FIG. 8 are given the same reference numerals.
  • the configuration of the display system 1D is that a wearable terminal 30D is provided in place of the wearable terminal 30B, and a portable device 10B is provided in place of the portable device 10A.
  • the configuration is different from that of the display system 1B.
  • the configuration of wearable terminal 30D differs from the configuration of wearable terminal 30B in that it does not include sensor 300.
  • aim operation is realized by moving the head equipped with the glass device 20. Therefore, compared to a mode in which an aim operation is performed by using the portable device 10B as an operation device, it is possible to perform an accurate aim operation while reducing the frequency of occurrence of erroneous operations. Also in this embodiment, the virtual object pointed to by the virtual indicator at the time when the determination operation is performed by voice input to the sound collection device 320 is determined as the operation target, as in the second embodiment. Therefore, the virtual object to be manipulated can be quickly and accurately determined.
  • the operation target The display control unit 140c may update the position of the virtual object in the region R so that the display position of the virtual object determined in the glass device 20 is fixed.
  • E Modification The present disclosure is not limited to the first embodiment and second embodiment illustrated above. Specific aspects of the modification are as follows. Two or more aspects arbitrarily selected from the examples below may be combined.
  • E-1 Modification 1
  • the virtual indicator VC in each of the above embodiments was a line segment having an immovable starting point at a predetermined position within the display area of the glass device 20 and an end point as a pointing position.
  • the virtual indicator VC in the present disclosure may be a figure such as a block arrow displayed at a designated position as shown in FIG. 15, similar to a so-called mouse pointer.
  • FIG. 15 is a diagram showing an example of an image G2 that appears in the eyes of the user U when a block arrow figure is used as the virtual indicator VC.
  • the virtual indicator VC is It may be a figure surrounding two virtual buttons.
  • FIG. 16 is a diagram showing an example of an image G2 that appears in the eyes of the user U when a rectangle surrounding a virtual button is used as the virtual indicator VC.
  • the initial indicated position of the virtual indicator VC may be any one of the positions of the plurality of virtual buttons included in the projected image.
  • the user U may arbitrarily set the display mode of the virtual indicator. Further, when the virtual objects arranged in the region R are hierarchical, the mobile devices 10A and 100B change the display mode of the virtual indicator VC according to the hierarchy of the virtual objects displayed on the glass device 20. You may switch. In the following, when there is no need to distinguish between the mobile device 10A and the mobile device 10B, the mobile device 10A and the mobile device 10B will be referred to as the mobile device 10. For example, when a virtual object corresponding to a major item such as a video or news is displayed on the glass device 20, the mobile device 10 displays the virtual indicator VC shown in FIG. 4 or 15 on the glass device 20.
  • the mobile device 10 displays the image shown in FIG. 16.
  • the virtual indicator VC shown is displayed on the glass device 20.
  • the wearable terminal 30A includes a sensor 300 that detects changes in the position and orientation of the sensor 300.
  • the mobile device 10A detects a change in the position and orientation of the wearable terminal 30A in the global coordinate system based on the output of the sensor 300, and updates the indicated position of the virtual indicator VC according to the change in the position and orientation of the wearable terminal 30A. do.
  • the glass device 20 is equipped with a function of detecting the position of the user's U's line of sight within the display area, the mobile device 10A provides virtual instructions according to the position of the user's U's line of sight detected by the function.
  • the indicated position of the body VC may be updated.
  • the indicated position of the virtual indicator VC is updated according to the position of the user U's line of sight, if the determining operation is accepted by an input operation on the input device 310, it can be quickly updated as in the first embodiment. Furthermore, it is possible to accurately determine the virtual object to be manipulated. Similarly, even if the second embodiment is modified so that the indicated position of the virtual indicator VC is updated according to the user's line of sight, as long as the decision operation is accepted by voice input, the operation target can be quickly and accurately updated. virtual objects can be determined.
  • E-3 Modification 3
  • the virtual object determined as the operation target is dragged by moving the wearable terminal 30A or moving the line of sight while continuing to perform the determination operation, that is, by continuing to press the second operator.
  • An and-drop may also be implemented.
  • drag and drop of the virtual object determined as the operation target may be realized by moving the wearable terminal 30B or moving the line of sight while continuing to utter the second keyword.
  • the virtual object in the three-dimensional space is By notifying the management device 40 of the position of the object after dragging and dropping, the position information stored in the management device 40, that is, the placement position of the virtual object in the region R may be updated. Further, a plurality of users U may share one area R, and a change in the display position of a virtual object by one user U may be reflected on all other users U.
  • the program PRA is stored in the storage device 130 of the mobile device 10A, but the program PRA may be manufactured or sold separately.
  • the program PRA may be provided to the purchaser, for example, by distributing a computer-readable recording medium such as a flash ROM on which the program PRA is written, or by downloading the program PRA via a telecommunications line. be.
  • the program PRB may be manufactured or sold separately.
  • the display control section 140a and the determination section 140b in the first embodiment and the second embodiment are both software modules.
  • one or both of the display control unit 140a and the determination unit 140b may be a hardware module.
  • the hardware module include a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array). Even if one or both of the display control unit 140a and the determination unit 140b is a hardware module, the same effects as in the first embodiment or the second embodiment can be achieved.
  • either or both of the display control unit 140c and the determination unit 140b in the third embodiment and the fourth embodiment may be a hardware module.
  • the start operation, the decision operation, and the end operation were performed on the wearable terminal worn on the user U's body, but the start operation and the end operation were performed on the mobile device 10.
  • the input operation performed on the wearable terminal may be only a decision operation.
  • the storage device 130 is exemplified as a ROM, a RAM, etc., but the storage device 130 can also be a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray disk, etc.).
  • ray (registered trademark) disk) smart card
  • flash memory device e.g., card, stick, key drive
  • CD-ROM Compact Disc-ROM
  • register removable disk
  • hard disk floppy disk
  • magnetic It may be a strip, a database, a server or other suitable storage medium.
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
  • the input/output information may be stored in a specific location (for example, memory) or may be managed using a management table. Information etc. to be input/output may be overwritten, updated, or additionally written. The output information etc. may be deleted. The input information etc. may be transmitted to other devices.
  • the determination may be made based on a value represented by 1 bit (0 or 1), or may be made based on a truth value (Boolean: true or false). , may be performed by numerical comparison (for example, comparison with a predetermined value).
  • each function illustrated in FIG. 6 or FIG. 10 is realized by an arbitrary combination of at least one of hardware and software.
  • the method for realizing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices.
  • the functional block may be realized by combining software with the one device or the plurality of devices.
  • the programs exemplified in the embodiments described above are instructions, instruction sets, codes, code segments, software, firmware, middleware, microcode, hardware description language, or other names. Should be broadly construed to mean program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc.
  • software, instructions, information, etc. may be transmitted and received via a transmission medium.
  • a transmission medium For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • the information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
  • the mobile device includes a mobile station (MS).
  • MS mobile station
  • a mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as “mobile station,” “user terminal,” “user equipment (UE),” and “terminal” may be used interchangeably.
  • connection refers to direct or indirect connections between two or more elements. Refers to any connection or combination and may include the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other.
  • the bonds or connections between elements may be physical, logical, or a combination thereof.
  • connection may be replaced with "access.”
  • two elements may include one or more electrical wires, cables, and/or printed electrical connections, as well as in the radio frequency domain, as some non-limiting and non-inclusive examples. , electromagnetic energy having wavelengths in the microwave and optical (both visible and non-visible) ranges, and the like.
  • determining and “determining” used in this disclosure may encompass a wide variety of operations.
  • “Judgment” and “decision” include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, and inquiry. (e.g., searching in a table, database, or other data structure), and regarding an ascertaining as a “judgment” or “decision.”
  • judgment and “decision” refer to receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and access.
  • (accessing) may include considering something as a “judgment” or “decision.”
  • judgment and “decision” refer to resolving, selecting, choosing, establishing, comparing, etc. as “judgment” and “decision”. may be included.
  • judgment and “decision” may include regarding some action as having been “judged” or “determined.”
  • judgment (decision) may be read as “assuming", “expecting", “considering”, etc.
  • a display control device includes a display control section and a determination section.
  • the display control unit displays one or more virtual objects included in the field of view of the glass device in a three-dimensional space on the glass device, and selects any one of the one or more virtual objects.
  • a virtual indicator is displayed on the glass device based on the line of sight of a user of the glass device.
  • the determination unit determines, as an operation target, a virtual object pointed to by the virtual indicator at the time when an input operation is performed on the wearable terminal worn on the user's body.
  • the virtual object pointed to by the virtual indicator at the time when an input operation is performed on the wearable terminal is determined as the operation target. Therefore, the virtual object to be manipulated can be determined more quickly and accurately than in the case where "an action of staring at a virtual object for several seconds" is interpreted as a "determination operation.”
  • the display control unit in the display control device according to a second aspect (an example of the first aspect) of the present disclosure may change the indicated position by the virtual indicator according to a change in the orientation of the glass device. good.
  • the user wearing the glass device on his head can update the indicated position of the virtual indicator by changing the direction of the head.
  • the display control unit in the display control device may be configured to control the display of the virtual object that is determined to be the operation target while the input operation continues.
  • the position of the virtual object determined as the operation target in the three-dimensional space may be updated according to a change in the orientation of the glass device so that the display position on the device is fixed.
  • a user wearing a glass device on his or her head can drag and drop a virtual object determined to be an operation target by moving his or her head while continuing an input operation.
  • the wearable terminal includes an input device that accepts input by the user's fingers.
  • the determining unit may determine whether the input operation has been performed based on data indicating an input to the input device.
  • the user can determine the virtual object to be operated by inputting with fingers.
  • the wearable terminal includes a sound collection device, and the determining unit is configured to It may be determined whether the input operation has been performed based on data indicating audio input to the sound collection device.
  • the user can determine the virtual object to be operated by inputting voice.
  • a display system includes a glass device, a display control device that displays on the glass device one or more virtual objects included in the field of view of the glass device in a three-dimensional space, and A wearable terminal connected to a control device and worn on the body of a user of the glass device.
  • the display control device displays a virtual indicator for selecting any one of the one or more virtual objects on the glass device based on the user's line of sight. Further, the display control device determines, as an operation target, a virtual object pointed to by the virtual indicator at the time when the input operation to the wearable terminal is performed.
  • the display system is faster and It is possible to accurately determine the virtual object to be manipulated.
  • 1A, 1B, 1C, 1D... Display system 10A, 10B... Portable device, 100... First communication device, 110... Second communication device, 120... Third communication device, 130... Storage device, 140... Processing device, 140a ...Display control unit, 140b...Decision unit, 150...Bus, PRA, PRB...Program, 20...Glass device, 30A, 30B, 30C, 30D...Wearable terminal, 300...Sensor, 310...Input device, 320...Sound collection device , 40...management device, NW...communication line.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Cet appareil de commande d'affichage est pourvu d'une unité de commande d'affichage et d'une unité de détermination. L'unité de commande d'affichage affiche, sur un dispositif en verre, un ou plusieurs objets virtuels inclus dans un champ optique du dispositif en verre dans un espace tridimensionnel et affiche, sur le dispositif en verre, un indicateur virtuel pour sélectionner un objet virtuel quelconque parmi le(s) objet(s) virtuel(s), sur la base de la ligne visuelle d'un utilisateur du dispositif en verre. L'unité de détermination détermine, en tant que cible d'opération, un objet virtuel indiqué par l'indicateur virtuel à un instant auquel une opération d'entrée est effectuée sur un terminal prêt-à-porter fixé au corps de l'utilisateur.
PCT/JP2023/028418 2022-09-02 2023-08-03 Appareil de commande d'affichage WO2024048195A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-139770 2022-09-02
JP2022139770 2022-09-02

Publications (1)

Publication Number Publication Date
WO2024048195A1 true WO2024048195A1 (fr) 2024-03-07

Family

ID=90099184

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/028418 WO2024048195A1 (fr) 2022-09-02 2023-08-03 Appareil de commande d'affichage

Country Status (1)

Country Link
WO (1) WO2024048195A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018061667A (ja) * 2016-10-12 2018-04-19 株式会社カプコン ゲームプログラム及びゲーム装置
WO2020017261A1 (fr) * 2018-07-20 2020-01-23 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2021060627A (ja) * 2018-02-07 2021-04-15 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2022020686A (ja) * 2017-08-15 2022-02-01 株式会社コロプラ 情報処理方法、プログラム、およびコンピュータ
JP7126583B1 (ja) * 2021-03-31 2022-08-26 株式会社コーエーテクモゲームス ゲームプログラム、記録媒体、ゲーム処理方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018061667A (ja) * 2016-10-12 2018-04-19 株式会社カプコン ゲームプログラム及びゲーム装置
JP2022020686A (ja) * 2017-08-15 2022-02-01 株式会社コロプラ 情報処理方法、プログラム、およびコンピュータ
JP2021060627A (ja) * 2018-02-07 2021-04-15 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
WO2020017261A1 (fr) * 2018-07-20 2020-01-23 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7126583B1 (ja) * 2021-03-31 2022-08-26 株式会社コーエーテクモゲームス ゲームプログラム、記録媒体、ゲーム処理方法

Similar Documents

Publication Publication Date Title
US11017257B2 (en) Information processing device, information processing method, and program
US20200335065A1 (en) Information processing device
US10621766B2 (en) Character input method and device using a background image portion as a control region
WO2019130991A1 (fr) Dispositif de traitement d'informations
US11763489B2 (en) Body and hand correlation method and apparatus, device, and storage medium
WO2016147498A1 (fr) Dispositif et procédé de traitement d'informations et programme
WO2024048195A1 (fr) Appareil de commande d'affichage
US11836978B2 (en) Related information output device
WO2020039703A1 (fr) Dispositif d'entrée
JP6999822B2 (ja) 端末装置および端末装置の制御方法
JP7365501B2 (ja) 情報処理装置
WO2023210195A1 (fr) Système d'identification
JP7344307B2 (ja) 情報処理装置
WO2022201739A1 (fr) Dispositif de commande d'affichage
US20240103625A1 (en) Interaction method and apparatus, electronic device, storage medium, and computer program product
WO2022201936A1 (fr) Dispositif de commande d'affichage
WO2023026798A1 (fr) Dispositif de commande d'affichage
WO2023223750A1 (fr) Dispositif d'affichage
WO2022091760A1 (fr) Dispositif d'exploitation
JP2024075800A (ja) 表示制御装置
WO2023199627A1 (fr) Dispositif de gestion d'image de guide
KR102539045B1 (ko) 착용형 증강현실 장치를 위한 대시보드 제어 장치 및 대시보드 제어 방법
WO2023218751A1 (fr) Dispositif de commande d'affichage
US11928264B2 (en) Fixed user interface navigation
WO2022190735A1 (fr) Dispositif de commande d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23859964

Country of ref document: EP

Kind code of ref document: A1