WO2016170717A1 - Wearable display, information processing system, and control method - Google Patents

Wearable display, information processing system, and control method Download PDF

Info

Publication number
WO2016170717A1
WO2016170717A1 PCT/JP2016/000826 JP2016000826W WO2016170717A1 WO 2016170717 A1 WO2016170717 A1 WO 2016170717A1 JP 2016000826 W JP2016000826 W JP 2016000826W WO 2016170717 A1 WO2016170717 A1 WO 2016170717A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
user
unit
display object
object candidate
Prior art date
Application number
PCT/JP2016/000826
Other languages
French (fr)
Japanese (ja)
Inventor
博隆 石川
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016170717A1 publication Critical patent/WO2016170717A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers

Definitions

  • the present technology relates to a wearable display capable of tactilely displaying information related to an image that can be displayed in a display field, an information processing system including the same, and a control method.
  • Patent Document 1 describes a see-through type head-mounted display (HMD: Head Mounted Display) capable of superimposing and displaying an image related to an object existing in a real space.
  • HMD Head Mounted Display
  • the AR technology can be applied to, for example, the see-through type head mounted display as described above, an image obtained by imaging a real space, or the like.
  • the display visual field that the user can visually recognize is limited, and the area in the display visual field that the user can watch closely is also limited. Therefore, in Patent Document 2, when an image or the like included in mail (AR mail) employing AR technology is displayed superimposed on a corresponding physical object in the real space captured by the camera function of the receiving device, An augmented reality system is described in which a user is guided toward a physical object by a vibrator, audio output, or the like.
  • Patent Document 2 does not disclose a specific method for guiding a user toward a physical object using a vibrator.
  • the purpose of the present technology is to allow the user to grasp information regarding the image to which the AR technology is applied and the position of the object in the real space related to the image by tactile feedback.
  • An object is to provide a wearable display, an information processing system using the same, and a control method.
  • a wearable display includes a display unit, a mounting unit, and a tactile sense presentation unit.
  • the display unit presents an AR (Augmented Reality) display in the user's field of view.
  • the attachment unit is configured to be attachable to the user.
  • the tactile sense presentation unit is provided on the mounting unit.
  • the tactile sense providing unit may detect a relative of the AR display candidate viewed from the user by a tactile feedback pattern when there is a predetermined change in the AR display candidate that can be presented as the AR display. Present information about the position by touch.
  • information related to the relative position of the AR display candidate viewed from the user can be presented to the user by a tactile sense using the tactile feedback pattern.
  • the “AR display object” includes at least an image (AR object) to which the AR technology is applied, and includes an object in the real space related to the image in addition to the AR object. Also good.
  • the “AR display object candidate” is prepared so as to be displayed as an AR display object in the field of view.
  • the haptic feedback pattern may be generated based on information related to a relative position of the AR display object candidate viewed from the user.
  • the information regarding the relative position of the AR display object candidate viewed from the user includes the direction of the AR display object candidate with respect to the center direction of the user's visual field, and between the AR display object candidate and the user. Information related to at least one of the distances may be included.
  • the direction of the AR display object candidate based on the center direction of the user's visual field and the distance between the AR display object candidate and the user can be grasped by the user.
  • the tactile sense presentation unit includes a plurality of vibrators arranged at different positions of the mounting unit, Each of the plurality of vibrators may vibrate according to a vibration pattern defined based on the information on the relative position and the arrangement of the plurality of vibrators.
  • the relative position of the AR display object candidate viewed from the user and the vibration pattern of the vibrator can be linked, and the user can intuitively grasp the relative position.
  • the mounting portion is A first mounting member mounted on the right side of the user; A second mounting member mounted on the left side of the user,
  • the plurality of vibrators are A first vibrator disposed on the first mounting member; A second vibrator disposed on the second mounting member, The first vibrator vibrates with a stronger intensity than the second vibrator when the direction of the AR display candidate from the user corresponds to the right side of the user, and the AR display candidate When the direction seen from the user corresponds to the left side of the user, the user may vibrate with a lower strength than the second vibrator.
  • the direction of the AR display object candidate viewed from the user can be expressed by the vibration intensity of the vibrator, and the direction can be more intuitively understood by the user.
  • the plurality of vibrators have a vibrator group arranged in a predetermined direction
  • the plurality of transducers included in the transducer group may vibrate in order along the predetermined direction when the direction of the AR display candidate viewed from the user corresponds to the predetermined direction.
  • the direction of the AR display object candidate based on the user can be expressed in the vibration order of the vibrator, and the direction can be more intuitively understood by the user.
  • the mounting part is A modern part worn on the user's ear, A support part connected to the modern part and supporting the display part;
  • the tactile sense presentation unit may be disposed in the modern unit.
  • the tactile sense presenting unit Since the tactile sense presenting unit is disposed in the modern part away from the display unit, the influence on the display unit due to the driving of the tactile presenting unit can be suppressed.
  • the modern part may have higher rigidity than the support part.
  • the tactile sensation presentation unit may present information regarding the relative position of the AR display object viewed from the user by tactile sensation for the newly acquired AR display object candidate.
  • the said tactile sense presentation part may present the information regarding the relative position of the said AR display thing seen from the user by a tactile sense about the said AR display thing candidate in which the display mode changed.
  • the tactile sensation presentation unit may present information regarding the relative position of the AR display object viewed from the user with a tactile sensation for the AR display object candidate in which a predetermined change has occurred in the distance to the user. .
  • the AR display object candidate may be associated with a position outside the field of view.
  • the information on the relative position can be grasped by tactile feedback.
  • the display unit may present an auxiliary display that assists information on the relative position of the AR display object viewed from the user, which is presented by the information pattern on the relative position, in the visual field.
  • An information processing system includes a wearable display and a control unit.
  • the wearable display includes a display unit, a mounting unit, and a tactile sense presentation unit.
  • the display unit presents an AR (Augmented Reality) display in the user's field of view.
  • the attachment unit is configured to be attachable to the user.
  • the tactile sense presentation unit is provided on the mounting unit.
  • the tactile sense presenting unit presents information related to the relative position of the AR display object candidate viewed from the user to the user by a tactile feedback pattern.
  • the control unit generates a haptic feedback pattern and outputs the haptic feedback pattern to the haptic presentation unit when there is a predetermined change in an AR display candidate that can be presented as the AR display. .
  • a control method includes a display unit that presents an AR display object in a user's field of view, a mounting unit that can be mounted on the user, and a tactile sensation provided on the mounting unit.
  • a wearable display control method including a presentation unit. When there is a predetermined change in an AR display object candidate that can be presented as the AR display object, information related to the relative position of the AR display object candidate viewed from the user is presented by a tactile sense using a tactile feedback pattern. To do.
  • the present technology it is possible to allow the user to grasp information regarding the image to which the AR technology is applied and the position of the object in the real space related to the image by tactile feedback.
  • the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
  • FIG. 1 is a schematic diagram illustrating a configuration of an AR system (information processing system) according to a first embodiment of the present technology. It is a figure which shows the example of the visual field shown by HMD (wearable display) of the said AR system. It is a block diagram which shows the structure of the said AR system. It is a perspective view which shows the external appearance of the said HMD. It is a typical side view of the HMD. It is a block diagram which shows the functional structure of the control unit of the said AR system. It is the schematic which shows the cylindrical coordinate employ
  • FIG. 10 is a schematic diagram illustrating a generation example of a haptic feedback pattern by the control unit according to Modification 1-1 of the embodiment.
  • FIG. 10 is a schematic diagram illustrating an example of generation of a haptic feedback pattern by the control unit according to Modification 1-2 of the embodiment. It is a typical side view of HMD of modification 1-3 of the above-mentioned embodiment. It is a typical top view of HMD of modification 1-4 of the above-mentioned embodiment.
  • FIG. 1 is a schematic diagram illustrating a configuration of an AR system (information processing system) according to the first embodiment of the present technology.
  • the AR system 100 includes a wearable display 10, a control unit 20 that controls the wearable display 10, a portable information terminal 30, and an AR server 40.
  • the wearable display 10 is a head mounted display (HMD) in the present embodiment, and is hereinafter referred to as an HMD 10.
  • HMD10 of FIG. 1 has shown the shape typically.
  • the AR server 40 is a server device on the Internet 50.
  • the AR server 40 stores information related to the AR object.
  • the portable information terminal 30 is typically a smartphone, but is configured by an information processing apparatus such as a mobile phone, a tablet terminal, a personal computer (PC), a tablet PC, or a PDA (Personal Digital Assistant).
  • the portable information terminal 30 can acquire the current position of the user by a GPS (Global Positioning System) function.
  • the portable information terminal 30 is connected to the AR server 40 via the Internet 50, and can acquire information related to the AR display processing and the like from the AR server 40.
  • the portable information terminal 30 is connected to the control unit 20 by a short-range wireless communication system such as Bluetooth (registered trademark), and can transmit information relating to the AR display processing and the like and information relating to the current position of the user to the control unit 20. .
  • a short-range wireless communication system such as Bluetooth (registered trademark)
  • the HMD 10 is configured as a glasses-shaped see-through display.
  • the control unit 20 is a device for controlling the HMD 10 and controls the operation of the HMD 10 based on an input operation by the user.
  • the control unit 20 is connected to the HMD 10 via a cable corresponding to a predetermined standard, executes processing based on information acquired from the portable information terminal 30, and outputs a processing result to the HMD 10.
  • the HMD 10 can provide an AR display object in which an image associated therewith is superimposed on an object existing in the real space via a see-through display to a user wearing the HMD 10.
  • an image related to an object is referred to as an AR object or simply an object.
  • FIG. 2 is a diagram illustrating an example of a visual field presented by the HMD 10.
  • the AR display object S includes an object A existing in the real space and an object B related thereto.
  • the object A is determined according to the type of application program that executes processing in the AR system 100 (control unit 20).
  • the object A is a building, a person, a tourist spot, a dangerous spot, or the like that the user wants to watch. Can do.
  • the object B is an image that displays information related to the object A, and may be an image including characters, patterns, or an animation image.
  • the object may be a two-dimensional image or a three-dimensional image.
  • the shape of the object may be a rectangle, a circle, or other geometric shape, and can be set as appropriate depending on the type of the object.
  • a user wearing the HMD 10 can acquire information related to the object A using the object B.
  • AR display object candidates are prepared at predetermined positions in a region other than the visual field V. These AR display object candidates can be displayed by changing the field of view due to the movement of the user or the change in the posture of the display unit 11.
  • FIG. 3 is a block diagram showing the configuration of the AR system 100. Hereinafter, each element of the AR system 100 will be described with reference to FIG.
  • the AR server 40 includes a CPU 401, a memory 402, and a network communication unit 403. Although not shown, the AR server 40 may have a configuration such as an input device, a display device, and a speaker as necessary.
  • the CPU 401 controls the overall operation of the AR server 40.
  • the memory 402 includes nonvolatile memories such as a ROM (Read Only Memory) and a RAM (Random Access Memory), an HDD (Hard Disk Drive), and a flash memory (SSD; Solid State Drive). Stores programs for executing control, various parameters, and other necessary data.
  • the network communication unit 403 communicates with the portable information terminal 30 via the Internet 50.
  • the communication method is not particularly limited, and may be wired communication using a NIC (Network Interface Card) for Ethernet (registered trademark), wireless LAN (IEEE802.11, etc.) such as WiFi (Wireless Fidelity), or mobile Wireless communication using a 3G or 4G network for communication may be used.
  • NIC Network Interface Card
  • IEEE802.11, etc. wireless LAN
  • WiFi Wireless Fidelity
  • mobile Wireless communication using a 3G or 4G network for communication may be used.
  • the memory 402 also holds an AR database 404.
  • the AR database 404 stores information such as object image information, target object attributes, target object position, and object position for each AR display object candidate.
  • the position information of the object is typically information on the absolute position (latitude, longitude, etc.) acquired from the portable information terminal 30. For example, a cylindrical coordinate system around the user centered on the user wearing the HMD 10 It may be expressed as the upper coordinates.
  • information on new AR display objects is additionally registered as appropriate by the portable information terminal 30, other portable information terminals, information processing apparatuses, and the like connected to the AR server 40 via the Internet 50.
  • the portable information terminal 30 includes a CPU 301, a memory 302, a network communication unit 303, a short-range communication unit 304, a GPS communication unit 305, a display unit 306 on which a touch panel is mounted, And a power source 307.
  • the CPU 301 controls the operation of the mobile information terminal 30 as a whole.
  • the memory 302 includes a ROM, a RAM, a non-volatile memory, and the like.
  • the network communication unit 303 communicates with the AR server 40 or the like using a wireless LAN (IEEE802.11 or the like) such as WiFi (Wireless Fidelity) or a 3G or 4G network for mobile communication.
  • the portable information terminal 30 downloads information on the AR display object to be transmitted to the control unit 20 from the AR server 40 via the network communication unit 303 and stores it in the memory 302.
  • the short-range communication unit 304 communicates with the control unit 20 and other portable information terminals using a short-range communication system such as Bluetooth (registered trademark) or infrared communication.
  • the GPS communication unit 305 acquires the current position of the user carrying the portable information terminal 30 by receiving a signal from a GPS satellite.
  • the display unit 306 includes, for example, an LCD (Liquid Crystal Display) or an OELD (Organic ElectroLuminescence Display), and displays various menus, GUIs of applications, and the like.
  • the display unit 306 is equipped with a touch panel and can accept a user's touch operation.
  • the internal power supply 307 supplies power necessary for driving the portable information terminal 30.
  • the control unit 20 includes a CPU 201, a memory 202, a communication unit 203, an input operation unit 204, and an internal power source 205.
  • the CPU 201 controls the operation of the entire control unit 20.
  • the memory 202 includes a ROM, a RAM, and the like, and stores a program for executing control of the control unit 20 by the CPU 201, various parameters, information on an AR display object, and other necessary data.
  • the communication unit 203 constitutes an interface for short-range communication with the portable information terminal 30.
  • the input operation unit 204 is for controlling an image displayed on the HMD 10 by a user operation.
  • the input operation unit 204 may be configured with a mechanical switch or a touch sensor.
  • the internal power supply 205 supplies power necessary for driving the HMD 100.
  • FIG. 4 is a perspective view showing the appearance of the HMD 10
  • FIG. 5 is a schematic side view of the HMD 10.
  • the HMD 10 includes a display unit 11, a mounting unit 12, a tactile sense presentation unit 13, and a detection unit 14.
  • the display unit 11 presents an AR display object in the user's field of view.
  • the AR display object includes an AR object and an object in the real space related to the AR object.
  • the display unit 11 includes first and second display surfaces (display surfaces) 111R and 111L, and first and second image generation units (image generation units) 112R and 112L.
  • the 1st and 2nd display surfaces 111R and 111L are comprised with the optical element which has transparency which can provide real space (external field visual field) to the right eye and left eye of the user U, respectively.
  • the first and second image generation units 112R and 112L are configured to be able to generate images to be presented to the user U via the first and second display surfaces 111R and 111L, respectively.
  • the display unit 11 configured as described above is configured to be able to provide a user with a visual field in which a predetermined image (or virtual image) is superimposed on the real space via the display surfaces 111R and 111L.
  • the display unit 11 may further include a frame 113 that supports the first and second display surfaces 111R and 111L and the first and second image generation units 112R and 112L.
  • the frame 113 is made of a lightweight and relatively high rigidity material, and is made of a metal material such as Mg (magnesium) or Al (aluminum).
  • the relative arrangement of the first and second display surfaces 111R and 111L with respect to the first and second image generation units 112R and 112L can be fixed, and a clear image can be provided.
  • the display unit 11 is supported by the mounting unit 12.
  • the mounting unit 12 is configured to be mountable by a user.
  • the mounting unit 12 supports the display surfaces 111R and 111L and the image generation units 112R and 112L, and is mounted on the user's head so that the display surfaces 111L and 111R face the right and left eyes of the user, respectively.
  • the mounting unit 12 includes a first mounting member 121R mounted on the right side of the user, a second mounting member 121L mounted on the left side of the user, and a support unit 122.
  • the first and second mounting members (mounting members) 121R and 121L are mounted on the user's temporal region and are configured in the shape of temples of glasses.
  • the support portion 122 is configured along the upper portions of the display surfaces 111R and 111L, and connects the first and second mounting members 121R and 121L.
  • the support part 122 supports the display part 11 by being connected to the center part of the frame 113.
  • the first and second mounting members 121R and 121L include first and second modern portions (modern portions) 123R and 123L formed in a region extending from the tip portion to the central portion, and the first and second modern portions 123R. , 123L and the support part 122, the first and second connection parts 124R, 124L are included.
  • the first and second modern parts (modern parts) 123R and 123L have a shape that can be locked to the upper part of the user's auricle, for example, and are attached to the user's ears. Further, the first and second modern portions 123R and 123L may have covers 123Ra and 123La formed of a highly elastic material such as silicone resin or rubber resin as shown in FIGS. Good. Thereby, a user's burden can be reduced.
  • the support portion 122 and the first and second connection portions 124R and 124L are formed of a material such as a relatively elastic resin or metal as a whole.
  • the first and second modern parts 123 ⁇ / b> R and 123 ⁇ / b> L may have higher rigidity than the support part 122.
  • the rigidity may be adjusted depending on the material or may be adjusted depending on the shape. Thereby, it can be set as the structure which the drive of the tactile sense presentation part 13 mentioned later cannot transmit easily to the display part 11.
  • the connecting portions 124R and 124 may have the same rigidity as the modern portions 123R and 123L, or may have the same rigidity as the support portion 122.
  • the tactile sense providing unit 13 is provided in the mounting unit 12 and presents the user with information related to the relative position of the AR display object candidate viewed from the user by a tactile feedback pattern.
  • the tactile sense presentation unit 13 includes a plurality of vibrators 131 arranged at different positions on the mounting unit 12.
  • the plurality of vibrators 131 include a first vibrator 131R disposed on the first mounting member 121R and a second vibrator 131L disposed on the second mounting member 121L.
  • the first and second vibrators (vibrators) 131R and 131L are disposed in the first and second modern portions 123R and 123L, respectively.
  • Each vibrator 131R, 131L is constituted by a vibration motor in this embodiment.
  • the vibration motor generates vibration when an unbalanced weight attached to the rotation shaft of the motor rotates around the rotation shaft.
  • the extending direction of the rotating shaft matches the extending direction of the mounting members 121R and 121R, and the rotating directions of the weights around the rotating shaft are different from each other. Be placed. Thereby, even when the vibrators 131R and 131L vibrate, the balance between the first and second mounting members 121R and 121L can be easily maintained, and the influence of vibration on the display unit 11 can be suppressed.
  • the tactile sense providing unit 13 obtains information on the relative position of the AR display object candidate viewed from the user by a tactile feedback pattern. Present by touch. Details will be described later.
  • the detection unit 14 can detect a change in posture of the display unit 11.
  • the detection unit 14 is configured to detect posture changes around the X, Y, and Z axes, respectively.
  • the detection unit 14 can be configured by a motion sensor such as an angular velocity sensor or an acceleration sensor, or a combination thereof.
  • the detection unit 14 may be configured by a sensor unit in which each of the angular velocity sensor and the acceleration sensor is arranged in the three-axis directions, or the sensor to be used may be different depending on each axis.
  • an integrated value of the output of the angular velocity sensor can be used as the posture change of the display unit 11, the direction of the change, the amount of the change, and the like.
  • a geomagnetic sensor may be employed for detecting the orientation of the display unit 11 around the vertical axis (Z axis).
  • the geomagnetic sensor and the motion sensor may be combined. Thereby, it is possible to detect a change in orientation or posture with high accuracy.
  • the detection unit 14 is disposed at an appropriate position on the display unit 11.
  • the position of the detection unit 14 is not particularly limited.
  • the detection unit 14 may be arranged in one of the image generation units 112R and 112L, or may be arranged in a part of the mounting unit 12.
  • control unit 20 The operation of the HMD 10 configured as described above is controlled by executing predetermined processing by the control unit 20.
  • the functional configuration of the control unit 20 will be described.
  • FIG. 6 is a block diagram showing a functional configuration of the control unit 20.
  • the control unit 20 includes an AR display processing unit 21 that executes an AR display process and a haptic feedback processing unit 22 that executes a haptic feedback process.
  • the AR display processing unit 21 includes a visual field setting unit 211, an AR information management unit 212, and a display control unit 213.
  • the visual field setting unit 211 sets a visual field range based on the attitude of the display unit 11 calculated from the detection result of the detection unit 14.
  • the visual field setting unit 211 is executed by the CPU 201 of the control unit 20.
  • virtual cylindrical coordinates C0 surrounding the user U with the vertical axis Az as the center are used.
  • FIG. 7 is a schematic diagram illustrating the cylindrical coordinates C0 and the field of view V.
  • the cylindrical coordinate C0 is a coordinate system that defines a position on a virtual circumferential surface arranged at a distance (radius) R from the vertical axis Az.
  • the user U (display unit 11) is arranged on the vertical axis Az.
  • the cylindrical coordinate C0 is a height in the height direction that represents a vertical coordinate axis ( ⁇ ) that represents an angle around the vertical axis with the north direction being 0 °, and a horizontal line of sight Lh of the user U. Coordinate axis (h).
  • the coordinate axis ( ⁇ ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction.
  • the radius R and height H of the cylindrical coordinates C0 can be arbitrarily set.
  • the position of the user U that defines the vertical axis Az is defined by the position of the user U acquired by the portable information terminal 30.
  • the visual field V is arranged on a virtual peripheral surface in which the cylindrical coordinates C0 are set.
  • the visual field setting unit 211 calculates the posture change of the display unit 11 based on the output of the detection unit 14, and determines which region on the cylindrical coordinate C0 the user's U visual field V belongs to.
  • the region to which the visual field V belongs in the cylindrical coordinate C0 is defined by the ranges of ⁇ and h.
  • the visual field V moves on the cylindrical coordinates C0 due to the change in the posture of the user U, and the moving direction and moving amount are calculated based on the output of the detection unit 14.
  • FIG. 8 is a development view of the cylindrical coordinates C0 indicating the visual field V on the cylindrical coordinates C0.
  • the symbol Oc in the figure indicates the origin of the cylindrical coordinates C0.
  • the visual field V has a substantially rectangular shape.
  • the circumferential range of the visual field V is expressed by the following formula (3). ⁇ v1 ⁇ ⁇ v ⁇ ⁇ v2 (3)
  • the range of the visual field V in the height direction is represented by the following formula (4). hv1 ⁇ hv ⁇ hv2 (4)
  • the AR information management unit 212 acquires information about a predetermined AR display object candidate from the portable information terminal 30 and manages the acquired information about the AR display object candidate.
  • the AR information management unit 212 is executed by the CPU 201 and the memory 202 of the control unit 20.
  • the AR display object candidate information includes, for example, object position information, object image information, and object position information.
  • the AR display object candidate refers to a set of an object that can be arranged on the cylindrical coordinate C0 and an object in the real space related to the object.
  • object A1 and object B1 are AR display object candidates S1
  • object A2 and object B2 are AR display object candidates S2
  • object A3 and object B3 are AR display object candidates S3, object A4 and object B4 are This is defined as an AR display object candidate S4.
  • AR display objects those displayed in the visual field V are referred to as “AR display objects”.
  • the objects A1 to A4 exist in the real space, and typically exist at positions farther from the user U than R.
  • the position information of the objects A1 to A4 may include information on the distance from the user U to the objects A1 to A4, or the positions (latitude, longitude, etc.) of the objects A1 to A4 in the real space. Information may be included.
  • the positions (coordinates) of the objects B1 to B4 are respectively associated with the intersection positions of the user's eye line L watching the objects A1 to A4 and the cylindrical coordinates C0. That is, the position information of the objects B1 to B4 includes, for example, coordinate information on the cylindrical coordinates C0. In the illustrated example, the center position of each of the objects B1 to B4 is made coincident with the intersection position.
  • the present invention is not limited to this, and a part of the periphery of the object (for example, part of the four corners) may coincide with the intersection position. Good.
  • the coordinate positions of the objects B1 to B4 may be associated with any position away from the intersection position.
  • the AR information management unit 212 acquires AR display object candidate information corresponding to the current position of the user acquired in advance from the portable information terminal 30 as information about the AR display object candidate.
  • the information acquisition timing by the AR information management unit 212 is determined to be, for example, a timing at which it is determined that the current position of the user has changed by a predetermined distance or more, or a notification from the AR server 40 regarding the state change of the AR display object candidate. Timing can be used.
  • the AR information management unit 212 can acquire AR display object information related to an object existing at a predetermined distance from the user's current position as AR display object information corresponding to the current position.
  • the AR information management unit 212 may also delete unnecessary AR display object candidate information based on the current position of the user.
  • the AR information management unit 212 supplies the acquired AR display object candidate information to the display control unit 213.
  • the display control unit 213 executes a process of displaying (drawing) an object on the cylindrical coordinate C0 corresponding to the orientation of the display unit 11 in the visual field V based on the output of the detection unit 14 (that is, the processing result of the visual field setting unit 211). Configured to do.
  • the display control unit 213 is executed by the CPU 201 of the control unit 20.
  • FIG. 9 is a developed view of the cylindrical coordinates C0 conceptually showing the relationship between the visual field V on the cylindrical coordinates C0 and the objects B1 to B4. For example, as shown in the figure, when the current orientation of the visual field V overlaps the display areas of the objects B1 and B2 on the cylindrical coordinates C0, images corresponding to the overlapping areas B10 and B20 are displayed in the visual field V. .
  • the display control unit 213 typically changes the display position of the objects B ⁇ b> 1 and B ⁇ b> 2 within the field of view V following the change in the orientation or orientation of the display unit 11. This control is continued as long as at least a part of each object B1, B2 exists in the visual field V.
  • the display control unit 213 may set an xy coordinate (local coordinate) having one point belonging to the field of view V as an origin, and execute a calculation for converting the cylindrical coordinate C0 to the local coordinate.
  • the display area (field of view V) of the HMD 10 is narrower than the visual field of human eyes, the objects displayed in the field of view V are limited. For example, since the objects B3 and B4 shown in FIG. 9 are out of the field of view, the user cannot see these objects B3 and B4 at all. Thus, there is a possibility that useful information cannot be sufficiently provided only by the presentation of the visual field V. Therefore, according to the present embodiment, the position information of the AR display object candidate outside the field of view can be grasped by the user by tactile feedback.
  • the tactile feedback processing unit 22 includes an AR information management unit 221, a tactile processing determination unit 222, and a feedback generation unit 223.
  • the tactile feedback pattern is generated based on information on the relative position of the AR display object candidate viewed from the user.
  • the AR information management unit 221 acquires information on the AR display object candidate corresponding to the current position of the user from the portable information terminal 30, and manages the acquired information on the AR display object candidate.
  • the AR information management unit 212 is executed by the CPU 201 and the memory 202 of the control unit 20.
  • the AR information management unit 221 may be executed by the AR information management unit 212 of the AR display processing unit 21.
  • the AR display object candidate information includes, for example, object position information, object image information, and object position information.
  • the position information of the objects A1 to A4 may include information about the distance from the user U to the objects A1 to A4, or may include information about the positions of the objects A1 to A4 in the real space.
  • the position information of the objects B1 to B4 may include coordinate information on the cylindrical coordinates C0.
  • the AR information management unit 221 acquires information on an AR display object candidate corresponding to the current position of the user acquired in advance from the portable information terminal 30 as information on the AR display object candidate.
  • the information acquisition timing by the AR information management unit 221 can be, for example, a timing when it is determined that the current position of the user has changed by a predetermined distance or more.
  • the AR information management unit 212 may also delete unnecessary AR display object candidate information based on the current position of the user.
  • the AR information management unit 221 supplies the acquired AR display object candidate information to the tactile sense processing determination unit 222.
  • the haptic process determination unit 222 determines whether there is an AR display object candidate to be subjected to the haptic feedback process among the AR display object candidates acquired by the AR information management unit 221.
  • the tactile processing determination unit 222 is executed by the CPU 201 of the control unit 20.
  • the criteria for tactile feedback processing targets are, for example, whether or not there is a newly acquired AR display object candidate or whether or not there is an AR display object candidate whose display mode has changed. can do.
  • Examples of “newly acquired” AR display object candidates include, for example, a newly acquired AR display object candidate that has been newly registered in the AR database 404, and a user's movement between the user and the target object.
  • the AR display object candidate newly acquired when the distance is equal to or less than the predetermined distance may be used.
  • Examples of the AR display object candidate “changed in the display mode” include an AR display object candidate in which the display mode of the image of the object registered in the database 404 of the AR server 40 has changed.
  • the tactile processing determination unit 222 supplies the determination result to the feedback generation unit 223.
  • the feedback generation unit 223 generates a haptic feedback pattern for tactilely presenting information regarding the relative position of the AR display object candidate viewed from the user of the AR display object candidate determined as the target of the tactile feedback process.
  • the feedback generation unit 223 is executed by the CPU 201 of the control unit 20.
  • the information regarding the relative position of the AR display object candidate viewed from the user is, for example, at least one of the direction of the AR display object candidate based on the center direction of the user's visual field and the distance between the AR display object candidate and the user. Contains information related to one.
  • the information related to the direction of the AR display candidate based on the center direction of the user's visual field is, for example, information on the rotation angle of the object based on the central direction of the visual field (the front direction of the user), Information such as the height of an object based on the height may be included.
  • the tactile feedback pattern is a vibration pattern of the first vibrator 131R and the second vibrator 131L, and the vibration intensity and vibration timing of each of the first vibrator 131R and the second vibrator 131L. Including patterns.
  • the vibration pattern for example, when the direction of the AR display object candidate based on the center direction of the user's visual field corresponds to the right side of the user, the first vibrator 131R is stronger than the second vibrator 131L.
  • the first vibrator 131R is vibrated with a weaker intensity than the second vibrator 131L.
  • a pattern is applied.
  • the AR display object candidate that is the target of the tactile feedback process may be associated with a position outside the field of view.
  • FIG. 10A is a flowchart showing a flow of processing for acquiring information on AR display object candidates in the control unit 20.
  • the control unit 20 executes an AR display candidate information acquisition process when the current position of the user has changed by, for example, 50 m or more.
  • the processing from ST101 to ST104 is executed by the portable information terminal 30.
  • the processes of ST201 to ST203 are executed by the AR server 40, and the process of ST301 is executed by the control unit 20.
  • the CPU 301 of the portable information terminal 30 determines whether or not an acquisition condition for information of an AR display object candidate is satisfied (ST101). Specifically, the CPU 301 may determine whether or not the user's current position has changed by 50 m or more based on the current position information acquired from the GPS communication unit 305. In addition to the above, or in place of the above, the CPU 301 may determine whether or not a notification regarding the state change of the AR display object candidate has been received from the AR server 40. When it is determined that the information acquisition condition for the AR display object candidate is satisfied (Yes in ST101), the network communication unit 303 determines the current position acquired from the GPS communication unit 305 based on the control of the CPU 301 as the AR server 40. (ST102).
  • the network communication unit 403 of the AR server 40 receives the current position transmitted from the portable information terminal 30 based on the control of the CPU 401 (ST201). Subsequently, the CPU 401 acquires information on AR display object candidates corresponding to the acquired current position from the object database 404 (ST202). Then, based on the control of CPU 401, network communication unit 403 transmits information about the AR display object candidate acquired to portable information terminal 30 to portable information terminal 30 (ST203).
  • the network communication unit 303 of the portable information terminal 30 acquires information on the AR display object candidate transmitted from the AR server 40 based on the control of the CPU 301 (ST103). Subsequently, the short-range communication unit 304 transmits information on the acquired AR display object candidate to the control unit 20 based on the control of the CPU 301 (ST104).
  • the communication unit 203 of the control unit 20 receives the information of the AR display object candidate transmitted from the portable information terminal 30 based on the control of the CPU 201, and the CPU 201 stores the information in the memory 202 (ST301). Thereby, the AR information management units 212 and 221 of the control unit 20 can acquire information on AR display object candidates.
  • the control unit 20 cooperates with the HMD 10 to execute a drawing process at a predetermined drawing timing. The outline of the drawing process will be described below.
  • FIG. 10B is a flowchart showing the flow of the drawing process.
  • the processing of ST401 to ST404 is executed by the HMD 10. Further, the processing of ST501 to ST504 is executed by the control unit 20.
  • the HMD 10 determines whether or not the drawing timing of 1/30 sec has been reached (ST401). When it is determined that the drawing timing has come (Yes in ST401), the HMD 10 outputs the detection result of the detection unit 14 to the control unit 20 (ST402). Moreover, the visual field setting unit 211 of the control unit 20 acquires the output detection result of the detection unit 14 (ST501).
  • the AR information management units 212 and 221 of the control unit 20 that acquired the detection result acquires information on AR display object candidates from the memory 202 (ST502). Then, the control unit 20 executes an AR display process (ST503) and a tactile feedback process (ST504) in cooperation with the CPU 201 and the memory 202.
  • the processing results of the AR display process (ST503) and the tactile feedback process (ST504) are transmitted to the HMD 10, respectively.
  • the display unit 11 of the HMD 10 presents a field of view including a predetermined AR display object based on the processing result of the AR display process (ST403), and the tactile sense presentation unit 13 compares the object relative to the object based on the processing result of the tactile feedback process.
  • the position information is presented by touch (ST404).
  • FIG. 11 is a flowchart for explaining an operation example of the control unit.
  • FIG. 11A is a flowchart of the AR display process (ST203)
  • FIG. 11B is a flowchart of the tactile feedback process (ST204).
  • the visual field setting unit 211 determines the range of the visual field V represented by Expression (3) and Expression (4) based on the attitude of the display unit 11 calculated from the detection result of the detection unit 14. Is set (see FIGS. 7 and 8) (ST601). Subsequently, the display control unit 213 determines whether there is an object to be displayed in the field of view V based on the information on the AR display object candidate managed by the AR information management unit 212 (see FIG. 7) ( ST602). If determined to be present (Yes in ST602), the display control unit 213 executes a process of displaying an object overlapping the visual field V (ST603). As described above, the control unit 20 ends the AR display process.
  • the haptic process determination unit 222 determines whether there is an AR display object candidate to be subjected to the haptic feedback process among the AR display object candidates acquired by the AR information management unit 221. (ST701). Specifically, the tactile sensation processing determination unit 222 determines whether or not there is a newly acquired AR display object candidate among the AR display object candidates acquired by the AR information management unit 221.
  • the newly acquired AR display object candidate is, for example, an AR display object candidate acquired by the AR information management unit 221 in the current process, which was not acquired by the AR information management unit 221 in the previous process.
  • the feedback generation unit 223 generates a tactile feedback pattern for each AR display object candidate determined to be a target as follows (ST702).
  • FIG. 12 is a schematic diagram illustrating a generation example of a haptic feedback pattern.
  • the feedback generation unit 223 calculates the rotation angle (coordinate in the circumferential direction) ⁇ ′b [°] of the object B included in the AR display object candidate determined as the target with reference to the front direction of the user U. (ST702-1).
  • the coordinate ⁇ v0 in the front direction of the user U is represented by the following formula (5) in view of the range in the circumferential direction of the visual field V shown by the formula (3).
  • ⁇ v0 [°] ( ⁇ v1 + ⁇ v2) / 2 (5)
  • the area on the right side of the user U is represented by coordinates not less than ⁇ v0 [°] and not more than ( ⁇ v0 + 180) [°]
  • the area on the left side of the user U is not less than ( ⁇ v0 + 180) [°].
  • ( ⁇ v0 + 360) It is expressed by coordinates below [°].
  • the relative rotation angle ⁇ ′b of the object B is expressed by the following equation (6) using the object coordinate ⁇ b in the cylindrical coordinate C0 and the coordinate ⁇ v0 in the front direction of the user U.
  • ⁇ ′b ⁇ b ⁇ v0 (6)
  • the feedback generation unit 223 refers to, for example, a lookup table stored in advance in the memory 202, and determines a region to which ⁇ ′b belongs (ST703-2).
  • a region to which ⁇ ′b belongs ST703-2.
  • first to fifth regions R1 to R5 are defined so as to surround the user U.
  • FIG. 12 shows an example in which ⁇ ′b belongs to the first region R1.
  • the range of ⁇ ′b belonging to the first region R1 is expressed by the following formula (10).
  • the range of ⁇ ′b belonging to the visual field V and the second to fifth regions R2 to R5 is shown in Table 1 below.
  • feedback generation section 223 refers to a lookup table stored in advance in memory 202, and determines the vibration pattern of each transducer 131R, 131L according to the region to which ⁇ ′b belongs (ST702-3).
  • FIG. 12 and Table 1 show the relationship between each of the first to fifth regions R1 to R5 stored as a lookup table and the strength of each transducer 131R, 131L.
  • “R” indicates the first vibrator 131R disposed on the right side
  • “L” indicates the second vibrator 131L disposed on the left side.
  • the numerical values corresponding to “R” and “L” in FIG. 12 indicate the vibration strengths of the vibrators 131R and 131L.
  • 0 indicates no vibration
  • the vibration intensity increases as the numerical value increases
  • 10 indicates the maximum intensity.
  • the control unit 20 ends the tactile feedback generation process.
  • the first region R1 corresponds to the right front of the user U and substantially coincides with the movable range of the right arm of the user U that can be recognized as the right side of the user U.
  • the first vibrator 131R arranged on the right side vibrates at the maximum intensity
  • the second vibrator 131L arranged on the left side does not vibrate.
  • the user can perceive a large vibration in the right side of the head and intuitively recognize that a new object exists on the right side of the user.
  • the second region R2 corresponds to the right rear of the user U.
  • the first vibrator 131R vibrates with a strength of about 80% of the maximum strength
  • the second vibrator 131L vibrates with a strength of about 20% of the maximum strength.
  • the user can perceive a large vibration in the right side of the head, and can also perceive a slight vibration in the left side of the head, and slightly toward the left rather than the right side of the user (the first region R1).
  • the third region R3 corresponds to the back of the user U, that is, the back surface.
  • both vibrators 131R and 131L vibrate with substantially the same intensity.
  • the user U can recognize that there is a new object in an area that is not biased to the right or left of the user U, that is, behind.
  • the user can recognize the presence and direction of the new object by vibration.
  • none of the vibrators 131R and 131L vibrates for the AR display object candidate included in the visual field V. That is, the AR display object candidate to be subjected to the tactile feedback process is associated with a position outside the visual field V.
  • the direction of the AR display object candidate not included in the visual field V can be recognized by the user by tactile feedback.
  • the control unit 20 can generate a haptic feedback pattern capable of presenting the direction of the AR display object candidate based on the center direction of the user's visual field. Thereby, even when the user cannot visually recognize the object, the user can accurately grasp the relative position of the AR display object candidate based on the tactile feedback pattern.
  • the tactile sense providing unit 13 can present the information about the newly acquired AR display object candidate by tactile sense. Thereby, new information can be presented to the user together with the position information, and the object can be used effectively.
  • the direction of the AR display object candidate is adjusted by adjusting the vibration intensity by linking the arrangement of the plurality of vibrators 131R and 131L provided on the left and right and the relative position of the AR display object candidate. It is possible to grasp intuitively.
  • the tactile sense presenting unit 13 is provided not on the display unit 11 but on the mounting unit 12, and in particular, by being arranged in the modern units 123 ⁇ / b> R and 123 ⁇ / b> L spaced apart from the display unit 11, vibration is transmitted to the display unit 11. It can be configured to be difficult to transmit. Thereby, the displacement of each of the display surfaces 111R and 111L due to the tactile feedback can be suppressed, and an image that is clear and easily visible even during vibration can be presented. Moreover, the said effect can be further heightened by raising the rigidity of modern part 123R, 123L.
  • AR system application example An application example of the AR system 100 (control unit 20) of the present embodiment will be described.
  • the AR system 100 can be applied to an application program (hereinafter also referred to as an application) that presents a position of a predetermined person in real time.
  • the object is a person such as a family or a friend registered in advance by the user, and the object can be information (name or the like) specifying the person.
  • the AR server 40 determines that a person (object) exists within a predetermined distance (for example, 1 km) from the user, the AR server 40 acquires information on the person and the object as information on the AR display object candidate. Then, the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired.
  • the feedback generation unit 223 When it is determined that the information is newly acquired, the feedback generation unit 223 generates a tactile feedback pattern based on the position information of the person or the object. Thereby, even when the object is not displayed, the user can grasp the approach of the person and the position information of the object, and can move the field of view toward the object. Further, when the distance between the user and the person changes and further approaches, and the AR server 40 determines that the person exists within another predetermined distance (for example, 200 m) from the user, the information on the person and the object is stored in the AR. It is newly acquired again as information on the display object candidate. Then, the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired.
  • the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired.
  • the feedback generation unit 223 When it is determined that the information is newly acquired, the feedback generation unit 223 generates a tactile feedback pattern based on the position information of the person or the object. Thereby, the tactile sense providing unit 13 can present information related to the relative position of the AR display object candidate viewed from the user with a tactile sensation for the AR display object candidate in which a predetermined change has occurred in the distance from the user. Therefore, even when the object is not displayed, the user can grasp the further approach of the person and the position information of the object, and can also meet the person by moving the field of view toward the object.
  • the AR system 100 can be applied to an application that notifies a dangerous location such as a fire location, an accident location, or an accident-prone spot.
  • the object is a dangerous place
  • the object can be information (position or information for calling attention) associated with the dangerous place.
  • the AR server 40 determines that the dangerous part (target object) exists within a predetermined distance (for example, 10 km) from the user, the AR server 40 acquires the information on the dangerous part and the object as information on the AR display object candidate. Then, the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired.
  • the feedback generation unit 223 When it is determined that the information is newly acquired, the feedback generation unit 223 generates a tactile feedback pattern based on the dangerous location and the position information of the object. Thereby, even when the object is not displayed, the user can grasp the approach of the dangerous place and the position information of the object, can move the field of view toward the object, escape in the direction opposite to the dangerous place, etc. You can take avoidance actions.
  • the distance informing the dangerous location may be changed depending on the type of danger and the user's action (whether walking or riding a car). Further, the vibration intensity of the tactile feedback pattern may be changed according to the degree of risk. Further, even when an object is displayed, a haptic feedback pattern may be generated. This makes it possible to avoid danger without fail.
  • the AR system 100 can be applied to a navigation application that provides route guidance and a town guide application that provides local information.
  • the object is a tourist spot, a station, a bus stop, a store, a gas station, a parking lot, an evacuation site, a traffic jam site, a construction site, etc.
  • the object is information attached to the corresponding site (spot name, location information, etc.) ).
  • the AR server 40 determines that the corresponding part (target object) exists within a predetermined distance (for example, 10 km) from the user, the AR server 40 acquires the information on the corresponding part and the object as information on the AR display object candidate.
  • the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired. When it is determined that the data is newly acquired, the feedback generation unit 223 generates a haptic feedback pattern based on the position information of the corresponding part and the object. Alternatively, the haptic process determination unit 222 determines whether or not the display mode of the already acquired AR display object candidate has changed. If it is determined that the display mode has changed, the feedback generation unit 223 generates the haptic feedback pattern. Good. Thereby, the tactile sense providing unit 13 can present information about the AR display object candidate whose display mode has changed by tactile sense.
  • the display mode can change when, for example, the congestion status of the corresponding part, which is the display content of the object, changes. Thereby, even when the object is not displayed, the user can grasp the approach of the corresponding part, the change of the content of the object, the position information of the object, and can move the visual field toward the object.
  • the AR system 100 can be applied to a game application associated with the real world.
  • the AR display object does not have an object, and has only objects such as game characters and items.
  • the AR server 40 determines that the set position of the object exists within a predetermined distance (for example, several hundred meters to several km) from the user, the AR server 40 acquires the information of the object as the AR display object candidate information. .
  • the tactile processing determination unit 222 of the control unit 20 determines whether or not the information on the AR display object candidate acquired via the portable information terminal 30 is information that is newly acquired.
  • the feedback generation unit 223 generates a haptic feedback pattern based on the position information of the object.
  • the haptic process determination unit 222 determines whether or not the display mode of the already acquired object has changed, and when it is determined that the display has changed, the feedback generation unit 223 may generate a haptic feedback pattern.
  • the display mode can change, for example, when the content of an item that is an object changes or when the battle situation displayed as an object changes.
  • the tactile feedback pattern can also be generated when an enemy character that is an object performs an attacking action or when another user connected via the Internet 50 is newly registered. Thereby, even when the object is not displayed, the user can grasp the approach of the corresponding part, the change of the content of the object, the position information of the object, and can move the visual field toward the object.
  • the generation example of the tactile feedback pattern is not limited to the example shown in FIG. 12 and Table 1. Even if the region to which the rotation angle ⁇ ′b with respect to the front direction of the user U of the object belongs is set as follows, Good. For example, as shown in FIG. 13, the region to which ⁇ ′b belongs may be only the first and second regions R1 and R2 in addition to the visual field V. In this case, the first region R1 is represented by the following equation (11), and the second region R2 is represented by the following equation (12).
  • the strengths of the vibrators 131R and 131L are not limited to the examples shown in FIG. 12 and Table 1, and can be appropriately adjusted.
  • the information related to the direction of the AR display object candidate based on the center direction of the user's visual field may include information on the height of the object based on the height of the center of the visual field.
  • the area to which the relative position of the object belongs may be an area set for the value of h in the height direction.
  • the first and second vibrators 131R and 131L can be vibrated with a specific vibration pattern with an intensity of 5, respectively.
  • tactile feedback may be generated for objects included in the field of view V.
  • the presence and position information of the object can be provided to the user by tactile feedback.
  • the feedback generation unit 223 can generate a haptic feedback pattern based on information related to the distance between the AR display object candidate and the user.
  • FIG. 14 is a schematic diagram illustrating a generation example of a haptic feedback pattern according to the present modification.
  • the feedback generation unit 223 determines the region to which the target object belongs based on the distance between the target AR display target candidate target and the user U. As shown in the figure, in the present modification, the first region R11 farthest from the user U, the second region R12 closer to the first region R11, and closer to the second region R12. A third region R13 is set. The distance from the user U in each region can be set as appropriate. Subsequently, the feedback generation unit 223 refers to a lookup table stored in advance in the memory 202 and determines a vibration pattern of the vibrator 131 according to the determined area.
  • the graph in FIG. 14 is a graph showing an example of a vibration pattern when the position of the object corresponds to each of the first to third regions R11 to R13, with the horizontal axis representing time and the vertical axis representing vibration intensity.
  • the vibrator 131 oscillates intermittently, for example, at a predetermined pitch.
  • the length of one cycle (referred to as a vibration pitch) in the first region R11 is t1
  • the continuous vibration time in each cycle is w1.
  • the vibration pitch in the second region R12 is t2
  • the continuous vibration time in each cycle is w2
  • the vibration pitch in the third region R13 is t3
  • the continuous vibration time in each cycle is w3.
  • t1 to t3 and w1 to w3 is expressed by the following equations (13) and (14), respectively.
  • the vibration pattern of this modification is not limited to the above example as long as the vibration pattern is set according to the distance between the user U and the object.
  • the vibrator 131 may vibrate in a pulse shape instead of continuously.
  • the number of pulses that are continuously generated may increase as the distance between the user U and the object decreases.
  • the strength of the vibrator 131 may be changed instead of changing the time during which the vibrator 131 continuously vibrates.
  • the vibrator 131 may vibrate at different pitches instead of vibrating at equal intervals.
  • the first and second vibrators 131R and 131L may vibrate with the same vibration intensity, and are set with respect to ⁇ ′b [°] as described with reference to FIG. 12 and Table 1.
  • the feedback generation unit 223 detects the tactile feedback pattern based on information related to both the direction of the AR display object candidate based on the center direction of the user's visual field and the distance between the AR display object candidate and the user. Can be generated.
  • FIG. 15 is a schematic side view of the HMD 10 of the present modification.
  • the tactile sense presentation unit 13 includes a plurality of vibrators 131a, 131b, and 131c arranged on the second mounting member 121L.
  • the plurality of vibrators 131a, 131b, and 131c are all disposed in the modern portion 123L of the second mounting member 121L.
  • the plurality of vibrators 131a, 131b, and 131c constitute a vibrator group 132 that is arranged side by side in a predetermined direction.
  • the predetermined direction is, for example, the extending direction of the modern portion 123L, and can be, for example, the Z-axis direction.
  • the plurality of vibrators 131a, 131b, and 131c included in the vibrator group 132 are sequentially arranged along a predetermined direction when the direction of the AR display object candidate with respect to the center direction of the user's visual field corresponds to the predetermined direction. Can vibrate.
  • the vibrators 131a, 131b, and 131c have substantially the same vibration pitch and continuous vibration time, but have different vibration timings.
  • the vibrator 131b and the vibrator 131c vibrate sequentially from the vibrator 131a toward the rear in the Z-axis direction. Thereby, the user can perceive vibration from the front in the Z-axis direction to the rear, and can intuitively recognize that the object is present in the rear (in the Z-axis direction).
  • the tactile sense providing unit 13 may include a transducer group in the first mounting member 121R on the right side in addition to the transducer group 132 disposed on the second mounting member 121L on the left side.
  • the first and second vibrators 131R and 131L described in the above embodiment may constitute a vibrator group.
  • the vibrator group is arranged in the left-right direction (X-axis direction). For example, when the object corresponds to the right side of the user, the vibrator 131L and the vibrator 131R may vibrate in this order. Good.
  • FIG. 16 is a top view schematically showing the HMD 10A of the present modification.
  • the mounting portion 12A does not include the first and second mounting members 121R and 121L but includes a band-shaped mounting member 121A.
  • the end of the mounting member 121A is connected to the right end and the left end of the display unit 11, respectively.
  • the mounting member 121A is mounted on the user's head from one temporal region to the occipital region and the other temporal region.
  • the mounting portion 12A may have an earpiece (not shown).
  • the tactile sense providing unit 13A includes a plurality of vibrators 131Aa, 131Ab, 131Ac, 131Ad, 131Ae, and 131Af arranged on the mounting member 121A.
  • the plurality of vibrators 131Aa to 131Af are arranged along the longitudinal direction of the mounting member 121A, for example, at a portion of the mounting member 121A that is mounted from the user's side to the back of the head. These vibrators 131Aa to 131Af can provide position information of AR display object candidates to the user by presenting various vibration patterns.
  • the plurality of vibrators 131Aa to 131Af are, for example, the first vibrator group 132Aa arranged side by side in the X-axis direction and the Z-axis direction, and the second vibrator arranged side by side in the X-axis direction and the Z-axis direction.
  • the vibrator group 132Ab includes transducers 131Aa, 131Ab, and 131Ac.
  • the second transducer group 132Ab includes transducers 131Ad, 131Ae, and 131Af.
  • the first transducer group 132Aa and / or the second transducer group 132Ab vibrate in order from the front to the rear in the Z-axis direction.
  • the first transducer group 132Aa vibrates sequentially from the left in the X-axis direction toward the right.
  • the second transducer group 132Ab vibrates sequentially from the right in the X-axis direction to the left.
  • the tactile sense providing unit 13 is not limited to the configuration having the vibrators 131R and 131L configured by a vibration motor.
  • the tactile sense presentation unit 13 is, for example, a linear actuator whose weight is reciprocally driven in a uniaxial direction, a “smart material” such as a piezoelectric material, an electroactive polymer or a shape memory alloy, a macro composite fiber actuator, an electrostatic actuator, an electric tactile actuator, You may have an actuator etc.
  • the haptic presentation unit 13 includes, for example, a device that uses electrostatic friction (“ESF”) and ultrasonic surface friction (“USF”), a device that generates an acoustic radiation pressure using an ultrasonic haptic transducer, a haptic substrate, and A device using a flexible or deformable surface, or a launch-type tactile presentation device such as air blowing using an air jet may be included.
  • ESF electrostatic friction
  • USF ultrasonic surface friction
  • a device using a flexible or deformable surface or a launch-type tactile presentation device such as air blowing using an air jet may be included.
  • FIG. 17 is a schematic top view showing the HMD 10B of this modification.
  • the tactile sense providing unit 13B of the HMD 10B includes a first linear actuator 133R and a second linear actuator 133L.
  • the first and second linear actuators 133R and 133L are disposed in the first and second modern portions 123R and 123L, respectively, and are, for example, regions that contact the upper pinna of the first and second modern portions 123R and 123L. May be arranged.
  • the first and second linear actuators 133R and 133L include weights 134R and 134L that reciprocate in predetermined axial directions, respectively.
  • the arrows in FIG. 17 indicate the driving directions of the weights 134R and 134L.
  • the AR display object candidate S corresponds to the right side of the user U
  • the weight 134R is driven forward in the Z-axis direction
  • the weight 134L is driven backward in the Z-axis direction.
  • FIG. 18 is a graph for explaining an example of driving the weight 134R shown in FIG. 17, in which the horizontal axis represents time, and the vertical axis represents the front position in the Z-axis direction.
  • the weight 134R moves forward in the Z-axis direction at a relatively fast speed, reaches a predetermined position, and then moves backward in the Z-axis direction at a relatively slow speed.
  • the linear actuator 133L also has a similar drive profile, although the movement direction of the weight is opposite to that of the linear actuator 133R. With such a driving profile, the user can obtain a feeling that the user is hit by someone from the front right. Therefore, the HMD 10B can prompt the user to move the field of view to the right, and can present the user with a tactile feedback pattern linked to the action that the user wants to guide.
  • the tactile sense providing unit 13B may have one linear actuator. Even in such a configuration, the tactile sense providing unit 13B can present to the user a tactile feedback pattern in which the driving direction of the weight and the direction of the AR display object candidate based on the user are linked.
  • the display unit 11 of the HMD 10 may be configured to present an auxiliary display that assists information regarding the relative position of the AR display object candidate viewed from the user presented by the tactile feedback pattern in the visual field.
  • the same components as those in the first embodiment are denoted by the same reference numerals, and the description thereof is omitted.
  • FIGS. 19 and 20 are diagrams illustrating examples of auxiliary display.
  • FIG. 19 shows an example in which the AR display object candidate exists on the right side of the user.
  • the auxiliary display D1 moves from left to right along the upper end of the visual field V when presenting tactile feedback.
  • FIG. 20 shows an example in which the AR display object candidate exists above the visual field.
  • the auxiliary display D2 moves from below to above along the right end of the visual field V when presenting tactile feedback.
  • the auxiliary displays D ⁇ b> 1 and D ⁇ b> 2 move along the end of the visual field V, so that the influence on other displays in the visual field V can be reduced.
  • the shapes of the auxiliary displays D1 and D2 may be rectangular, circular or other geometric shapes, and may be other animations or illustrations.
  • FIG. 21 is a block diagram showing a functional configuration of the control unit 20C according to the present embodiment.
  • the control unit 20C includes an AR display processing unit 21 and a tactile feedback processing unit 22 as well as an auxiliary display processing unit 23, as in the first embodiment.
  • the auxiliary display processing unit 23 includes an auxiliary display control unit 231 that executes processing for assisting information related to the relative position of the AR display object candidate viewed from the user, which is presented by the tactile feedback pattern.
  • the auxiliary display control unit 231 is executed by the CPU 201 of the control unit 20.
  • the auxiliary display control unit 231 is supplied with the determination result from the tactile processing determination unit 222 and is also supplied with information regarding the relative position of the AR display object candidate viewed from the user from the AR information management unit 221.
  • the auxiliary display control unit 231 executes auxiliary display drawing processing and animation processing based on these pieces of information.
  • the user can more easily grasp the position information of the AR display object candidate.
  • the display mode of the auxiliary display is not limited to the examples shown in FIGS.
  • the auxiliary displays D3 and D4 may display the position of the AR display object candidate, the contents of the object, and the like. Thereby, the positional information and contents of the AR display object candidate can be grasped in more detail.
  • the auxiliary display D5 is arranged in a partial region of the visual field V (a region along the lower end in the example shown in the figure), and is a schematic of the cylindrical coordinates C0 surrounding the user. It may be a development view.
  • the auxiliary display D5 has, for example, a region DV corresponding to the developed view of the visual field V at the center, and regions DR and DL on the right and left sides thereof.
  • the center of the auxiliary display D5 (position in the area DV) corresponds to the front direction of the user. Regions DR and DL correspond to the right and left development views outside the field of view.
  • the auxiliary display D5 has a sub-object SB corresponding to the AR display object candidate object.
  • the sub-object SB is arranged at a position in the auxiliary display D5 corresponding to the circumferential direction with the AR display candidate user as a reference. Thereby, the position information of the AR display object candidate can be accurately grasped by the user.
  • the AR server, the portable information terminal, the control unit, and the HMD have been described as the AR system.
  • the present invention is not limited to this.
  • the AR system does not have a portable information terminal, and the AR server and the control unit may be configured to communicate directly.
  • the control unit may be configured integrally with the HMD, or may be configured by a portable information terminal.
  • a control unit may be comprised by several apparatuses, such as AR server and a portable information terminal, for example.
  • the wearable display may be configured to include a mounting unit mounted on the user's head or the like, a display unit not supported by the mounting unit, and a tactile sense providing unit provided in the mounting unit.
  • the display unit may be, for example, a contact lens type display device arranged in the user's eye.
  • the display unit may be a portable information terminal such as a smartphone or a display device configured to be attached to spectacles worn by the user.
  • the mounting portion may have a configuration that hangs on a user's ear or a headphone configuration. Thereby, the mounting part can perceive a tactile sense with relatively high accuracy without blocking the user's visual field.
  • necessary AR display object candidate information is acquired from the AR server each time the user's current position changes or the display unit posture changes, but the present invention is not limited to this.
  • the control unit (HMD) or the portable information terminal when the control unit (HMD) or the portable information terminal is activated, information of AR display object candidates necessary for the AR display process and the tactile feedback process are collectively acquired from the AR server and stored in the memory. Also good.
  • the present technology can also be applied to, for example, a wearable display that can be attached to a wrist, an arm, a neck, or the like.
  • a see-through type (transmission type) HMD has been described.
  • the present technology can also be applied to a non-transmission type HMD.
  • a predetermined object according to the present technology may be displayed in an external field of view captured by a camera attached to the display unit.
  • the AR display object has been described as including a predetermined object existing in the real space and an object including information related thereto, but the present invention is not limited to this, and the AR display object represents the object. You don't have to.
  • the object is associated with a predetermined position in the real space, and the control unit uses this position information to determine the distance between the user and the AR display object. It may be configured to calculate.
  • this technique can also take the following structures.
  • a display unit that presents an AR (Augmented Reality) display in the user's field of view;
  • a mounting portion that can be mounted on the user;
  • information related to the relative position of the AR display object candidate viewed from the user is presented by a tactile sense using a tactile feedback pattern.
  • a wearable display comprising: a tactile sense providing unit provided in the mounting unit.
  • the tactile feedback pattern is generated based on information related to a relative position of the AR display object candidate as viewed from the user.
  • the wearable display according to (1) or (2) above The information regarding the relative position of the AR display object candidate viewed from the user includes the direction of the AR display object candidate based on the center direction of the user's visual field, and the distance between the AR display object candidate and the user.
  • the tactile sense presentation unit includes a plurality of vibrators arranged at different positions of the mounting unit, Each of the plurality of vibrators vibrates according to a vibration pattern defined based on information on the relative position and an arrangement of the plurality of vibrators.
  • the wearable display according to (4) above is A first mounting member mounted on the right side of the user; A second mounting member mounted on the left side of the user, The plurality of vibrators are A first vibrator disposed on the first mounting member; A second vibrator disposed on the second mounting member, The first vibrator vibrates with a stronger intensity than the second vibrator when the direction of the AR display candidate from the user corresponds to the right side of the user, and the AR display candidate When the direction seen from the user corresponds to the left side of the user, a wearable display that vibrates with a lower intensity than the second vibrator.
  • the wearable display according to (4) or (5) above The plurality of vibrators have a vibrator group arranged side by side in a predetermined direction, The plurality of vibrators included in the vibrator group vibrate in order along the predetermined direction when a direction of the AR display candidate viewed from the user corresponds to the predetermined direction.
  • the mounting part is A modern part worn on the user's ear; A support part connected to the modern part and supporting the display part;
  • the tactile sense presentation unit is a wearable display arranged in the modern unit.
  • the wearable display according to (7) above, The modern part has a higher rigidity than the support part.
  • the tactile sensation providing unit presents information related to the relative position of the AR display object viewed from the user with the tactile sensation for the AR display object candidate whose display mode has changed.
  • the tactile sensation providing unit presents information related to the relative position of the AR display object viewed from the user with respect to the AR display object candidate in which a predetermined change has occurred in the distance between the user and the wearable display.
  • a tactile sensation presentation unit that presents information regarding the relative position of the viewed AR display object candidate by tactile sensation
  • a control unit that generates a haptic feedback pattern and outputs the haptic feedback pattern to the haptic presentation unit when there is a predetermined change in an AR display candidate that can be presented as the AR display object.
  • Information processing system (16) A display unit that presents an AR (Augmented Reality) display in the user's field of view, a mounting unit that can be mounted on the user, and a tactile sense providing unit provided on the mounting unit.
  • a control method for a wearable display When there is a predetermined change in an AR display object candidate that can be presented as the AR display object, information related to the relative position of the AR display object candidate viewed from the user is presented by a tactile sense using a tactile feedback pattern. Yes Wearable display control method.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided is a wearable display, etc. allowing a user to know, through haptic feedback, an image to which an Augmented Reality (AR) technology is applied and information about the position of an object in a real space relating to the image. The wearable display according to an embodiment of the present technology comprises a display part, a wearable part, and a haptics presentation part. The display part presents an AR displayed object in the visual field of a user. The wearable part is configured to support the display part and to be wearable by the user. The haptics presentation part is provided on the wearable part. When a predetermined change occurs in an AR displayed object candidate which may be presented as the AR displayed object, the haptics presentation part presents, by a haptic feedback pattern and through haptics, information about the relative position of the AR displayed object candidate viewed from the user.

Description

ウェアラブルディスプレイ、情報処理システム及び制御方法Wearable display, information processing system, and control method
 本技術は、表示視野に表示され得る画像に関する情報を触覚で提示することが可能なウェアラブルディスプレイ及びそれを備えた情報処理システム、並びに制御方法に関する。 The present technology relates to a wearable display capable of tactilely displaying information related to an image that can be displayed in a display field, an information processing system including the same, and a control method.
 現実空間もしくはその画像に、対応する画像を付加する、拡張現実(AR:Augmented Reality)と呼ばれる技術が知られている。例えば特許文献1には、現実空間に存在する対象物に、それと関連する画像を重畳して表示することが可能な、シースルー型のヘッドマウントディスプレイ(HMD:Head Mounted Display)が記載されている。 A technology called augmented reality (AR) that adds a corresponding image to a real space or its image is known. For example, Patent Document 1 describes a see-through type head-mounted display (HMD: Head Mounted Display) capable of superimposing and displaying an image related to an object existing in a real space.
 AR技術は、例えば、上記のようなシースルー型のヘッドマウントディスプレイや、現実空間を撮像した画像等に適用され得る。いずれの場合においても、ユーザが視認できる表示視野は限定されており、ユーザが注視できる表示視野中の領域も限定されている。
 そこで、特許文献2には、AR技術を採用したメール(ARメール)に含まれる画像等を、受信装置のカメラ機能によって撮像された現実空間上の対応する物理オブジェクトに重畳して表示する際、バイブレータや音声出力等によって物理オブジェクトの方へユーザを誘導する拡張現実感システムが記載されている。
The AR technology can be applied to, for example, the see-through type head mounted display as described above, an image obtained by imaging a real space, or the like. In any case, the display visual field that the user can visually recognize is limited, and the area in the display visual field that the user can watch closely is also limited.
Therefore, in Patent Document 2, when an image or the like included in mail (AR mail) employing AR technology is displayed superimposed on a corresponding physical object in the real space captured by the camera function of the receiving device, An augmented reality system is described in which a user is guided toward a physical object by a vibrator, audio output, or the like.
国際公開第2014/128810号International Publication No. 2014/128810 特開2012-165276号公報JP 2012-165276 A
 しかしながら、特許文献2は、バイブレータによって物理オブジェクトの方へユーザを誘導する具体的な手法については開示していない。 However, Patent Document 2 does not disclose a specific method for guiding a user toward a physical object using a vibrator.
 以上のような事情に鑑み、本技術の目的は、触覚フィードバックにより、AR技術が適用された画像や当該画像に関連する現実空間上の対象物の位置に関する情報をユーザに把握させることが可能なウェアラブルディスプレイ及びそれを用いた情報処理システム、並びに制御方法を提供することにある。 In view of the circumstances as described above, the purpose of the present technology is to allow the user to grasp information regarding the image to which the AR technology is applied and the position of the object in the real space related to the image by tactile feedback. An object is to provide a wearable display, an information processing system using the same, and a control method.
 上記目的を達成するため、本技術の一形態に係るウェアラブルディスプレイは、表示部と、装着部と、触覚提示部とを具備する。
 上記表示部は、ユーザの視野に、AR(Augmented Reality)表示物を提示する。
 上記装着部は、上記ユーザに装着されることが可能に構成される。
 上記触覚提示部は、上記装着部に設けられる。
 また上記触覚提示部は、上記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンにより、上記ユーザから見た上記AR表示物候補の相対位置に関する情報を触覚で提示する。
In order to achieve the above object, a wearable display according to an embodiment of the present technology includes a display unit, a mounting unit, and a tactile sense presentation unit.
The display unit presents an AR (Augmented Reality) display in the user's field of view.
The attachment unit is configured to be attachable to the user.
The tactile sense presentation unit is provided on the mounting unit.
In addition, the tactile sense providing unit may detect a relative of the AR display candidate viewed from the user by a tactile feedback pattern when there is a predetermined change in the AR display candidate that can be presented as the AR display. Present information about the position by touch.
 上記ウェアラブルディスプレイによれば、触覚フィードバックパターンにより、AR表示物候補の上記ユーザから見た相対位置に関する情報をユーザに触覚で提示させることができる。 According to the wearable display, information related to the relative position of the AR display candidate viewed from the user can be presented to the user by a tactile sense using the tactile feedback pattern.
 ここで、「AR表示物」とは、AR技術が適用された画像(ARオブジェクト)を少なくとも含むものとし、当該ARオブジェクトのほか、当該画像に関連する現実空間上の対象物を含むものであってもよい。また、「AR表示物候補」は、上記視野にAR表示物として表示されることが可能に準備されているものをいう。 Here, the “AR display object” includes at least an image (AR object) to which the AR technology is applied, and includes an object in the real space related to the image in addition to the AR object. Also good. The “AR display object candidate” is prepared so as to be displayed as an AR display object in the field of view.
 また、上記触覚フィードバックパターンは、上記ユーザから見た上記AR表示物候補の相対位置に関する情報に基づいて生成されてもよい。 Further, the haptic feedback pattern may be generated based on information related to a relative position of the AR display object candidate viewed from the user.
 例えば、上記ユーザから見た上記AR表示物候補の相対位置に関する情報は、上記ユーザの視野の中心方向を基準とするAR表示物候補の方向、及び上記AR表示物候補と上記ユーザとの間の距離のうちの少なくとも一方に関連する情報を含んでいてもよい。 For example, the information regarding the relative position of the AR display object candidate viewed from the user includes the direction of the AR display object candidate with respect to the center direction of the user's visual field, and between the AR display object candidate and the user. Information related to at least one of the distances may be included.
 これにより、例えばユーザの視野の中心方向を基準とするAR表示物候補の方向や、AR表示物候補とユーザとの間の距離をユーザに把握させることができる。 Thereby, for example, the direction of the AR display object candidate based on the center direction of the user's visual field and the distance between the AR display object candidate and the user can be grasped by the user.
 また、上記触覚提示部は、上記装着部の異なる位置に配置された複数の振動子を含み、
 上記複数の振動子各々は、上記相対位置に関する情報と、上記複数の振動子の配置とに基づいて規定された振動パターンにより振動してもよい。
Further, the tactile sense presentation unit includes a plurality of vibrators arranged at different positions of the mounting unit,
Each of the plurality of vibrators may vibrate according to a vibration pattern defined based on the information on the relative position and the arrangement of the plurality of vibrators.
 これにより、ユーザから見たAR表示物候補の相対位置と、振動子の振動パターンとをリンクさせることができ、ユーザに当該相対位置を直感的に把握させることができる。 Thereby, the relative position of the AR display object candidate viewed from the user and the vibration pattern of the vibrator can be linked, and the user can intuitively grasp the relative position.
 より具体的に、上記装着部は、
 上記ユーザの右側に装着される第1の装着部材と、
 上記ユーザの左側に装着される第2の装着部材と、を含み、
 上記複数の振動子は、
 上記第1の装着部材に配置された第1の振動子と、
 上記第2の装着部材に配置された第2の振動子とを含み、
 上記第1の振動子は、上記AR表示物候補の上記ユーザから見た方向が上記ユーザの右方に対応する場合、上記第2の振動子よりも強い強度で振動し、上記AR表示物候補の上記ユーザから見た方向が上記ユーザの左方に対応する場合、上記第2の振動子よりも弱い強度で振動してもよい。
More specifically, the mounting portion is
A first mounting member mounted on the right side of the user;
A second mounting member mounted on the left side of the user,
The plurality of vibrators are
A first vibrator disposed on the first mounting member;
A second vibrator disposed on the second mounting member,
The first vibrator vibrates with a stronger intensity than the second vibrator when the direction of the AR display candidate from the user corresponds to the right side of the user, and the AR display candidate When the direction seen from the user corresponds to the left side of the user, the user may vibrate with a lower strength than the second vibrator.
 これにより、ユーザから見たAR表示物候補の方向を振動子の振動強度で表現することができ、ユーザに対し当該方向をより直観的に把握させることができる。 Thereby, the direction of the AR display object candidate viewed from the user can be expressed by the vibration intensity of the vibrator, and the direction can be more intuitively understood by the user.
 あるいは、上記複数の振動子は、所定の方向に並んで配置された振動子群を有し、
 上記振動子群に含まれる複数の振動子は、上記AR表示物候補の上記ユーザから見た方向が上記所定の方向と対応する場合、上記所定の方向に沿って順に振動してもよい。
Alternatively, the plurality of vibrators have a vibrator group arranged in a predetermined direction,
The plurality of transducers included in the transducer group may vibrate in order along the predetermined direction when the direction of the AR display candidate viewed from the user corresponds to the predetermined direction.
 これにより、ユーザを基準とするAR表示物候補の方向を振動子の振動順で表現することができ、ユーザに対し当該方向をより直観的に把握させることができる。 Thereby, the direction of the AR display object candidate based on the user can be expressed in the vibration order of the vibrator, and the direction can be more intuitively understood by the user.
 また、上記装着部は、
 ユーザの耳に装着されるモダン部と、
 上記モダン部と接続され、上記表示部を支持する支持部とを含み、
 上記触覚提示部は、上記モダン部に配置されていてもよい。
In addition, the mounting part is
A modern part worn on the user's ear,
A support part connected to the modern part and supporting the display part;
The tactile sense presentation unit may be disposed in the modern unit.
 触覚提示部が表示部から離れたモダン部に配置されるため、触覚提示部の駆動に伴う表示部への影響を抑制することができる。 Since the tactile sense presenting unit is disposed in the modern part away from the display unit, the influence on the display unit due to the driving of the tactile presenting unit can be suppressed.
 この場合、上記モダン部は、上記支持部よりも高い剛性を有していてもよい。 In this case, the modern part may have higher rigidity than the support part.
 これにより、触覚提示部の駆動に伴うモダン部の変形を抑制し、触覚提示部の駆動に伴う表示部への影響をより効果的に抑制することができる。 This makes it possible to suppress the deformation of the modern part associated with the driving of the tactile sense presenting part, and to more effectively suppress the influence on the display part associated with the driving of the tactile sense presenting part.
 一方、上記触覚提示部は、新規に取得された上記AR表示物候補について、ユーザから見た上記AR表示物の相対位置に関する情報を触覚で提示してもよい。
 あるいは、上記触覚提示部は、表示態様に変化が生じた上記AR表示物候補について、ユーザから見た上記AR表示物の相対位置に関する情報を触覚で提示してもよい。
On the other hand, the tactile sensation presentation unit may present information regarding the relative position of the AR display object viewed from the user by tactile sensation for the newly acquired AR display object candidate.
Or the said tactile sense presentation part may present the information regarding the relative position of the said AR display thing seen from the user by a tactile sense about the said AR display thing candidate in which the display mode changed.
 これらにより、ユーザに対し、AR表示物候補を介してより多くの情報を提供することができる。 Thus, more information can be provided to the user via the AR display object candidate.
 また、上記触覚提示部は、上記ユーザと間の距離に所定の変化が生じた上記AR表示物候補について、上記ユーザから見た上記AR表示物の相対位置に関する情報を触覚で提示してもよい。 In addition, the tactile sensation presentation unit may present information regarding the relative position of the AR display object viewed from the user with a tactile sensation for the AR display object candidate in which a predetermined change has occurred in the distance to the user. .
 これにより、ユーザに対し、AR表示物候補との間に距離の変化が生じたことを把握させることができる。 This makes it possible for the user to grasp that a change in distance has occurred with the AR display object candidate.
 あるいは、上記AR表示物候補は、上記視野外の位置に対応付けられていてもよい。 Alternatively, the AR display object candidate may be associated with a position outside the field of view.
 これにより、ユーザがAR表示物候補を視認できない場合であっても、触覚フィードバックによってその相対位置に関する情報を把握させることができる。 Thus, even when the user cannot visually recognize the AR display object candidate, the information on the relative position can be grasped by tactile feedback.
 また、上記表示部は、上記視野中に、上記相対位置に関する情報パターンにより提示される上記ユーザから見た上記AR表示物の相対位置に関する情報を補助する、補助表示を提示してもよい。 In addition, the display unit may present an auxiliary display that assists information on the relative position of the AR display object viewed from the user, which is presented by the information pattern on the relative position, in the visual field.
 これにより、ユーザから見たAR表示物候補の相対位置に関する情報を、より的確に把握させることができる。 Thereby, information regarding the relative position of the AR display object candidate viewed from the user can be more accurately grasped.
 本技術の他の形態に係る情報処理システムは、ウェアラブルディスプレイと、制御ユニットとを具備する。
 上記ウェアラブルディスプレイは、表示部と、装着部と、触覚提示部とを具備する。
 上記表示部は、ユーザの視野に、AR(Augmented Reality)表示物を提示する。
 上記装着部は、上記ユーザに装着されることが可能に構成される。
 上記触覚提示部は、上記装着部に設けられる。
 また上記触覚提示部は、上記ユーザに触覚フィードバックパターンにより、上記ユーザから見た上記AR表示物候補の相対位置に関する情報を触覚で提示する。
 上記制御ユニットは、上記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンを生成し、上記触覚提示部に上記触覚フィードバックパターンを出力する。
An information processing system according to another embodiment of the present technology includes a wearable display and a control unit.
The wearable display includes a display unit, a mounting unit, and a tactile sense presentation unit.
The display unit presents an AR (Augmented Reality) display in the user's field of view.
The attachment unit is configured to be attachable to the user.
The tactile sense presentation unit is provided on the mounting unit.
In addition, the tactile sense presenting unit presents information related to the relative position of the AR display object candidate viewed from the user to the user by a tactile feedback pattern.
The control unit generates a haptic feedback pattern and outputs the haptic feedback pattern to the haptic presentation unit when there is a predetermined change in an AR display candidate that can be presented as the AR display. .
 本技術のさらに他の形態に係る制御方法は、ユーザの視野に、AR表示物を提示する表示部と、上記ユーザに装着されることが可能な装着部と、上記装着部に設けられた触覚提示部とを具備するウェアラブルディスプレイの制御方法である。
 上記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンにより、上記ユーザから見た上記AR表示物候補の相対位置に関する情報を触覚で提示する。
A control method according to still another embodiment of the present technology includes a display unit that presents an AR display object in a user's field of view, a mounting unit that can be mounted on the user, and a tactile sensation provided on the mounting unit. A wearable display control method including a presentation unit.
When there is a predetermined change in an AR display object candidate that can be presented as the AR display object, information related to the relative position of the AR display object candidate viewed from the user is presented by a tactile sense using a tactile feedback pattern. To do.
 以上のように、本技術によれば、触覚フィードバックにより、AR技術が適用された画像や当該画像に関連する現実空間上の対象物の位置に関する情報をユーザに把握させることが可能となる。
 なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。
As described above, according to the present technology, it is possible to allow the user to grasp information regarding the image to which the AR technology is applied and the position of the object in the real space related to the image by tactile feedback.
Note that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
本技術の第1の実施形態に係るARシステム(情報処理システム)の構成を示す概略図である。1 is a schematic diagram illustrating a configuration of an AR system (information processing system) according to a first embodiment of the present technology. 上記ARシステムのHMD(ウェアラブルディスプレイ)によって提示される視野の例を示す図である。It is a figure which shows the example of the visual field shown by HMD (wearable display) of the said AR system. 上記ARシステムの構成を示すブロック図である。It is a block diagram which shows the structure of the said AR system. 上記HMDの外観を示す斜視図である。It is a perspective view which shows the external appearance of the said HMD. 上記HMDの模式的な側面図である。It is a typical side view of the HMD. 上記ARシステムの制御ユニットの機能的構成を示すブロック図である。It is a block diagram which shows the functional structure of the control unit of the said AR system. 上記ARシステムにおいて採用される円筒座標を示す概略図である。It is the schematic which shows the cylindrical coordinate employ | adopted in the said AR system. 上記HMDにおいて提示される視野を示す展開図である。It is an expanded view which shows the visual field presented in the said HMD. 上記HMDにおいて提示される視野とオブジェクトとの関係を概念的に示す展開図である。It is an expanded view which shows notionally the relationship between the visual field presented in the said HMD, and an object. 上記ARシステム全体の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the said AR system whole. 上記制御ユニットの動作例について説明するフローチャートである。It is a flowchart explaining the operation example of the said control unit. 上記制御ユニットによる触覚フィードバックパターンの生成例を示す模式的な図である。It is a schematic diagram which shows the example of a production | generation of the tactile feedback pattern by the said control unit. 上記実施形態の変形例1-1に係る上記制御ユニットによる触覚フィードバックパターンの生成例を示す模式的な図である。FIG. 10 is a schematic diagram illustrating a generation example of a haptic feedback pattern by the control unit according to Modification 1-1 of the embodiment. 上記実施形態の変形例1-2に係る上記制御ユニットによる触覚フィードバックパターンの生成例を示す模式的な図である。FIG. 10 is a schematic diagram illustrating an example of generation of a haptic feedback pattern by the control unit according to Modification 1-2 of the embodiment. 上記実施形態の変形例1-3のHMDの模式的な側面図である。It is a typical side view of HMD of modification 1-3 of the above-mentioned embodiment. 上記実施形態の変形例1-4のHMDの模式的な上面図である。It is a typical top view of HMD of modification 1-4 of the above-mentioned embodiment. 上記実施形態の変形例1-5のHMDの模式的な上面図である。It is a typical top view of HMD of modification 1-5 of the above-mentioned embodiment. 図17に示すHMDの触覚提示部の駆動例を説明するグラフである。It is a graph explaining the example of a drive of the tactile sense presentation part of HMD shown in FIG. 本技術の第2の実施形態に係るARシステムのHMDにより提示される補助表示例を示す図である。It is a figure showing an example of auxiliary display shown by HMD of AR system concerning a 2nd embodiment of this art. 上記HMDにより提示される補助表示の他の例を示す図である。It is a figure which shows the other example of the auxiliary display shown by the said HMD. 上記ARシステムの制御ユニットの機能的構成を示すブロック図である。It is a block diagram which shows the functional structure of the control unit of the said AR system. 上記HMDにより提示される補助表示の他の例を示す図である。It is a figure which shows the other example of the auxiliary display shown by the said HMD. 上記HMDにより提示される補助表示の他の例を示す図である。It is a figure which shows the other example of the auxiliary display shown by the said HMD.
 以下、本技術に係る実施形態を、図面を参照しながら説明する。 Hereinafter, embodiments of the present technology will be described with reference to the drawings.
<第1の実施形態>
 [ARシステムの概略構成]
 図1は、本技術の第1の実施形態に係るARシステム(情報処理システム)の構成を示す概略図である。同図に示すように、ARシステム100は、ウェアラブルディスプレイ10と、ウェアラブルディスプレイ10を制御する制御ユニット20と、携帯情報端末30と、ARサーバ40とを備える。ウェアラブルディスプレイ10は、本実施形態においてヘッドマウントディスプレイ(Head Mount Display;HMD)であり、以下HMD10と称する。なお、図1のHMD10は、形状を模式的に示している。
 ARサーバ40は、インターネット50上のサーバ装置である。ARサーバ40は、ARオブジェクトに関する情報を記憶している。
 携帯情報端末30は、典型的にはスマートフォンであるが、携帯電話機、タブレット端末、パーソナルコンピュータ(PC:Personal Computer)、タブレットPC、PDA(Personal Digital Assistant)等の情報処理装置で構成される。
 携帯情報端末30は、GPS(Global Positioning System)機能によってユーザの現在位置を取得することができる。また携帯情報端末30は、インターネット50を介してARサーバ40に接続されており、ARサーバ40からAR表示処理等に関する情報を取得することできる。さらに携帯情報端末30は、Bluetooth(登録商標)等の近距離無線通信システムによって制御ユニット20と接続され、制御ユニット20へAR表示処理等に関する情報やユーザの現在位置に関する情報を送信することができる。
 HMD10は、眼鏡形状のシースルー型ディスプレイとして構成される。
 制御ユニット20は、HMD10の制御用の機器であって、ユーザによる入力操作に基づいて、HMD10の動作を制御する。制御ユニット20は、所定の規格に対応したケーブルによってHMD10と接続されており、携帯情報端末30から取得した情報に基づいて処理を実行し、処理結果をHMD10へ出力する。
 これにより、HMD10は、HMD10を装着したユーザに、シースルー型ディスプレイを介して、現実空間に存在する対象物にそれと関連する画像が重畳されたAR表示物を提供することができる。以下、対象物に関連する画像をARオブジェクト又は単にオブジェクトと称する。
<First Embodiment>
[Schematic configuration of AR system]
FIG. 1 is a schematic diagram illustrating a configuration of an AR system (information processing system) according to the first embodiment of the present technology. As shown in the figure, the AR system 100 includes a wearable display 10, a control unit 20 that controls the wearable display 10, a portable information terminal 30, and an AR server 40. The wearable display 10 is a head mounted display (HMD) in the present embodiment, and is hereinafter referred to as an HMD 10. In addition, HMD10 of FIG. 1 has shown the shape typically.
The AR server 40 is a server device on the Internet 50. The AR server 40 stores information related to the AR object.
The portable information terminal 30 is typically a smartphone, but is configured by an information processing apparatus such as a mobile phone, a tablet terminal, a personal computer (PC), a tablet PC, or a PDA (Personal Digital Assistant).
The portable information terminal 30 can acquire the current position of the user by a GPS (Global Positioning System) function. The portable information terminal 30 is connected to the AR server 40 via the Internet 50, and can acquire information related to the AR display processing and the like from the AR server 40. Further, the portable information terminal 30 is connected to the control unit 20 by a short-range wireless communication system such as Bluetooth (registered trademark), and can transmit information relating to the AR display processing and the like and information relating to the current position of the user to the control unit 20. .
The HMD 10 is configured as a glasses-shaped see-through display.
The control unit 20 is a device for controlling the HMD 10 and controls the operation of the HMD 10 based on an input operation by the user. The control unit 20 is connected to the HMD 10 via a cable corresponding to a predetermined standard, executes processing based on information acquired from the portable information terminal 30, and outputs a processing result to the HMD 10.
Thereby, the HMD 10 can provide an AR display object in which an image associated therewith is superimposed on an object existing in the real space via a see-through display to a user wearing the HMD 10. Hereinafter, an image related to an object is referred to as an AR object or simply an object.
 図2は、HMD10によって提示される視野の例を示す図である。
 同図に示す視野Vは、AR表示物Sを含む。AR表示物Sは、現実空間に存在する対象物Aと、それに関連するオブジェクトBとを有する。対象物Aは、ARシステム100(制御ユニット20)において処理を実行するアプリケーションプログラムの種類に応じて決定されるが、例えばユーザに注視させたい建造物、人物、観光スポット、危険箇所等とすることができる。オブジェクトBは、対象物Aに関連する情報を表示する画像であり、文字や絵柄等を含む画像であってもよいし、アニメーション画像であってもよい。またオブジェクトは、2次元画像であってもよいし、3次元画像であってもよい。さらにオブジェクトの形状は、矩形、円形その他の幾何学的形状であってもよく、オブジェクトの種類によって適宜設定可能である。
 HMD10を装着したユーザは、オブジェクトBにより対象物Aに関連する情報を取得することができる。
 一方で、本実施形態では、視野V以外の領域の所定の位置に、AR表示物候補が準備されている。これらのAR表示物候補は、ユーザの移動や表示部11の姿勢の変化により視野が変化することで表示され得る。
FIG. 2 is a diagram illustrating an example of a visual field presented by the HMD 10.
The visual field V shown in FIG. The AR display object S includes an object A existing in the real space and an object B related thereto. The object A is determined according to the type of application program that executes processing in the AR system 100 (control unit 20). For example, the object A is a building, a person, a tourist spot, a dangerous spot, or the like that the user wants to watch. Can do. The object B is an image that displays information related to the object A, and may be an image including characters, patterns, or an animation image. The object may be a two-dimensional image or a three-dimensional image. Furthermore, the shape of the object may be a rectangle, a circle, or other geometric shape, and can be set as appropriate depending on the type of the object.
A user wearing the HMD 10 can acquire information related to the object A using the object B.
On the other hand, in this embodiment, AR display object candidates are prepared at predetermined positions in a region other than the visual field V. These AR display object candidates can be displayed by changing the field of view due to the movement of the user or the change in the posture of the display unit 11.
 図3は、ARシステム100の構成を示すブロック図である。
 以下、同図を参照してARシステム100の各要素について説明する。
FIG. 3 is a block diagram showing the configuration of the AR system 100.
Hereinafter, each element of the AR system 100 will be described with reference to FIG.
 [ARサーバの構成]
 図3に示すように、ARサーバ40は、CPU401と、メモリ402と、ネットワーク通信部403とを有する。ARサーバ40は、図示しないが、必要に応じて入力デバイスや表示デバイス、スピーカ等の構成を有していてもよい。
 CPU401は、ARサーバ40全体の動作を制御する。
 メモリ402は、ROM(Read Only Memory)及びRAM(Random Access Memory)や、HDD(Hard Disk Drive)及びフラッシュメモリ(SSD;Solid State Drive)等の不揮発性メモリを有し、CPU401によるARサーバ40の制御を実行するためのプログラムや各種パラメータ、その他必要なデータを記憶する。
 ネットワーク通信部403は、インターネット50を介して携帯情報端末30と通信する。通信方法は特に限定されず、Ethernet(登録商標)用のNIC(Network Interface Card)を用いた有線通信であってもよいし、WiFi(Wireless Fidelity)等の無線LAN(IEEE802.11等)や移動通信用の3Gや4Gのネットワークを用いた無線通信であってもよい。
[AR server configuration]
As illustrated in FIG. 3, the AR server 40 includes a CPU 401, a memory 402, and a network communication unit 403. Although not shown, the AR server 40 may have a configuration such as an input device, a display device, and a speaker as necessary.
The CPU 401 controls the overall operation of the AR server 40.
The memory 402 includes nonvolatile memories such as a ROM (Read Only Memory) and a RAM (Random Access Memory), an HDD (Hard Disk Drive), and a flash memory (SSD; Solid State Drive). Stores programs for executing control, various parameters, and other necessary data.
The network communication unit 403 communicates with the portable information terminal 30 via the Internet 50. The communication method is not particularly limited, and may be wired communication using a NIC (Network Interface Card) for Ethernet (registered trademark), wireless LAN (IEEE802.11, etc.) such as WiFi (Wireless Fidelity), or mobile Wireless communication using a 3G or 4G network for communication may be used.
 メモリ402は、また、ARデータベース404を保持する。
 ARデータベース404には、AR表示物候補毎に、オブジェクトの画像情報、対象物の属性、対象物の位置、オブジェクトの位置等の情報が記憶される。オブジェクトの位置情報は、典型的には、携帯情報端末30から取得された絶対位置(緯度、経度等)の情報であるが、例えば、HMD10を装着したユーザを中心としたその周囲の円筒座標系上の座標として表されてもよい。
 ARデータベース404には、ARサーバ40とインターネット50を介して接続された携帯情報端末30や他の携帯情報端末、情報処理装置等により、適宜新たなAR表示物の情報が追加登録される。
The memory 402 also holds an AR database 404.
The AR database 404 stores information such as object image information, target object attributes, target object position, and object position for each AR display object candidate. The position information of the object is typically information on the absolute position (latitude, longitude, etc.) acquired from the portable information terminal 30. For example, a cylindrical coordinate system around the user centered on the user wearing the HMD 10 It may be expressed as the upper coordinates.
In the AR database 404, information on new AR display objects is additionally registered as appropriate by the portable information terminal 30, other portable information terminals, information processing apparatuses, and the like connected to the AR server 40 via the Internet 50.
 [携帯情報端末の構成]
 図3に示すように、携帯情報端末30は、CPU301と、メモリ302と、ネットワーク通信部303と、近距離通信部304と、GPS通信部305と、タッチパネルが搭載された表示部306と、内部電源307とを有する。
 CPU301は、携帯情報端末30全体の動作を制御する。
 メモリ302は、ROM及びRAM、不揮発性メモリ等を有し、CPU301による携帯情報端末30の制御を実行するためのアプリケーションプログラムや各種パラメータ、制御ユニット20へ送信されるAR表示物の情報、その他必要なデータを記憶する。
 ネットワーク通信部303は、WiFi(Wireless Fidelity)等の無線LAN(IEEE802.11等)や移動通信用の3Gや4Gのネットワークを用いて、ARサーバ40等と通信する。携帯情報端末30は、ネットワーク通信部303を介してARサーバ40から、制御ユニット20へ送信するべきAR表示物の情報をダウンロードし、メモリ302へ格納する。
 近距離通信部304は、Bluetooth(登録商標)、赤外線通信等の近距離通信システムを用いて、制御ユニット20や他の携帯情報端末と通信する。
 GPS通信部305は、GPS衛星からの信号を受信することで、携帯情報端末30を携帯するユーザの現在位置を取得する。
 表示部306は、例えばLCD(Liquid Crystal Display)やOELD(Organic ElectroLuminescence Display)で構成され、各種メニューやアプリケーションのGUI等を表示する。典型的には、表示部306は、タッチパネルを搭載しており、ユーザのタッチ操作を受け付け可能である。
 内部電源307は、携帯情報端末30の駆動に必要な電力を供給する。
[Configuration of portable information terminal]
As shown in FIG. 3, the portable information terminal 30 includes a CPU 301, a memory 302, a network communication unit 303, a short-range communication unit 304, a GPS communication unit 305, a display unit 306 on which a touch panel is mounted, And a power source 307.
The CPU 301 controls the operation of the mobile information terminal 30 as a whole.
The memory 302 includes a ROM, a RAM, a non-volatile memory, and the like. An application program and various parameters for executing control of the portable information terminal 30 by the CPU 301, information on an AR display object transmitted to the control unit 20, and other necessary Memorize data.
The network communication unit 303 communicates with the AR server 40 or the like using a wireless LAN (IEEE802.11 or the like) such as WiFi (Wireless Fidelity) or a 3G or 4G network for mobile communication. The portable information terminal 30 downloads information on the AR display object to be transmitted to the control unit 20 from the AR server 40 via the network communication unit 303 and stores it in the memory 302.
The short-range communication unit 304 communicates with the control unit 20 and other portable information terminals using a short-range communication system such as Bluetooth (registered trademark) or infrared communication.
The GPS communication unit 305 acquires the current position of the user carrying the portable information terminal 30 by receiving a signal from a GPS satellite.
The display unit 306 includes, for example, an LCD (Liquid Crystal Display) or an OELD (Organic ElectroLuminescence Display), and displays various menus, GUIs of applications, and the like. Typically, the display unit 306 is equipped with a touch panel and can accept a user's touch operation.
The internal power supply 307 supplies power necessary for driving the portable information terminal 30.
 [制御ユニットの構成]
 図3に示すように、制御ユニット20は、CPU201と、メモリ202と、通信部203と、入力操作部204と、内部電源205とを有する。
 CPU201は、制御ユニット20全体の動作を制御する。メモリ202は、ROM及びRAM等を有し、CPU201による制御ユニット20の制御を実行するためのプログラムや各種パラメータ、AR表示物の情報、その他必要なデータを記憶する。通信部203は、携帯情報端末30との近距離通信のためのインターフェースを構成する。
 入力操作部204は、ユーザ操作によってHMD10で表示される画像を制御するためのものである。入力操作部204は、メカニカルスイッチで構成されてもよいし、タッチセンサで構成されてもよい。
 内部電源205は、HMD100の駆動に必要な電力を供給する。
[Control unit configuration]
As shown in FIG. 3, the control unit 20 includes a CPU 201, a memory 202, a communication unit 203, an input operation unit 204, and an internal power source 205.
The CPU 201 controls the operation of the entire control unit 20. The memory 202 includes a ROM, a RAM, and the like, and stores a program for executing control of the control unit 20 by the CPU 201, various parameters, information on an AR display object, and other necessary data. The communication unit 203 constitutes an interface for short-range communication with the portable information terminal 30.
The input operation unit 204 is for controlling an image displayed on the HMD 10 by a user operation. The input operation unit 204 may be configured with a mechanical switch or a touch sensor.
The internal power supply 205 supplies power necessary for driving the HMD 100.
 [HMDの構成]
 図4は、HMD10の外観を示す斜視図であり、図5はHMD10の模式的な側面図である。
 図3及び図4に示すように、HMD10は、表示部11と、装着部12と、触覚提示部13と、検出部14とを有する。
[Configuration of HMD]
FIG. 4 is a perspective view showing the appearance of the HMD 10, and FIG. 5 is a schematic side view of the HMD 10.
As shown in FIGS. 3 and 4, the HMD 10 includes a display unit 11, a mounting unit 12, a tactile sense presentation unit 13, and a detection unit 14.
 (表示部)
 表示部11は、ユーザの視野に、AR表示物を提示する。AR表示物は、本実施形態において、ARオブジェクトとそれに関連する現実空間上の対象物とを含む。表示部11は、第1及び第2の表示面(表示面)111R,111Lと、第1及び第2の画像生成部(画像生成部)112R,112Lを有する。
 第1及び第2の表示面111R,111Lは、それぞれユーザUの右眼及び左眼に現実空間(外界視野)を提供可能な透明性を有する光学素子で構成される。第1及び第2の画像生成部112R,112Lは、それぞれ第1及び第2の表示面111R,111Lを介してユーザUへ提示される画像を生成可能に構成される。
 以上のように構成される表示部11は、ユーザに対して、表示面111R,111Lを介して現実空間に所定の画像(あるいは虚像)が重畳された視野を提供することが可能に構成される。
 また表示部11は、第1及び第2の表示面111R,111L及び第1及び第2の画像生成部112R,112Lを支持するフレーム113をさらに有していてもよい。フレーム113は、軽量かつ比較的高い剛性の材料で形成され、例えばMg(マグネシウム)、Al(アルミニウム)等の金属材料で形成される。フレーム113により、第1及び第2の画像生成部112R,112Lに対する第1及び第2の表示面111R,111Lの相対的な配置を固定し、明瞭な画像を提供することができる。
 本実施形態において、表示部11は、装着部12に支持されている。
(Display section)
The display unit 11 presents an AR display object in the user's field of view. In this embodiment, the AR display object includes an AR object and an object in the real space related to the AR object. The display unit 11 includes first and second display surfaces (display surfaces) 111R and 111L, and first and second image generation units (image generation units) 112R and 112L.
The 1st and 2nd display surfaces 111R and 111L are comprised with the optical element which has transparency which can provide real space (external field visual field) to the right eye and left eye of the user U, respectively. The first and second image generation units 112R and 112L are configured to be able to generate images to be presented to the user U via the first and second display surfaces 111R and 111L, respectively.
The display unit 11 configured as described above is configured to be able to provide a user with a visual field in which a predetermined image (or virtual image) is superimposed on the real space via the display surfaces 111R and 111L. .
The display unit 11 may further include a frame 113 that supports the first and second display surfaces 111R and 111L and the first and second image generation units 112R and 112L. The frame 113 is made of a lightweight and relatively high rigidity material, and is made of a metal material such as Mg (magnesium) or Al (aluminum). With the frame 113, the relative arrangement of the first and second display surfaces 111R and 111L with respect to the first and second image generation units 112R and 112L can be fixed, and a clear image can be provided.
In the present embodiment, the display unit 11 is supported by the mounting unit 12.
 (装着部)
 装着部12は、ユーザに装着されることが可能に構成される。
 装着部12は、表示面111R,111L及び画像生成部112R,112Lを支持し、表示面111L,111Rがユーザの右眼及び左眼にそれぞれ対向するようにユーザの頭部に装着される。
 例えば装着部12は、ユーザの右側に装着される第1の装着部材121Rと、ユーザの左側に装着される第2の装着部材121Lと、支持部122とを有する。
 本実施形態において、第1及び第2の装着部材(装着部材)121R,121Lは、ユーザの側頭部に装着され、眼鏡のテンプル状に構成される。
 支持部122は、表示面111R,111Lの上部に沿って構成され、第1及び第2の装着部材121R,121Lを接続する。支持部122は、フレーム113の中央部と接続されることにより表示部11を支持する。
 第1及び第2の装着部材121R,121Lは、先端部分から中央部分にわたる領域に形成された第1及び第2のモダン部(モダン部)123R,123Lと、第1及び第2のモダン部123R,123Lと支持部122との間に形成された第1及び第2の接続部124R,124Lとを含む。
 第1及び第2のモダン部(モダン部)123R,123Lは、例えばユーザの耳介上部に係止されることが可能な形状を有し、ユーザの耳に装着される。また第1及び第2のモダン部123R,123Lは、図4及び図5に示すようにシリコーン系樹脂やラバー系樹脂等の弾性の高い材料で形成されたカバー123Ra,123Laを有していてもよい。これにより、ユーザの負担を軽減することができる。
(Mounting part)
The mounting unit 12 is configured to be mountable by a user.
The mounting unit 12 supports the display surfaces 111R and 111L and the image generation units 112R and 112L, and is mounted on the user's head so that the display surfaces 111L and 111R face the right and left eyes of the user, respectively.
For example, the mounting unit 12 includes a first mounting member 121R mounted on the right side of the user, a second mounting member 121L mounted on the left side of the user, and a support unit 122.
In the present embodiment, the first and second mounting members (mounting members) 121R and 121L are mounted on the user's temporal region and are configured in the shape of temples of glasses.
The support portion 122 is configured along the upper portions of the display surfaces 111R and 111L, and connects the first and second mounting members 121R and 121L. The support part 122 supports the display part 11 by being connected to the center part of the frame 113.
The first and second mounting members 121R and 121L include first and second modern portions (modern portions) 123R and 123L formed in a region extending from the tip portion to the central portion, and the first and second modern portions 123R. , 123L and the support part 122, the first and second connection parts 124R, 124L are included.
The first and second modern parts (modern parts) 123R and 123L have a shape that can be locked to the upper part of the user's auricle, for example, and are attached to the user's ears. Further, the first and second modern portions 123R and 123L may have covers 123Ra and 123La formed of a highly elastic material such as silicone resin or rubber resin as shown in FIGS. Good. Thereby, a user's burden can be reduced.
 支持部122及び第1及び第2の接続部124R,124Lは、全体として比較的弾性のある樹脂や金属等の材料で形成される。一方、第1及び第2のモダン部123R,123Lは、支持部122よりも高い剛性を有していてもよい。剛性は、材料によって調整されてもよいし、形状によって調整されてもよい。これにより、後述する触覚提示部13の駆動が表示部11に伝達しにくい構成とすることができる。
 なお、接続部124R,124はモダン部123R,123Lと同様の剛性を有していてもよいし、支持部122と同様の剛性を有していてもよい。
The support portion 122 and the first and second connection portions 124R and 124L are formed of a material such as a relatively elastic resin or metal as a whole. On the other hand, the first and second modern parts 123 </ b> R and 123 </ b> L may have higher rigidity than the support part 122. The rigidity may be adjusted depending on the material or may be adjusted depending on the shape. Thereby, it can be set as the structure which the drive of the tactile sense presentation part 13 mentioned later cannot transmit easily to the display part 11. FIG.
The connecting portions 124R and 124 may have the same rigidity as the modern portions 123R and 123L, or may have the same rigidity as the support portion 122.
 (触覚提示部)
 触覚提示部13は、装着部12に設けられ、ユーザに触覚フィードバックパターンにより、そのユーザから見たAR表示物候補の相対位置に関する情報を触覚で提示する。
 触覚提示部13は、装着部12の異なる位置に配置された複数の振動子131を含む。本実施形態において、複数の振動子131は、第1の装着部材121Rに配置された第1の振動子131Rと、第2の装着部材121Lに配置された第2の振動子131Lとを含む。第1及び第2の振動子(振動子)131R,131Lは、それぞれ、第1及び第2のモダン部123R,123Lに配置される。
 各振動子131R,131Lは、本実施形態において、振動モータで構成される。振動モータは、モータの回転軸に取り付けられたアンバランスなウェイトが回転軸まわりに回転することにより、振動を発生させる。例えば各振動子131R,131Lは、それぞれ、回転軸の延在方向が装着部材121R,121Rの延在方向と一致し、かつ、回転軸まわりのウェイトの回転方向が相互に異なる向きになるように配置される。これにより、振動子131R,131Lの振動時でも第1及び第2の装着部材121R,121Lのバランスを維持しやすくなり、表示部11への振動の影響を抑制することができる。
 触覚提示部13は、AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンにより、ユーザから見たAR表示物候補の相対位置に関する情報を触覚で提示する。詳細については後述する。
(Tactile presentation part)
The tactile sense providing unit 13 is provided in the mounting unit 12 and presents the user with information related to the relative position of the AR display object candidate viewed from the user by a tactile feedback pattern.
The tactile sense presentation unit 13 includes a plurality of vibrators 131 arranged at different positions on the mounting unit 12. In the present embodiment, the plurality of vibrators 131 include a first vibrator 131R disposed on the first mounting member 121R and a second vibrator 131L disposed on the second mounting member 121L. The first and second vibrators (vibrators) 131R and 131L are disposed in the first and second modern portions 123R and 123L, respectively.
Each vibrator 131R, 131L is constituted by a vibration motor in this embodiment. The vibration motor generates vibration when an unbalanced weight attached to the rotation shaft of the motor rotates around the rotation shaft. For example, in each of the vibrators 131R and 131L, the extending direction of the rotating shaft matches the extending direction of the mounting members 121R and 121R, and the rotating directions of the weights around the rotating shaft are different from each other. Be placed. Thereby, even when the vibrators 131R and 131L vibrate, the balance between the first and second mounting members 121R and 121L can be easily maintained, and the influence of vibration on the display unit 11 can be suppressed.
When there is a predetermined change in the AR display object candidate that can be presented as an AR display object, the tactile sense providing unit 13 obtains information on the relative position of the AR display object candidate viewed from the user by a tactile feedback pattern. Present by touch. Details will be described later.
 (検出部)
 検出部14は、表示部11の姿勢変化を検出することができる。本実施形態において検出部14は、X,Y及びZ軸周りの姿勢変化をそれぞれ検出するように構成されている。
 検出部14は、角速度センサ、加速度センサ等のモーションセンサ、あるいはこれらの組み合わせによって構成することができる。この場合、検出部14は、角速度センサ及び加速度センサの各々を3軸方向に配置したセンサユニットで構成されてもよいし、各軸に応じて使用するセンサを異ならせてもよい。表示部11の姿勢変化、変化の方向及びその変化の量等は、例えば角速度センサの出力の積分値を用いることができる。
 また、鉛直軸(Z軸)周りの表示部11の方位の検出には、地磁気センサが採用されてもよい。あるいは、地磁気センサと上記モーションセンサとが組み合わされてもよい。これにより精度の高い方位あるいは姿勢変化の検出が可能となる。
 ユーザを基準としたAR表示物の方向を求めるには、ユーザの視野の方向(方位)を知る必要があるので、地磁気センサを利用することで精度を高めることができる。
 検出部14は、表示部11の適宜の位置に配置されている。検出部14の位置は特に限定されず、例えば画像生成部112R,112Lのいずれか一方に配置され、あるいは装着部12の一部に配置されてもよい。
(Detection unit)
The detection unit 14 can detect a change in posture of the display unit 11. In the present embodiment, the detection unit 14 is configured to detect posture changes around the X, Y, and Z axes, respectively.
The detection unit 14 can be configured by a motion sensor such as an angular velocity sensor or an acceleration sensor, or a combination thereof. In this case, the detection unit 14 may be configured by a sensor unit in which each of the angular velocity sensor and the acceleration sensor is arranged in the three-axis directions, or the sensor to be used may be different depending on each axis. For example, an integrated value of the output of the angular velocity sensor can be used as the posture change of the display unit 11, the direction of the change, the amount of the change, and the like.
Further, a geomagnetic sensor may be employed for detecting the orientation of the display unit 11 around the vertical axis (Z axis). Alternatively, the geomagnetic sensor and the motion sensor may be combined. Thereby, it is possible to detect a change in orientation or posture with high accuracy.
In order to obtain the direction of the AR display object based on the user, it is necessary to know the direction (azimuth) of the user's visual field, and therefore the accuracy can be improved by using a geomagnetic sensor.
The detection unit 14 is disposed at an appropriate position on the display unit 11. The position of the detection unit 14 is not particularly limited. For example, the detection unit 14 may be arranged in one of the image generation units 112R and 112L, or may be arranged in a part of the mounting unit 12.
 このような構成のHMD10の動作は、制御ユニット20によって所定の処理が実行されることで制御される。以下、制御ユニット20の機能的構成について説明する。 The operation of the HMD 10 configured as described above is controlled by executing predetermined processing by the control unit 20. Hereinafter, the functional configuration of the control unit 20 will be described.
 [制御ユニットの機能的構成]
 図6は、制御ユニット20の機能的構成を示すブロック図である。
 制御ユニット20は、AR表示処理を実行するAR表示処理ユニット21と、触覚フィードバック処理を実行する触覚フィードバック処理ユニット22とを有する。
[Functional configuration of control unit]
FIG. 6 is a block diagram showing a functional configuration of the control unit 20.
The control unit 20 includes an AR display processing unit 21 that executes an AR display process and a haptic feedback processing unit 22 that executes a haptic feedback process.
 (AR表示処理ユニット)
 AR表示処理ユニット21は、視野設定部211と、AR情報管理部212と、表示制御部213とを有する。
(AR display processing unit)
The AR display processing unit 21 includes a visual field setting unit 211, an AR information management unit 212, and a display control unit 213.
 視野設定部211は、検出部14の検出結果から算出された表示部11の姿勢に基づいて、視野の範囲を設定する。視野設定部211は、制御ユニット20のCPU201によって実行される。本実施形態において、視野設定部211による視野の範囲の設定には、鉛直軸Azを中心とし、ユーザUを包囲する仮想上の円筒座標C0が用いられる。 The visual field setting unit 211 sets a visual field range based on the attitude of the display unit 11 calculated from the detection result of the detection unit 14. The visual field setting unit 211 is executed by the CPU 201 of the control unit 20. In the present embodiment, for setting the visual field range by the visual field setting unit 211, virtual cylindrical coordinates C0 surrounding the user U with the vertical axis Az as the center are used.
 図7は、円筒座標C0と視野Vの説明をする概略図である。
 同図に示すように、円筒座標C0は、鉛直軸Azから距離(半径)Rの位置に配置された仮想的な周面上の位置を規定する座標系である。ユーザU(表示部11)は鉛直軸Az上に配置される。円筒座標C0は、北方向を0°とした鉛直軸周りの角度を表す周方向の座標軸(θ)と、ユーザUの水平方向の目線Lhを基準とした上下方向の角度を表す高さ方向の座標軸(h)とを有する。座標軸(θ)は、東周りを正方向とし、座標軸(h)は、俯角を正方向、仰角を負方向としている。円筒座標C0の上記半径R、高さHは任意に設定可能である。なお、鉛直軸Azを規定するユーザUの位置は、携帯情報端末30によって取得されたユーザUの位置により規定される。
 同図に示すように、視野Vは、円筒座標C0が設定された仮想的な周面上に配置される。
FIG. 7 is a schematic diagram illustrating the cylindrical coordinates C0 and the field of view V.
As shown in the figure, the cylindrical coordinate C0 is a coordinate system that defines a position on a virtual circumferential surface arranged at a distance (radius) R from the vertical axis Az. The user U (display unit 11) is arranged on the vertical axis Az. The cylindrical coordinate C0 is a height in the height direction that represents a vertical coordinate axis (θ) that represents an angle around the vertical axis with the north direction being 0 °, and a horizontal line of sight Lh of the user U. Coordinate axis (h). The coordinate axis (θ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction. The radius R and height H of the cylindrical coordinates C0 can be arbitrarily set. Note that the position of the user U that defines the vertical axis Az is defined by the position of the user U acquired by the portable information terminal 30.
As shown in the figure, the visual field V is arranged on a virtual peripheral surface in which the cylindrical coordinates C0 are set.
 視野設定部211は、検出部14の出力に基づいて表示部11の姿勢変化を算出し、ユーザUの視野Vが円筒座標C0上のどの領域に属するか判定する。円筒座標C0における視野Vの属する領域は、θ,h各々の範囲によって規定される。視野VはユーザUの姿勢変化によって円筒座標C0上を移動し、その移動方向や移動量は、検出部14の出力に基づいて算出される。 The visual field setting unit 211 calculates the posture change of the display unit 11 based on the output of the detection unit 14, and determines which region on the cylindrical coordinate C0 the user's U visual field V belongs to. The region to which the visual field V belongs in the cylindrical coordinate C0 is defined by the ranges of θ and h. The visual field V moves on the cylindrical coordinates C0 due to the change in the posture of the user U, and the moving direction and moving amount are calculated based on the output of the detection unit 14.
 図8は、円筒座標C0上における視野Vを示す円筒座標C0の展開図である。図中の符号Ocは、円筒座標C0の原点を示す。
 視野Vは略矩形状であり、例えば視野Vの周方向の範囲は、以下の式(3)で表される。
 θv1≦θv≦θv2 …(3)
 一方、視野Vの高さ方向の範囲は、以下の式(4)で表される。
 hv1≦hv≦hv2 …(4)
FIG. 8 is a development view of the cylindrical coordinates C0 indicating the visual field V on the cylindrical coordinates C0. The symbol Oc in the figure indicates the origin of the cylindrical coordinates C0.
The visual field V has a substantially rectangular shape. For example, the circumferential range of the visual field V is expressed by the following formula (3).
θv1 ≦ θv ≦ θv2 (3)
On the other hand, the range of the visual field V in the height direction is represented by the following formula (4).
hv1 ≦ hv ≦ hv2 (4)
 AR情報管理部212は、携帯情報端末30から所定のAR表示物候補の情報を取得し、取得したAR表示物候補の情報を管理する。AR情報管理部212は、制御ユニット20のCPU201及びメモリ202によって実行される。
 AR表示物候補の情報は、例えば、オブジェクトの位置情報、オブジェクトの画像情報、対象物の位置情報等を含む。
The AR information management unit 212 acquires information about a predetermined AR display object candidate from the portable information terminal 30 and manages the acquired information about the AR display object candidate. The AR information management unit 212 is executed by the CPU 201 and the memory 202 of the control unit 20.
The AR display object candidate information includes, for example, object position information, object image information, and object position information.
 再び図7を参照し、AR表示物候補の位置情報について説明する。
 AR表示物候補は、本実施形態において、円筒座標C0上に配置され得るオブジェクトとそれに関連する現実空間上の対象物の組をいう。図7において、対象物A1とオブジェクトB1をAR表示物候補S1、対象物A2とオブジェクトB2をAR表示物候補S2、対象物A3とオブジェクトB3をAR表示物候補S3、対象物A4とオブジェクトB4をAR表示物候補S4と定義する。これらのAR表示物候補S1~S4のうち、視野V内に表示されたものを、「AR表示物」というものとする。
 対象物A1~A4は、現実空間上に存在し、典型的にはユーザUからRよりも遠い位置に存在する。これにより、対象物A1~A4の位置情報は、ユーザUから対象物A1~A4までの距離の情報を含んでもよいし、対象物A1~A4の現実空間上の位置(緯度、経度など)の情報を含んでいてもよい。
 一方、オブジェクトB1~B4の位置(座標)は、対象物A1~A4を注視するユーザの目線Lと円筒座標C0との交差位置にそれぞれ対応付けられる。すなわち、オブジェクトB1~B4の位置情報は、例えば、円筒座標C0における座標の情報を含む。
 図示の例では、オブジェクトB1~B4各々の中心位置を上記交差位置に一致させたが、これに限られず、オブジェクトの周縁の一部(例えば四隅の一部)を上記交差位置に一致させてもよい。あるいは、オブジェクトB1~B4の座標位置が上記交差位置から離れた任意の位置に対応付けられてもよい。
With reference to FIG. 7 again, position information of the AR display object candidate will be described.
In this embodiment, the AR display object candidate refers to a set of an object that can be arranged on the cylindrical coordinate C0 and an object in the real space related to the object. In FIG. 7, object A1 and object B1 are AR display object candidates S1, object A2 and object B2 are AR display object candidates S2, object A3 and object B3 are AR display object candidates S3, object A4 and object B4 are This is defined as an AR display object candidate S4. Among these AR display object candidates S1 to S4, those displayed in the visual field V are referred to as “AR display objects”.
The objects A1 to A4 exist in the real space, and typically exist at positions farther from the user U than R. Thus, the position information of the objects A1 to A4 may include information on the distance from the user U to the objects A1 to A4, or the positions (latitude, longitude, etc.) of the objects A1 to A4 in the real space. Information may be included.
On the other hand, the positions (coordinates) of the objects B1 to B4 are respectively associated with the intersection positions of the user's eye line L watching the objects A1 to A4 and the cylindrical coordinates C0. That is, the position information of the objects B1 to B4 includes, for example, coordinate information on the cylindrical coordinates C0.
In the illustrated example, the center position of each of the objects B1 to B4 is made coincident with the intersection position. However, the present invention is not limited to this, and a part of the periphery of the object (for example, part of the four corners) may coincide with the intersection position. Good. Alternatively, the coordinate positions of the objects B1 to B4 may be associated with any position away from the intersection position.
 AR情報管理部212は、AR表示物候補の情報として、携帯情報端末30から予め取得されたユーザの現在位置に応じたAR表示物候補の情報を取得する。AR情報管理部212による情報取得のタイミングは、例えば、ユーザの現在位置が所定の距離以上変化したと判定されたタイミングや、ARサーバ40からAR表示物候補の状態変化に関する通知を受信したと判定されたタイミング等とすることができる。
 例えばAR情報管理部212は、現在位置に応じたAR表示物情報として、ユーザの現在位置から所定の距離に存在する対象物に関するAR表示物情報を取得することができる。AR情報管理部212はまた、ユーザの現在位置に基づいて不要なAR表示物候補の情報を削除してもよい。
 AR情報管理部212は、取得したAR表示物候補の情報を表示制御部213へ供給する。
The AR information management unit 212 acquires AR display object candidate information corresponding to the current position of the user acquired in advance from the portable information terminal 30 as information about the AR display object candidate. The information acquisition timing by the AR information management unit 212 is determined to be, for example, a timing at which it is determined that the current position of the user has changed by a predetermined distance or more, or a notification from the AR server 40 regarding the state change of the AR display object candidate. Timing can be used.
For example, the AR information management unit 212 can acquire AR display object information related to an object existing at a predetermined distance from the user's current position as AR display object information corresponding to the current position. The AR information management unit 212 may also delete unnecessary AR display object candidate information based on the current position of the user.
The AR information management unit 212 supplies the acquired AR display object candidate information to the display control unit 213.
 表示制御部213は、検出部14の出力(すなわち視野設定部211の処理結果)に基づいて表示部11の方位に対応する円筒座標C0上のオブジェクトを視野Vに表示(描画)する処理を実行するように構成される。表示制御部213は、制御ユニット20のCPU201によって実行される。
 図9は、円筒座標C0上における視野VとオブジェクトB1~B4との関係を概念的に示す円筒座標C0の展開図である。
 例えば同図に示すように、視野Vの現在の方位が、円筒座標C0上のオブジェクトB1,B2の表示領域にそれぞれ重なる場合、それらの重なる領域B10,B20に相当する画像を視野Vに表示する。表示制御部213は、典型的には、表示部11の方位あるいは姿勢の変化に追従して、視野V内でオブジェクトB1,B2の表示位置を変化させる。この制御は、視野Vに各オブジェクトB1,B2の少なくとも一部が存在する限り継続される。
 なお、表示制御部213は、例えば視野Vに属する1点を原点とするxy座標(ローカル座標)を設定し、円筒座標C0からローカル座標へ変換する演算を実行してもよい。
The display control unit 213 executes a process of displaying (drawing) an object on the cylindrical coordinate C0 corresponding to the orientation of the display unit 11 in the visual field V based on the output of the detection unit 14 (that is, the processing result of the visual field setting unit 211). Configured to do. The display control unit 213 is executed by the CPU 201 of the control unit 20.
FIG. 9 is a developed view of the cylindrical coordinates C0 conceptually showing the relationship between the visual field V on the cylindrical coordinates C0 and the objects B1 to B4.
For example, as shown in the figure, when the current orientation of the visual field V overlaps the display areas of the objects B1 and B2 on the cylindrical coordinates C0, images corresponding to the overlapping areas B10 and B20 are displayed in the visual field V. . The display control unit 213 typically changes the display position of the objects B <b> 1 and B <b> 2 within the field of view V following the change in the orientation or orientation of the display unit 11. This control is continued as long as at least a part of each object B1, B2 exists in the visual field V.
For example, the display control unit 213 may set an xy coordinate (local coordinate) having one point belonging to the field of view V as an origin, and execute a calculation for converting the cylindrical coordinate C0 to the local coordinate.
 HMD10の表示領域(視野V)は、人間の目の視界より狭いため、視野Vに表示されるオブジェクトは限定される。例えば図9に示すオブジェクトB3,B4は視野外にあるため、ユーザはこれらのオブジェクトB3,B4を全く視認することができない。このように、視野Vの提示のみでは有用な情報を十分に提供できない可能性がある。
 そこで本実施形態によれば、視野外のAR表示物候補の位置情報を、触覚フィードバックによってユーザに把握させることができる。
Since the display area (field of view V) of the HMD 10 is narrower than the visual field of human eyes, the objects displayed in the field of view V are limited. For example, since the objects B3 and B4 shown in FIG. 9 are out of the field of view, the user cannot see these objects B3 and B4 at all. Thus, there is a possibility that useful information cannot be sufficiently provided only by the presentation of the visual field V.
Therefore, according to the present embodiment, the position information of the AR display object candidate outside the field of view can be grasped by the user by tactile feedback.
 (触覚フィードバック処理ユニット)
 触覚フィードバック処理ユニット22は、AR情報管理部221と、触覚処理判定部222と、フィードバック生成部223とを有する。触覚フィードバックパターンは、本実施形態において、ユーザから見た上記AR表示物候補の相対位置に関する情報に基づいて生成される。
(Tactile feedback processing unit)
The tactile feedback processing unit 22 includes an AR information management unit 221, a tactile processing determination unit 222, and a feedback generation unit 223. In the present embodiment, the tactile feedback pattern is generated based on information on the relative position of the AR display object candidate viewed from the user.
 AR情報管理部221は、携帯情報端末30からユーザの現在位置に応じたAR表示物候補の情報を取得し、取得したAR表示物候補の情報を管理する。AR情報管理部212は、制御ユニット20のCPU201及びメモリ202によって実行される。なお、AR情報管理部221は、AR表示処理ユニット21のAR情報管理部212によって実行されてもよい。
 AR表示物候補の情報は、例えば、オブジェクトの位置情報、オブジェクトの画像情報、対象物の位置情報等を含む。対象物A1~A4の位置情報は、ユーザUから対象物A1~A4までの距離の情報を含んでもよいし、対象物A1~A4の現実空間上の位置の情報を含んでいてもよい。オブジェクトB1~B4の位置情報は、円筒座標C0における座標の情報を含んでいてもよい。
 AR情報管理部221は、AR表示物候補の情報として、携帯情報端末30から予め取得されたユーザの現在位置に応じたAR表示物候補の情報を取得する。AR情報管理部221による情報取得のタイミングは、例えば、ユーザの現在位置が所定の距離以上変化したと判定されたタイミングとすることができる。
AR情報管理部212はまた、ユーザの現在位置に基づいて不要なAR表示物候補の情報を削除してもよい。
 AR情報管理部221は、取得したAR表示物候補の情報を、触覚処理判定部222に供給する。
The AR information management unit 221 acquires information on the AR display object candidate corresponding to the current position of the user from the portable information terminal 30, and manages the acquired information on the AR display object candidate. The AR information management unit 212 is executed by the CPU 201 and the memory 202 of the control unit 20. The AR information management unit 221 may be executed by the AR information management unit 212 of the AR display processing unit 21.
The AR display object candidate information includes, for example, object position information, object image information, and object position information. The position information of the objects A1 to A4 may include information about the distance from the user U to the objects A1 to A4, or may include information about the positions of the objects A1 to A4 in the real space. The position information of the objects B1 to B4 may include coordinate information on the cylindrical coordinates C0.
The AR information management unit 221 acquires information on an AR display object candidate corresponding to the current position of the user acquired in advance from the portable information terminal 30 as information on the AR display object candidate. The information acquisition timing by the AR information management unit 221 can be, for example, a timing when it is determined that the current position of the user has changed by a predetermined distance or more.
The AR information management unit 212 may also delete unnecessary AR display object candidate information based on the current position of the user.
The AR information management unit 221 supplies the acquired AR display object candidate information to the tactile sense processing determination unit 222.
 触覚処理判定部222は、AR情報管理部221によって取得されたAR表示物候補のうち、触覚フィードバック処理の対象となるAR表示物候補があるか否か判定する。触覚処理判定部222は、制御ユニット20のCPU201によって実行される。
 触覚フィードバック処理の対象の判定基準は、例えば、新規に取得されたAR表示物候補があるか否か、あるいは表示態様に変化が生じたAR表示物候補があるか否か、等の基準を適用することができる。「新規に取得された」AR表示物候補の例としては、例えばARデータベース404に新規に登録されたことにより新規に取得されたAR表示物候補や、ユーザの移動により、ユーザと対象物との距離が所定の距離以下になったことにより新規に取得されたAR表示物候補等が挙げられる。「表示態様に変化が生じた」AR表示物候補の例としては、ARサーバ40のデータベース404に登録されたオブジェクトの画像の表示態様に変化が生じたAR表示物候補等が挙げられる。
 触覚処理判定部222は、判定結果をフィードバック生成部223に供給する。
The haptic process determination unit 222 determines whether there is an AR display object candidate to be subjected to the haptic feedback process among the AR display object candidates acquired by the AR information management unit 221. The tactile processing determination unit 222 is executed by the CPU 201 of the control unit 20.
The criteria for tactile feedback processing targets are, for example, whether or not there is a newly acquired AR display object candidate or whether or not there is an AR display object candidate whose display mode has changed. can do. Examples of “newly acquired” AR display object candidates include, for example, a newly acquired AR display object candidate that has been newly registered in the AR database 404, and a user's movement between the user and the target object. The AR display object candidate newly acquired when the distance is equal to or less than the predetermined distance may be used. Examples of the AR display object candidate “changed in the display mode” include an AR display object candidate in which the display mode of the image of the object registered in the database 404 of the AR server 40 has changed.
The tactile processing determination unit 222 supplies the determination result to the feedback generation unit 223.
 フィードバック生成部223は、触覚フィードバック処理の対象と判定されたAR表示物候補の、ユーザから見たAR表示物候補の相対位置に関する情報を触覚で提示するための触覚フィードバックパターンを生成する。フィードバック生成部223は、制御ユニット20のCPU201によって実行される。
 ユーザから見たAR表示物候補の相対位置に関する情報は、例えば、ユーザの視野の中心方向を基準とするAR表示物候補の方向、及びAR表示物候補とユーザとの間の距離のうちの少なくとも一方に関連する情報を含む。ユーザの視野の中心方向を基準とするAR表示物候補の方向に関連するの情報は、例えば視野の中心方向(ユーザの正面方向)を基準としたオブジェクトの回転角の情報や、視野の中心の高さを基準とするオブジェクトの高さ等の情報を含んでいてもよい。
 触覚フィードバックパターンは、本実施形態において、第1の振動子131Rと第2の振動子131Lの振動パターンであり、第1の振動子131R及び第2の振動子131L各々の振動強度や振動タイミング等のパターンを含む。振動パターンとしては、例えば、ユーザの視野の中心方向を基準とするAR表示物候補の方向がユーザの右方に対応する場合、第1の振動子131Rを第2の振動子131Lよりも強い強度で振動させ、ユーザの視野の中心方向を基準とするAR表示物候補の方向がユーザの左方に対応する場合、第1の振動子131Rを第2の振動子131Lよりも弱い強度で振動させるパターンが適用される。
 また、触覚フィードバック処理の対象であるAR表示物候補は、視野外の位置に対応付けられていてもよい。
The feedback generation unit 223 generates a haptic feedback pattern for tactilely presenting information regarding the relative position of the AR display object candidate viewed from the user of the AR display object candidate determined as the target of the tactile feedback process. The feedback generation unit 223 is executed by the CPU 201 of the control unit 20.
The information regarding the relative position of the AR display object candidate viewed from the user is, for example, at least one of the direction of the AR display object candidate based on the center direction of the user's visual field and the distance between the AR display object candidate and the user. Contains information related to one. The information related to the direction of the AR display candidate based on the center direction of the user's visual field is, for example, information on the rotation angle of the object based on the central direction of the visual field (the front direction of the user), Information such as the height of an object based on the height may be included.
In the present embodiment, the tactile feedback pattern is a vibration pattern of the first vibrator 131R and the second vibrator 131L, and the vibration intensity and vibration timing of each of the first vibrator 131R and the second vibrator 131L. Including patterns. As the vibration pattern, for example, when the direction of the AR display object candidate based on the center direction of the user's visual field corresponds to the right side of the user, the first vibrator 131R is stronger than the second vibrator 131L. If the direction of the AR display object candidate with respect to the center direction of the user's visual field corresponds to the left side of the user, the first vibrator 131R is vibrated with a weaker intensity than the second vibrator 131L. A pattern is applied.
Moreover, the AR display object candidate that is the target of the tactile feedback process may be associated with a position outside the field of view.
 [ARシステムの動作例]
 (AR表示物候補の情報の取得処理)
 図10Aは、制御ユニット20におけるAR表示物候補の情報の取得処理の流れを示すフローチャートである。本動作例では、制御ユニット20が、ユーザの現在位置が例えば50m以上変化した場合等に、AR表示物候補の情報の取得処理を実行する。
 なお同図において、ST101乃至ST104の処理は、携帯情報端末30により実行される。また、ST201乃至ST203の処理は、ARサーバ40により実行され、ST301の処理は、制御ユニット20により実行される。
[Operation example of AR system]
(AR display object candidate information acquisition process)
FIG. 10A is a flowchart showing a flow of processing for acquiring information on AR display object candidates in the control unit 20. In this operation example, the control unit 20 executes an AR display candidate information acquisition process when the current position of the user has changed by, for example, 50 m or more.
In the figure, the processing from ST101 to ST104 is executed by the portable information terminal 30. Further, the processes of ST201 to ST203 are executed by the AR server 40, and the process of ST301 is executed by the control unit 20.
 まず携帯情報端末30のCPU301は、AR表示物候補の情報の取得条件を満たしているか否か判定する(ST101)。具体的には、CPU301は、GPS通信部305から取得された現在位置の情報に基づいてユーザの現在位置が50m以上変化したか否か判定してもよい。またCPU301は、上記に加えて、あるいは上記に替えて、ARサーバ40からAR表示物候補の状態変化に関する通知を受信したか否か判定してもよい。
 AR表示物候補の情報の取得条件を満たしていると判定された場合(ST101でYes)、ネットワーク通信部303は、CPU301の制御に基づき、GPS通信部305から取得された現在位置をARサーバ40へ送信する(ST102)。
First, the CPU 301 of the portable information terminal 30 determines whether or not an acquisition condition for information of an AR display object candidate is satisfied (ST101). Specifically, the CPU 301 may determine whether or not the user's current position has changed by 50 m or more based on the current position information acquired from the GPS communication unit 305. In addition to the above, or in place of the above, the CPU 301 may determine whether or not a notification regarding the state change of the AR display object candidate has been received from the AR server 40.
When it is determined that the information acquisition condition for the AR display object candidate is satisfied (Yes in ST101), the network communication unit 303 determines the current position acquired from the GPS communication unit 305 based on the control of the CPU 301 as the AR server 40. (ST102).
 ARサーバ40のネットワーク通信部403は、CPU401の制御に基づき、携帯情報端末30から送信された現在位置を受信する(ST201)。続いてCPU401は、取得された現在位置に応じたAR表示物候補の情報を、オブジェクトデータベース404から取得する(ST202)。そして、ネットワーク通信部403は、CPU401の制御に基づき、携帯情報端末30へ取得されたAR表示物候補の情報を携帯情報端末30へ送信する(ST203)。 The network communication unit 403 of the AR server 40 receives the current position transmitted from the portable information terminal 30 based on the control of the CPU 401 (ST201). Subsequently, the CPU 401 acquires information on AR display object candidates corresponding to the acquired current position from the object database 404 (ST202). Then, based on the control of CPU 401, network communication unit 403 transmits information about the AR display object candidate acquired to portable information terminal 30 to portable information terminal 30 (ST203).
 携帯情報端末30のネットワーク通信部303は、CPU301の制御に基づき、ARサーバ40から送信されたAR表示物候補の情報を取得する(ST103)。続いて近距離通信部304は、CPU301の制御に基づき、取得されたAR表示物候補の情報を制御ユニット20へ送信する(ST104)。 The network communication unit 303 of the portable information terminal 30 acquires information on the AR display object candidate transmitted from the AR server 40 based on the control of the CPU 301 (ST103). Subsequently, the short-range communication unit 304 transmits information on the acquired AR display object candidate to the control unit 20 based on the control of the CPU 301 (ST104).
 最後に制御ユニット20の通信部203は、CPU201の制御に基づき、携帯情報端末30から送信されたAR表示物候補の情報を受信し、CPU201が当該情報をメモリ202に保存する(ST301)。
 これにより、制御ユニット20のAR情報管理部212,221が、AR表示物候補の情報を取得することができる。
 一方、制御ユニット20は、HMD10と協働し、所定の描画タイミングで描画処理を実行する。以下、描画処理の概要について説明する。
Finally, the communication unit 203 of the control unit 20 receives the information of the AR display object candidate transmitted from the portable information terminal 30 based on the control of the CPU 201, and the CPU 201 stores the information in the memory 202 (ST301).
Thereby, the AR information management units 212 and 221 of the control unit 20 can acquire information on AR display object candidates.
On the other hand, the control unit 20 cooperates with the HMD 10 to execute a drawing process at a predetermined drawing timing. The outline of the drawing process will be described below.
 (描画処理)
 図10Bは、描画処理の流れを示すフローチャートである。
 なお同図において、ST401乃至ST404の処理は、HMD10により実行される。また、ST501乃至ST504の処理は、制御ユニット20により実行される。
(Drawing process)
FIG. 10B is a flowchart showing the flow of the drawing process.
In the figure, the processing of ST401 to ST404 is executed by the HMD 10. Further, the processing of ST501 to ST504 is executed by the control unit 20.
 まず、HMD10は、1/30secの描画タイミングになったか否か判定している(ST401)。
 描画タイミングになったと判定された場合(ST401でYes)、HMD10は、検出部14の検出結果を制御ユニット20に出力する(ST402)。
 また、制御ユニット20の視野設定部211は、出力された検出部14の検出結果を取得する(ST501)。
First, the HMD 10 determines whether or not the drawing timing of 1/30 sec has been reached (ST401).
When it is determined that the drawing timing has come (Yes in ST401), the HMD 10 outputs the detection result of the detection unit 14 to the control unit 20 (ST402).
Moreover, the visual field setting unit 211 of the control unit 20 acquires the output detection result of the detection unit 14 (ST501).
 検出結果を取得した制御ユニット20のAR情報管理部212,221は、メモリ202からAR表示物候補の情報を取得する(ST502)。
 そして、制御ユニット20は、CPU201とメモリ202との協働によって、AR表示処理(ST503)及び触覚フィードバック処理(ST504)がそれぞれ実行される。AR表示処理(ST503)及び触覚フィードバック処理(ST504)の処理結果は、それぞれHMD10に送信される。
 そしてHMD10の表示部11は、AR表示処理の処理結果に基づいて所定のAR表示物を含む視野を提示し(ST403)、触覚提示部13は、触覚フィードバック処理の処理結果に基づいてオブジェクトの相対位置の情報を触覚で提示する(ST404)。
The AR information management units 212 and 221 of the control unit 20 that acquired the detection result acquires information on AR display object candidates from the memory 202 (ST502).
Then, the control unit 20 executes an AR display process (ST503) and a tactile feedback process (ST504) in cooperation with the CPU 201 and the memory 202. The processing results of the AR display process (ST503) and the tactile feedback process (ST504) are transmitted to the HMD 10, respectively.
The display unit 11 of the HMD 10 presents a field of view including a predetermined AR display object based on the processing result of the AR display process (ST403), and the tactile sense presentation unit 13 compares the object relative to the object based on the processing result of the tactile feedback process. The position information is presented by touch (ST404).
 [制御ユニットの動作例]
 図11は、制御ユニットの動作例について説明するフローチャートであり、図11AはAR表示処理(ST203)のフローチャート、図11Bは触覚フィードバック処理(ST204)のフローチャートである。
[Operation example of control unit]
FIG. 11 is a flowchart for explaining an operation example of the control unit. FIG. 11A is a flowchart of the AR display process (ST203), and FIG. 11B is a flowchart of the tactile feedback process (ST204).
 (AR表示処理)
 図11Aを参照し、まず、視野設定部211が、検出部14の検出結果から算出された表示部11の姿勢に基づいて、式(3)及び式(4)で表される視野Vの範囲を設定する(図7、図8参照)(ST601)。
 続いて、表示制御部213が、AR情報管理部212によって管理されているAR表示物候補の情報に基づいて、視野V内に表示されるオブジェクトがあるか否か判定する(図7参照)(ST602)。あると判定された場合(ST602でYes)、表示制御部213は視野Vと重複するオブジェクトを表示する処理を実行する(ST603)。
 以上により、制御ユニット20は、AR表示処理を終了する。
(AR display processing)
Referring to FIG. 11A, first, the visual field setting unit 211 determines the range of the visual field V represented by Expression (3) and Expression (4) based on the attitude of the display unit 11 calculated from the detection result of the detection unit 14. Is set (see FIGS. 7 and 8) (ST601).
Subsequently, the display control unit 213 determines whether there is an object to be displayed in the field of view V based on the information on the AR display object candidate managed by the AR information management unit 212 (see FIG. 7) ( ST602). If determined to be present (Yes in ST602), the display control unit 213 executes a process of displaying an object overlapping the visual field V (ST603).
As described above, the control unit 20 ends the AR display process.
 (触覚フィードバック処理)
 続いて、図11Bを参照し、触覚処理判定部222が、AR情報管理部221によって取得された各AR表示物候補のうち、触覚フィードバック処理の対象となるAR表示物候補があるか否か判定する(ST701)。
 具体的には、触覚処理判定部222は、AR情報管理部221によって取得された各AR表示物候補のうち、新規に取得されたAR表示物候補があるか否か判定する。新規に取得されたAR表示物候補とは、例えば前回の処理ではAR情報管理部221により取得されていなかったが、今回の処理ではAR情報管理部221により取得されたAR表示物候補をいう。
 新規に取得されたAR表示物候補があると判定された場合、フィードバック生成部223は、対象と判定された各AR表示物候補について以下のように触覚フィードバックパターンを生成する(ST702)。
(Tactile feedback processing)
Subsequently, with reference to FIG. 11B, the haptic process determination unit 222 determines whether there is an AR display object candidate to be subjected to the haptic feedback process among the AR display object candidates acquired by the AR information management unit 221. (ST701).
Specifically, the tactile sensation processing determination unit 222 determines whether or not there is a newly acquired AR display object candidate among the AR display object candidates acquired by the AR information management unit 221. The newly acquired AR display object candidate is, for example, an AR display object candidate acquired by the AR information management unit 221 in the current process, which was not acquired by the AR information management unit 221 in the previous process.
When it is determined that there is a newly acquired AR display object candidate, the feedback generation unit 223 generates a tactile feedback pattern for each AR display object candidate determined to be a target as follows (ST702).
 (触覚フィードバックパターンの生成例)
 図12は、触覚フィードバックパターンの生成例を示す模式的な図である。
 フィードバック生成部223は、まず、対象と判定されたAR表示物候補に含まれるオブジェクトBの、ユーザUの正面方向を基準とした回転角(周方向の座標)θ'b[°]を算出する(ST702-1)。
 ユーザUの正面方向の座標θv0は、式(3)で示した視野Vの周方向の範囲を鑑み、以下の式(5)で表される。
 θv0[°]=(θv1+θv2)/2 …(5)
 なお、円筒座標C0において、ユーザUの右方の領域は、θv0[°]以上(θv0+180)[°]以下の座標で表され、ユーザUの左方の領域は、(θv0+180)[°]以上(θv0+360)[°]以下の座標で表される。
 そして、オブジェクトBの相対的な回転角θ'bは、円筒座標C0におけるオブジェクトの座標θbとユーザUの正面方向の座標θv0を用いて、以下の(6)式で表される。
 θ'b=θb-θv0 …(6)
(Tactile feedback pattern generation example)
FIG. 12 is a schematic diagram illustrating a generation example of a haptic feedback pattern.
First, the feedback generation unit 223 calculates the rotation angle (coordinate in the circumferential direction) θ′b [°] of the object B included in the AR display object candidate determined as the target with reference to the front direction of the user U. (ST702-1).
The coordinate θv0 in the front direction of the user U is represented by the following formula (5) in view of the range in the circumferential direction of the visual field V shown by the formula (3).
θv0 [°] = (θv1 + θv2) / 2 (5)
In the cylindrical coordinates C0, the area on the right side of the user U is represented by coordinates not less than θv0 [°] and not more than (θv0 + 180) [°], and the area on the left side of the user U is not less than (θv0 + 180) [°]. (Θv0 + 360) It is expressed by coordinates below [°].
The relative rotation angle θ′b of the object B is expressed by the following equation (6) using the object coordinate θb in the cylindrical coordinate C0 and the coordinate θv0 in the front direction of the user U.
θ′b = θb−θv0 (6)
 続いて、フィードバック生成部223は、例えば予めメモリ202に記憶されたルックアップテーブルを参照し、θ'bが属する領域を決定する(ST703-2)。
 図12に示すように、本動作例においては、ユーザUを包囲するように、第1~第5領域R1~R5が規定されている。図12は、θ'bが第1の領域R1に属する例を示す。第1の領域R1に属するθ'bの範囲は、以下の式(10)で表される。
 θ'v2<θ'b≦90° …(10)
 式中、θ'v2は、ユーザUの正面方向を基準(0°)とした視野Vの右端部の相対座標であり、θ'v2=θv2-θv0 を満たす。
 同様に、視野V及び第2~第5の領域R2~R5に属するθ'bの範囲を、以下の表1に示す。
 表中、θ'v1は、ユーザUの正面方向を基準(0°)とした視野Vの左端部の相対座標であり、θ'v2と同様に、θ'v1=θv1-θv0 を満たす。
Subsequently, the feedback generation unit 223 refers to, for example, a lookup table stored in advance in the memory 202, and determines a region to which θ′b belongs (ST703-2).
As shown in FIG. 12, in this operation example, first to fifth regions R1 to R5 are defined so as to surround the user U. FIG. 12 shows an example in which θ′b belongs to the first region R1. The range of θ′b belonging to the first region R1 is expressed by the following formula (10).
θ′v2 <θ′b ≦ 90 ° (10)
In the equation, θ′v2 is a relative coordinate of the right end portion of the visual field V with the front direction of the user U as a reference (0 °), and satisfies θ′v2 = θv2−θv0.
Similarly, the range of θ′b belonging to the visual field V and the second to fifth regions R2 to R5 is shown in Table 1 below.
In the table, θ′v1 is a relative coordinate of the left end portion of the field of view V with the front direction of the user U as a reference (0 °), and satisfies θ′v1 = θv1−θv0 similarly to θ′v2.
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
 続いて、フィードバック生成部223は、予めメモリ202に記憶されたルックアップテーブルを参照し、θ'bの属する領域に応じた各振動子131R,131Lの振動パターンを決定する(ST702-3)。
 ルックアップテーブルとして記憶された第1~第5の領域R1~R5各々と、各振動子131R,131Lの強度との関係を、図12及び表1に示す。
 図12中の「R」は、右側に配置された第1の振動子131Rを示し、「L」は、左側に配置された第2の振動子131Lを示す。また、図12中の「R」、「L」に対応する数値は、各振動子131R,131Lの振動強度を示す。図表に示す振動強度の数値については、0が振動なしであり、数値が大きくなるに従い振動強度が大きくなり、10が最大強度である。
 以上により、制御ユニット20は、触覚フィードバック生成処理を終了する。
Subsequently, feedback generation section 223 refers to a lookup table stored in advance in memory 202, and determines the vibration pattern of each transducer 131R, 131L according to the region to which θ′b belongs (ST702-3).
FIG. 12 and Table 1 show the relationship between each of the first to fifth regions R1 to R5 stored as a lookup table and the strength of each transducer 131R, 131L.
In FIG. 12, “R” indicates the first vibrator 131R disposed on the right side, and “L” indicates the second vibrator 131L disposed on the left side. Also, the numerical values corresponding to “R” and “L” in FIG. 12 indicate the vibration strengths of the vibrators 131R and 131L. Regarding the numerical values of the vibration intensity shown in the chart, 0 indicates no vibration, the vibration intensity increases as the numerical value increases, and 10 indicates the maximum intensity.
As described above, the control unit 20 ends the tactile feedback generation process.
 (本動作例の作用)
 図12及び表1を参照し、例えば第1の領域R1は、ユーザUの右前方に対応し、ユーザが自身の右方と認識し得るユーザUの右腕の可動域とも略一致する。この場合、右側に配置された第1の振動子131Rは最大強度で振動し、左側に配置された第2の振動子131Lは振動しない。これにより、ユーザは、右側頭部において大きな振動を知覚し、直観的に自身の右方に新規オブジェクトが存在することを認識することができる。
 同様に、第2の領域R2は、ユーザUの右後方に対応する。この場合、第1の振動子131Rは最大強度の8割程度の強度で振動し、第2の振動子131Lは最大強度の2割程度の強度で振動する。これにより、ユーザは、右側頭部において大きな振動を知覚するとともに、左側頭部においてもわずかな振動を知覚することができ、自身の右方(第1の領域R1)よりも若干左方へ向けて回転した方向、つまり、右後方に新規オブジェクトが存在することを認識することができる。
 同様に、第3の領域R3は、ユーザUの後方、つまり背面に対応する。この場合、両振動子131R,131Lが略同一の強度で振動する。これにより、ユーザUは、自身の右方にも左方に偏っていない領域、つまり後方に新規オブジェクトが存在することを認識することができる。
 ユーザのUの左方に対応する第4の領域R4及び第5の領域R5に新規オブジェクトが存在する場合も同様に、振動によって新規オブジェクトの存在とその方向とをユーザに認識させることができる。
 また、図12及び表1に示すように、視野Vに含まれるAR表示物候補については、いずれの振動子131R,131Lも振動しない。すなわち、触覚フィードバック処理の対象となるAR表示物候補は、視野V外の位置に対応付けられている。これにより、本動作例によれば、視野V中に含まれないAR表示物候補の方向を、触覚フィードバックによってユーザに把握させることができる。
(Operation of this operation example)
Referring to FIG. 12 and Table 1, for example, the first region R1 corresponds to the right front of the user U and substantially coincides with the movable range of the right arm of the user U that can be recognized as the right side of the user U. In this case, the first vibrator 131R arranged on the right side vibrates at the maximum intensity, and the second vibrator 131L arranged on the left side does not vibrate. Thereby, the user can perceive a large vibration in the right side of the head and intuitively recognize that a new object exists on the right side of the user.
Similarly, the second region R2 corresponds to the right rear of the user U. In this case, the first vibrator 131R vibrates with a strength of about 80% of the maximum strength, and the second vibrator 131L vibrates with a strength of about 20% of the maximum strength. As a result, the user can perceive a large vibration in the right side of the head, and can also perceive a slight vibration in the left side of the head, and slightly toward the left rather than the right side of the user (the first region R1). It can be recognized that there is a new object in the rotated direction, that is, right rear.
Similarly, the third region R3 corresponds to the back of the user U, that is, the back surface. In this case, both vibrators 131R and 131L vibrate with substantially the same intensity. As a result, the user U can recognize that there is a new object in an area that is not biased to the right or left of the user U, that is, behind.
Similarly, when a new object exists in the fourth region R4 and the fifth region R5 corresponding to the left side of the user U, the user can recognize the presence and direction of the new object by vibration.
Further, as shown in FIG. 12 and Table 1, none of the vibrators 131R and 131L vibrates for the AR display object candidate included in the visual field V. That is, the AR display object candidate to be subjected to the tactile feedback process is associated with a position outside the visual field V. Thereby, according to this operation example, the direction of the AR display object candidate not included in the visual field V can be recognized by the user by tactile feedback.
 [本実施形態の作用効果]
 以上のように、本実施形態によれば、制御ユニット20が、ユーザの視野の中心方向を基準とするAR表示物候補の方向を提示可能な触覚フィードバックパターンを生成することができる。これにより、ユーザは、オブジェクトを視認できない場合でも、触覚フィードバックパターンに基づいてAR表示物候補の相対位置を精度よく把握することができる。
 特に、本実施形態によれば、触覚提示部13は、新規に取得されたAR表示物候補についての情報を触覚で提示することができる。これにより、ユーザに対し新規な情報をその位置情報とともに提示することができ、オブジェクトを有効に活用することができる。
[Operational effects of this embodiment]
As described above, according to the present embodiment, the control unit 20 can generate a haptic feedback pattern capable of presenting the direction of the AR display object candidate based on the center direction of the user's visual field. Thereby, even when the user cannot visually recognize the object, the user can accurately grasp the relative position of the AR display object candidate based on the tactile feedback pattern.
In particular, according to the present embodiment, the tactile sense providing unit 13 can present the information about the newly acquired AR display object candidate by tactile sense. Thereby, new information can be presented to the user together with the position information, and the object can be used effectively.
 さらに本実施形態によれば、左右に設けられた複数の振動子131R,131Lの配置とAR表示物候補の相対位置とをリンクさせて振動強度を調整することで、AR表示物候補の方向を直観的に把握させることができる。 Furthermore, according to this embodiment, the direction of the AR display object candidate is adjusted by adjusting the vibration intensity by linking the arrangement of the plurality of vibrators 131R and 131L provided on the left and right and the relative position of the AR display object candidate. It is possible to grasp intuitively.
 さらに本実施形態によれば、触覚提示部13が表示部11ではなく装着部12に設けられ、特に表示部11と離間したモダン部123R,123Lに配置されることにより、振動が表示部11へ伝達しにくい構成とすることができる。これにより、触覚フィードバックによる各表示面111R,111Lの変位を抑制し、振動時も明瞭かつ視認しやすい画像を提示することができる。
 また、モダン部123R,123Lの剛性を高めることで、上記作用効果を一層高めることができる。
Furthermore, according to the present embodiment, the tactile sense presenting unit 13 is provided not on the display unit 11 but on the mounting unit 12, and in particular, by being arranged in the modern units 123 </ b> R and 123 </ b> L spaced apart from the display unit 11, vibration is transmitted to the display unit 11. It can be configured to be difficult to transmit. Thereby, the displacement of each of the display surfaces 111R and 111L due to the tactile feedback can be suppressed, and an image that is clear and easily visible even during vibration can be presented.
Moreover, the said effect can be further heightened by raising the rigidity of modern part 123R, 123L.
 [ARシステムの適用例]
 本実施形態のARシステム100(制御ユニット20)の適用例について説明する。
[AR system application example]
An application example of the AR system 100 (control unit 20) of the present embodiment will be described.
 (適用例1)
 ARシステム100は、所定の人物の位置をリアルタイムに提示するアプリケーションプログラム(以下、アプリとも略する)に適用されることができる。この場合、対象物はユーザにより予め登録された例えば家族や友人等の人物であり、オブジェクトは当該人物を特定する情報(名前等)とすることができる。
 例えば、ARサーバ40は、人物(対象物)がユーザから所定の距離(例えば1km)以内に存在すると判定した場合に、その人物とオブジェクトの情報をAR表示物候補の情報として取得する。そして制御ユニット20の触覚処理判定部222は、携帯情報端末30を介して取得された上記AR表示物候補が新規に取得されたものであるか否か判定する。新規に取得されたものであると判定された場合、フィードバック生成部223が人物やオブジェクトの位置情報に基づく触覚フィードバックパターンを生成する。
 これにより、ユーザは、オブジェクトが表示されない場合でも人物の接近及びそのオブジェクトの位置情報を把握することができ、オブジェクトの方へ視野を移動させることができる。
 また、その後ユーザと人物の距離が変化して更に近づき、ARサーバ40が、人物がユーザから別の所定の距離(例えば200m)以内に存在すると判定した場合に、その人物とオブジェクトの情報をAR表示物候補の情報として再度新たに取得する。そして制御ユニット20の触覚処理判定部222は、携帯情報端末30を介して取得された上記AR表示物候補が新規に取得されたものであるか否か判定する。新規に取得されたものであると判定された場合、フィードバック生成部223が人物やオブジェクトの位置情報に基づく触覚フィードバックパターンを生成する。
 これにより、触覚提示部13が、ユーザと間の距離に所定の変化が生じたAR表示物候補について、ユーザから見たAR表示物候補の相対位置に関する情報を触覚で提示することができる。したがって、ユーザは、オブジェクトが表示されない場合でも人物の更なる接近及びそのオブジェクトの位置情報を把握することができ、オブジェクトの方へ視野を移動させることで、人物と会うことも可能となる。
(Application example 1)
The AR system 100 can be applied to an application program (hereinafter also referred to as an application) that presents a position of a predetermined person in real time. In this case, the object is a person such as a family or a friend registered in advance by the user, and the object can be information (name or the like) specifying the person.
For example, when the AR server 40 determines that a person (object) exists within a predetermined distance (for example, 1 km) from the user, the AR server 40 acquires information on the person and the object as information on the AR display object candidate. Then, the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired. When it is determined that the information is newly acquired, the feedback generation unit 223 generates a tactile feedback pattern based on the position information of the person or the object.
Thereby, even when the object is not displayed, the user can grasp the approach of the person and the position information of the object, and can move the field of view toward the object.
Further, when the distance between the user and the person changes and further approaches, and the AR server 40 determines that the person exists within another predetermined distance (for example, 200 m) from the user, the information on the person and the object is stored in the AR. It is newly acquired again as information on the display object candidate. Then, the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired. When it is determined that the information is newly acquired, the feedback generation unit 223 generates a tactile feedback pattern based on the position information of the person or the object.
Thereby, the tactile sense providing unit 13 can present information related to the relative position of the AR display object candidate viewed from the user with a tactile sensation for the AR display object candidate in which a predetermined change has occurred in the distance from the user. Therefore, even when the object is not displayed, the user can grasp the further approach of the person and the position information of the object, and can also meet the person by moving the field of view toward the object.
 (適用例2)
 ARシステム100は、火災箇所や事故箇所、事故多発スポット等の危険箇所を通知するアプリに適用されることができる。この場合、対象物は危険箇所であり、オブジェクトは危険箇所に付随する情報(位置や注意を喚起する情報等)とすることができる。
 例えば、ARサーバ40は、危険箇所(対象物)がユーザから所定の距離(例えば10km)以内に存在すると判定した場合に、その危険箇所とオブジェクトの情報をAR表示物候補の情報として取得する。そして制御ユニット20の触覚処理判定部222は、携帯情報端末30を介して取得された上記AR表示物候補が新規に取得されたものであるか否か判定する。新規に取得されたものであると判定された場合、フィードバック生成部223が危険箇所やオブジェクトの位置情報に基づく触覚フィードバックパターンを生成する。
 これにより、ユーザは、オブジェクトが表示されない場合でも、危険箇所の接近及びそのオブジェクトの位置情報を把握することができ、オブジェクトの方へ視野を移動させることができ、危険箇所と反対方向に逃げる等の回避行動をとることができる。
 危険箇所を知らせる距離は、危険の種類やユーザの行動(歩いているか、車に乗っているか等)によって変更してもよい。また触覚フィードバックパターンも危険度に応じて、振動強度を変更してもよい。またオブジェクトが表示されている場合にも、触覚フィードバックパターンを生成するようにしてもよい。これにより確実に危険の回避が可能となる。
(Application example 2)
The AR system 100 can be applied to an application that notifies a dangerous location such as a fire location, an accident location, or an accident-prone spot. In this case, the object is a dangerous place, and the object can be information (position or information for calling attention) associated with the dangerous place.
For example, when the AR server 40 determines that the dangerous part (target object) exists within a predetermined distance (for example, 10 km) from the user, the AR server 40 acquires the information on the dangerous part and the object as information on the AR display object candidate. Then, the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired. When it is determined that the information is newly acquired, the feedback generation unit 223 generates a tactile feedback pattern based on the dangerous location and the position information of the object.
Thereby, even when the object is not displayed, the user can grasp the approach of the dangerous place and the position information of the object, can move the field of view toward the object, escape in the direction opposite to the dangerous place, etc. You can take avoidance actions.
The distance informing the dangerous location may be changed depending on the type of danger and the user's action (whether walking or riding a car). Further, the vibration intensity of the tactile feedback pattern may be changed according to the degree of risk. Further, even when an object is displayed, a haptic feedback pattern may be generated. This makes it possible to avoid danger without fail.
 (適用例3)
 ARシステム100は、経路案内を行うナビゲーションアプリや、地域の情報を提供するタウンガイドアプリに適用されることができる。この場合、対象物は観光スポット、駅、バス停、店、ガソリンスタンド、駐車場、避難場所、渋滞箇所、工事箇所等であり、オブジェクトはこれらの該当箇所に付随する情報(スポット名、位置情報等)とすることができる。
 例えば、ARサーバ40は、該当箇所(対象物)がユーザから所定の距離(例えば10km)以内に存在すると判定した場合に、その該当箇所とオブジェクトの情報をAR表示物候補の情報として取得する。そして制御ユニット20の触覚処理判定部222は、携帯情報端末30を介して取得された上記AR表示物候補が新規に取得されたものであるか否か判定する。新規に取得されたものであると判定された場合、フィードバック生成部223が該当箇所やオブジェクトの位置情報に基づく触覚フィードバックパターンを生成する。
 あるいは、触覚処理判定部222は、既に取得されたAR表示物候補の表示態様が変化したか否か判定し、変化したと判定された場合、フィードバック生成部223が触覚フィードバックパターンを生成してもよい。これにより、触覚提示部13は、表示態様に変化が生じたAR表示物候補についての情報を触覚で提示することができる。上記表示態様は、例えばオブジェクトの表示内容である該当箇所の混雑状況等が変化した場合に、変化し得る。
 これにより、ユーザは、オブジェクトが表示されない場合でも、該当箇所の接近やオブジェクトの内容の変化、オブジェクトの位置情報を把握することができ、オブジェクトの方へ視野を移動させることができる。
(Application example 3)
The AR system 100 can be applied to a navigation application that provides route guidance and a town guide application that provides local information. In this case, the object is a tourist spot, a station, a bus stop, a store, a gas station, a parking lot, an evacuation site, a traffic jam site, a construction site, etc., and the object is information attached to the corresponding site (spot name, location information, etc.) ).
For example, when the AR server 40 determines that the corresponding part (target object) exists within a predetermined distance (for example, 10 km) from the user, the AR server 40 acquires the information on the corresponding part and the object as information on the AR display object candidate. Then, the tactile sense determination unit 222 of the control unit 20 determines whether or not the AR display object candidate acquired via the portable information terminal 30 is newly acquired. When it is determined that the data is newly acquired, the feedback generation unit 223 generates a haptic feedback pattern based on the position information of the corresponding part and the object.
Alternatively, the haptic process determination unit 222 determines whether or not the display mode of the already acquired AR display object candidate has changed. If it is determined that the display mode has changed, the feedback generation unit 223 generates the haptic feedback pattern. Good. Thereby, the tactile sense providing unit 13 can present information about the AR display object candidate whose display mode has changed by tactile sense. The display mode can change when, for example, the congestion status of the corresponding part, which is the display content of the object, changes.
Thereby, even when the object is not displayed, the user can grasp the approach of the corresponding part, the change of the content of the object, the position information of the object, and can move the visual field toward the object.
 (適用例4)
 ARシステム100は、現実世界に紐付いたゲームアプリに適用されることができる。この場合、例えばAR表示物が対象物を有しておらず、ゲームのキャラクタやアイテム等のオブジェクトのみを有する。
 例えば、ARサーバ40は、オブジェクトの設定された位置がユーザから所定の距離(例えば数100m~数km)以内に存在すると判定した場合に、そのオブジェクトの情報をAR表示物候補の情報として取得する。そして制御ユニット20の触覚処理判定部222は、携帯情報端末30を介して取得された上記AR表示物候補の情報が新規に取得された情報であるか否か判定する。新規に取得された情報であると判定された場合、フィードバック生成部223がオブジェクトの位置情報に基づく触覚フィードバックパターンを生成する。
 これにより、ユーザは、付近に敵キャラクタやアイテムが存在していることを把握することができ、それらにアプローチすることができる。
 あるいは、触覚処理判定部222は、既に取得されたオブジェクトの表示態様が変化したか否か判定し、変化したと判定された場合、フィードバック生成部223が触覚フィードバックパターンを生成してもよい。上記表示態様は、例えばオブジェクトであるアイテムの内容が変化した場合や、オブジェクトとして表示される戦況が変化した場合等に、変化し得る。あるいは、オブジェクトである敵キャラクタが攻撃の動作をした場合や、インターネット50を介して接続されている他のユーザが新たに登録された場合なども、触覚フィードバックパターンを生成することができる。
 これにより、ユーザは、オブジェクトが表示されない場合でも、該当箇所の接近やオブジェクトの内容の変化、オブジェクトの位置情報を把握することができ、オブジェクトの方へ視野を移動させることができる。
(Application example 4)
The AR system 100 can be applied to a game application associated with the real world. In this case, for example, the AR display object does not have an object, and has only objects such as game characters and items.
For example, when the AR server 40 determines that the set position of the object exists within a predetermined distance (for example, several hundred meters to several km) from the user, the AR server 40 acquires the information of the object as the AR display object candidate information. . Then, the tactile processing determination unit 222 of the control unit 20 determines whether or not the information on the AR display object candidate acquired via the portable information terminal 30 is information that is newly acquired. When it is determined that the information is newly acquired, the feedback generation unit 223 generates a haptic feedback pattern based on the position information of the object.
Thereby, the user can grasp that there are enemy characters and items in the vicinity, and can approach them.
Alternatively, the haptic process determination unit 222 determines whether or not the display mode of the already acquired object has changed, and when it is determined that the display has changed, the feedback generation unit 223 may generate a haptic feedback pattern. The display mode can change, for example, when the content of an item that is an object changes or when the battle situation displayed as an object changes. Alternatively, the tactile feedback pattern can also be generated when an enemy character that is an object performs an attacking action or when another user connected via the Internet 50 is newly registered.
Thereby, even when the object is not displayed, the user can grasp the approach of the corresponding part, the change of the content of the object, the position information of the object, and can move the visual field toward the object.
 [変形例1-1]
 触覚フィードバックパターンの生成例は、図12及び表1に示した例に限定されず、オブジェクトのユーザUの正面方向を基準とした回転角θ'bの属する領域を以下のように設定してもよい。
 例えば、図13に示すように、θ'bの属する領域は、視野Vの他、第1及び第2の領域R1,R2のみであってもよい。この場合、第1の領域R1は、以下の式(11)で表され、第2の領域R2は、以下の式(12)で表される。
 θ'v2<θ'b≦180° …(11)
 180°<θ'b<θ'v1 …(12)
 オブジェクトが第1の領域R1にある場合、オブジェクトがユーザの右方に存在するため、第1の振動子131Rが振動し、第2の振動子131Lは振動しない。一方、オブジェクトが第2の領域R2にある場合、オブジェクトがユーザの左方に存在するため、第2の振動子131Lが振動し、第1の振動子131Rは振動しない。
[Modification 1-1]
The generation example of the tactile feedback pattern is not limited to the example shown in FIG. 12 and Table 1. Even if the region to which the rotation angle θ′b with respect to the front direction of the user U of the object belongs is set as follows, Good.
For example, as shown in FIG. 13, the region to which θ′b belongs may be only the first and second regions R1 and R2 in addition to the visual field V. In this case, the first region R1 is represented by the following equation (11), and the second region R2 is represented by the following equation (12).
θ′v2 <θ′b ≦ 180 ° (11)
180 ° <θ′b <θ′v1 (12)
When the object is in the first region R1, since the object exists on the right side of the user, the first vibrator 131R vibrates and the second vibrator 131L does not vibrate. On the other hand, when the object is in the second region R2, since the object exists on the left side of the user, the second vibrator 131L vibrates and the first vibrator 131R does not vibrate.
 また、振動子131R,131Lの強度も、図12及び表1に示した例に限定されず、適宜調整することができる。 Further, the strengths of the vibrators 131R and 131L are not limited to the examples shown in FIG. 12 and Table 1, and can be appropriately adjusted.
 あるいは、ユーザの視野の中心方向を基準とするAR表示物候補の方向に関連する情報は、視野の中心の高さを基準とするオブジェクトの高さの情報を含んでいてもよい。この場合、オブジェクトの相対位置の属する領域は、高さ方向のhの値に関して設定された領域であってもよい。
 例えば視野Vの上方及び下方にAR表示物候補のオブジェクトが存在する場合、第1及び第2の振動子131R,131Lをそれぞれ強度5で、かつ特定の振動パターンで振動させることができる。
Alternatively, the information related to the direction of the AR display object candidate based on the center direction of the user's visual field may include information on the height of the object based on the height of the center of the visual field. In this case, the area to which the relative position of the object belongs may be an area set for the value of h in the height direction.
For example, when there are AR display object candidate objects above and below the field of view V, the first and second vibrators 131R and 131L can be vibrated with a specific vibration pattern with an intensity of 5, respectively.
 さらに、視野Vに含まれるオブジェクトについて触覚フィードバックを生成してもよい。
 これにより、視野内にユーザの注視しにくいオブジェクト等がある場合、触覚フィードバックによって当該オブジェクトの存在及び位置情報をユーザに提供することができる。
Further, tactile feedback may be generated for objects included in the field of view V.
Thereby, when there is an object or the like that is difficult for the user to watch in the visual field, the presence and position information of the object can be provided to the user by tactile feedback.
 [変形例1-2]
 以上の実施形態では、例えば、ユーザの視野の中心方向を基準とするAR表示物候補の方向の情報に基づいて触覚フィードバックパターンを生成する例を示したが、これに限定されない。
 例えば、フィードバック生成部223は、AR表示物候補とユーザとの間の距離に関連する情報に基づいて触覚フィードバックパターンを生成することができる。
[Modification 1-2]
In the above embodiment, for example, the example in which the tactile feedback pattern is generated based on the information on the direction of the AR display object candidate based on the center direction of the user's visual field has been described, but the present invention is not limited thereto.
For example, the feedback generation unit 223 can generate a haptic feedback pattern based on information related to the distance between the AR display object candidate and the user.
 図14は、本変形例の触覚フィードバックパターンの生成例を示す模式的な図である。
 フィードバック生成部223は、対象となるAR表示物候補の対象物とユーザUとの距離に基づいて、当該対象物が属する領域を決定する。
 同図に示すように、本変形例では、ユーザUからの距離が最も遠い第1の領域R11と、第1の領域R11よりも近い第2の領域R12と、第2の領域R12よりも近い第3の領域R13とが設定されている。各領域のユーザUからの距離は、適宜設定可能である。
 続いて、フィードバック生成部223は、予めメモリ202に記憶されたルックアップテーブルを参照し、決定された領域に応じた振動子131の振動パターンを決定する。
FIG. 14 is a schematic diagram illustrating a generation example of a haptic feedback pattern according to the present modification.
The feedback generation unit 223 determines the region to which the target object belongs based on the distance between the target AR display target candidate target and the user U.
As shown in the figure, in the present modification, the first region R11 farthest from the user U, the second region R12 closer to the first region R11, and closer to the second region R12. A third region R13 is set. The distance from the user U in each region can be set as appropriate.
Subsequently, the feedback generation unit 223 refers to a lookup table stored in advance in the memory 202 and determines a vibration pattern of the vibrator 131 according to the determined area.
 図14中のグラフは、対象物の位置が第1~第3の領域R11~R13各々に対応する場合の振動パターン例を示すグラフであり、横軸が時間、縦軸が振動強度を示す。
 これらのグラフに示すように、振動子131は、断続的に、例えば所定のピッチで振動する。第1の領域R11のグラフを参照し、第1の領域R11における1周期の長さ(振動ピッチと称する)をt1、各周期における連続振動時間をw1とする。同様に、第2の領域R12における振動ピッチをt2、各周期における連続振動時間をw2とし、第3の領域R13における振動ピッチをt3、各周期における連続振動時間をw3とする。t1~t3と、w1~w3との関係は、それぞれ以下の式(13)、式(14)で表される。
 t1>t2>t3 …(13)
 w1<w2<w3 …(14)
 すなわち、上記振動パターンによれば、ユーザUと対象物との距離が近くなるに従って連続振動時間が長く、かつ振動ピッチが短くなる。これにより、ユーザは、オブジェクトが表示されていない場合でも、振動ピッチと連続振動時間とに基づいて対象物の距離を把握することができる。
The graph in FIG. 14 is a graph showing an example of a vibration pattern when the position of the object corresponds to each of the first to third regions R11 to R13, with the horizontal axis representing time and the vertical axis representing vibration intensity.
As shown in these graphs, the vibrator 131 oscillates intermittently, for example, at a predetermined pitch. Referring to the graph of the first region R11, the length of one cycle (referred to as a vibration pitch) in the first region R11 is t1, and the continuous vibration time in each cycle is w1. Similarly, the vibration pitch in the second region R12 is t2, the continuous vibration time in each cycle is w2, the vibration pitch in the third region R13 is t3, and the continuous vibration time in each cycle is w3. The relationship between t1 to t3 and w1 to w3 is expressed by the following equations (13) and (14), respectively.
t1>t2> t3 (13)
w1 <w2 <w3 (14)
That is, according to the vibration pattern, the continuous vibration time is longer and the vibration pitch is shorter as the distance between the user U and the object is shorter. Thereby, even when the object is not displayed, the user can grasp the distance of the target object based on the vibration pitch and the continuous vibration time.
 本変形例の振動パターンは、ユーザUと対象物との距離に応じて振動パターンが設定されていれば、上記の例に限定されない。
 例えば、振動子131が連続的に振動するのではなく、パルス状に振動してもよい。この場合、振動パターンにおいては、ユーザUと対象物との距離が近くなるに従って連続的に発生するパルス数が多くなってもよい。
 あるいは、振動子131が連続して振動する時間を変化させることに替えて、振動子131の強度を変化させてもよい。
 また、振動子131が等間隔のピッチで振動するのではなく、異なるピッチで振動してもよい。
 さらに、第1及び第2の振動子131R,131Lは、同一の振動強度で振動してもよいし、図12及び表1を用いて説明したように、θ'b[°]に関して設定された振動強度で振動してもよい。これにより、フィードバック生成部223は、ユーザの視野の中心方向を基準とするAR表示物候補の方向、及びAR表示物候補とユーザとの間の距離の双方に関連する情報に基づいて触覚フィードバックパターンを生成することができる。
The vibration pattern of this modification is not limited to the above example as long as the vibration pattern is set according to the distance between the user U and the object.
For example, the vibrator 131 may vibrate in a pulse shape instead of continuously. In this case, in the vibration pattern, the number of pulses that are continuously generated may increase as the distance between the user U and the object decreases.
Alternatively, the strength of the vibrator 131 may be changed instead of changing the time during which the vibrator 131 continuously vibrates.
Further, the vibrator 131 may vibrate at different pitches instead of vibrating at equal intervals.
Furthermore, the first and second vibrators 131R and 131L may vibrate with the same vibration intensity, and are set with respect to θ′b [°] as described with reference to FIG. 12 and Table 1. You may vibrate with vibration intensity. As a result, the feedback generation unit 223 detects the tactile feedback pattern based on information related to both the direction of the AR display object candidate based on the center direction of the user's visual field and the distance between the AR display object candidate and the user. Can be generated.
 [変形例1-3]
 図15は、本変形例のHMD10の模式的な側面図である。
 同図に示すように、触覚提示部13は、第2の装着部材121Lに配置された複数の振動子131a,131b,131cを有する。複数の振動子131a,131b,131cは、いずれも、第2の装着部材121Lのモダン部123Lに配置される。
 複数の振動子131a,131b,131cは、所定の方向に並んで配置された振動子群132を構成する。所定の方向は、例えばモダン部123Lの延在方向であり、例えばZ軸方向とすることができる。
 振動子群132に含まれる複数の振動子131a,131b,131cは、ユーザの視野の中心方向を基準とするAR表示物候補の方向が所定の方向と対応する場合、所定の方向に沿って順に振動することができる。以下、図15中のグラフを用いて具体例を説明する。なおこの例では、対象物がユーザの後方に存在するものとする。
 各グラフに示すように、各振動子131a,131b,131cは、それぞれ略同一の振動ピッチ及び連続振動時間を有するが、振動のタイミングが異なっている。具体的には、振動子131aからZ軸方向後方に向かって、振動子131b,振動子131cと順に振動する。これにより、ユーザは、Z軸方向前方から後方に向かって振動を知覚し、対象物が自身の(Z軸方向)後方に存在すると直観的に認識することができる。
[Modification 1-3]
FIG. 15 is a schematic side view of the HMD 10 of the present modification.
As shown in the figure, the tactile sense presentation unit 13 includes a plurality of vibrators 131a, 131b, and 131c arranged on the second mounting member 121L. The plurality of vibrators 131a, 131b, and 131c are all disposed in the modern portion 123L of the second mounting member 121L.
The plurality of vibrators 131a, 131b, and 131c constitute a vibrator group 132 that is arranged side by side in a predetermined direction. The predetermined direction is, for example, the extending direction of the modern portion 123L, and can be, for example, the Z-axis direction.
The plurality of vibrators 131a, 131b, and 131c included in the vibrator group 132 are sequentially arranged along a predetermined direction when the direction of the AR display object candidate with respect to the center direction of the user's visual field corresponds to the predetermined direction. Can vibrate. A specific example will be described below using the graph in FIG. In this example, it is assumed that the object exists behind the user.
As shown in the graphs, the vibrators 131a, 131b, and 131c have substantially the same vibration pitch and continuous vibration time, but have different vibration timings. Specifically, the vibrator 131b and the vibrator 131c vibrate sequentially from the vibrator 131a toward the rear in the Z-axis direction. Thereby, the user can perceive vibration from the front in the Z-axis direction to the rear, and can intuitively recognize that the object is present in the rear (in the Z-axis direction).
 本変形例において、複数の振動子の配置は上記に限定されない。例えば触覚提示部13は、左側の第2の装着部材121Lに配置された振動子群132の他、右側の第1の装着部材121Rにも振動子群を有していてもよい。また、上記の実施形態で説明した第1及び第2の振動子131R,131Lが振動子群を構成していてもよい。この場合は、振動子群が左右方向(X軸方向)に並んで配置されており、例えば対象物がユーザの右方に対応する場合は、振動子131L、振動子131Rの順に振動してもよい。 In this modification, the arrangement of the plurality of vibrators is not limited to the above. For example, the tactile sense providing unit 13 may include a transducer group in the first mounting member 121R on the right side in addition to the transducer group 132 disposed on the second mounting member 121L on the left side. The first and second vibrators 131R and 131L described in the above embodiment may constitute a vibrator group. In this case, the vibrator group is arranged in the left-right direction (X-axis direction). For example, when the object corresponds to the right side of the user, the vibrator 131L and the vibrator 131R may vibrate in this order. Good.
 [変形例1-4]
 HMD10の装着部12の構成は、上述の例に限定されない。
 図16は、本変形例のHMD10Aを模式的に示す上面図である。
 同図に示すように、装着部12Aは、第1及び第2の装着部材121R,121Lを有さず、バンド状の装着部材121Aを有する。装着部材121Aは、端部が表示部11の右端部及び左端部にそれぞれ接続される。装着部材121Aは、一方の側頭部から後頭部、他方の側頭部にわたってユーザの頭部に装着される。なお、装着部12Aは、図示しない耳あて等を有していてもよい。
 触覚提示部13Aは、装着部材121Aに配置された複数の振動子131Aa,131Ab,131Ac,131Ad,131Ae,131Afを有する。複数の振動子131Aa~131Afは、例えば装着部材121Aのうちユーザの側頭部から後頭部に装着される部位に、装着部材121Aの長手方向に沿って配置される。
 これらの振動子131Aa~131Afは、多様な振動パターンを提示することにより、ユーザにAR表示物候補の位置情報を提供することができる。
[Modification 1-4]
The configuration of the mounting portion 12 of the HMD 10 is not limited to the above example.
FIG. 16 is a top view schematically showing the HMD 10A of the present modification.
As shown in the figure, the mounting portion 12A does not include the first and second mounting members 121R and 121L but includes a band-shaped mounting member 121A. The end of the mounting member 121A is connected to the right end and the left end of the display unit 11, respectively. The mounting member 121A is mounted on the user's head from one temporal region to the occipital region and the other temporal region. The mounting portion 12A may have an earpiece (not shown).
The tactile sense providing unit 13A includes a plurality of vibrators 131Aa, 131Ab, 131Ac, 131Ad, 131Ae, and 131Af arranged on the mounting member 121A. The plurality of vibrators 131Aa to 131Af are arranged along the longitudinal direction of the mounting member 121A, for example, at a portion of the mounting member 121A that is mounted from the user's side to the back of the head.
These vibrators 131Aa to 131Af can provide position information of AR display object candidates to the user by presenting various vibration patterns.
 複数の振動子131Aa~131Afは、例えば、X軸方向及びZ軸方向に並んで配置された第1の振動子群132Aaと、同様にX軸方向及びZ軸方向に並んで配置された第2の振動子群132Abとを有する。第1の振動子群132Aaは、振動子131Aa,131Ab,131Acを含む。第2の振動子群132Abは、振動子131Ad,131Ae,131Afを含む。
 例えば、対象物がユーザの後方に配置されている場合は、第1の振動子群132Aa及び/又は第2の振動子群132Abが、Z軸方向前方から後方に向かって順に振動する。
 あるいは、対象物がユーザの右方に配置されている場合は、第1の振動子群132AaがX軸方向左方から右方へ向かって順に振動する。同様に、対象物がユーザの左方に配置されている場合は、第2の振動子群132AbがX軸方向右方から左方へ向かって順に振動する。
 これにより、変形例1-3と同様に、ユーザに対しAR表示物の位置情報を直観的に知覚させることができる。
The plurality of vibrators 131Aa to 131Af are, for example, the first vibrator group 132Aa arranged side by side in the X-axis direction and the Z-axis direction, and the second vibrator arranged side by side in the X-axis direction and the Z-axis direction. The vibrator group 132Ab. The first transducer group 132Aa includes transducers 131Aa, 131Ab, and 131Ac. The second transducer group 132Ab includes transducers 131Ad, 131Ae, and 131Af.
For example, when the object is arranged behind the user, the first transducer group 132Aa and / or the second transducer group 132Ab vibrate in order from the front to the rear in the Z-axis direction.
Alternatively, when the object is arranged on the right side of the user, the first transducer group 132Aa vibrates sequentially from the left in the X-axis direction toward the right. Similarly, when the object is arranged on the left side of the user, the second transducer group 132Ab vibrates sequentially from the right in the X-axis direction to the left.
As a result, as in the modified example 1-3, the user can intuitively perceive the position information of the AR display object.
 [変形例1-5]
 触覚提示部13は、振動モータで構成された振動子131R,131Lを有する構成に限定されない。
 触覚提示部13は、例えば、ウェイトが一軸方向に往復駆動するリニアアクチュエータ、圧電材料、電気活性ポリマー若しくは形状記憶合金などの「スマートマテリアル」、マクロ複合繊維アクチュエータ、静電気アクチュエータ、電気触感アクチュエータ、その他のアクチュエータ等を有していてもよい。さらに、触覚提示部13は、例えば、静電摩擦(「ESF」)や超音波表面摩擦(「USF」)を用いる装置、超音波触覚トランスデューサを用いて音響放射圧力を生じさせる装置、触覚基板および可撓性もしくは変形可能な表面を用いる装置、またはエアジェットを用いた空気の吹きかけなどの発射型の触覚提示装置などを有していてもよい。
[Modification 1-5]
The tactile sense providing unit 13 is not limited to the configuration having the vibrators 131R and 131L configured by a vibration motor.
The tactile sense presentation unit 13 is, for example, a linear actuator whose weight is reciprocally driven in a uniaxial direction, a “smart material” such as a piezoelectric material, an electroactive polymer or a shape memory alloy, a macro composite fiber actuator, an electrostatic actuator, an electric tactile actuator, You may have an actuator etc. Further, the haptic presentation unit 13 includes, for example, a device that uses electrostatic friction (“ESF”) and ultrasonic surface friction (“USF”), a device that generates an acoustic radiation pressure using an ultrasonic haptic transducer, a haptic substrate, and A device using a flexible or deformable surface, or a launch-type tactile presentation device such as air blowing using an air jet may be included.
 図17は、本変形例のHMD10Bを示す模式的な上面図である。
 HMD10Bの触覚提示部13Bは、第1のリニアアクチュエータ133Rと、第2のリニアアクチュエータ133Lとを有する。第1及び第2のリニアアクチュエータ133R,133Lは、それぞれ、第1及び第2のモダン部123R,123Lに配置され、例えば第1及び第2のモダン部123R,123Lの耳介上部に接触する領域に配置されていてもよい。
 第1及び第2のリニアアクチュエータ133R,133Lは、それぞれ所定の軸方向に往復駆動するウェイト134R,134Lを含む。図17の矢印は、各ウェイト134R,134Lの駆動方向を示す。図17に示す例では、AR表示物候補SがユーザUの右方に対応しており、ウェイト134RはZ軸方向前方に向かって駆動し、ウェイト134LはZ軸方向後方に向かって駆動する。
FIG. 17 is a schematic top view showing the HMD 10B of this modification.
The tactile sense providing unit 13B of the HMD 10B includes a first linear actuator 133R and a second linear actuator 133L. The first and second linear actuators 133R and 133L are disposed in the first and second modern portions 123R and 123L, respectively, and are, for example, regions that contact the upper pinna of the first and second modern portions 123R and 123L. May be arranged.
The first and second linear actuators 133R and 133L include weights 134R and 134L that reciprocate in predetermined axial directions, respectively. The arrows in FIG. 17 indicate the driving directions of the weights 134R and 134L. In the example shown in FIG. 17, the AR display object candidate S corresponds to the right side of the user U, the weight 134R is driven forward in the Z-axis direction, and the weight 134L is driven backward in the Z-axis direction.
 図18は、図17に示すウェイト134Rの駆動例を説明するグラフであり、横軸は時間、縦軸はZ軸方向前方の位置を示す。
 同図に示すように、まず、触覚フィードバック提示開始時には、ウェイト134Rが比較的速い速度でZ軸方向前方へ移動し、所定位置まで到達した後、比較的ゆっくりとした速度でZ軸方向後方へ戻る。リニアアクチュエータ133Lも、リニアアクチュエータ133Rとウェイトの移動方向は逆であるが、同様の駆動プロファイルを有する。
 このような駆動プロファイルにより、ユーザは、右前方から誰かに叩かれているような感覚を得ることができる。したがって、HMD10Bは、ユーザに右方に視野を移動させるように促すことができ、誘導したい行動とリンクした触覚フィードバックパターンをユーザに提示することができる。
FIG. 18 is a graph for explaining an example of driving the weight 134R shown in FIG. 17, in which the horizontal axis represents time, and the vertical axis represents the front position in the Z-axis direction.
As shown in the figure, first, at the start of tactile feedback presentation, the weight 134R moves forward in the Z-axis direction at a relatively fast speed, reaches a predetermined position, and then moves backward in the Z-axis direction at a relatively slow speed. Return. The linear actuator 133L also has a similar drive profile, although the movement direction of the weight is opposite to that of the linear actuator 133R.
With such a driving profile, the user can obtain a feeling that the user is hit by someone from the front right. Therefore, the HMD 10B can prompt the user to move the field of view to the right, and can present the user with a tactile feedback pattern linked to the action that the user wants to guide.
 あるいは、触覚提示部13Bは、1つのリニアアクチュエータを有していてもよい。このような構成でも、触覚提示部13Bは、ウェイトの駆動方向とユーザを基準とするAR表示物候補の方向とをリンクさせた触覚フィードバックパターンをユーザに提示することができる。 Alternatively, the tactile sense providing unit 13B may have one linear actuator. Even in such a configuration, the tactile sense providing unit 13B can present to the user a tactile feedback pattern in which the driving direction of the weight and the direction of the AR display object candidate based on the user are linked.
<第2の実施形態>
 HMD10の表示部11は、視野中に、触覚フィードバックパターンにより提示されるユーザから見たAR表示物候補の相対位置に関する情報を補助する補助表示を提示するように構成されてもよい。
 なお、本実施形態において、第1の実施形態と同様の構成については同一の符号を付し、その説明を省略する。
<Second Embodiment>
The display unit 11 of the HMD 10 may be configured to present an auxiliary display that assists information regarding the relative position of the AR display object candidate viewed from the user presented by the tactile feedback pattern in the visual field.
In the present embodiment, the same components as those in the first embodiment are denoted by the same reference numerals, and the description thereof is omitted.
 図19、図20は、補助表示の提示例を示す図である。
 図19はAR表示物候補がユーザの右方に存在する例を示す。この場合、補助表示D1は、触覚フィードバックの提示時に、視野Vの上端部に沿って左方から右方へと移動する。
 図20はAR表示物候補が視野の上方に存在する例を示す。この場合、補助表示D2は、触覚フィードバックの提示時に、視野Vの右端部に沿って下方から上方へと移動する。
 図19、図20に示すように、補助表示D1,D2が視野Vの端部に沿って移動することで、視野V中の他の表示への影響を低減することができる。
 補助表示D1,D2の形状は、矩形、円形その他の幾何学的形状であってもよく、その他アニメーションやイラスト等であってもよい。
19 and 20 are diagrams illustrating examples of auxiliary display.
FIG. 19 shows an example in which the AR display object candidate exists on the right side of the user. In this case, the auxiliary display D1 moves from left to right along the upper end of the visual field V when presenting tactile feedback.
FIG. 20 shows an example in which the AR display object candidate exists above the visual field. In this case, the auxiliary display D2 moves from below to above along the right end of the visual field V when presenting tactile feedback.
As shown in FIGS. 19 and 20, the auxiliary displays D <b> 1 and D <b> 2 move along the end of the visual field V, so that the influence on other displays in the visual field V can be reduced.
The shapes of the auxiliary displays D1 and D2 may be rectangular, circular or other geometric shapes, and may be other animations or illustrations.
 図21は、本実施形態に係る制御ユニット20Cの機能的構成を示すブロック図である。
 制御ユニット20Cは、第1の実施形態と同様に、AR表示処理ユニット21と、触覚フィードバック処理ユニット22とを有し、さらに補助表示処理ユニット23を有する。
 補助表示処理ユニット23は、触覚フィードバックパターンにより提示される、ユーザから見たAR表示物候補の相対位置に関する情報を補助する処理を実行する補助表示制御部231を有する。補助表示制御部231は、制御ユニット20のCPU201によって実行される。
 補助表示制御部231は、触覚処理判定部222から判定結果を供給されるとともに、AR情報管理部221からユーザから見たAR表示物候補の相対位置に関する情報を供給される。補助表示制御部231は、これらの情報に基づいて補助表示の描画処理やアニメーション処理を実行する。
FIG. 21 is a block diagram showing a functional configuration of the control unit 20C according to the present embodiment.
The control unit 20C includes an AR display processing unit 21 and a tactile feedback processing unit 22 as well as an auxiliary display processing unit 23, as in the first embodiment.
The auxiliary display processing unit 23 includes an auxiliary display control unit 231 that executes processing for assisting information related to the relative position of the AR display object candidate viewed from the user, which is presented by the tactile feedback pattern. The auxiliary display control unit 231 is executed by the CPU 201 of the control unit 20.
The auxiliary display control unit 231 is supplied with the determination result from the tactile processing determination unit 222 and is also supplied with information regarding the relative position of the AR display object candidate viewed from the user from the AR information management unit 221. The auxiliary display control unit 231 executes auxiliary display drawing processing and animation processing based on these pieces of information.
 本実施形態によれば、触覚フィードバックに補助表示を組み合わせることで、ユーザに対し、AR表示物候補の位置情報をより容易に把握させることができる。 According to the present embodiment, by combining the haptic feedback with the auxiliary display, the user can more easily grasp the position information of the AR display object candidate.
 [変形例2-1]
 補助表示の表示態様は図19、図20に示す例に限定されない。
 例えば、図22に示すように、補助表示D3,D4は、AR表示物候補の位置とそのオブジェクトの内容等を表示するものであってもよい。
 これにより、AR表示物候補の位置情報や内容をより詳細に把握させることができる。
[Modification 2-1]
The display mode of the auxiliary display is not limited to the examples shown in FIGS.
For example, as shown in FIG. 22, the auxiliary displays D3 and D4 may display the position of the AR display object candidate, the contents of the object, and the like.
Thereby, the positional information and contents of the AR display object candidate can be grasped in more detail.
 あるいは、図23に示すように、補助表示D5は、視野Vの一部の領域(同図に示す例では下端部に沿った領域)に配置され、ユーザを包囲する円筒座標C0の模式的な展開図であってもよい。この場合、補助表示D5は、例えば中央に視野Vの展開図に対応する領域DVを有し、その右方及び左方に領域DR,DLを有する。補助表示D5の中央(領域DV中の位置)は、ユーザの正面方向に対応する。領域DR,DLは、視野外の右方及び左方の展開図に対応する。補助表示D5は、AR表示物候補のオブジェクトに対応するサブオブジェクトSBを有する。サブオブジェクトSBは、補助表示D5中の、AR表示物候補のユーザを基準とする周方向に対応する位置に配置される。
 これにより、AR表示物候補の位置情報をユーザにより正確に把握させることができる。
Alternatively, as shown in FIG. 23, the auxiliary display D5 is arranged in a partial region of the visual field V (a region along the lower end in the example shown in the figure), and is a schematic of the cylindrical coordinates C0 surrounding the user. It may be a development view. In this case, the auxiliary display D5 has, for example, a region DV corresponding to the developed view of the visual field V at the center, and regions DR and DL on the right and left sides thereof. The center of the auxiliary display D5 (position in the area DV) corresponds to the front direction of the user. Regions DR and DL correspond to the right and left development views outside the field of view. The auxiliary display D5 has a sub-object SB corresponding to the AR display object candidate object. The sub-object SB is arranged at a position in the auxiliary display D5 corresponding to the circumferential direction with the AR display candidate user as a reference.
Thereby, the position information of the AR display object candidate can be accurately grasped by the user.
 以上、本技術の実施形態について説明したが、本技術は上述の実施形態にのみ限定されるものではなく、本技術の要旨を逸脱しない範囲内において種々変更を加え得ることは勿論である。 As mentioned above, although embodiment of this technique was described, this technique is not limited only to the above-mentioned embodiment, Of course, various changes can be added within the range which does not deviate from the summary of this technique.
 以上の実施形態では、ARシステムとしてARサーバと、携帯情報端末と、制御ユニットと、HMDとを有すると説明したが、これに限定されない。
 例えば、ARシステムが携帯情報端末を有さず、ARサーバと制御ユニットとが直接通信するように構成されていてもよい。
 また、制御ユニットはHMDと一体に構成されていてもよいし、携帯情報端末によって構成されてもよい。あるいは、制御ユニットは、例えばARサーバと携帯情報端末など、複数の機器によって構成されてもよい。
In the above embodiments, the AR server, the portable information terminal, the control unit, and the HMD have been described as the AR system. However, the present invention is not limited to this.
For example, the AR system does not have a portable information terminal, and the AR server and the control unit may be configured to communicate directly.
Further, the control unit may be configured integrally with the HMD, or may be configured by a portable information terminal. Or a control unit may be comprised by several apparatuses, such as AR server and a portable information terminal, for example.
 また、ウェアラブルディスプレイは、ユーザの頭部等に装着される装着部と、装着部によって支持されていない表示部と、装着部に設けられた触覚提示部とを備えた構成であってもよい。
 この場合、表示部は、例えば、ユーザの眼に配置されたコンタクトレンズ型の表示デバイスであってもよい。あるいは、表示部は、スマートフォン等の携帯情報端末や、ユーザの装着している眼鏡に取り付けられる構成の表示デバイス等であってもよい。
 装着部は、例えばユーザの耳に掛ける構成や、ヘッドホン型の構成であってもよい。これにより、装着部は、ユーザの視野を遮ることなく、触覚を比較的精度よく知覚することができる。
In addition, the wearable display may be configured to include a mounting unit mounted on the user's head or the like, a display unit not supported by the mounting unit, and a tactile sense providing unit provided in the mounting unit.
In this case, the display unit may be, for example, a contact lens type display device arranged in the user's eye. Alternatively, the display unit may be a portable information terminal such as a smartphone or a display device configured to be attached to spectacles worn by the user.
For example, the mounting portion may have a configuration that hangs on a user's ear or a headphone configuration. Thereby, the mounting part can perceive a tactile sense with relatively high accuracy without blocking the user's visual field.
 さらに、以上の動作例では、ユーザの現在位置の変化や表示部の姿勢の変化の都度、ARサーバから必要なAR表示物候補の情報を取得するようにしたが、これに限定されない。
 例えば、制御ユニット(HMD)又は携帯情報端末が、これらの起動時に、ARサーバからAR表示処理と触覚フィードバック処理とに必要なAR表示物候補の情報をまとめて取得し、メモリに保持していてもよい。
Furthermore, in the above operation example, necessary AR display object candidate information is acquired from the AR server each time the user's current position changes or the display unit posture changes, but the present invention is not limited to this.
For example, when the control unit (HMD) or the portable information terminal is activated, information of AR display object candidates necessary for the AR display process and the tactile feedback process are collectively acquired from the AR server and stored in the memory. Also good.
 例えば以上の実施形態では、HMDに本技術を適用した例を説明したが、例えば手首や腕、首等に装着可能なウェアラブルディスプレイにも本技術は適用可能である。 For example, in the above embodiment, the example in which the present technology is applied to the HMD has been described, but the present technology can also be applied to, for example, a wearable display that can be attached to a wrist, an arm, a neck, or the like.
 また以上の実施形態では、シースルー型(透過型)のHMDへの適用例を説明したが、非透過型のHMDにも本技術は適用可能である。この場合、表示部に装着したカメラで撮像された外界視野に、本技術に係る所定のオブジェクトを表示すればよい。 In the above embodiment, an example of application to a see-through type (transmission type) HMD has been described. However, the present technology can also be applied to a non-transmission type HMD. In this case, a predetermined object according to the present technology may be displayed in an external field of view captured by a camera attached to the display unit.
 さらに以上の実施形態では、AR表示物が、現実空間に存在する所定の対象物と、それに関連する情報を含むオブジェクトとを含むと説明したが、これに限られず、AR表示物が対象物を有さなくてもよい。この場合は、上述の適用例4で示したように、オブジェクトが現実空間上の所定の位置に対応付けられており、制御ユニットが、この位置情報を用いてユーザとAR表示物との距離を算出するように構成されてもよい。 Further, in the above embodiments, the AR display object has been described as including a predetermined object existing in the real space and an object including information related thereto, but the present invention is not limited to this, and the AR display object represents the object. You don't have to. In this case, as shown in Application Example 4 above, the object is associated with a predetermined position in the real space, and the control unit uses this position information to determine the distance between the user and the AR display object. It may be configured to calculate.
 なお、本技術は以下のような構成もとることができる。
(1)ユーザの視野に、AR(Augmented Reality)表示物を提示する表示部と、
 上記ユーザに装着されることが可能な装着部と、
 上記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンにより、上記ユーザから見た上記AR表示物候補の相対位置に関する情報を触覚で提示する、上記装着部に設けられた触覚提示部と
 を具備するウェアラブルディスプレイ。
(2)上記(1)に記載のウェアラブルディスプレイであって、
 上記触覚フィードバックパターンは、上記ユーザから見た上記AR表示物候補の相対位置に関する情報に基づいて生成される
 ウェアラブルディスプレイ。
(3)上記(1)又は(2)に記載のウェアラブルディスプレイであって、
 上記ユーザから見た上記AR表示物候補の相対位置に関する情報は、上記ユーザの視野の中心方向を基準とする上記AR表示物候補の方向、及び上記AR表示物候補と上記ユーザとの間の距離のうちの少なくとも一方に関連する情報を含む
 ウェアラブルディスプレイ。
(4)上記(1)から(3)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記触覚提示部は、上記装着部の異なる位置に配置された複数の振動子を含み、
 上記複数の振動子各々は、上記相対位置に関する情報と、上記複数の振動子の配置とに基づいて規定された振動パターンにより振動する
 ウェアラブルディスプレイ。
(5)上記(4)に記載のウェアラブルディスプレイであって、
 上記装着部は、
 上記ユーザの右側に装着される第1の装着部材と、
 上記ユーザの左側に装着される第2の装着部材と、を含み、
 上記複数の振動子は、
 上記第1の装着部材に配置された第1の振動子と、
 上記第2の装着部材に配置された第2の振動子とを含み、
 上記第1の振動子は、上記AR表示物候補の上記ユーザから見た方向が上記ユーザの右方に対応する場合、上記第2の振動子よりも強い強度で振動し、上記AR表示物候補の上記ユーザから見た方向が上記ユーザの左方に対応する場合、上記第2の振動子よりも弱い強度で振動する
 ウェアラブルディスプレイ。
(6)上記(4)又は(5)に記載のウェアラブルディスプレイであって、
 上記複数の振動子は、所定の方向に並んで配置された振動子群を有し、
 上記振動子群に含まれる複数の振動子は、上記AR表示物候補の上記ユーザから見た方向が上記所定の方向と対応する場合、上記所定の方向に沿って順に振動する
 ウェアラブルディスプレイ。
(7)上記(1)から(6)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記装着部は、
 上記ユーザの耳に装着されるモダン部と、
 上記モダン部と接続され、上記表示部を支持する支持部とを含み、
 上記触覚提示部は、上記モダン部に配置される
 ウェアラブルディスプレイ。
(8)上記(7)に記載のウェアラブルディスプレイであって、
 上記モダン部は、上記支持部よりも高い剛性を有する
 ウェアラブルディスプレイ。
(9)上記(1)から(8)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記触覚提示部は、新規に取得された上記AR表示物候補についての上記ユーザから見た前記AR表示物候補の相対位置に関する情報を触覚で提示する
 ウェアラブルディスプレイ。
(10)上記(1)から(9)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記触覚提示部は、表示態様に変化が生じた上記AR表示物候補について、上記ユーザから見た上記AR表示物の相対位置に関する情報を触覚で提示する
 ウェアラブルディスプレイ。
(11)上記(1)から(10)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記触覚提示部は、上記ユーザと間の距離に所定の変化が生じた上記AR表示物候補について、上記ユーザから見た上記AR表示物の相対位置に関する情報を触覚で提示する
 ウェアラブルディスプレイ。
(12)上記(1)から(11)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記AR表示物候補は、上記視野外の位置に対応付けられている
 ウェアラブルディスプレイ。
(13)上記(1)から(12)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記表示部は、上記視野中に、上記触覚フィードバックパターンにより提示される上記ユーザから見た上記AR表示物の相対位置に関する情報を補助する、補助表示を提示する
 ウェアラブルディスプレイ。
(14)上記(1)から(13)のうちいずれか1つに記載のウェアラブルディスプレイであって、
 上記表示部は上記装着部に支持されている
 ウェアラブルディスプレイ。
(15)ユーザの視野に、AR表示物を提示する表示部と、上記ユーザに装着されることが可能な装着部と、上記装着部に設けられ、上記ユーザに触覚フィードバックパターンにより、上記ユーザから見た上記AR表示物候補の相対位置に関する情報を触覚で提示する触覚提示部と、を有するウェアラブルディスプレイと、
 上記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンを生成し、上記触覚提示部に上記触覚フィードバックパターンを出力する制御ユニットと
 を具備する情報処理システム。
(16)ユーザの視野に、AR(Augmented Reality)表示物を提示する表示部と、上記ユーザに装着されることが可能な装着部と、上記装着部に設けられた触覚提示部とを具備するウェアラブルディスプレイの制御方法であって、
 上記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンにより、上記ユーザから見た上記AR表示物候補の相対位置に関する情報を触覚で提示する
 ウェアラブルディスプレイの制御方法。
In addition, this technique can also take the following structures.
(1) a display unit that presents an AR (Augmented Reality) display in the user's field of view;
A mounting portion that can be mounted on the user;
When there is a predetermined change in an AR display object candidate that can be presented as the AR display object, information related to the relative position of the AR display object candidate viewed from the user is presented by a tactile sense using a tactile feedback pattern. A wearable display comprising: a tactile sense providing unit provided in the mounting unit.
(2) The wearable display according to (1) above,
The tactile feedback pattern is generated based on information related to a relative position of the AR display object candidate as viewed from the user.
(3) The wearable display according to (1) or (2) above,
The information regarding the relative position of the AR display object candidate viewed from the user includes the direction of the AR display object candidate based on the center direction of the user's visual field, and the distance between the AR display object candidate and the user. A wearable display that contains information related to at least one of the following.
(4) The wearable display according to any one of (1) to (3) above,
The tactile sense presentation unit includes a plurality of vibrators arranged at different positions of the mounting unit,
Each of the plurality of vibrators vibrates according to a vibration pattern defined based on information on the relative position and an arrangement of the plurality of vibrators.
(5) The wearable display according to (4) above,
The mounting part is
A first mounting member mounted on the right side of the user;
A second mounting member mounted on the left side of the user,
The plurality of vibrators are
A first vibrator disposed on the first mounting member;
A second vibrator disposed on the second mounting member,
The first vibrator vibrates with a stronger intensity than the second vibrator when the direction of the AR display candidate from the user corresponds to the right side of the user, and the AR display candidate When the direction seen from the user corresponds to the left side of the user, a wearable display that vibrates with a lower intensity than the second vibrator.
(6) The wearable display according to (4) or (5) above,
The plurality of vibrators have a vibrator group arranged side by side in a predetermined direction,
The plurality of vibrators included in the vibrator group vibrate in order along the predetermined direction when a direction of the AR display candidate viewed from the user corresponds to the predetermined direction.
(7) The wearable display according to any one of (1) to (6) above,
The mounting part is
A modern part worn on the user's ear;
A support part connected to the modern part and supporting the display part;
The tactile sense presentation unit is a wearable display arranged in the modern unit.
(8) The wearable display according to (7) above,
The modern part has a higher rigidity than the support part.
(9) The wearable display according to any one of (1) to (8) above,
The tactile sense presenting unit presents information related to the relative position of the AR display object candidate as viewed from the user with respect to the newly acquired AR display object candidate by a tactile sense.
(10) The wearable display according to any one of (1) to (9) above,
The tactile sensation providing unit presents information related to the relative position of the AR display object viewed from the user with the tactile sensation for the AR display object candidate whose display mode has changed.
(11) The wearable display according to any one of (1) to (10) above,
The tactile sensation providing unit presents information related to the relative position of the AR display object viewed from the user with respect to the AR display object candidate in which a predetermined change has occurred in the distance between the user and the wearable display.
(12) The wearable display according to any one of (1) to (11) above,
The AR display object candidate is a wearable display associated with a position outside the field of view.
(13) The wearable display according to any one of (1) to (12) above,
The display unit presents an auxiliary display that assists information related to a relative position of the AR display object viewed from the user presented by the tactile feedback pattern in the visual field.
(14) The wearable display according to any one of (1) to (13) above,
The display unit is a wearable display supported by the mounting unit.
(15) A display unit that presents an AR display object in a user's field of view, a mounting unit that can be mounted on the user, and a mounting unit that is provided on the mounting unit, and that provides a tactile feedback pattern to the user from the user. A tactile sensation presentation unit that presents information regarding the relative position of the viewed AR display object candidate by tactile sensation,
A control unit that generates a haptic feedback pattern and outputs the haptic feedback pattern to the haptic presentation unit when there is a predetermined change in an AR display candidate that can be presented as the AR display object. Information processing system.
(16) A display unit that presents an AR (Augmented Reality) display in the user's field of view, a mounting unit that can be mounted on the user, and a tactile sense providing unit provided on the mounting unit. A control method for a wearable display,
When there is a predetermined change in an AR display object candidate that can be presented as the AR display object, information related to the relative position of the AR display object candidate viewed from the user is presented by a tactile sense using a tactile feedback pattern. Yes Wearable display control method.
 100…情報処理システム(ARシステム)
 10,10A,10B…ウェアラブルディスプレイ(HMD)
 11…表示部
 12,12A…装着部
 121R…第1の装着部材(装着部材)
 121L…第2の装着部材(装着部材)
 122・・・支持部
 123R,123L…モダン部
 13,13A…触覚提示部
 131…複数の振動子
 131R…第1の振動子(振動子)
 131L…第2の振動子(振動子)
 132…振動子群
 20,20C…制御ユニット
100 ... Information processing system (AR system)
10, 10A, 10B ... Wearable display (HMD)
DESCRIPTION OF SYMBOLS 11 ... Display part 12, 12A ... Mounting part 121R ... 1st mounting member (mounting member)
121L ... Second mounting member (mounting member)
122 ... support part 123R, 123L ... modern part 13, 13A ... tactile sense presentation part 131 ... multiple vibrators 131R ... first vibrator (vibrator)
131L ... Second vibrator (vibrator)
132 ... vibrator group 20, 20C ... control unit

Claims (16)

  1.  ユーザの視野に、AR(Augmented Reality)表示物を提示する表示部と、
     前記ユーザに装着されることが可能な装着部と、
     前記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンにより、前記ユーザから見た前記AR表示物候補の相対位置に関する情報を触覚で提示する、前記装着部に設けられた触覚提示部と
     を具備するウェアラブルディスプレイ。
    A display unit that presents AR (Augmented Reality) display in the user's field of view;
    A mounting portion that can be mounted on the user;
    When there is a predetermined change in an AR display object candidate that can be presented as the AR display object, information related to the relative position of the AR display object candidate as viewed from the user is presented by a tactile sense using a tactile feedback pattern. A wearable display comprising: a tactile sense providing unit provided in the mounting unit.
  2.  請求項1に記載のウェアラブルディスプレイであって、
     前記触覚フィードバックパターンは、前記ユーザから見た前記AR表示物候補の相対位置に関する情報に基づいて生成される
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The tactile feedback pattern is generated based on information related to a relative position of the AR display object candidate as viewed from the user.
  3.  請求項1に記載のウェアラブルディスプレイであって、
     前記ユーザから見た前記AR表示物候補の相対位置に関する情報は、前記ユーザの視野の中心方向を基準とするAR表示物候補の方向、及び前記AR表示物候補と前記ユーザとの間の距離のうちの少なくとも一方に関連する情報を含む
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    Information on the relative position of the AR display object candidate viewed from the user includes the direction of the AR display object candidate based on the center direction of the user's visual field, and the distance between the AR display object candidate and the user. A wearable display that contains information related to at least one of them.
  4.  請求項1に記載のウェアラブルディスプレイであって、
     前記触覚提示部は、前記装着部の異なる位置に配置された複数の振動子を含み、
     前記複数の振動子各々は、前記相対位置に関する情報と、前記複数の振動子の配置とに基づいて規定された振動パターンにより振動する
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The tactile sense presentation unit includes a plurality of vibrators arranged at different positions of the mounting unit,
    Each of the plurality of vibrators vibrates according to a vibration pattern defined based on information on the relative position and an arrangement of the plurality of vibrators.
  5.  請求項4に記載のウェアラブルディスプレイであって、
     前記装着部は、
     前記ユーザの右側に装着される第1の装着部材と、
     前記ユーザの左側に装着される第2の装着部材と、を含み、
     前記複数の振動子は、
     前記第1の装着部材に配置された第1の振動子と、
     前記第2の装着部材に配置された第2の振動子とを含み、
     前記第1の振動子は、前記AR表示物候補の前記ユーザから見た方向が前記ユーザの右方に対応する場合、前記第2の振動子よりも強い強度で振動し、前記AR表示物候補の前記ユーザから見た方向が前記ユーザの左方に対応する場合、前記第2の振動子よりも弱い強度で振動する
     ウェアラブルディスプレイ。
    The wearable display according to claim 4,
    The mounting part is
    A first mounting member mounted on the right side of the user;
    A second mounting member mounted on the left side of the user,
    The plurality of vibrators are:
    A first vibrator disposed on the first mounting member;
    A second vibrator disposed on the second mounting member,
    The first vibrator vibrates at a stronger intensity than the second vibrator when the direction of the AR display candidate from the user corresponds to the right side of the user, and the AR display candidate A wearable display that vibrates at a lower intensity than the second vibrator when the direction viewed from the user corresponds to the left side of the user.
  6.  請求項4に記載のウェアラブルディスプレイであって、
     前記複数の振動子は、所定の方向に並んで配置された振動子群を有し、
     前記振動子群に含まれる複数の振動子は、前記AR表示物候補の前記ユーザから見た方向が前記所定の方向と対応する場合、前記所定の方向に沿って順に振動する
     ウェアラブルディスプレイ。
    The wearable display according to claim 4,
    The plurality of vibrators have a vibrator group arranged side by side in a predetermined direction,
    The plurality of vibrators included in the vibrator group vibrate in order along the predetermined direction when a direction of the AR display candidate viewed from the user corresponds to the predetermined direction.
  7.  請求項1に記載のウェアラブルディスプレイであって、
     前記装着部は、
     前記ユーザの耳に装着されるモダン部と、
     前記モダン部と接続され、前記表示部を支持する支持部とを含み、
     前記触覚提示部は、前記モダン部に配置される
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The mounting part is
    A modern part worn on the user's ear;
    A support part connected to the modern part and supporting the display part;
    The tactile sense presentation unit is a wearable display arranged in the modern unit.
  8.  請求項7に記載のウェアラブルディスプレイであって、
     前記モダン部は、前記支持部よりも高い剛性を有する
     ウェアラブルディスプレイ。
    The wearable display according to claim 7,
    The modern part has a higher rigidity than the support part.
  9.  請求項1に記載のウェアラブルディスプレイであって、
     前記触覚提示部は、新規に取得された前記AR表示物候補について、前記ユーザから見た前記AR表示物候補の相対位置に関する情報を触覚で提示する
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The tactile sensation providing unit presents information related to the relative position of the AR display object candidate viewed from the user by tactile sensation for the newly acquired AR display object candidate.
  10.  請求項1に記載のウェアラブルディスプレイであって、
     前記触覚提示部は、表示態様に変化が生じた前記AR表示物候補について、前記ユーザから見た前記AR表示物候補の相対位置に関する情報を触覚で提示する
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The tactile sensation providing unit presents information related to the relative position of the AR display object candidate viewed from the user by tactile sensation with respect to the AR display object candidate whose display mode has changed.
  11.  請求項1に記載のウェアラブルディスプレイであって、
     前記触覚提示部は、前記ユーザと間の距離に所定の変化が生じた前記AR表示物候補について、前記ユーザから見た前記AR表示物候補の相対位置に関する情報を触覚で提示する
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The tactile sense presenting unit presents information related to a relative position of the AR display object candidate viewed from the user by a tactile sense with respect to the AR display object candidate in which a predetermined change has occurred in the distance between the user and the wearable display.
  12.  請求項1に記載のウェアラブルディスプレイであって、
     前記AR表示物候補は、前記視野外の位置に対応付けられている
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The AR display object candidate is associated with a position outside the field of view.
  13.  請求項1に記載のウェアラブルディスプレイであって、
     前記表示部は、前記視野中に、前記触覚フィードバックパターンにより提示される前記ユーザから見た前記AR表示物候補の相対位置に関する情報を補助する、補助表示を提示する
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The said display part presents the auxiliary display which assists the information regarding the relative position of the said AR display object candidate seen from the said user shown by the said tactile feedback pattern in the said visual field.
  14.  請求項1に記載のウェアラブルディスプレイであって、
     前記表示部は前記装着部に支持されている
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The display unit is a wearable display supported by the mounting unit.
  15.  ユーザの視野に、AR表示物を提示する表示部と、前記ユーザに装着されることが可能な装着部と、前記装着部に設けられ、前記ユーザに触覚フィードバックパターンにより、前記ユーザから見た前記AR表示物候補の相対位置に関する情報を触覚で提示する触覚提示部と、を有するウェアラブルディスプレイと、
     前記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンを生成し、前記触覚提示部に前記触覚フィードバックパターンを出力する制御ユニットと
     を具備する情報処理システム。
    A display unit that presents an AR display object in the user's field of view, a mounting unit that can be mounted on the user, and the mounting unit. A tactile sense presentation unit that presents information related to the relative position of the AR display object candidate by tactile sense;
    A control unit that generates a haptic feedback pattern and outputs the haptic feedback pattern to the haptic presentation unit when there is a predetermined change in an AR display candidate that can be presented as the AR display. Information processing system.
  16.  ユーザの視野に、AR表示物を提示する表示部と、前記ユーザに装着されることが可能な装着部と、前記装着部に設けられた触覚提示部とを具備するウェアラブルディスプレイの制御方法であって、
     前記AR表示物として提示されることが可能なAR表示物候補に所定の変化があったときに、触覚フィードバックパターンにより、前記ユーザから見た前記AR表示物候補の相対位置に関する情報を触覚で提示する
     ウェアラブルディスプレイの制御方法。
    A wearable display control method comprising: a display unit that presents an AR display object in a user's field of view; a mounting unit that can be mounted on the user; and a tactile display unit provided in the mounting unit. And
    When there is a predetermined change in an AR display object candidate that can be presented as the AR display object, information related to the relative position of the AR display object candidate as viewed from the user is presented by a tactile sense using a tactile feedback pattern. Yes Wearable display control method.
PCT/JP2016/000826 2015-04-23 2016-02-17 Wearable display, information processing system, and control method WO2016170717A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015088499 2015-04-23
JP2015-088499 2015-04-23

Publications (1)

Publication Number Publication Date
WO2016170717A1 true WO2016170717A1 (en) 2016-10-27

Family

ID=57143021

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/000826 WO2016170717A1 (en) 2015-04-23 2016-02-17 Wearable display, information processing system, and control method

Country Status (1)

Country Link
WO (1) WO2016170717A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018082363A (en) * 2016-11-18 2018-05-24 セイコーエプソン株式会社 Head-mounted display device and method for controlling the same, and computer program
JP2018182535A (en) * 2017-04-13 2018-11-15 日本電信電話株式会社 Aerial image projection apparatus and aerial image projection method
CN110582741A (en) * 2017-03-21 2019-12-17 Pcms控股公司 Method and system for haptic interaction detection and augmentation in augmented reality
CN110710014A (en) * 2017-05-30 2020-01-17 奇跃公司 Power supply assembly with fan assembly for electronic device
JP2020526107A (en) * 2017-09-14 2020-08-27 アップル インコーポレイテッドApple Inc. Face seal for head mounted display
JP2020526065A (en) * 2017-06-13 2020-08-27 ビーハプティクス インコーポレイテッド Head mounted display
EP4080330A1 (en) * 2021-04-22 2022-10-26 Korea University of Technology and Education Industry-University Corporation Foundation Visual and tactile medium based on augmented reality
WO2023106063A1 (en) * 2021-12-06 2023-06-15 株式会社Jvcケンウッド Notification device, notification method, and program
JP7480388B2 (en) 2019-03-06 2024-05-09 マクセル株式会社 Head-mounted information processing device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100238161A1 (en) * 2009-03-19 2010-09-23 Kenneth Varga Computer-aided system for 360º heads up display of safety/mission critical data
JP2014194767A (en) * 2013-03-15 2014-10-09 Immersion Corp Wearable haptic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100238161A1 (en) * 2009-03-19 2010-09-23 Kenneth Varga Computer-aided system for 360º heads up display of safety/mission critical data
JP2014194767A (en) * 2013-03-15 2014-10-09 Immersion Corp Wearable haptic device

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018082363A (en) * 2016-11-18 2018-05-24 セイコーエプソン株式会社 Head-mounted display device and method for controlling the same, and computer program
US11726557B2 (en) 2017-03-21 2023-08-15 Interdigital Vc Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
CN110582741A (en) * 2017-03-21 2019-12-17 Pcms控股公司 Method and system for haptic interaction detection and augmentation in augmented reality
CN110582741B (en) * 2017-03-21 2024-04-02 交互数字Vc控股公司 Method and system for haptic interaction detection and augmentation in augmented reality
JP2018182535A (en) * 2017-04-13 2018-11-15 日本電信電話株式会社 Aerial image projection apparatus and aerial image projection method
JP2020524831A (en) * 2017-05-30 2020-08-20 マジック リープ, インコーポレイテッドMagic Leap,Inc. Power supply assembly with fan assembly for electronic devices
JP7480236B2 (en) 2017-05-30 2024-05-09 マジック リープ, インコーポレイテッド Power supply assembly with fan assembly for an electronic device - Patents.com
CN110710014A (en) * 2017-05-30 2020-01-17 奇跃公司 Power supply assembly with fan assembly for electronic device
US11797065B2 (en) 2017-05-30 2023-10-24 Magic Leap, Inc. Power supply assembly with fan assembly for electronic device
JP7319927B2 (en) 2017-05-30 2023-08-02 マジック リープ, インコーポレイテッド Power supply assembly with fan assembly for electronic devices
JP2020526065A (en) * 2017-06-13 2020-08-27 ビーハプティクス インコーポレイテッド Head mounted display
US11131856B2 (en) 2017-06-13 2021-09-28 Bhaptics Inc. Head-mounted display
JP2020526107A (en) * 2017-09-14 2020-08-27 アップル インコーポレイテッドApple Inc. Face seal for head mounted display
US11774768B2 (en) 2017-09-14 2023-10-03 Apple Inc. Face seal for head-mounted display
JP7480388B2 (en) 2019-03-06 2024-05-09 マクセル株式会社 Head-mounted information processing device
US11786811B2 (en) 2021-04-22 2023-10-17 Korea University Of Technology And Education Industry-University Cooperation Foundation Visual and tactile AR medium based on augmented reality
US20220339534A1 (en) 2021-04-22 2022-10-27 Korea University Of Technology And Education Industry--University Cooperation Foundation Visual and tactile ar medium based on augmented reality
EP4080330A1 (en) * 2021-04-22 2022-10-26 Korea University of Technology and Education Industry-University Corporation Foundation Visual and tactile medium based on augmented reality
WO2023106063A1 (en) * 2021-12-06 2023-06-15 株式会社Jvcケンウッド Notification device, notification method, and program

Similar Documents

Publication Publication Date Title
WO2016170717A1 (en) Wearable display, information processing system, and control method
JP7268692B2 (en) Information processing device, control method and program
US10740973B2 (en) Ultrasonic collision management in virtual, augmented, and mixed reality (xR) applications
US10809530B2 (en) Information processing apparatus and information processing method
US9599821B2 (en) Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
US11037532B2 (en) Information processing apparatus and information processing method
US10115235B2 (en) Method for controlling head mounted display, and system for implemeting the method
WO2014204330A1 (en) Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
US10579109B2 (en) Control device and control method
JP2012078224A (en) Image generation system, program, and information storage medium
JP2018200557A (en) Program to provide virtual space, information processing device to execute the same and method for providing virtual space
US11086392B1 (en) Devices, systems, and methods for virtual representation of user interface devices
JP2018072992A (en) Information processing method and equipment and program making computer execute the information processing method
JP2018036720A (en) Virtual space observation system, method and program
JP2015026286A (en) Display device, display system and control method of display device
JP2022048144A (en) Pseudo force device
JP6495398B2 (en) Method and program for providing virtual space, and information processing apparatus for executing the program
KR20170142161A (en) Center-of-gravity moving force device
EP3253283A1 (en) Fan-driven force device
JP6927797B2 (en) Methods, programs and computers for providing virtual space to users via headmount devices
JP6457446B2 (en) Method and apparatus for supporting communication in virtual space, and program for causing computer to execute the method
US20230252691A1 (en) Passthrough window object locator in an artificial reality system
JP6118444B1 (en) Information processing method and program for causing computer to execute information processing method
JP2018049629A (en) Method and device for supporting input in virtual space and program for causing computer to execute the method
JP2018094086A (en) Information processing device and image formation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16782747

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16782747

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP