WO2016021252A1 - 情報処理装置及び情報処理方法、並びに画像表示システム - Google Patents
情報処理装置及び情報処理方法、並びに画像表示システム Download PDFInfo
- Publication number
- WO2016021252A1 WO2016021252A1 PCT/JP2015/062934 JP2015062934W WO2016021252A1 WO 2016021252 A1 WO2016021252 A1 WO 2016021252A1 JP 2015062934 W JP2015062934 W JP 2015062934W WO 2016021252 A1 WO2016021252 A1 WO 2016021252A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- orientation
- display device
- image display
- information processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2356/00—Detection of the display position w.r.t. other display screens
Definitions
- the technology disclosed in this specification includes an information processing apparatus and an information processing method for providing screen information to be displayed on an image display apparatus that is worn on a user's head or face and is used for viewing an image, and an image display system. About.
- An image display device that is worn on a head or a face and is used for viewing an image, that is, a head-mounted display is known.
- a head-mounted display for example, an image display unit is arranged for each of the left and right eyes, and a user can observe a realistic image by forming an enlarged virtual image of the display image by a virtual image optical system.
- the head-mounted display is equipped with a high-resolution display panel made of, for example, a liquid crystal or an organic EL (Electro-Luminescence) element as a display unit for the left and right eyes.
- a high-resolution display panel made of, for example, a liquid crystal or an organic EL (Electro-Luminescence) element as a display unit for the left and right eyes.
- the head mounted display can be classified into a transmissive type and a light shielding type.
- the transmissive head-mounted display allows the wearer to observe the surrounding scenery while wearing the head and displaying an image (for example, see Patent Document 1).
- an image for example, see Patent Document 1.
- the user can avoid dangers such as collision with obstacles.
- the light-shielding head-mounted display is configured so as to directly cover the wearer's eyes when worn on the head, which increases the feeling of immersion when viewing images. Enlarge and project the display screen using a virtual image optical system to allow the user to observe it as an enlarged virtual image with an appropriate angle of view, and to reproduce the multi-channel with headphones, reproduce the realism as seen in a movie theater (For example, see Patent Document 2).
- a video see-through system is known that has a built-in camera that can photograph the front of the wearer and displays an external image obtained by photographing. The surrounding landscape can be observed through the image (see, for example, Patent Documents 3 and 4).
- the former transparent head-mounted display is called optical see-through or simply see-through.
- Head-mounted display limits the visual and auditory sense of the user who wears it, regardless of whether it is light-shielding or transmissive. In particular, a user wearing a light-shielding head-mounted display is completely blocked from view, and cannot perform any external operations such as answering a phone call or inputting information on a computer screen.
- the above-described video see-through type head-mounted display can display a user's field of view image captured by the camera, and the user can operate an object in the field of view through the display image.
- a mobile terminal including a display device that performs display according to display information, and a head-mounted display that displays a virtual screen in a display area within the field of view of the wearer based on display information acquired from the mobile terminal Proposals have been made on portable terminal systems provided (see, for example, Patent Document 5).
- the portable terminal system as described above is premised on that the head-mounted display is equipped with a camera for photographing the outside world. It is difficult for a user who wears a head-mounted display that is not equipped with a camera to perform external operations unless the head-mounted display is removed from the head and viewing is completely interrupted.
- An object of the technology disclosed in this specification is to provide excellent information processing that can suitably provide screen information to be displayed on an image display device that is worn on a user's head or face and used for viewing an image.
- An apparatus, an information processing method, and an image display system are provided.
- a shooting section A position and orientation calculation unit that calculates the position and orientation of the image display device in the first reference coordinate system of the information processing device based on an image obtained by capturing an external image display device by the imaging unit; A position and orientation conversion unit that converts the position and orientation of the image display device in the first reference coordinate system into the position and orientation of the information processing device in the second reference coordinate system of the image display device; An embedded position calculation unit that calculates a position in which screen information related to the information processing device is embedded in the screen of the image display device based on the position and orientation of the information processing device in the second reference coordinate system; Is an information processing apparatus.
- the second aspect of the technology disclosed in this specification is: A shooting section; A display unit; A position and orientation calculation unit that calculates the position and orientation of the image display device in the first reference coordinate system of the information processing device based on an image obtained by capturing an external image display device by the imaging unit; A position and orientation conversion unit that converts the position and orientation of the image display device in the first reference coordinate system into the position and orientation of the information processing device in the second reference coordinate system of the image display device; An embedded position calculation unit that calculates a position to embed screen information of the display unit in the screen of the image display device based on the position and orientation of the information processing device in the second reference coordinate system; Is an information processing apparatus.
- the information processing apparatus further includes a communication unit that communicates with the image display apparatus. And it is comprised so that the screen information of the said display part and the information of the said embedding position may be transmitted to the said image display apparatus via the said communication part.
- the position / orientation calculation unit of the information processing device may include a reference index mounted on the image display device side. Based on this, the position and orientation of the image display device in the first reference coordinate system are calculated.
- the information processing apparatus acquires reference index arrangement information that acquires information about a relationship between the attitude of the image display apparatus and the attitude of the reference index An acquisition unit is further provided.
- the reference index arrangement information acquisition unit of the information processing apparatus relates to a relationship between the attitude of the image display apparatus and the attitude of the reference index.
- the design information is preloaded, and the position / orientation calculator calculates the first information based on the design information from an image captured by the imaging unit when the information processing apparatus is arranged in a predetermined direction with respect to the image display device.
- the position and orientation of the image display device in the reference coordinate system are calculated.
- the reference index arrangement information acquisition unit of the information processing device is configured to receive information from the reference index.
- the reference index arrangement information acquisition unit of the information processing apparatus identifies each reference index based on at least one of color, texture, and shape, and the reference of the image display apparatus It is configured to acquire information about the axes of the coordinate system.
- the reference index arrangement information acquisition unit of the information processing device receives information encoded in the blinking pattern from the reference index. It is configured.
- the reference index arrangement information acquisition unit of the information processing device inquires of the database about information obtained by decoding the blinking pattern, and The coordinate information, the reference index information, or the image display device information is acquired.
- the position / orientation calculation unit of the information processing device uses the Visual SLAM, and the first reference coordinates The position and orientation of the image display device in the system are calculated.
- the information processing apparatus is calculated by the position / orientation calculation unit in a reference state in which the information processing apparatus is held in an initial attitude.
- the position and orientation of the image display device are stored, and the position and orientation conversion unit calculates a change in the orientation of the image display device from the reference state, and the image in the first reference coordinate system is calculated.
- the display device is configured to convert the position and orientation of the display device into the position and orientation of the information processing device in the second reference coordinate system.
- the thirteenth aspect of the technology disclosed in this specification is: A position and orientation calculation step of calculating the position and orientation of the image display device in the first reference coordinate system of the information terminal based on an image obtained by photographing another image display device with an imaging unit mounted on the information terminal; , A position and orientation conversion step of converting the position and orientation of the image display device in the first reference coordinate system into the position and orientation of the information terminal in the second reference coordinate system of the image display device; An embedding position calculating step for calculating a position where the screen information of the information terminal is embedded in the screen of the image display device based on the position and orientation of the information terminal in the second reference coordinate system; Is an information processing method.
- the fourteenth aspect of the technology disclosed in this specification is: A display device fixed to the observer's head or face; A position of the image display device in the first reference coordinate system of the information processing device is calculated based on an image obtained by photographing an external image display device with the image pickup unit and a display unit. The position and orientation of the image display device in the first reference coordinate system are converted into the position and orientation of the information processing device in the second reference coordinate system of the image display device, and the image display device An information processing device for calculating a position where screen information of the display unit is embedded in the screen of Is an image display system.
- system here refers to a logical collection of a plurality of devices (or functional modules that realize specific functions), and each device or functional module is in a single housing. It does not matter whether or not.
- excellent information processing that can suitably provide screen information to be displayed on an image display device that is worn on a user's head or face and is used for viewing an image.
- An apparatus, an information processing method, and an image display system can be provided.
- FIG. 1 is a diagram showing a user wearing a head-mounted display 100 on the head as viewed from the front.
- FIG. 2 is a diagram illustrating a state in which a user wearing the head mounted display 100 illustrated in FIG. 1 is viewed from above.
- FIG. 3 is a diagram showing an example of the internal configuration of the head mounted display 100.
- FIG. 4 is a diagram showing a state in which the mutual positional relationship and posture of the information terminals are grasped based on an image photographed by the outer camera 312 of the head mounted display 100, and the information terminal image is superimposed and displayed. It is.
- FIG. 5 is a diagram showing a processing procedure for realizing the screen display shown in FIG. FIG.
- FIG. 6 shows a state in which the positional relationship and orientation of the head-mounted display 100 and the information terminal are grasped based on the image taken by the information terminal camera, and the information terminal image is superimposed and displayed. It is a figure.
- FIG. 7 is a diagram showing a processing procedure for realizing the screen display shown in FIG.
- FIG. 8 is a diagram showing a state in which the AR marker 801 is mounted on the head-mounted display 100 and the head-mounted display 100 is photographed by the information terminal camera held in the forward direction of the spindle.
- FIG. 9 is a diagram showing a state in which the reference indicators 901A to 901D having the known shapes are mounted on the head mounted display 100 and the head mounted display 100 is photographed by the information terminal camera held in the front direction of the spindle.
- FIG. 10 shows that the head-mounted display 100 is photographed with an information terminal camera in which four or more reference indicators 1001A to 100D of different shapes are mounted on the head-mounted display 100 in the forward direction of the spindle. It is the figure which showed a mode that it does.
- FIG. 11 is a diagram showing a state where the flashing reference indicators 1101A to 1101D are mounted on the head-mounted display 100 and the head-mounted display 100 is photographed by an information terminal camera held in the forward direction of the spindle.
- FIG. 12 is a diagram showing a processing procedure for embedding the screen information of the information terminal in the screen of the head mounted display 100 based on the object coordinates received from the reference index.
- FIG. 13 is a diagram showing a processing procedure for embedding the screen information of the information terminal in the screen of the head mounted display 100 based on the search key received from the reference index.
- FIG. 14 is a diagram showing a processing procedure for initializing the position of the information terminal.
- FIG. 15 is a diagram showing a processing procedure for embedding the screen information of the information terminal in the screen of the head mounted display 100 using the position of the information terminal initialized by the processing procedure shown in FIG.
- FIG. 16 is a diagram illustrating a configuration example of an information terminal.
- FIG. 1 shows a front view of a user wearing a head-mounted display 100 to which the technology disclosed in this specification is applied.
- the head-mounted display 100 directly covers the user's eyes when the user wears it on the head or face, and can give an immersive feeling to the user who is viewing the image. Further, since the display image cannot be seen from the outside (that is, another person), it is easy to protect privacy when displaying information. Unlike the see-through type, the user wearing the head mounted display 100 cannot directly view the real world scenery. If the outer camera 312 that captures a landscape in the direction of the user's line of sight is installed, the captured image is displayed so that the user indirectly views the real world landscape (ie, displays the landscape through video see-through). )be able to. However, as will be described later, the technology disclosed in this specification does not assume the equipment of the outer camera 312 or the video see-through display.
- a display panel (not shown in FIG. 1) for the user to observe is disposed at a position facing the left and right eyes inside the head mounted display 100 main body.
- the display panel includes a micro display such as an organic EL element or a liquid crystal display, or a laser scanning display such as a direct retina display.
- Microphones 103L and 103R are installed near the left and right ends of the head-mounted display 100 main body. By having the microphones 103L and 103R symmetrically on the left and right sides, by recognizing only the voice localized at the center (user's voice), it can be separated from the surrounding noise and the voice of others, for example by voice It is possible to prevent malfunction during operation.
- a touch panel 315 that allows a user to perform touch input using a fingertip or the like is disposed near the back of the display panel on the front surface of the head mounted display 100 main body.
- a pair of left and right touch panels 315 are provided, but a single touch panel or three or more touch panels 315 may be provided.
- FIG. 2 shows a state in which the user wearing the head mounted display 100 shown in FIG. 1 is viewed from above.
- the illustrated head mounted display 100 has left-eye and right-eye display panels 104 ⁇ / b> L and 104 ⁇ / b> R on the side facing the user's face.
- the display panels 104L and 104R are configured by a laser scanning type display such as a micro display such as an organic EL element or a liquid crystal display or a retina direct drawing display.
- the display images on the display panels 104L and 104R are observed by the user as enlarged virtual images by passing through the virtual image optical units 101L and 101R.
- an eye width adjustment mechanism 105 is provided between the display panel for the right eye and the display panel for the left eye.
- FIG. 3 shows an internal configuration example of the head mounted display 100. Hereinafter, each part will be described.
- the control unit 301 includes a ROM (Read Only Memory) 301A and a RAM (Random Access Memory) 301B.
- the ROM 301A stores program codes executed by the control unit 301 and various data.
- the control unit 301 executes a program loaded into the RAM 301B, thereby starting image display control and overall operation of the head mounted display 100.
- a display control program that displays a playback moving image on a screen or displays image information captured from an external device (such as a smartphone) on a screen can be given.
- identification information unique to the head mounted display 100 including a model number and a manufacturing number
- authentication information for authenticating a user who uses the head mounted display 100 for example, , User attribute information such as a personal identification number, a password, and biometric information
- design information of the head mounted display 100 the shape and arrangement of reference indices mounted on the head mounted display 100.
- the input interface (IF) unit 302 includes one or more operators (none of which are shown) such as keys, buttons, switches, etc., on which the user performs input operations, accepts user instructions via the operators, and controls the control unit 301 is output. Further, the input operation unit 302 receives a user instruction including a remote control command received by the remote control reception unit 303 and outputs it to the control unit 301.
- operators such as keys, buttons, switches, etc.
- the input interface (IF) unit 302 when the user performs a touch operation with the fingertip on the touch panel 315 disposed outside the head-mounted display 100 main body, the coordinate data of the touched fingertip position, etc. are input to the control unit 301.
- the touch panel 315 is disposed on the back surface of the display image of the display unit 309 (enlarged virtual image observed through the virtual image optical unit 310) on the front surface of the head mounted display 100 main body (see FIG. 1), The user can perform a touch operation as if touching the display image with a fingertip.
- the status information acquisition unit 304 is a functional module that acquires status information of the head mounted display 100 main body or a user wearing the head mounted display 100.
- the state information acquisition unit 304 may be equipped with various sensors for detecting state information by itself, or an external device (for example, a user wears a part or all of these sensors).
- the status information may be acquired via a communication unit 305 (described later) from a smartphone, a wristwatch, or another multifunction terminal.
- the status information acquisition unit 304 acquires, for example, information on the position and posture of the user's head or information on the posture in order to track the user's head movement.
- the state information acquisition unit 304 is, for example, a sensor that can detect a total of nine axes including a three-axis gyro sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor.
- the state information acquisition unit 304 may use any one or more sensors such as a GPS (Global Positioning System) sensor, a Doppler sensor, an infrared sensor, and a radio wave intensity sensor.
- the status information acquisition unit 304 is provided from various infrastructures such as mobile phone base station information and PlaceEngine (registered trademark) information (electrical measurement information from a wireless LAN access point) in order to acquire position and orientation information. Information may be used in combination.
- the state acquisition unit 304 for tracking the head movement is built in the head mounted display 100, but is configured with accessory parts attached to the head mounted display 100. It may be.
- the externally connected state acquisition unit 304 sends the head posture information to the head via wireless communication such as Bluetooth (registered trademark) communication or a high-speed wired interface such as USB (Universal Serial Bus). Transmit to the main body of the mount display 100.
- the state information acquisition unit 304 may include, for example, the user's work state (head mounted display) as the state information of the user wearing the head mounted display 100. 100 or not), user's action state (moving state such as stationary, walking, running, etc.) gesture by hand or fingertip, opening / closing state of eyelids, gaze direction, pupil size), mental state (user displays display image) The degree of excitement, excitement, wakefulness, emotion, emotion, etc.), and physiological state, such as whether they are immersed or concentrated during observation.
- the state information acquisition unit 304 acquires the state information from the user by using an outer camera 312, a wearing sensor such as a mechanical switch, an inner camera that captures the user's face, a gyro sensor, an acceleration sensor, Speed sensor, pressure sensor, temperature sensor that detects body temperature or air temperature, sweat sensor, pulse sensor, myoelectric sensor, electro-oculogram sensor, electroencephalogram sensor, exhalation sensor, gas / ion concentration sensor, etc. (Not shown) may also be provided.
- the state information acquisition unit 304 indicates that when the head mounted display is mounted on the user's head, the head mounted display is mounted on the user in conjunction with the movement of contacting the head of the forehead. It is also possible to detect whether or not the head-mounted display 100 is mounted by using a mounting sensor that detects the above (for example, see Patent Document 6).
- the environment information acquisition unit 316 is a functional module that acquires information related to the environment surrounding the head mounted display 100 main body or the user wearing the head mounted display 100.
- Information on the environment here includes sound, air volume, temperature, atmospheric pressure, atmosphere (smoke, dense fog, electromagnetic waves (ultraviolet rays, blue light, radio waves), heat rays (infrared rays), radiation, and the like that the head mounted display 100 or the user receives.
- the environment sensor may include the above-described microphone and the outside camera 312.
- the environment information acquisition unit 316 may include an external device (for example, a part or all of these sensors).
- the environment information may be acquired via a communication unit 305 (described later) from a smartphone, a wristwatch, or other multi-function terminal worn by the user.
- the outer camera 312 is disposed, for example, in the approximate center of the front surface of the head mounted display 100 (see FIG. 2), and can capture a surrounding image. However, as will be described later, the technology disclosed in this specification does not assume the installation or use of the outer camera 312.
- the communication unit 305 performs communication processing with an external device (not shown), modulation / demodulation of communication signals, and encoding / decoding processing.
- an external device a content playback device (Blu-ray disc or DVD player) that supplies viewing content when the user uses the head mounted display 100, a streaming server, and a smartphone or personal computer used by the user
- An information terminal such as
- the control unit 301 transmits transmission data to the external device from the communication unit 305.
- the configuration of the communication unit 305 is arbitrary.
- the communication unit 305 can be configured according to a communication method used for transmission / reception operations with an external device that is a communication partner.
- the communication method may be either wired or wireless.
- Communication standards mentioned here include MHL (Mobile High-definition Link), USB (Universal Serial Bus), HDMI (registered trademark) Multimedia Interface, Wi-Fi (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (trademark) Examples include BLE (Bluetooth (registered trademark) Low Energy) communication, ultra-low power wireless communication such as ANT, and a mesh network standardized by IEEE 802.11s.
- the communication unit 305 may be a cellular radio transceiver that operates according to a standard such
- the storage unit 306 is a large-capacity storage device configured by an SSD (Solid State Drive) or the like.
- the storage unit 306 stores application programs executed by the control unit 301 and various data. For example, the user uses the head mounted display 100 to store content to be viewed in the storage unit 306.
- the image processing unit 307 further performs signal processing such as image quality correction on the image signal output from the control unit 301 and converts the image signal to a resolution that matches the screen of the display unit 309.
- the display driving unit 308 sequentially selects the pixels of the display unit 309 for each row and performs line sequential scanning, and supplies a pixel signal based on the image signal subjected to signal processing.
- the display unit 309 includes a display panel configured with a micro display such as an organic EL element or a liquid crystal display, or a laser scanning display such as a direct retina display.
- the virtual image optical unit 310 enlarges and projects the display image of the display unit 309 and causes the user to observe it as an enlarged virtual image.
- display images output from the display unit 309 include commercial content (virtual world) supplied from a content playback device (Blu-ray disc or DVD player) or streaming server, images taken by the outer camera 312 (user view images, etc.) Real world images).
- image information sent from an information terminal such as a smartphone or a personal computer may be displayed in a superimposed manner on a content reproduction screen.
- the audio processing unit 313 further performs signal processing such as sound quality correction, audio amplification, and input audio signal on the audio signal output from the control unit 301. Then, the voice input / output unit 314 outputs the voice after voice processing to the outside and inputs voice from the microphone (described above).
- a user wearing the head-mounted display 100 may want to operate another information terminal such as a smartphone while the content is being viewed and the field of view is completely blocked. For example, when you want to check the incoming history of mail or telephone, or when you want to know the current time.
- another information terminal such as a smartphone
- the head-mounted display 100 inputs screen information of an external information terminal, switches between the original content playback screen and the screen of the external information terminal and displays it on the display unit 309, or displays external information on the original screen. If the terminal screen is superimposed and displayed, the user can check the contents of the screen of the external information terminal without removing the head-mounted display 100 from the head.
- the superimposed display here includes displaying a small screen separately in a picture in picture format, blending two screens at a predetermined ratio, and the like.
- the operation lacks reality. Become. Further, if the position of the screen of the information terminal displayed on the display unit 309 is not at all linked to the position of the screen of the actual information terminal, the user can display the information terminal displayed on the head-mounted display 100. It is extremely difficult to perform operations on the screen of an actual information terminal while looking at the screen.
- the outer camera 312 takes a picture of the information terminal that the user himself / herself has (reference number 401). If the captured image of the information terminal is displayed on the content playback screen of the head-mounted display 100 at the same position and orientation as the actual information terminal in the user's field of view (reference number 402), the user It becomes easy to operate the screen of the actual information terminal while viewing the screen of the information terminal displayed on the mount display 100 as a video see-through display. Specifically, if the mutual positional relationship and posture between the head-mounted display 100 and the information terminal are grasped based on the captured image of the outer camera 312, the user's content is displayed on the content playback screen of the head-mounted display 100. An image of the information terminal can be superimposed and displayed at the same position and orientation as the actual information terminal on the field of view.
- FIG. 5 shows a processing procedure for realizing the screen display shown in FIG.
- the viewing content is displayed on the display unit 309 (F501).
- photographing is performed using the outer camera 312 (F511). Then, an object recognition process is performed on the captured image to detect an information terminal (F512). The position and orientation of the information terminal are calculated (F513).
- the reference coordinate system of the display unit 309 that displays the information terminal on the head-mounted display 100 (hereinafter, simply referred to as the reference coordinate system of the head-mounted display 100) is an outer camera that images the information terminal. Strictly different from the reference coordinate system 312. Therefore, in F513, it is necessary to convert the position and orientation of the information terminal detected in the camera reference coordinate system into the position and orientation of the information terminal in the reference coordinate system of the head-mounted display 100.
- a visual reference index may be assigned to the information terminal.
- a known pattern displayed on the screen of the information terminal can be used as a reference index.
- the position and orientation of the information terminal may be calculated without a marker (described later).
- the position where the screen information of the information terminal is embedded in the viewing content screen is calculated (F514).
- the head-mounted display 100 acquires screen information from the information terminal (F521), the head-mounted display 100 embeds it in the content reproduction screen at the embedding position calculated in F514 and generates an output screen (F530).
- the content playback screen of the head-mounted display 100 is present at the same position and posture as the actual information terminal on the user's field of view, and the user continues to watch the content while The information terminal screen can be confirmed and input operations can be performed without losing reality.
- the outer camera 312 is necessary to correct the display position and orientation of the information terminal by visual information, in other words, the outer camera 312 is mounted. There is a problem that it cannot be applied to products that do not.
- the head mounted display 100 on which the outer camera 312 is mounted.
- the outside camera is not always equipped as standard, and a considerable number of products not equipped with the outside camera are already in widespread use.
- the head-mounted display 100 does not have a means for grasping the mutual positional relationship and posture between the display main body and the information terminal to be operated, on the content playback screen of the head-mounted display 100
- the information terminal cannot be displayed in the same position and posture as the actual information terminal on the user's field of view.
- FIG. 7 shows a processing procedure for realizing the screen display shown in FIG.
- the viewing content is displayed on the display unit 309 (F701).
- shooting is performed using a standard camera (F711). It is assumed that the user wearing the head mounted display 100 holds the information terminal in his hand, and the camera is directed to the front of the head mounted display 100. Then, an object recognition process is performed on the captured image to detect the head mounted display 100 (F712). Next, the position and orientation of the head mounted display 100 in the coordinate system (or camera coordinate system) with the information terminal as a reference are calculated (F713).
- the position and orientation of the head-mounted display 100 in the reference coordinate system of the information terminal are converted into the position and orientation of the information terminal in the reference coordinate system of the head-mounted display 100 (F714), and the calculated position Then, based on the posture information, the position where the screen information of the information terminal is embedded in the screen of the viewing content is calculated (F715).
- the processing (F712 to F715) after the processing of the captured image may be executed by either the information terminal or the head-mounted display 100, or may be executed by an external device such as a cloud computer.
- the head-mounted display 100 obtains the screen information from the information terminal (F721), the head-mounted display 100 embeds it in the content reproduction screen at the embedding position calculated in F715 and generates an output screen (F730).
- the same position and orientation information terminal screen as the actual information terminal in the user's field of view is displayed in the content playback screen (virtual world image) of the head-mounted display 100.
- a user whose view is completely blocked can check the screen of the information terminal and perform an input operation without losing reality while continuing to watch the content.
- FIG. 16 shows a configuration example of an information terminal that can realize the processing shown in FIG.
- the illustrated information terminal is configured by connecting a display unit 1620, an audio processing unit 1630, a communication unit 1640, a storage unit 1650, a camera unit 1660, a sensor unit 1670, and the like to the control unit 1610.
- the control unit 1610 includes a CPU (Central Processing Unit) 211, a ROM 1612, a RAM 1613, and the like.
- the ROM 1612 stores program codes executed by the CPU 1611 and information essential to the information terminal.
- the CPU 1611 loads the program code from the ROM 1612 or the storage unit 1640 to the RAM 1613 and executes it.
- the program executed by the CPU 1611 includes an execution system provided by the operating system, such as an operating system such as Android or iOS, or an application that transfers screen information to an external image display device (such as the head-mounted display 100). List various application programs that run under the environment.
- the display unit 1620 includes a display panel 1621 made of a liquid crystal element, an organic EL element, and the like, and a transparent touch panel 1623 attached to the upper surface of the display panel 1622.
- the display panel 1621 is connected to the control unit 1610 via the display interface 1622 and displays and outputs the image information generated by the control unit 210.
- the touch panel 1623 is connected to the control unit 1610 via the touch interface 1624, and outputs to the control unit 1610 coordinate information that the user has operated on the display panel 1621 with a fingertip.
- user operations such as tap, long press, flick, and swipe are detected based on the input coordinate information, and processing corresponding to the user operation is started.
- the audio processing unit 1630 includes an audio output unit 1631 such as a speaker, an audio input unit 1632 such as a microphone, and an audio codec (CODEC) 1633 that encodes and decodes an input / output audio signal.
- the audio processing unit 1630 may further include an output terminal 1634 for outputting an audio signal to headphones (not shown).
- the communication unit 1640 performs an information communication process between the application executed by the control unit 1610 and the external device.
- the external device mentioned here include a television receiver, an information terminal (not shown) handled by another user, a server existing on the Internet, and the like.
- the communication unit 1640 is equipped with a physical layer module such as Wi-Fi, NFC (Near Field Communication), or Bluetooth (registered trademark) according to a communication medium to be used, and a communication signal transmitted / received via the physical layer module. Modulation / demodulation processing and encoding / decoding processing are performed.
- screen information displayed on the display unit 1620 may be transferred to an external image display device (such as the head-mounted display 100) via the communication unit 1640.
- the storage unit 1650 includes, for example, a large-capacity storage device such as an SSD (Solid State Drive) or an HDD (Hard Disc Drive).
- a large-capacity storage device such as an SSD (Solid State Drive) or an HDD (Hard Disc Drive).
- application programs and contents downloaded via the communication unit 1640, image data such as still images and moving images captured by the camera unit 1660, and the like are stored in the storage unit 1650.
- the camera unit 1660 includes an image sensor 1661 that photoelectrically converts light captured via a lens (not shown) such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a detection signal of the image sensor 1661.
- An AFE (Analog Front End) processing unit 1662 that generates image data by performing noise removal and digitization is provided, and the generated image data is output from the camera interface 1663 to the control unit 1610.
- the sensor unit 1670 includes a GPS (Global Positioning System) sensor for acquiring position information of the information terminal, a gyro sensor, an acceleration sensor, and the like for detecting the attitude of the information terminal body and the acting force. Yes.
- GPS Global Positioning System
- the mutual positional relationship and posture between the head mounted display 100 and the information terminal can be more accurately grasped based on the image taken from the camera mounted on the information terminal.
- one or more reference indicators such as an AR (Augmented Reality) marker may be mounted on the head-mounted display 100 side.
- the reference index For example, depending on the type of the reference index, detailed information (for example, product information such as a model number and a product number) of the head mounted display 100 can be acquired. By using the detailed information, the recognition accuracy of the mutual position and orientation of the head mounted display 100 and the information terminal is further improved.
- the following (1) and (2) can be exemplified as a method of using the reference index.
- the direction and unit length of the optical spindle are specified from the arrangement of the reference indices according to a certain rule, and coordinate axis alignment and position measurement are performed precisely.
- the reference index is not mounted on the markerless, that is, the head-mounted display 100 side
- absolute position and orientation information cannot be acquired. It is only possible to measure the displacement of the relative position and orientation from the initial position when processing is started. Therefore, when shooting the head-mounted display 100 from the camera mounted on the information terminal, the user holds the information terminal at a predetermined initial position and sends a reset signal for notifying the information terminal of the initial position. It may be.
- the predetermined initial position referred to here is, for example, the front side or the front side of the user wearing the head mounted display 100.
- a method of sending a reset signal a method of tapping a touch panel screen of the information terminal or a method of operating a switch on the head mounted display 100 side can be given. The relative displacement after that is measured and reflected in the drawing on the screen of the head mounted display 100.
- position / attitude calculation may be performed by using Visual SLAM (Multiple localization, mapping and mapping).
- Visual SLAM is a technology that can simultaneously perform camera self-position estimation and map creation in an unknown environment.
- An example of Visual SLAM is an integrated augmented reality technology SmartAR (trademark of Sony Corporation). According to the markerless, it is not necessary to give a reference index to the head-mounted display 100 side, and it can be applied to any product.
- the reference coordinates of the head mounted display 100 are used.
- the system is strictly different from the reference coordinate system of the outer camera 312. Therefore, in F513, it is necessary to convert the position and orientation of the information terminal detected in the camera reference coordinate system into the position and orientation of the information terminal in the reference coordinate system of the head-mounted display 100.
- the position of the outer camera 312 relative to the head mounted display 100 is fixed or known. Therefore, it is easy to convert the position and orientation of the information terminal detected in the camera reference coordinate system into the position and orientation of the information terminal in the reference coordinate system of the head-mounted display 100.
- the camera position with respect to the information terminal is fixed, that is, known. Therefore, it is easy to convert the position and orientation of the head mounted display 100 taken by the camera into the position and orientation of the head mounted display 100 in the reference coordinate system of the information terminal.
- the reference index arrangement information transmission means for transmitting the information on the posture relationship between the two to the information terminal is necessary for performing the position and posture conversion processing in F714 more strictly.
- the following (1) to (3) are illustrated as reference index arrangement information transmission means.
- the design information of the head mounted display 100 is known information of the information terminal. (2) Transmit necessary information from the reference index to the information terminal. (3) Via an initialization process by the user.
- the design information of the head mounted display 100 is known information of the information terminal.
- a dedicated application for the information terminal such as an application for calculating the position and orientation of the head-mounted display 100 from a photographed image of the camera
- information corresponding to the model number of the head-mounted display 100 is preloaded on the information terminal.
- all the shapes and arrangements of the reference indicators mounted on the head mounted display 100 are designed in advance.
- the main axis of the head-mounted display 100 is defined and the reference index is arranged so as to align with the main axis. According to this transmission means, it is possible to give a degree of freedom to the shape of the reference index.
- FIG. 8 shows a state in which the AR marker 801 is mounted on the head mount display 100 and the head mount display 100 is photographed by an information terminal camera held in the forward direction of the spindle.
- the arrangement information for pasting the AR marker is known.
- the relative position and orientation of the information terminal can be calculated by observing the position and orientation (shape) of the AR marker on the captured image.
- FIG. 9 shows a state in which the reference indicators 901A to 901D of known shapes are mounted on the head mount display 100, and the head mount display 100 is photographed by the information terminal camera held in the forward direction of the spindle. ing.
- the arrangement information of each reference index 901A-D is assumed to be known.
- the relative position and orientation of the information terminal can be calculated by observing the position of each reference index 901A-D on the captured image.
- the camera captures three or more points with known three-dimensional coordinates, and determines the position and orientation of the camera from the position of each point on the image, that is, two-dimensional coordinates defined on the camera image from a known three-dimensional coordinate system.
- the method of calculating the mapping to is an established technique called 3D-2D registration in the field of image processing and the like.
- 3D-2D registration In order to obtain the 6 variables corresponding to the position and orientation, which are called external parameters of the camera, at least 3 observations are required. Usually, estimation is performed using the least squares method using more observation points. (In this case, the internal parameters of the camera are known).
- the information terminal side camera can recognize the relative position and orientation of the information terminal, and it is necessary to use the outer camera 312 on the head mounted display 100 side. Absent.
- each of the reference indices 1001A-D can be identified based on the color, texture, and shape of each of the reference indices 1001A-D reflected in the captured image. For example, when the reference indices 1001B and 1001D are connected, the reference horizontal line is obtained.
- a combination of the reference indicators 1001A and 100B or a combination of 1001C and 1001D represents a reference vertical line.
- FIG. 11 shows a state in which the blinking reference indicators 1101A to 1101D are mounted on the head mounted display 100 and the head mounted display 100 is photographed by an information terminal camera held in the front direction of the spindle.
- Each of the reference indicators 1101A to 1101D includes position information of each indicator in the reference coordinate system of the head mounted display 100, model number information of the head mounted display 100 on which the indicator is installed, or other necessary data.
- Information to be transmitted such as a search key of a stored database can be encoded into a blinking pattern, and information can be obtained by decoding the blinking pattern on the information terminal side. In any of these transmission means, the information terminal side camera can recognize the relative position and orientation of the information terminal, and there is no need to use the outer camera 312 on the head mounted display 100 side.
- FIG. 12 shows a processing procedure for embedding the screen information of the information terminal in the screen of the head mount display 100 based on the object coordinates received from the reference index.
- the position of the head-mounted display 100 in the coordinate system (or camera coordinate system) based on the information terminal is determined from the position and orientation of the detected reference index.
- the position and orientation are calculated (F1213).
- This calculation process is a 3D-2D registration (described above) or an equivalent process. That is, the camera coordinate system is obtained by associating the two-dimensional position of the reference index on the camera coordinates detected in F1212 with the three-dimensional position of each reference index received in F1241 in the reference coordinate system of the head mounted display 100.
- the position / posture of the HMD (coordinate system) viewed from the position or the position / posture of the camera (coordinate system) viewed from the HMD coordinate system can be calculated.
- the position and orientation of the HMD viewed from the camera coordinate system are calculated.
- the position and orientation of the head mounted display 100 in the reference coordinate system of the information terminal are converted into the position and orientation of the information terminal in the head mounted display 100 reference coordinate system (F1214), and the calculated position is converted.
- the position where the screen information of the information terminal is embedded is calculated (F1215). Note that the processing (F1212 to F1215) after the processing of the captured image may be executed by either the information terminal or the head mounted display 100, or may be executed by an external device such as a cloud computer.
- the head-mounted display 100 acquires the screen information from the information terminal (F1221), the head-mounted display 100 embeds it in the content reproduction screen at the embedding position calculated in F1215 and generates an output screen (F1230).
- the screen of the information terminal is displayed in the content playback screen (virtual world image) of the head-mounted display 100 at the same position and orientation as the real information terminal in the user's field of view.
- a user whose view is completely obstructed can check the screen of the information terminal and perform an input operation without losing reality while continuing to watch the content.
- FIG. 13 shows a processing procedure for embedding the screen information of the information terminal in the screen of the head mounted display 100 based on the search key received from the reference index.
- the database is queried with the search key (F1342), and the coordinate information of the reference index, the information of the reference index, the information of the head mounted display 100, etc. are acquired. Then, from the detected position and orientation of the reference index, the position and orientation of the head mounted display 100 in the coordinate system (or camera coordinate system) based on the information terminal is calculated (F1313).
- the position and orientation of the head mounted display 100 in the reference coordinate system of the information terminal are converted into the position and orientation of the information terminal in the head mounted display 100 reference coordinate system (F1314), and the calculated position is calculated. Based on the information on the posture and the position of the viewing content, the position where the screen information of the information terminal is embedded is calculated (F1315). Note that the processing (F1312 to F1315) after the processing of the captured image may be executed by either the information terminal or the head mounted display 100, or may be executed by an external device such as a cloud computer.
- the head-mounted display 100 acquires screen information from the information terminal (F1321), the head-mounted display 100 embeds it in the content reproduction screen at the embedding position calculated in F1315 and generates an output screen (F1330).
- the screen of the information terminal is displayed in the content playback screen (virtual world image) of the head-mounted display 100 at the same position and orientation as the real information terminal in the user's field of view.
- a user whose view is completely obstructed can check the screen of the information terminal and perform an input operation without losing reality while continuing to watch the content.
- the user wearing the head mounted display 100 holds the information terminal in a predetermined initial posture such as the front of the eye line and resets (initializes) the information terminal.
- a predetermined initial posture such as the front of the eye line and resets (initializes) the information terminal.
- the information terminal side recognizes the head-mounted display 100 captured by the camera by applying the Visual SLAM technology or the like, the position of the information terminal is initialized with respect to the posture.
- FIG. 14 shows a processing procedure for initializing the position of the information terminal.
- the user who wears the head mounted display 100 holds the information terminal in an initial posture such as the front of his eyes.
- the head mount display 100 is photographed by the camera of the information terminal (F1401), and the head mount display 100 is detected from the photographed image (F1402).
- the position and orientation of the head mounted display 100 in the coordinate system (or camera coordinate system) with the information terminal as a reference are calculated (F1403).
- the calculated position and orientation of the head mounted display 100 are stored as the position and orientation of the head mounted display 100 in the reference state (F1410).
- FIG. 15 shows a processing procedure for embedding the screen information of the information terminal in the screen of the head mounted display 100 using the position of the information terminal initialized by the processing procedure shown in FIG.
- the viewing content is displayed on the display unit 309 (F1501).
- shooting is performed using a standard camera (F1511). It is assumed that the user wearing the head mounted display 100 holds the information terminal in his hand, and the camera is directed to the front of the head mounted display 100. Then, an object recognition process is performed on the captured image to detect the head mounted display 100 (F1512). Then, the position and orientation of the head mounted display 100 in the coordinate system (or camera coordinate system) with the information terminal as a reference are calculated (F1513).
- the processing (F1512 to F1516) after the processing of the captured image may be executed by either the information terminal or the head-mounted display 100, or may be executed by an external device such as a cloud computer.
- the head-mounted display 100 acquires screen information from the information terminal (F1521), the head-mounted display 100 embeds it in the content reproduction screen at the embedding position calculated in F1516, and generates an output screen (F1530).
- the screen of the information terminal is displayed in the content playback screen (virtual world image) of the head-mounted display 100 at the same position and orientation as the real information terminal in the user's field of view.
- a user whose view is completely obstructed can check the screen of the information terminal and perform an input operation without losing reality while continuing to watch the content.
- the technology disclosed in this specification can be suitably applied to an immersive head-mounted display not equipped with a camera.
- a video see-through head-mounted display equipped with a camera That is, the present invention can be similarly applied to various types of image display devices other than the head mount display of the optical see-through type.
- a shooting unit A position and orientation calculation unit that calculates the position and orientation of the image display device in the first reference coordinate system of the information processing device based on an image obtained by capturing an external image display device by the imaging unit;
- a position and orientation conversion unit that converts the position and orientation of the image display device in the first reference coordinate system into the position and orientation of the information processing device in the second reference coordinate system of the image display device;
- An embedded position calculation unit that calculates a position in which screen information related to the information processing device is embedded in the screen of the image display device based on the position and orientation of the information processing device in the second reference coordinate system;
- An information processing apparatus comprising: (2) a shooting unit; A display unit; A position and orientation calculation unit that calculates the position and orientation of the image display device in the first reference coordinate system of the information processing device based on an image obtained by capturing an external image display device by the imaging unit;
- a position and orientation conversion unit that converts the position and orientation of the image display device in the first reference coordinate system
- the position and orientation calculation unit calculates the position and orientation of the image display device in the first reference coordinate system based on a reference index mounted on the image display device side.
- the information processing apparatus according to any one of (2) to (3) above.
- It further includes a reference index arrangement information acquisition unit that acquires information on the relationship between the attitude of the image display device and the attitude of the reference index.
- the information processing apparatus according to (4) above.
- the reference index arrangement information acquisition unit preloads design information related to the relationship between the attitude of the image display device and the attitude of the reference index,
- the position / orientation calculation unit uses the first reference coordinate system based on the design information from an image captured by the imaging unit when the information processing apparatus is arranged in a predetermined direction with respect to the image display device.
- the reference index arrangement information acquisition unit receives information from the reference index.
- Three or more reference indicators whose arrangement information is known are arranged on the image display device side,
- the reference index arrangement information acquisition unit identifies each reference index based on at least one of color, texture, and shape, and acquires information on the axis of the reference coordinate system of the image display device.
- the information processing apparatus according to (7) above. (9)
- the reference index arrangement information acquisition unit receives information encoded in a blinking pattern from the reference index.
- the reference marker arrangement information acquisition unit inquires of the database for information obtained by decoding the blinking pattern, and acquires coordinate information of the reference marker, information of the reference marker, or information of the image display device.
- the information processing apparatus according to (9) above.
- the position and orientation calculation unit calculates the position and orientation of the image display device in the first reference coordinate system using Visual SLAM.
- the information processing apparatus according to any one of (2) to (3) above.
- (12) The position and orientation of the image display device calculated by the position and orientation calculation unit in the reference state in which the information processing device is held in the initial orientation is stored.
- the position / orientation conversion unit calculates a change in the orientation of the image display device from the reference state, and determines the position and orientation of the image display device in the first reference coordinate system as the second reference coordinates.
- the position and orientation of the image display device in the first reference coordinate system are converted into the position and orientation of the information processing device in the second reference coordinate system of the image display device, and the image display device
- An information processing device for calculating a position where screen information of the display unit is embedded in the screen of An image display system comprising:
- DESCRIPTION OF SYMBOLS 100 Head mount display 101L, 101R ... Virtual image optical part 103L, 103R ... Microphone, 104L, 104R ... Display panel 105 ... Eye width adjustment mechanism 301 ... Control part, 301A ... ROM, 301B ... RAM 302 ... Input operation unit 303 ... Remote control receiving unit 304 ... Status information acquisition unit 305 ... Communication unit 306 ... Storage unit 307 ... Image processing unit 308 ... Display drive unit 309 ... Display unit 310 ... Virtual image optical unit 312 ... Outside camera 313 ... Audio processing unit, 314 ... Audio input / output unit, 315 ... Touch panel, 316 ... Environmental information acquisition unit 1610 ...
- Control unit 1611 ... CPU, 1612 ... ROM, 1613 ... RAM 1620 ... Display unit, 1621 ... Display panel, 1622 ... Display interface 1623 ... Touch panel, 1624 ... Touch interface 1630 ... Audio processing unit, 1631 ... Audio output unit, 1632 ... Audio input unit 1634 ... Output terminal 1640 ... Communication unit, 1650 ... Storage unit, 1660 ... Camera unit 1661 ... Image sensor, 1662 ... AFE processing unit 1663 ... Camera interface, 1670 ... Sensor unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
撮影部と、
前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算部と、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する位置姿勢変換部と、
前記第2基準座標系での当該情報処理装置の位置及び姿勢に基づいて、前記画像表示装置の画面内で当該情報処理装置に関わる画面情報を埋め込む位置を計算する埋め込み位置計算部と、
を具備する情報処理装置である。
撮影部と、
表示部と、
前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算部と、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する位置姿勢変換部と、
前記第2基準座標系での当該情報処理装置の位置及び姿勢に基づいて、前記画像表示装置の画面に対して前記表示部の画面情報を埋め込む位置を計算する埋め込み位置計算部と、
を具備する情報処理装置である。
情報端末に搭載された撮影部で他の画像表示装置を撮影した画像に基づいて、前記情報端末の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算ステップと、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での前記情報端末の位置及び姿勢に変換する位置姿勢変換ステップと、
前記第2基準座標系での前記情報端末の位置及び姿勢に基づいて、前記画像表示装置の画面に対して前記情報端末の画面情報を埋め込む位置を計算する埋め込み位置計算ステップと、
を有する情報処理方法である。
観察者の頭部又は顔部に固定される表示装置と、
撮影部と表示部を備え、前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算するとともに、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換して、前記画像表示装置の画面に対して前記表示部の画面情報を埋め込む位置を計算する情報処理装置と、
を具備する画像表示システムである。
(2)基準指標から必要な情報を情報端末に伝達する。
(3)ユーザーによる初期化プロセスを経由する。
例えば、情報端末の専用アプリケーション(カメラの撮影画像からヘッド・マウント・ディスプレイ100の位置及び姿勢を計算するアプリケーションなど)、若しくはヘッド・マウント・ディスプレイ100の型番に応じた情報を、情報端末にプリロードする。この場合、ヘッド・マウント・ディスプレイ100に搭載する基準指標の形状や配置などをすべて事前に設計することが前提である。また、ヘッド・マウント・ディスプレイ100の主軸などを定義し、それにアラインするように、基準指標を配置することを約束若しくは規格とする。この伝達手段によれば、基準指標の形状に自由度を出すことができる。
図10には、色が異なる3個以上の既知形状の基準指標1001A~Dを、ヘッド・マウント・ディスプレイ100に搭載し、主軸前方方向に持った情報端末のカメラでヘッド・マウント・ディスプレイ100を撮影する様子を示している。各基準指標1001A~Dの色やテクスチャー、形状を異ならせており、組み合わせによってヘッド・マウント・ディスプレイ100の基準座標系の軸の情報を情報端末に伝達する。配置情報は既知とする。情報端末側では、撮影画像に映った各基準指標1001A~Dの色やテクスチャー、形状に基づいて各々を識別することができる。例えば、基準指標1001Bと1001Dを結ぶと基準水平線とする。また、基準指標1001Aと100Bの組み合わせ、又は1001Cと1001Dの組み合わせが基準垂直線を表すものとする。
各々の指標の位置情報や、指標が設置されているヘッド・マウント・ディスプレイ100の型番情報、あるいは、その他必要なデータが格納されたデータベースの検索キーなどの伝達したい情報を点滅パターンにエンコードし、情報端末側では点滅パターンをデコードして情報を取得することができる。これらいずれの伝達手段でも、情報端末側のカメラで情報端末の相対位置及び姿勢を認識することができ、ヘッド・マウント・ディスプレイ100側の外側カメラ312を使用する必要がない。
ヘッド・マウント・ディスプレイ100を着用したユーザーは、目線の正面など事前に定められた初期姿勢で情報端末を保持して、リセット(初期化)する。情報端末側では、Visual SLAM技術などを適用して、カメラで撮影したヘッド・マウント・ディスプレイ100を認識すると、その姿勢に対して情報端末の位置を初期化する。
(1)撮影部と、
前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算部と、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する位置姿勢変換部と、
前記第2基準座標系での当該情報処理装置の位置及び姿勢に基づいて、前記画像表示装置の画面内で当該情報処理装置に関わる画面情報を埋め込む位置を計算する埋め込み位置計算部と、
を具備する情報処理装置。
(2)撮影部と、
表示部と、
前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算部と、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する位置姿勢変換部と、
前記第2基準座標系での当該情報処理装置の位置及び姿勢に基づいて、前記画像表示装置の画面に対して前記表示部の画面情報を埋め込む位置を計算する埋め込み位置計算部と、
を具備する情報処理装置。
(3)前記画像表示装置と通信する通信部をさらに備え、
前記表示部の画面情報及び前記埋め込む位置の情報を、前記通信部を介して前記画像表示装置に送信する、
上記(2)に記載の情報処理装置。
(4)前記位置姿勢計算部は、前記画像表示装置側に搭載された基準指標に基づいて、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する、
上記(2)乃至(3)のいずれかに記載の情報処理装置。
(5)前記画像表示装置の姿勢と前記基準指標の姿勢の関係に関する情報を取得する基準指標配置情報取得部をさらに備える、
上記(4)に記載の情報処理装置。
(6)前記基準指標配置情報取得部は、前記画像表示装置の姿勢と前記基準指標の姿勢の関係に関する設計情報をプリロードし、
前記位置姿勢計算部は、前記画像表示装置に対し当該情報処理装置が所定方向に配置されたときに前記撮影部で撮影した画像から、前記設計情報に基づいて前記第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する、
上記(5)に記載の情報処理装置。
(7)前記基準指標配置情報取得部は、前記基準指標から情報を受信する、
上記(5)に記載の情報処理装置。
(8)前記画像表示装置側には配置情報が既知となる3個以上の基準指標が配置され、
前記基準指標配置情報取得部は、色、テクスチャー、形状のうち少なくとも1つに基づいて各基準指標の各々を識別して、前記画像表示装置の基準座標系の軸の情報を取得する、
上記(7)に記載の情報処理装置。
(9)前記基準指標配置情報取得部は、点滅パターンにエンコードされた情報を基準指標から受信する、
上記(7)に記載の情報処理装置。
(10)前記基準指標配置情報取得部は、点滅パターンをデコードした情報をデータベースに問い合わせて、基準指標の座標情報、基準指標の情報、又は前記画像表示装置の情報を取得する、
上記(9)に記載の情報処理装置。
(11)前記位置姿勢計算部は、Visual SLAMを用いて、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する、
上記(2)乃至(3)のいずれかに記載の情報処理装置。
(12)初期姿勢に当該情報処理装置が保持された基準状態において前記位置姿勢計算部で算出された前記画像表示装置の位置及び姿勢を保存しておき、
前記位置姿勢変換部は、前記基準状態からの前記画像表示装置の姿勢の変化を計算して、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する、
上記(11)に記載の情報処理装置。
(13)情報端末に搭載された撮影部で他の画像表示装置を撮影した画像に基づいて、前記情報端末の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算ステップと、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での前記情報端末の位置及び姿勢に変換する位置姿勢変換ステップと、
前記第2基準座標系での前記情報端末の位置及び姿勢に基づいて、前記画像表示装置の画面に対して前記情報端末の画面情報を埋め込む位置を計算する埋め込み位置計算ステップと、
を有する情報処理方法。
(14)観察者の頭部又は顔部に固定される表示装置と、
撮影部と表示部を備え、前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算するとともに、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換して、前記画像表示装置の画面に対して前記表示部の画面情報を埋め込む位置を計算する情報処理装置と、
を具備する画像表示システム。
101L、101R…虚像光学部
103L、103R…マイクロフォン、104L、104R…表示パネル
105…眼幅調整機構
301…制御部、301A…ROM、301B…RAM
302…入力操作部、303…リモコン受信部
304…状態情報取得部、305…通信部、306…記憶部
307…画像処理部、308…表示駆動部
309…表示部、310…虚像光学部、312…外側カメラ
313…音声処理部、314…音声入出力部、
315…タッチパネル、316…環境情報取得部
1610…制御部、1611…CPU、1612…ROM、1613…RAM
1620…表示部、1621…表示パネル、1622…表示インターフェース
1623…タッチパネル、1624…タッチ・インターフェース
1630…音声処理部、1631…音声出力部、1632…音声入力部
1634…出力端子
1640…通信部、1650…記憶部、1660…カメラ部
1661…イメージ・センサー、1662…AFE処理部
1663…カメラ・インターフェース、1670…センサー部
Claims (14)
- 撮影部と、
前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算部と、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する位置姿勢変換部と、
前記第2基準座標系での当該情報処理装置の位置及び姿勢に基づいて、前記画像表示装置の画面内で当該情報処理装置に関わる画面情報を埋め込む位置を計算する埋め込み位置計算部と、
を具備する情報処理装置。 - 撮影部と、
表示部と、
前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算部と、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する位置姿勢変換部と、
前記第2基準座標系での当該情報処理装置の位置及び姿勢に基づいて、前記画像表示装置の画面に対して前記表示部の画面情報を埋め込む位置を計算する埋め込み位置計算部と、
を具備する情報処理装置。 - 前記画像表示装置と通信する通信部をさらに備え、
前記表示部の画面情報及び前記埋め込む位置の情報を、前記通信部を介して前記画像表示装置に送信する、
請求項2に記載の情報処理装置。 - 前記位置姿勢計算部は、前記画像表示装置側に搭載された基準指標に基づいて、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する、
請求項2乃至3のいずれかに記載の情報処理装置。 - 前記画像表示装置の姿勢と前記基準指標の姿勢の関係に関する情報を取得する基準指標配置情報取得部をさらに備える、
請求項4に記載の情報処理装置。 - 前記基準指標配置情報取得部は、前記画像表示装置の姿勢と前記基準指標の姿勢の関係に関する設計情報をプリロードし、
前記位置姿勢計算部は、前記画像表示装置に対し当該情報処理装置が所定方向に配置されたときに前記撮影部で撮影した画像から、前記設計情報に基づいて前記第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する、
請求項5に記載の情報処理装置。 - 前記基準指標配置情報取得部は、前記基準指標から情報を受信する、
請求項5に記載の情報処理装置。 - 前記画像表示装置側には配置情報が既知となる3個以上の基準指標が配置され、
前記基準指標配置情報取得部は、色、テクスチャー、形状のうち少なくとも1つに基づいて各基準指標の各々を識別して、前記画像表示装置の基準座標系の軸の情報を取得する、
請求項7に記載の情報処理装置。 - 前記基準指標配置情報取得部は、点滅パターンにエンコードされた情報を基準指標から受信する、
請求項7に記載の情報処理装置。 - 前記基準指標配置情報取得部は、点滅パターンをデコードした情報をデータベースに問い合わせて、基準指標の座標情報、基準指標の情報、又は前記画像表示装置の情報を取得する、
請求項9に記載の情報処理装置。 - 前記位置姿勢計算部は、Visual SLAM(Simultaneous localization and mapping)を用いて、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する、
請求項2乃至3のいずれかに記載の情報処理装置。 - 初期姿勢に当該情報処理装置が保持された基準状態において前記位置姿勢計算部で算出された前記画像表示装置の位置及び姿勢を保存しておき、
前記位置姿勢変換部は、前記基準状態からの前記画像表示装置の姿勢の変化を計算して、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記第2の基準座標系での当該情報処理装置の位置及び姿勢に変換する、
請求項11に記載の情報処理装置。 - 情報端末に搭載された撮影部で他の画像表示装置を撮影した画像に基づいて、前記情報端末の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算する位置姿勢計算ステップと、
前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での前記情報端末の位置及び姿勢に変換する位置姿勢変換ステップと、
前記第2基準座標系での前記情報端末の位置及び姿勢に基づいて、前記画像表示装置の画面に対して前記情報端末の画面情報を埋め込む位置を計算する埋め込み位置計算ステップと、
を有する情報処理方法。 - 観察者の頭部又は顔部に固定される表示装置と、
撮影部と表示部を備え、前記撮影部で外部の画像表示装置を撮影した画像に基づいて、当該情報処理装置の第1の基準座標系での前記画像表示装置の位置及び姿勢を計算するとともに、前記第1の基準座標系での前記画像表示装置の位置及び姿勢を、前記画像表示装置の第2の基準座標系での当該情報処理装置の位置及び姿勢に変換して、前記画像表示装置の画面に対して前記表示部の画面情報を埋め込む位置を計算する情報処理装置と、
を具備する画像表示システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/328,089 US10719950B2 (en) | 2014-08-05 | 2015-04-30 | Head mount display (HMD) operated with mobile device for transforming reference coordinate systems for providing screen information |
JP2016539870A JP6525010B2 (ja) | 2014-08-05 | 2015-04-30 | 情報処理装置及び情報処理方法、並びに画像表示システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-159169 | 2014-08-05 | ||
JP2014159169 | 2014-08-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016021252A1 true WO2016021252A1 (ja) | 2016-02-11 |
Family
ID=55263535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/062934 WO2016021252A1 (ja) | 2014-08-05 | 2015-04-30 | 情報処理装置及び情報処理方法、並びに画像表示システム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10719950B2 (ja) |
JP (1) | JP6525010B2 (ja) |
WO (1) | WO2016021252A1 (ja) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017032870A (ja) * | 2015-08-04 | 2017-02-09 | 富士通株式会社 | 画像投影装置及び画像表示システム |
JP2017220032A (ja) * | 2016-06-07 | 2017-12-14 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、およびコンピュータプログラム |
JP2019074362A (ja) * | 2017-10-13 | 2019-05-16 | 任天堂株式会社 | 姿勢位置推測システム、姿勢位置推測方法、および姿勢位置推測装置 |
JP2019113693A (ja) * | 2017-12-22 | 2019-07-11 | セイコーエプソン株式会社 | 表示システム、表示システムの制御方法、表示装置、及び、表示装置の制御方法 |
CN110060202A (zh) * | 2019-04-19 | 2019-07-26 | 湖北亿咖通科技有限公司 | 一种单目slam算法的初始化方法及系统 |
CN110537208A (zh) * | 2017-05-04 | 2019-12-03 | 索尼互动娱乐欧洲有限公司 | 头戴式显示器和方法 |
JP2020501245A (ja) * | 2016-11-25 | 2020-01-16 | センサリクス アーゲー | 着用可能な動作追跡システム |
JP2020507221A (ja) * | 2017-02-03 | 2020-03-05 | ベステル エレクトロニク サナイー ベ ティカレト エー.エス. | Hmdを用いたビデオ会議の改良された方法およびシステム |
WO2021095537A1 (ja) * | 2019-11-12 | 2021-05-20 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
JPWO2021131023A1 (ja) * | 2019-12-27 | 2021-07-01 | ||
CN113467602A (zh) * | 2020-03-31 | 2021-10-01 | 中国移动通信集团浙江有限公司 | Vr显示方法及系统 |
CN114827338A (zh) * | 2021-01-29 | 2022-07-29 | 北京外号信息技术有限公司 | 用于在设备的显示媒介上呈现虚拟对象的方法和电子装置 |
WO2022230350A1 (ja) * | 2021-04-28 | 2022-11-03 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
WO2022244052A1 (ja) * | 2021-05-17 | 2022-11-24 | マクセル株式会社 | ヘッドマウントディスプレイ装置 |
WO2022269753A1 (ja) | 2021-06-22 | 2022-12-29 | マクセル株式会社 | 情報処理システム、情報処理装置及び画像表示装置 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2998849A4 (en) * | 2013-05-15 | 2017-01-25 | Sony Corporation | Display control device, display control method, and recording medium |
CN104898829A (zh) * | 2015-04-17 | 2015-09-09 | 杭州豚鼠科技有限公司 | 体感交互系统 |
WO2016194844A1 (ja) * | 2015-05-29 | 2016-12-08 | 京セラ株式会社 | ウェアラブル装置 |
JP6367166B2 (ja) * | 2015-09-01 | 2018-08-01 | 株式会社東芝 | 電子機器及び方法 |
JP2018101019A (ja) * | 2016-12-19 | 2018-06-28 | セイコーエプソン株式会社 | 表示装置及び表示装置の制御方法 |
US10835809B2 (en) | 2017-08-26 | 2020-11-17 | Kristina Contreras | Auditorium efficient tracking in auditory augmented reality |
WO2019044003A1 (ja) * | 2017-09-04 | 2019-03-07 | 株式会社ワコム | 空間位置指示システム |
US10776475B2 (en) * | 2017-11-13 | 2020-09-15 | International Business Machines Corporation | Secure password input in electronic devices |
CN109992100B (zh) * | 2017-12-30 | 2022-11-29 | 深圳多哚新技术有限责任公司 | 一种头戴显示系统及其显示方法 |
JP7238456B2 (ja) * | 2019-02-21 | 2023-03-14 | セイコーエプソン株式会社 | 表示システム、情報処理装置の制御プログラム、及び情報処理装置の制御方法 |
US11153488B2 (en) * | 2019-09-26 | 2021-10-19 | United States Of America, As Represented By The Secretary Of The Army | Variable latency and frame rate camera |
US11169600B1 (en) | 2019-12-06 | 2021-11-09 | Snap Inc. | Virtual object display interface between a wearable device and a mobile device |
TWI800856B (zh) * | 2021-06-25 | 2023-05-01 | 宏碁股份有限公司 | 擴增實境系統及其操作方法 |
US12093470B2 (en) * | 2021-08-31 | 2024-09-17 | Htc Corporation | Virtual image display system and calibration method for pointing direction of controller thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010276792A (ja) * | 2009-05-27 | 2010-12-09 | Ntt Docomo Inc | 画像表示システム、画像表示方法、及び携帯端末 |
JP2011186856A (ja) * | 2010-03-09 | 2011-09-22 | Nec Corp | ヘッドマウントディスプレイを外部表示装置として使用する携帯端末 |
JP2013175208A (ja) * | 2013-04-05 | 2013-09-05 | Nintendo Co Ltd | 情報処理装置および情報処理プログラム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI222039B (en) * | 2000-06-26 | 2004-10-11 | Iwane Lab Ltd | Information conversion system |
JP2005038321A (ja) | 2003-07-18 | 2005-02-10 | Canon Inc | ヘッドマウントディスプレイ装置 |
JP4218952B2 (ja) * | 2003-09-30 | 2009-02-04 | キヤノン株式会社 | データ変換方法及び装置 |
JP4642538B2 (ja) * | 2005-04-20 | 2011-03-02 | キヤノン株式会社 | 画像処理方法および画像処理装置 |
JP5329480B2 (ja) | 2010-05-18 | 2013-10-30 | 富士フイルム株式会社 | ヘッドマウントディスプレイ装置 |
JP5434848B2 (ja) | 2010-08-18 | 2014-03-05 | ソニー株式会社 | 表示装置 |
JP2012141461A (ja) | 2010-12-29 | 2012-07-26 | Sony Corp | ヘッド・マウント・ディスプレイ |
JP2012186660A (ja) | 2011-03-06 | 2012-09-27 | Sony Corp | ヘッド・マウント・ディスプレイ |
US9973848B2 (en) * | 2011-06-21 | 2018-05-15 | Amazon Technologies, Inc. | Signal-enhancing beamforming in an augmented reality environment |
WO2013027628A1 (ja) * | 2011-08-24 | 2013-02-28 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP5538483B2 (ja) * | 2012-06-29 | 2014-07-02 | 株式会社ソニー・コンピュータエンタテインメント | 映像処理装置、映像処理方法、および映像処理システム |
KR101861380B1 (ko) * | 2012-07-16 | 2018-05-28 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 헤드 마운트 디스플레이를 이용한 컨텐츠 출력 방법 및 이를 위한 헤드 마운트 디스플레이 |
KR102063076B1 (ko) * | 2013-07-10 | 2020-01-07 | 엘지전자 주식회사 | 모바일 디바이스, 헤드 마운트 디스플레이 및 제어 방법 |
US9759918B2 (en) * | 2014-05-01 | 2017-09-12 | Microsoft Technology Licensing, Llc | 3D mapping with flexible camera rig |
US9659411B2 (en) * | 2015-01-14 | 2017-05-23 | Oculus Vr, Llc | Passive locators for a virtual reality headset |
WO2017104869A1 (ko) * | 2015-12-17 | 2017-06-22 | 주식회사 룩시드랩스 | 아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 |
-
2015
- 2015-04-30 JP JP2016539870A patent/JP6525010B2/ja active Active
- 2015-04-30 WO PCT/JP2015/062934 patent/WO2016021252A1/ja active Application Filing
- 2015-04-30 US US15/328,089 patent/US10719950B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010276792A (ja) * | 2009-05-27 | 2010-12-09 | Ntt Docomo Inc | 画像表示システム、画像表示方法、及び携帯端末 |
JP2011186856A (ja) * | 2010-03-09 | 2011-09-22 | Nec Corp | ヘッドマウントディスプレイを外部表示装置として使用する携帯端末 |
JP2013175208A (ja) * | 2013-04-05 | 2013-09-05 | Nintendo Co Ltd | 情報処理装置および情報処理プログラム |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017032870A (ja) * | 2015-08-04 | 2017-02-09 | 富士通株式会社 | 画像投影装置及び画像表示システム |
JP2017220032A (ja) * | 2016-06-07 | 2017-12-14 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、およびコンピュータプログラム |
JP2020501245A (ja) * | 2016-11-25 | 2020-01-16 | センサリクス アーゲー | 着用可能な動作追跡システム |
JP7162898B2 (ja) | 2016-11-25 | 2022-10-31 | センサリクス アーゲー | 着用可能な動作追跡システム |
JP2020507221A (ja) * | 2017-02-03 | 2020-03-05 | ベステル エレクトロニク サナイー ベ ティカレト エー.エス. | Hmdを用いたビデオ会議の改良された方法およびシステム |
EP3619685B1 (en) * | 2017-05-04 | 2024-01-24 | Sony Interactive Entertainment Inc. | Head mounted display and method |
CN110537208B (zh) * | 2017-05-04 | 2024-01-09 | 索尼互动娱乐股份有限公司 | 头戴式显示器和方法 |
US11590415B2 (en) | 2017-05-04 | 2023-02-28 | Sony Interactive Entertainment Inc. | Head mounted display and method |
CN110537208A (zh) * | 2017-05-04 | 2019-12-03 | 索尼互动娱乐欧洲有限公司 | 头戴式显示器和方法 |
JP2020520494A (ja) * | 2017-05-04 | 2020-07-09 | ソニー インタラクティブ エンタテインメント ヨーロッパ リミテッド | ヘッドマウントディスプレイおよび方法 |
JP7191853B2 (ja) | 2017-05-04 | 2022-12-19 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイおよび方法 |
US11272152B2 (en) | 2017-10-13 | 2022-03-08 | Nintendo Co., Ltd. | Orientation and/or position estimation system, orientation and/or position estimation method, and orientation and/or position estimation apparatus |
JP7027109B2 (ja) | 2017-10-13 | 2022-03-01 | 任天堂株式会社 | 姿勢位置推測システム、姿勢位置推測方法、および姿勢位置推測装置 |
JP2019074362A (ja) * | 2017-10-13 | 2019-05-16 | 任天堂株式会社 | 姿勢位置推測システム、姿勢位置推測方法、および姿勢位置推測装置 |
JP7035510B2 (ja) | 2017-12-22 | 2022-03-15 | セイコーエプソン株式会社 | 表示システム、及び、表示システムの制御方法 |
JP2019113693A (ja) * | 2017-12-22 | 2019-07-11 | セイコーエプソン株式会社 | 表示システム、表示システムの制御方法、表示装置、及び、表示装置の制御方法 |
CN110060202A (zh) * | 2019-04-19 | 2019-07-26 | 湖北亿咖通科技有限公司 | 一种单目slam算法的初始化方法及系统 |
CN110060202B (zh) * | 2019-04-19 | 2021-06-08 | 湖北亿咖通科技有限公司 | 一种单目slam算法的初始化方法及系统 |
WO2021095537A1 (ja) * | 2019-11-12 | 2021-05-20 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
US11954269B2 (en) | 2019-11-12 | 2024-04-09 | Sony Group Corporation | Information processing apparatus, information processing method, and program for generating location data |
JP7376616B2 (ja) | 2019-12-27 | 2023-11-08 | マクセル株式会社 | ヘッドマウント型情報出力装置 |
JPWO2021131023A1 (ja) * | 2019-12-27 | 2021-07-01 | ||
CN113467602A (zh) * | 2020-03-31 | 2021-10-01 | 中国移动通信集团浙江有限公司 | Vr显示方法及系统 |
CN113467602B (zh) * | 2020-03-31 | 2024-03-19 | 中国移动通信集团浙江有限公司 | Vr显示方法及系统 |
CN114827338A (zh) * | 2021-01-29 | 2022-07-29 | 北京外号信息技术有限公司 | 用于在设备的显示媒介上呈现虚拟对象的方法和电子装置 |
WO2022230350A1 (ja) * | 2021-04-28 | 2022-11-03 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
WO2022244052A1 (ja) * | 2021-05-17 | 2022-11-24 | マクセル株式会社 | ヘッドマウントディスプレイ装置 |
WO2022269753A1 (ja) | 2021-06-22 | 2022-12-29 | マクセル株式会社 | 情報処理システム、情報処理装置及び画像表示装置 |
Also Published As
Publication number | Publication date |
---|---|
US10719950B2 (en) | 2020-07-21 |
JP6525010B2 (ja) | 2019-06-05 |
US20170206673A1 (en) | 2017-07-20 |
JPWO2016021252A1 (ja) | 2017-05-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6525010B2 (ja) | 情報処理装置及び情報処理方法、並びに画像表示システム | |
WO2016013269A1 (ja) | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム | |
CN111052046B (zh) | 使用现实界面访问外部设备的功能 | |
WO2015111283A1 (ja) | 画像表示装置及び画像表示方法 | |
KR102233223B1 (ko) | 화상 표시 장치 및 화상 표시 방법, 화상 출력 장치 및 화상 출력 방법과, 화상 표시 시스템 | |
JP6217747B2 (ja) | 情報処理装置及び情報処理方法 | |
JP6428268B2 (ja) | 画像表示装置及び画像表示方法、並びに画像表示システム | |
KR102184272B1 (ko) | 글래스 타입 단말기 및 이의 제어방법 | |
JP6340301B2 (ja) | ヘッドマウントディスプレイ、携帯情報端末、画像処理装置、表示制御プログラム、表示制御方法、及び表示システム | |
US11042038B2 (en) | Display control apparatus and display control method | |
CN111630478B (zh) | 高速交错双眼跟踪系统 | |
WO2017094606A1 (ja) | 表示制御装置及び表示制御方法 | |
JP6822410B2 (ja) | 情報処理システム及び情報処理方法 | |
KR20160001178A (ko) | 글래스 타입 단말기 및 이의 제어방법 | |
US10564801B2 (en) | Method for communicating via virtual space and information processing apparatus for executing the method | |
US20210160150A1 (en) | Information processing device, information processing method, and computer program | |
US20210063746A1 (en) | Information processing apparatus, information processing method, and program | |
KR101784095B1 (ko) | 복수의 영상 데이터를 이용하는 헤드 마운트 디스플레이 장치 및 복수의 영상 데이터를 송수신하기 위한 시스템 | |
US20200348749A1 (en) | Information processing apparatus, information processing method, and program | |
US11589001B2 (en) | Information processing apparatus, information processing method, and program | |
US12088781B2 (en) | Hyper-connected and synchronized AR glasses | |
US20210065435A1 (en) | Data processing | |
WO2018216327A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP6705929B2 (ja) | 表示制御装置及び表示制御方法 | |
JP2021068296A (ja) | 情報処理装置、ヘッドマウントディスプレイ、およびユーザ操作処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15829711 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016539870 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15328089 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15829711 Country of ref document: EP Kind code of ref document: A1 |