WO2021075113A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2021075113A1
WO2021075113A1 PCT/JP2020/027831 JP2020027831W WO2021075113A1 WO 2021075113 A1 WO2021075113 A1 WO 2021075113A1 JP 2020027831 W JP2020027831 W JP 2020027831W WO 2021075113 A1 WO2021075113 A1 WO 2021075113A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
calibration
real object
contact operation
plane
Prior art date
Application number
PCT/JP2020/027831
Other languages
French (fr)
Japanese (ja)
Inventor
周藤 泰広
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2021075113A1 publication Critical patent/WO2021075113A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Definitions

  • This disclosure relates to information processing devices, information processing methods and programs.
  • AR Augmented Reality
  • optical see-through type display that superimposes a virtual display object on a real object that the user is viewing.
  • an operation is performed using a parameter that converts the position of a real object to the position of a display. It has been proposed to calibrate the parameter in order to accurately superimpose the virtual object on the position of the real object (see, for example, Non-Patent Document 1 below).
  • the present disclosure has been made in view of the above points, and provides an information processing device, an information processing method, and a program capable of calibrating parameters by a simple method that minimizes the load on the user. Is one of the purposes.
  • the present disclosure is, for example, A parameter that converts the position where the contact operation is performed on the real object to the position on the display where the object is displayed, based on the detection result of the contact operation for each of at least three objects superimposed on the real object.
  • This is an information processing device having a calibration processing unit that performs calibration processing.
  • the present disclosure is, for example, A display on which the calibration processing unit displays the position where the contact operation is performed on the real object based on the detection result of the contact operation for each of at least three objects superimposed on the real object.
  • This is an information processing method that performs calibration processing to set parameters to be converted to the position in.
  • the present disclosure is, for example, A display on which the calibration processing unit displays the position where the contact operation is performed on the real object based on the detection result of the contact operation for each of at least three objects superimposed on the real object. It is a program that causes a computer to execute an information processing method that performs a calibration process that sets parameters to be converted to the position in.
  • FIG. 1 is a diagram showing an external example of the information processing apparatus according to the present embodiment.
  • FIG. 2 is a diagram referred to when explaining an internal configuration example of the information processing apparatus according to the present embodiment.
  • FIG. 3 is a diagram referred to when explaining the function of the information processing apparatus according to the present embodiment.
  • FIG. 4 is a diagram referred to when explaining an example of the object according to the present embodiment.
  • FIG. 5 is a flowchart for explaining the flow of the calibration process performed by the information processing apparatus according to the present embodiment.
  • FIG. 1 is a diagram showing an external example of the information processing device 1 according to the present embodiment.
  • the information processing device 1 has a left image display unit 3A and a right image display unit 3B located in front of the eyes, similarly to ordinary eyeglasses.
  • the left image display unit 3A and the right image display unit 3B are configured as, for example, a see-through type display unit (optical see-through display).
  • the left image display unit 3A and the right image display unit 3B are appropriately collectively referred to as the display unit 3.
  • the information processing device 1 has a frame 5 for holding the left image display unit 3A and the right image display unit 3B.
  • the frame 5 is made of the same materials that make up ordinary eyeglasses, such as metals, alloys, plastics, and combinations thereof.
  • the frame 5 is equipped with batteries, various sensors, speakers, and the like.
  • an outward-facing camera 11 for recognizing an object existing in the user's line-of-sight direction is mounted at a predetermined position of the frame 5.
  • the camera 11 is configured as, for example, a stereo camera having a camera 11A and a camera 11B, and the camera 11 can acquire information on the depth.
  • a depth sensor for example, ToF (Time of Flight), LiDAR (Light Detection and Ringing), etc.
  • the frame 5 is equipped with a camera 12 facing inward (the side facing the user's eyes).
  • the camera 12 has a camera 12A for the left eye and a camera 12B for the right eye.
  • FIG. 2 is a diagram for explaining an example of internal configuration of the information processing device 1 according to the present embodiment.
  • the information processing device 1 includes, for example, a signal processing unit 21, a sensor unit 22, a calibration processing unit 23, an AR superimposition processing unit 24, and a drawing processing unit 25.
  • the signal processing unit 21 comprehensively controls the entire information processing device 1.
  • the signal processing unit 21 performs known processing on the sensing data supplied from the sensor unit 22.
  • the signal processing unit 21 detects a contact operation with respect to each of at least three objects superimposed and displayed on the real object based on the sensing data supplied from the sensor unit 22. The specific contents of other processing performed by the signal processing unit 21 will be described later.
  • the sensor unit 22 is a general term for the sensors included in the information processing device 1.
  • the sensor unit 22 includes the above-mentioned cameras 11 and 12. As described above, the sensor unit 22 may include a depth sensor for acquiring depth information.
  • the calibration processing unit 23 performs the calibration process.
  • the calibration process is a display that displays the position of the contact operation on the real object based on the detection result of the contact operation on each of at least three objects superimposed on the real object. This is the process of setting the parameters to be converted to the position in. The specific contents of the calibration process will be described later.
  • the AR superimposition processing unit 24 calculates and sets the position (specifically, the coordinates) at which the object is superposed on the display unit 3.
  • the AR superimposition processing unit 24 supplies the calculation result to the drawing processing unit 25.
  • the drawing processing unit 25 performs drawing processing in which a virtual object is superimposed and displayed on the position (drawing position) of the display unit 3 supplied from the AR superimposition processing unit 24. By the processing by the drawing processing unit 25, the object is drawn (superimposed display) in the real space visually recognized through the display unit 3.
  • the memory unit 26 is a general term for the memory included in the information processing device 1.
  • the memory unit 26 includes, for example, a ROM (Read Only Memory) in which a program executed by the information processing device 1 is stored and a RAM (Random Access Memory) used as a work memory.
  • the memory unit 26 may be built in the information processing device 1 or may be detachable. Specific examples of the memory unit 26 include a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and a storage device combining these. An area for storing parameters to be calibrated, which will be described later, is set in the memory unit 26.
  • the configuration of the information processing device 1 described above is an example, and the information processing device 1 may have another configuration.
  • the information processing device 1 may have a communication unit for communicating with another device, a speaker for reproducing voice, a sensor for grasping its own position, and the like.
  • FIG. 3 is a functional block diagram for explaining an example of the functions of the information processing device 1.
  • Functions of the information processing device 1 include, for example, a plane detection process, a pointer generation process, a drawing position calculation process, a display drawing process, a touch detection process, a touch corresponding point saving process, and a calibration process.
  • planes such as a wall surface and a surface on a table existing around the information processing device 1 are detected.
  • a plane on the real object is detected based on the depth of the real object on which the contact operation is performed.
  • the plane detection process is performed based on the image obtained by the camera 11 and the output of the depth sensor. More specifically, a plane in which the difference in distance in the depth direction detected based on the image captured by the camera 11 is within a certain range is detected as a plane.
  • the flat surface does not have to be flat in a strict sense, and may have irregularities within a certain range.
  • the plane detection process gives a plane equation that defines the detected plane. An object described later is drawn on the plane detected by the plane detection process.
  • another recognition process other than depth detection such as an image recognition process for a two-dimensional image captured by the camera 11, may be used.
  • a plurality of planes can be detected by the plane detection process.
  • the plane closest to the information processing device 1 is the plane on which the object is superimposed. Is set as.
  • pointer generation process for example, three pointer coordinates are set in the plane defined by the plane equation obtained by the plane detection process. Then, a circle centered on each pointer coordinate is defined. The set pointer coordinates are used in the drawing position calculation process.
  • Equation 1 (u, v) are the coordinates of the display unit 3, (x, y, z) are the coordinates of the real space, specifically, the pointer coordinates, and A is the internal parameter (angle of view, angle of view) of the display unit 3. It is a parameter (parameter such as focal length) and is a known value set at the time of shipment of the information processing apparatus 1.
  • T] in Equation 1 represents the posture (Rotation) and position (Translation) of the imaging unit (camera 11 in the present embodiment). This [R
  • T] used in the calculation performed in the drawing position calculation process is a preset default calibration parameter.
  • the drawing position calculation process three pointer coordinates (u, v) corresponding to the three pointer coordinates (x, y, z) set in the pointer generation process can be obtained.
  • an operation using an ellipse equation for converting the circle into an ellipse is performed in order to project the circle on a plane.
  • the calculation result of the drawing position calculation process is used for the display drawing process. Further, the calculation result (three pointer coordinates (u, v)) by the drawing position calculation process is saved (stored) in the memory unit 26 by the touch corresponding point saving process.
  • an ellipse centered on the pointer coordinates (u, v) is drawn at a depth position on a plane (on the same plane) on the real object detected by the plane detection process.
  • a plane on the same plane
  • three elliptical virtual markers an example of three objects
  • the user recognizes the three ellipses displayed on the display unit 3.
  • the shape of the object is not an ellipse, but can be any shape such as a rectangle or a polygon.
  • the calculation using the ellipse equation described above may not be performed. Further, in order to clearly distinguish individual objects, the color and shape of each object may be different.
  • the user is notified to urge the user to perform a contact operation by touching the vicinity of the center of the object with a finger.
  • Such notification may be performed by drawing characters on the display unit 3, or may be performed by reproducing voice.
  • the user performs a contact operation near the center of the object.
  • the contact operation performed by the user is detected.
  • the contact operation is performed, for example, by the fingertip of the user (see FIG. 4). It should be noted that the contact operation may be performed using a pen or the like instead of the user's fingertip.
  • the depth near the center of a predetermined object and the depth of the user's fingertip are detected, and when the difference becomes less than or equal to the predetermined object, it is detected that the contact operation with respect to the object has been performed. It may be detected that the contact operation is performed according to the fact that the movement of the user's fingertip is substantially stationary for a certain period of time (for example, several seconds) based on the image captured by the camera 11. Further, the contact operation may be detected by another known method.
  • the touch detection process detects touch points (x, y, z) that are positions corresponding to the user's contact operation.
  • touch points x, y, z
  • the three touch points are detected by the touch detection process.
  • the detected three touch points are saved in the memory unit 26 by the touch point saving process. That is, in the memory unit 26, three sets of pointer coordinates (u, v) and touch points (x, y, z) obtained by the drawing position calculation process are stored.
  • the calculation is performed by the mathematical formula 1 using the correspondence between the three sets of pointer coordinates (u, v) and the touch points (x, y, z) stored in the memory unit 26.
  • T] corresponding to the usage environment of the user is calculated.
  • T] which is a parameter for converting the position of the real object to the position on the display unit 3, is appropriately set based on the detection result of the contact operation described above. ..
  • T] is stored in the memory unit 26.
  • T] obtained by the calibration processing is used.
  • the plane detection process, the pointer generation process, the touch detection process, and the touch corresponding point saving process are performed by the signal processing unit 21. Therefore, in the present embodiment, the signal processing unit 21 functions as a contact operation detection unit that detects a contact operation and a plane detection unit that detects a plane on a real object. Further, in the present embodiment, the calibration process is performed by the calibration processing unit 23. Further, in the present embodiment, the drawing position calculation process is performed by the AR superimposition processing unit 24. Therefore, in the present embodiment, the AR superimposition processing unit 24 functions as a drawing position calculation processing unit. Further, in the present embodiment, the drawing process is performed by the drawing processing unit 25.
  • step ST11 the calibration process is started.
  • the calibration process is performed, for example, at the start of use of the information processing device 1. Specifically, it is detected that the frame 5 of the information processing device 1 is hung on the user's ear by a pressure sensor or the like. When such detection is made, the calibration process described below is performed.
  • the calibration process may be performed when a predetermined operation (for example, an operation of turning on the power) is performed on the information processing device 1. Further, when it is detected by the inner camera 12 that the deviation of the relative relationship between the display unit 3 and the eyes (for example, the pupil) becomes more than a predetermined value, the calibration process may be performed. Further, since it is assumed that the position where the information processing apparatus 1 is used changes even during use, the calibration process may be periodically performed even after the start of use. Then, the process proceeds to step ST12.
  • a predetermined operation for example, an operation of turning on the power
  • step ST12 the user is instructed to approach a flat surface such as a table or a wall surface. Such an instruction is given by displaying characters or reproducing voice. Then, the process proceeds to step ST13.
  • step ST13 the plane is detected by the plane detection process. For example, a contact operation is performed by the user. Then, the depth of the position on the real object such as the table on which the contact operation is performed is detected, and the plane on the real object is detected based on the detected depth. The user may be instructed to approach the plane detected by the plane detection process in advance. Then, the process proceeds to step ST14.
  • step ST14 a circle centered on each of the three pointer coordinates (x, y, z) is set by the pointer generation process. Then, the drawing position calculation process performs an operation using the default calibration parameters for each of the three pointer coordinates (x, y, z), so that the three pointer coordinates (u, v) can be obtained. .. Three elliptical objects centered on each of the three pointer coordinates (u, v) are drawn at the depth positions on the plane detected by the plane detection process. Then, a notification is given to the user to urge the user to perform a contact operation near the center of the drawn object. Then, the process proceeds to step ST15.
  • step ST15 the touch point corresponding to the contact operation by the user is detected by the touch point detection process. Then, the process proceeds to step ST16.
  • step ST16 it is determined whether or not the number of pairs of the pointer coordinates (u, v) and the touch points (x, y, z) is sufficient. In this example, since three objects are drawn, three touch points (x, y, z) are detected. Therefore, in step ST16, it is determined whether or not the number of pairs of the pointer coordinates (u, v) and the touch points (x, y, z) is three or more. Such a determination is made, for example, by the signal processing unit 21.
  • step ST16 If the determination result in step ST16 is No, the process returns to step ST15, and the touch point detection process is performed again. At this time, the object that has not been touched may be emphasized, or the object that has already been touched may be deleted so that the object that has not been touched by the user can be easily recognized.
  • step ST17 the calibration parameter [R
  • T] after setting is saved in the memory unit 26. It will be used until the saved calibration parameter [R
  • the calibration parameters can be set appropriately. Further, in the present embodiment, it is not necessary to prepare a specific object or the like for the calibration process. Further, since the user only needs to perform a contact operation on the object drawn on the plane, the calibration process can be easily performed without imposing an excessive burden on the user. Further, since the user can obtain a tactile sensation by performing a touch operation on the real object, the touch point is more stable than performing an operation on a position in the space where the real object does not exist. By superimposing the object on the real object whose position is easy for the user to specify, the three-dimensional position in the real space is stably detected, and the accuracy of the calibration process is improved.
  • the constraint condition that the touch point exists on the plane can be used, so that the calculation related to the calibration process can be simplified. Further, since the calibration process can be performed if there is a flat surface in the surroundings, it is possible to eliminate the locational restrictions when performing the calibration process.
  • the information processing device 1 is not limited to the eyeglass type, and may be a HUD (Head Up Display) worn on the head, a Helmet type or a car front glass (head up display) type display device. May be. Further, the device is not necessarily attached to the human body and may be a portable device.
  • HUD Head Up Display
  • Helmet type or a car front glass (head up display) type display device. May be.
  • the device is not necessarily attached to the human body and may be a portable device.
  • a part of the functions of the information processing device 1 according to the above-described embodiment may be performed by a device on the cloud.
  • the device on the cloud can be an information processing device.
  • This disclosure can also be realized by devices, methods, programs, systems, etc. For example, by making it possible to download a program that performs the functions described in the above-described embodiment and downloading and installing the program by a device that does not have the functions described in the above-described embodiment, the control described in the embodiment can be performed in the device. It becomes possible to do.
  • the present disclosure can also be realized by a server that distributes such a program.
  • the items described in each embodiment and modification can be combined as appropriate.
  • the present disclosure may also adopt the following configuration.
  • (1) Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the position where the contact operation is performed on the real object is set to the position on the display displaying the object.
  • An information processing device having a calibration processing unit that performs calibration processing for setting parameters to be converted.
  • (2) The information processing apparatus according to (1), which has a contact operation detection unit that detects a position on the real object on which the contact operation has been performed.
  • (3) The information processing device according to (2), wherein the contact operation detection unit detects the depth of the position on the real object on which the contact operation is performed.
  • It has a plane detection unit that detects a plane on the real object based on the depth of the real object on which the contact operation is performed.
  • the information processing apparatus according to (3), wherein the three objects are superimposed and displayed at the position of the depth on the plane on the real object.
  • the information processing apparatus according to any one of (1) to (5), wherein the parameter is a parameter including a posture and a position of a photographing unit that photographs the real object.
  • the information processing apparatus which has the photographing unit.
  • the information processing apparatus according to (4) or (5), wherein the plane detection unit sets the plane located closest to the detected plurality of planes as a plane on which the object is superimposed and displayed.
  • the information processing apparatus performs the calibration processing at the start of use of the information processing apparatus.
  • the calibration processing unit performs the calibration processing when the deviation between the positions of the display and the eyes is equal to or greater than a predetermined value.
  • the information processing apparatus which has a drawing position calculation processing unit for calculating each drawing position of the three objects on the display.
  • the information processing apparatus which has a drawing processing unit that performs a process of displaying the object at the drawing position set by the drawing position calculation processing unit.
  • the calibration processing unit Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the calibration processing unit sets the position on the real object where the contact operation is performed.
  • An information processing method that performs a calibration process that sets parameters to be converted to positions on the display to be displayed.
  • the calibration processing unit Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the calibration processing unit sets the position on the real object where the contact operation is performed.
  • a program that causes a computer to execute an information processing method that performs a calibration process that sets parameters to be converted to the position on the display to be displayed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

According to the present invention, an information processing device includes a calibration processing unit which performs, on the basis of detection results of contact operations for at least three respective objects that are superimposedly displayed for an actual object, a calibration process for setting a parameter for converting a position, at which a contact operation is performed on the actual object, to positions on a display for displaying the objects. FIG. 2

Description

情報処理装置、情報処理方法及びプログラムInformation processing equipment, information processing methods and programs
 本開示は、情報処理装置、情報処理方法及びプログラムに関する。 This disclosure relates to information processing devices, information processing methods and programs.
 ユーザが視認している実物体に仮想表示物を重畳する、所謂、光学シースルー型のディスプレイを使用した拡張現実(AR(Augmented Reality))に関する技術が知られている。係るAR技術では、実物体の位置をディスプレイの位置に変換するパラメータを用いた演算が行われる。実物体の位置に精度良く仮想物体を重畳するために、当該パラメータをキャリブレーションする提案がなされている(例えば、下記非特許文献1を参照のこと)。 There is known a technology related to augmented reality (AR (Augmented Reality)) using a so-called optical see-through type display that superimposes a virtual display object on a real object that the user is viewing. In such AR technology, an operation is performed using a parameter that converts the position of a real object to the position of a display. It has been proposed to calibrate the parameter in order to accurately superimpose the virtual object on the position of the real object (see, for example, Non-Patent Document 1 below).
 このような分野では、できるたけ簡便にキャリブレーションが行われることが望まれる。 In such fields, it is desirable that calibration be performed as easily as possible.
 本開示は、上述した点に鑑みてなされたものであり、ユーザにかかる負荷を極力小さくした簡便な方法でパラメータをキャリブレーションすることが可能な情報処理装置、情報処理方法及びプログラムを提供することを目的の一つとする。 The present disclosure has been made in view of the above points, and provides an information processing device, an information processing method, and a program capable of calibrating parameters by a simple method that minimizes the load on the user. Is one of the purposes.
 本開示は、例えば、
 実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、実物体上の接触操作が行われた位置を、オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行うキャリブレーション処理部を有する
 情報処理装置である。
The present disclosure is, for example,
A parameter that converts the position where the contact operation is performed on the real object to the position on the display where the object is displayed, based on the detection result of the contact operation for each of at least three objects superimposed on the real object. This is an information processing device having a calibration processing unit that performs calibration processing.
 本開示は、例えば、
 キャリブレーション処理部が、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、実物体上の接触操作が行われた位置を、オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行う
 情報処理方法である。
The present disclosure is, for example,
A display on which the calibration processing unit displays the position where the contact operation is performed on the real object based on the detection result of the contact operation for each of at least three objects superimposed on the real object. This is an information processing method that performs calibration processing to set parameters to be converted to the position in.
 本開示は、例えば、
 キャリブレーション処理部が、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、実物体上の接触操作が行われた位置を、オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行う
 情報処理方法をコンピュータに実行させるプログラムである。
The present disclosure is, for example,
A display on which the calibration processing unit displays the position where the contact operation is performed on the real object based on the detection result of the contact operation for each of at least three objects superimposed on the real object. It is a program that causes a computer to execute an information processing method that performs a calibration process that sets parameters to be converted to the position in.
図1は、本実施形態に係る情報処理装置の外観例を示す図である。FIG. 1 is a diagram showing an external example of the information processing apparatus according to the present embodiment. 図2は、本実施形態に係る情報処理装置の内部構成例を説明する際に参照される図である。FIG. 2 is a diagram referred to when explaining an internal configuration example of the information processing apparatus according to the present embodiment. 図3は、本実施形態に係る情報処理装置が有する機能を説明する際に参照される図である。FIG. 3 is a diagram referred to when explaining the function of the information processing apparatus according to the present embodiment. 図4は、本実施形態に係るオブジェクトの一例を説明する際に参照される図である。FIG. 4 is a diagram referred to when explaining an example of the object according to the present embodiment. 図5は、本実施形態に係る情報処理装置で行われるキャリブレーション処理の流れを説明するためのフローチャートである。FIG. 5 is a flowchart for explaining the flow of the calibration process performed by the information processing apparatus according to the present embodiment.
 以下、本開示の実施形態等について図面を参照した説明がなされる。なお、説明は以下の順序で行われる。
<本実施形態において考慮すべき問題>
<一実施形態>
<変形例>
 以下に説明する実施形態等は本開示の好適な具体例であり、本開示の内容がこれらの実施形態等に限定されるものではない。
Hereinafter, embodiments and the like of the present disclosure will be described with reference to the drawings. The explanation will be given in the following order.
<Problems to be considered in this embodiment>
<One Embodiment>
<Modification example>
The embodiments and the like described below are suitable specific examples of the present disclosure, and the contents of the present disclosure are not limited to these embodiments and the like.
<本実施形態において考慮すべき問題>
 始めに、本開示の理解を容易とするために、本開示において考慮すべき問題について説明する。光学シースルー型のディスプレイを用いたAR技術ではカメラとディスプレイ間をキャリブレーションすることで、実物体に対する仮想物体の重畳を実現している。カメラとディスプレイとは、個々の内部パラメータと、相対関係の外部パラメータとで表現されて扱われる。ここで、ユーザ毎にディスプレイの視点位置は微妙に異なるため個別のキャリブレーションが必要となる。また、かけズレや後発におけるズレも発生することからユーザが光学シースルー型のディスプレイを視聴した際にキャリブレーションを行うことにより重畳精度を改善することが可能となる。係るキャリブレーションは、できるだけユーザに対して負担の少ない方法で行われることが望まれる。また、できるだけ場所の制約がないようにキャリブレーションを行えることが望まれる。以上の観点を踏まえつつ、本開示の実施形態についての詳細な説明がなされる。
<Problems to be considered in this embodiment>
First, the issues to be considered in the present disclosure will be described in order to facilitate the understanding of the present disclosure. In AR technology using an optical see-through type display, the superimposition of a virtual object on a real object is realized by calibrating between the camera and the display. The camera and the display are treated by being represented by individual internal parameters and relative external parameters. Here, since the viewpoint position of the display is slightly different for each user, individual calibration is required. In addition, since misalignment and misalignment occur later, it is possible to improve the superimposition accuracy by performing calibration when the user views the optical see-through type display. It is desirable that such calibration be performed by a method that is as light as possible to the user. In addition, it is desirable that calibration can be performed so that there are as few restrictions as possible on the location. Based on the above viewpoints, a detailed explanation of the embodiments of the present disclosure will be given.
<一実施形態>
[情報処理装置の外観例]
 本実施形態では、情報処理装置(情報処理装置1)として眼鏡型のウエアラブル機器を例にした説明がなされる。図1は、本実施形態に係る情報処理装置1の外観例を示す図である。情報処理装置1は、通常の眼鏡と同様に、眼前に位置する左画像表示部3A及び右画像表示部3Bを有している。左画像表示部3A及び右画像表示部3Bは、例えば、シースルー型の表示部(光学シースルーディスプレイ)として構成されている。なお、左右の表示部を特に区別する必要無い場合には、左画像表示部3A及び右画像表示部3Bは、表示部3と適宜、総称される。
<One Embodiment>
[Example of appearance of information processing device]
In the present embodiment, an explanation will be given using a glasses-type wearable device as an example of the information processing device (information processing device 1). FIG. 1 is a diagram showing an external example of the information processing device 1 according to the present embodiment. The information processing device 1 has a left image display unit 3A and a right image display unit 3B located in front of the eyes, similarly to ordinary eyeglasses. The left image display unit 3A and the right image display unit 3B are configured as, for example, a see-through type display unit (optical see-through display). When it is not necessary to distinguish the left and right display units, the left image display unit 3A and the right image display unit 3B are appropriately collectively referred to as the display unit 3.
 情報処理装置1は、左画像表示部3A及び右画像表示部3Bを保持するためのフレーム5を有する。フレーム5は、金属や合金、プラスチック、これらの組合せといった、通常の眼鏡を構成する材料と同じ材料から作製されている。 The information processing device 1 has a frame 5 for holding the left image display unit 3A and the right image display unit 3B. The frame 5 is made of the same materials that make up ordinary eyeglasses, such as metals, alloys, plastics, and combinations thereof.
 フレーム5には、電池、各種センサ、スピーカ等が搭載されている。例えば、フレーム5の所定位置には、ユーザの視線方向に存在する物体を認識するための外向きのカメラ11が搭載されている。カメラ11は、例えば、カメラ11A及びカメラ11Bを有するステレオカメラとして構成されており、カメラ11により奥行に関する情報が取得可能とされている。なお、奥行に関する情報を取得するためのデプスセンサ(例えば、ToF(Time of Flight)やLiDAR(Light Detection and Ranging)等)がフレーム5に搭載されていても良い。また、フレーム5には、内向き(ユーザの目に向かう側)のカメラ12が搭載されている。カメラ12は、左目用のカメラ12A及び右目用のカメラ12Bを有している。 The frame 5 is equipped with batteries, various sensors, speakers, and the like. For example, an outward-facing camera 11 for recognizing an object existing in the user's line-of-sight direction is mounted at a predetermined position of the frame 5. The camera 11 is configured as, for example, a stereo camera having a camera 11A and a camera 11B, and the camera 11 can acquire information on the depth. A depth sensor (for example, ToF (Time of Flight), LiDAR (Light Detection and Ringing), etc.) for acquiring information on the depth may be mounted on the frame 5. Further, the frame 5 is equipped with a camera 12 facing inward (the side facing the user's eyes). The camera 12 has a camera 12A for the left eye and a camera 12B for the right eye.
[情報処理装置の内部構成例]
 図2は、本実施形態に係る情報処理装置1の内部構成例を説明するための図である。情報処理装置1は、例えば、信号処理部21と、センサ部22と、キャリブレーション処理部23と、AR重畳処理部24と、描画処理部25とを有している。
[Example of internal configuration of information processing device]
FIG. 2 is a diagram for explaining an example of internal configuration of the information processing device 1 according to the present embodiment. The information processing device 1 includes, for example, a signal processing unit 21, a sensor unit 22, a calibration processing unit 23, an AR superimposition processing unit 24, and a drawing processing unit 25.
 信号処理部21は、情報処理装置1全体を統括的に制御する。また、信号処理部21は、センサ部22から供給されるセンシングデータに対して公知の処理を行う。例えば、信号処理部21は、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作を、センサ部22から供給されるセンシングデータに基づいて検出する。なお、信号処理部21により行われる他の処理の具体的内容は後述される。 The signal processing unit 21 comprehensively controls the entire information processing device 1. In addition, the signal processing unit 21 performs known processing on the sensing data supplied from the sensor unit 22. For example, the signal processing unit 21 detects a contact operation with respect to each of at least three objects superimposed and displayed on the real object based on the sensing data supplied from the sensor unit 22. The specific contents of other processing performed by the signal processing unit 21 will be described later.
 センサ部22には、情報処理装置1が有するセンサを総称したものである。センサ部22には、上述したカメラ11や12が含まれる。上述したように、センサ部22に、奥行き情報を取得するためのデプスセンサが含まれていても良い。 The sensor unit 22 is a general term for the sensors included in the information processing device 1. The sensor unit 22 includes the above-mentioned cameras 11 and 12. As described above, the sensor unit 22 may include a depth sensor for acquiring depth information.
 キャリブレーション処理部23は、キャリブレーション処理を行う。キャリブレーション処理とは、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、実物体上の接触操作が行われた位置を、オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定する処理である。なお、キャリブレーション処理の具体的な内容については後述される。 The calibration processing unit 23 performs the calibration process. The calibration process is a display that displays the position of the contact operation on the real object based on the detection result of the contact operation on each of at least three objects superimposed on the real object. This is the process of setting the parameters to be converted to the position in. The specific contents of the calibration process will be described later.
 AR重畳処理部24は、表示部3にオブジェクトを重畳する位置(具体的には、座標)を演算して設定する。AR重畳処理部24は、演算結果を描画処理部25に供給する。 The AR superimposition processing unit 24 calculates and sets the position (specifically, the coordinates) at which the object is superposed on the display unit 3. The AR superimposition processing unit 24 supplies the calculation result to the drawing processing unit 25.
 描画処理部25は、AR重畳処理部24から供給された表示部3の位置(描画位置)に、仮想物体を重畳して表示する描画処理を行う。描画処理部25による処理により、表示部3を介して視認される実空間にオブジェクトが描画(重畳表示)される。 The drawing processing unit 25 performs drawing processing in which a virtual object is superimposed and displayed on the position (drawing position) of the display unit 3 supplied from the AR superimposition processing unit 24. By the processing by the drawing processing unit 25, the object is drawn (superimposed display) in the real space visually recognized through the display unit 3.
 メモリ部26は、情報処理装置1が有するメモリを総称したものである。メモリ部26は、例えば、情報処理装置1で実行されるプログラムが格納されるROM(Read Only Memory)やワークメモリとして使用されるRAM(Random Access Memory)を含む。メモリ部26は、情報処理装置1に内蔵されていても良いし、着脱自在とされていても良い。メモリ部26の具体例としては、HDD(Hard Disk Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス、光磁気記憶デバイス、これらを組み合わせた記憶デバイス等が挙げられる。メモリ部26には、後述するキャリブレーションの対象となるパラメータを保存する領域が設定されている。 The memory unit 26 is a general term for the memory included in the information processing device 1. The memory unit 26 includes, for example, a ROM (Read Only Memory) in which a program executed by the information processing device 1 is stored and a RAM (Random Access Memory) used as a work memory. The memory unit 26 may be built in the information processing device 1 or may be detachable. Specific examples of the memory unit 26 include a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and a storage device combining these. An area for storing parameters to be calibrated, which will be described later, is set in the memory unit 26.
 なお、以上説明した情報処理装置1の構成は一例であって、情報処理装置1が他の構成を有していても良い。例えば、情報処理装置1が、他の装置と通信を行うための通信部や音声を再生するためのスピーカ、自身の位置を把握するためのセンサ等を有していても良い。 The configuration of the information processing device 1 described above is an example, and the information processing device 1 may have another configuration. For example, the information processing device 1 may have a communication unit for communicating with another device, a speaker for reproducing voice, a sensor for grasping its own position, and the like.
[情報処理装置の機能]
 図3は、情報処理装置1の機能の一例を説明するための機能ブロック図である。情報処理装置1の機能としては、例えば、平面検出処理、ポインタ生成処理、描画位置演算処理、ディスプレイ描画処理、タッチ検出処理、タッチ対応点保存処理、及び、キャリブレーション処理が挙げられる。
[Functions of information processing device]
FIG. 3 is a functional block diagram for explaining an example of the functions of the information processing device 1. Functions of the information processing device 1 include, for example, a plane detection process, a pointer generation process, a drawing position calculation process, a display drawing process, a touch detection process, a touch corresponding point saving process, and a calibration process.
 平面検出処理では、情報処理装置1の周囲に存在する壁面やテーブル上の面等の平面が検出される。例えば、接触操作が行われる実物体のデプスに基づいて当該実物体上の平面が検出される。平面検出処理は、カメラ11により得られる画像やデプスセンサの出力に基づいて行われる。より具体的には、カメラ11の撮像画像に基づいて検出される奥行方向の距離の差分が一定以内である面が平面として検出される。平面は、厳密な意味で平らである必要はなく、一定範囲内の凹凸があっても良い。平面検出処理により、検出された平面を規定する平面方程式が得られる。平面検出処理により検出された平面に、後述するオブジェクトが描画される。なお、平面検出処理には、カメラ11により撮像した2次元画像に対する画像認識処理など、デプス検出以外の別の認識処理を用いてもよい。 In the plane detection process, planes such as a wall surface and a surface on a table existing around the information processing device 1 are detected. For example, a plane on the real object is detected based on the depth of the real object on which the contact operation is performed. The plane detection process is performed based on the image obtained by the camera 11 and the output of the depth sensor. More specifically, a plane in which the difference in distance in the depth direction detected based on the image captured by the camera 11 is within a certain range is detected as a plane. The flat surface does not have to be flat in a strict sense, and may have irregularities within a certain range. The plane detection process gives a plane equation that defines the detected plane. An object described later is drawn on the plane detected by the plane detection process. For the plane detection process, another recognition process other than depth detection, such as an image recognition process for a two-dimensional image captured by the camera 11, may be used.
 なお、平面検出処理により複数の平面が検出され得る。本実施形態では、検出された複数の平面のうち、情報処理装置1(より具体的には情報処理装置1を使用しているユーザ)に最も近い位置にある平面が、オブジェクトが重畳される平面として設定される。 Note that a plurality of planes can be detected by the plane detection process. In the present embodiment, among the plurality of detected planes, the plane closest to the information processing device 1 (more specifically, the user using the information processing device 1) is the plane on which the object is superimposed. Is set as.
 ポインタ生成処理では、平面検出処理により得られた平面方程式で規定される平面に、例えば3個のポインタ座標が設定される。そして、各ポインタ座標を中心とする円が規定される。設定されたポインタ座標は、描画位置演算処理で使用される。 In the pointer generation process, for example, three pointer coordinates are set in the plane defined by the plane equation obtained by the plane detection process. Then, a circle centered on each pointer coordinate is defined. The set pointer coordinates are used in the drawing position calculation process.
 描画位置演算処理では、各ポインタ座標をそれぞれ表示部3における座標(u,v)に変換する演算が行われる。具体的には、描画位置演算処理では、下記の数式1に基づく演算が行われる。 In the drawing position calculation process, the operation of converting each pointer coordinate into the coordinates (u, v) in the display unit 3 is performed. Specifically, in the drawing position calculation process, a calculation based on the following mathematical formula 1 is performed.
[式1]
Figure JPOXMLDOC01-appb-I000001
[Equation 1]
Figure JPOXMLDOC01-appb-I000001
 数式1において、(u,v)は表示部3の座標、(x,y,z)は実空間の座標であり、具体的にはポインタ座標、Aは表示部3の内部パラメータ(画角、焦点距離等のパラメータ)であり情報処理装置1の出荷時に設定されている既知の値である。また、数式1における[R|T]は、撮像ユニット(本実施形態におけるカメラ11)の姿勢(回転(Rotation))と位置(並進(Translation))を表している。この[R|T]がキャリブレーションの対象となるパラメータである。以下の説明では、[R|T]がキャリブレーションパラメータ[R|T]と適宜称される。 In Equation 1, (u, v) are the coordinates of the display unit 3, (x, y, z) are the coordinates of the real space, specifically, the pointer coordinates, and A is the internal parameter (angle of view, angle of view) of the display unit 3. It is a parameter (parameter such as focal length) and is a known value set at the time of shipment of the information processing apparatus 1. Further, [R | T] in Equation 1 represents the posture (Rotation) and position (Translation) of the imaging unit (camera 11 in the present embodiment). This [R | T] is a parameter to be calibrated. In the following description, [R | T] is appropriately referred to as a calibration parameter [R | T].
 なお、描画位置演算処理で行われる演算で用いられるキャリブレーションパラメータ[R|T]は、予め設定されているデフォルトのキャリブレーションパラメータである。描画位置演算処理により、ポインタ生成処理で設定された3個のポインタ座標(x,y,z)に対応する3個のポインタ座標(u,v)が得られる。なお、本実施形態では、ポインタ生成処理で円が設定されていることから、円を平面に投影するために、円を楕円に変換する楕円方程式を用いた演算がなされる。描画位置演算処理による演算結果がディスプレイ描画処理に用いられる。また、描画位置演算処理による演算結果(3個のポインタ座標(u,v))は、タッチ対応点保存処理によりメモリ部26に保存(記憶)される。 The calibration parameter [R | T] used in the calculation performed in the drawing position calculation process is a preset default calibration parameter. By the drawing position calculation process, three pointer coordinates (u, v) corresponding to the three pointer coordinates (x, y, z) set in the pointer generation process can be obtained. In this embodiment, since the circle is set in the pointer generation process, an operation using an ellipse equation for converting the circle into an ellipse is performed in order to project the circle on a plane. The calculation result of the drawing position calculation process is used for the display drawing process. Further, the calculation result (three pointer coordinates (u, v)) by the drawing position calculation process is saved (stored) in the memory unit 26 by the touch corresponding point saving process.
 描画処理では、ポインタ座標(u,v)を中心とする楕円が、平面検出処理により検出された実物体上の平面上(同一平面上)のデプスの位置に描画される処理が行われる。例えば、図4に示すように、平面の一例であるテーブル面に、3個のポインタ座標のそれぞれを中心とする3個の楕円形状の仮想マーカ(3個のオブジェクトの一例)が描画される。表示部3に表示された3個の楕円をユーザは認識する。なお、オブジェクトの形状は楕円でなく、矩形、多角形等、任意の形状とすることができる。オブジェクトの形状によっては、上述した楕円方程式を用いた演算が行われなくても良い。また、個々のオブジェクトを明確に区別するために、オブジェクト毎の色や形状が異なるようにされても良い。 In the drawing process, an ellipse centered on the pointer coordinates (u, v) is drawn at a depth position on a plane (on the same plane) on the real object detected by the plane detection process. For example, as shown in FIG. 4, three elliptical virtual markers (an example of three objects) centered on each of the three pointer coordinates are drawn on a table surface which is an example of a plane. The user recognizes the three ellipses displayed on the display unit 3. The shape of the object is not an ellipse, but can be any shape such as a rectangle or a polygon. Depending on the shape of the object, the calculation using the ellipse equation described above may not be performed. Further, in order to clearly distinguish individual objects, the color and shape of each object may be different.
 描画処理によりオブジェクトが描画された後、ユーザに対して、オブジェクトの中心付近を指でタッチする接触操作を促す報知がなされる。係る報知は、表示部3に文字を描画することにより行われても良いし、音声の再生により行われても良い。報知に応じてユーザはオブジェクトの中心付近に対する接触操作を行う。 After the object is drawn by the drawing process, the user is notified to urge the user to perform a contact operation by touching the vicinity of the center of the object with a finger. Such notification may be performed by drawing characters on the display unit 3, or may be performed by reproducing voice. In response to the notification, the user performs a contact operation near the center of the object.
 タッチ検出処理では、ユーザによりなされた接触操作が検出される。接触操作は、例えば、ユーザの指先により行われる(図4参照)。なお、ユーザの指先でなく、ペン等を用いた接触操作が行われても良い。 In the touch detection process, the contact operation performed by the user is detected. The contact operation is performed, for example, by the fingertip of the user (see FIG. 4). It should be noted that the contact operation may be performed using a pen or the like instead of the user's fingertip.
 一例として、所定のオブジェクトの中心付近のデプスとユーザの指先のデプスとが検出され、その差分が所定以下になったら、当該オブジェクトに対する接触操作がなされたものと検出される。ユーザの指先の動きが一定時間(例えば、数秒)、略静止していることが、カメラ11による撮像画像に基づいて検出されたことに応じて、接触操作がなされものと検出されても良い。また、その他の公知の方法により接触操作が検出されても良い。 As an example, the depth near the center of a predetermined object and the depth of the user's fingertip are detected, and when the difference becomes less than or equal to the predetermined object, it is detected that the contact operation with respect to the object has been performed. It may be detected that the contact operation is performed according to the fact that the movement of the user's fingertip is substantially stationary for a certain period of time (for example, several seconds) based on the image captured by the camera 11. Further, the contact operation may be detected by another known method.
 タッチ検出処理により、ユーザの接触操作に対応する位置であるタッチ点(x,y,z)が検出される。本実施形態では、3個のオブジェクトのそれぞれに対して接触操作がなされることから、タッチ検出処理により3個のタッチ点が検出される。検出された3個のタッチ点は、タッチ点保存処理でメモリ部26に保存される。即ち、メモリ部26には、描画位置演算処理により得られたポインタ座標(u,v)とタッチ点(x,y,z)の組が3組、保存される。 The touch detection process detects touch points (x, y, z) that are positions corresponding to the user's contact operation. In the present embodiment, since the contact operation is performed on each of the three objects, the three touch points are detected by the touch detection process. The detected three touch points are saved in the memory unit 26 by the touch point saving process. That is, in the memory unit 26, three sets of pointer coordinates (u, v) and touch points (x, y, z) obtained by the drawing position calculation process are stored.
 キャリブレーション処理では、メモリ部26に保存されている3組のポインタ座標(u,v)とタッチ点(x,y,z)の対応をそれぞれ使用した数式1による演算が行われることにより、現在のユーザの使用環境に対応するキャリブレーションパラメータ[R|T]が演算される。換言すれば、キャリブレーション処理では、上述した接触操作の検出結果に基づいて、実物体の位置を表示部3における位置に変換するパラメータであるキャリブレーションパラメータ[R|T]が適切に設定される。設定されたキャリブレーションパラメータ[R|T]は、メモリ部26に保存される。以降の情報処理装置1におけるAR処理では、キャリブレーション処理により得られたキャリブレーションパラメータ[R|T]が用いられる。 In the calibration process, the calculation is performed by the mathematical formula 1 using the correspondence between the three sets of pointer coordinates (u, v) and the touch points (x, y, z) stored in the memory unit 26. The calibration parameter [R | T] corresponding to the usage environment of the user is calculated. In other words, in the calibration process, the calibration parameter [R | T], which is a parameter for converting the position of the real object to the position on the display unit 3, is appropriately set based on the detection result of the contact operation described above. .. The set calibration parameter [R | T] is stored in the memory unit 26. In the subsequent AR processing in the information processing apparatus 1, the calibration parameter [R | T] obtained by the calibration processing is used.
 なお、本実施形態では、カメラと表示部3(ディスプレイ)間の相対関係、回転、並進を補正するためには、実空間における座標と表示部3における座標の対応が少なくとも3個(3組)必要であることから、3個のオブジェクトが平面上(平面上のデプス位置)に描画される。勿論、平面に描画されるオブジェクトの数は、3個ではなく4個以上であっても良い。 In the present embodiment, in order to correct the relative relationship, rotation, and translation between the camera and the display unit 3 (display), there are at least three correspondences (three sets) between the coordinates in the real space and the coordinates in the display unit 3. Since it is necessary, three objects are drawn on a plane (depth position on the plane). Of course, the number of objects drawn on the plane may be four or more instead of three.
 本実施形態では、上述した複数の機能のうち、平面検出処理、ポインタ生成処理、タッチ検出処理、タッチ対応点保存処理は、信号処理部21により行われる。従って、本実施形態では、信号処理部21が、接触操作を検出する接触操作検出部及び実物体上の平面を検出する平面検出部として機能する。また、本実施形態では、キャリブレーション処理は、キャリブレーション処理部23により行われる。また、本実施形態では、描画位置演算処理はAR重畳処理部24により行われる。従って、本実施形態では、AR重畳処理部24が描画位置演算処理部として機能する。また、本実施形態では、描画処理が描画処理部25により行われる。 In the present embodiment, among the plurality of functions described above, the plane detection process, the pointer generation process, the touch detection process, and the touch corresponding point saving process are performed by the signal processing unit 21. Therefore, in the present embodiment, the signal processing unit 21 functions as a contact operation detection unit that detects a contact operation and a plane detection unit that detects a plane on a real object. Further, in the present embodiment, the calibration process is performed by the calibration processing unit 23. Further, in the present embodiment, the drawing position calculation process is performed by the AR superimposition processing unit 24. Therefore, in the present embodiment, the AR superimposition processing unit 24 functions as a drawing position calculation processing unit. Further, in the present embodiment, the drawing process is performed by the drawing processing unit 25.
[キャリブレーション処理の流れ]
 続いて、図5のフローチャートが参照されつつ、情報処理装置1で行われるキャリブレーション処理の流れについての説明がなされる。
[Calibration process flow]
Subsequently, the flow of the calibration process performed by the information processing apparatus 1 will be described with reference to the flowchart of FIG.
 ステップST11では、キャリブレーション処理が開始される。キャリブレーション処理は、例えば、情報処理装置1の使用開始時に行われる。具体的には、圧力センサ等により情報処理装置1のフレーム5がユーザの耳に掛けられたことが検出される。かかる検出がなされると、以下に説明するキャリブレーション処理が行われる。なお、情報処理装置1に対する所定の操作(例えば、電源をオンする操作)がなされたときにキャリブレーション処理が行われるようにしても良い。また、内側のカメラ12により表示部3と目(例えば瞳孔)との間の相対関係のズレが所定以上となったことが検出された場合に、キャリブレーション処理が行われるようにしても良い。また、使用中においても情報処理装置1の使用位置が変化することも想定されることから、使用開始後にもキャリブレーション処理が周期的に行われるようにしても良い。そして、処理がステップST12に進む。 In step ST11, the calibration process is started. The calibration process is performed, for example, at the start of use of the information processing device 1. Specifically, it is detected that the frame 5 of the information processing device 1 is hung on the user's ear by a pressure sensor or the like. When such detection is made, the calibration process described below is performed. The calibration process may be performed when a predetermined operation (for example, an operation of turning on the power) is performed on the information processing device 1. Further, when it is detected by the inner camera 12 that the deviation of the relative relationship between the display unit 3 and the eyes (for example, the pupil) becomes more than a predetermined value, the calibration process may be performed. Further, since it is assumed that the position where the information processing apparatus 1 is used changes even during use, the calibration process may be periodically performed even after the start of use. Then, the process proceeds to step ST12.
 ステップST12では、ユーザに対してテーブルや壁面等の平面に近づくことが指示される。係る指示は、文字の表示や音声の再生により行われる。そして、処理がステップST13に進む。 In step ST12, the user is instructed to approach a flat surface such as a table or a wall surface. Such an instruction is given by displaying characters or reproducing voice. Then, the process proceeds to step ST13.
 ステップST13では、平面検出処理により平面が検出される。例えば、ユーザにより接触操作が行われる。そして、当該接触操作が行われたテーブル等の実物体上の位置のデプスが検出され、検出されたデプスに基づいて実物体上の平面が検出される。なお、予め平面検出処理により検出された平面に近づくことがユーザに対して指示されても良い。そして、処理がステップST14に進む。 In step ST13, the plane is detected by the plane detection process. For example, a contact operation is performed by the user. Then, the depth of the position on the real object such as the table on which the contact operation is performed is detected, and the plane on the real object is detected based on the detected depth. The user may be instructed to approach the plane detected by the plane detection process in advance. Then, the process proceeds to step ST14.
 ステップST14では、ポインタ生成処理により3個のポインタ座標(x,y,z)のそれぞれを中心とする円が設定される。そして、描画位置演算処理により、3個のポインタ座標(x,y,z)のそれぞれにデフォルトのキャリブレーションパラメータを使用した演算が行われることにより3個のポインタ座標(u,v)が得られる。3個のポインタ座標(u,v)のそれぞれを中心とする3個の楕円状のオブジェクトが、平面検出処理で検出された平面上のデプスの位置に描画される。そして、描画されたオブジェクトの中心付近に対して接触操作を行うことを促す報知が、ユーザに対してなされる。そして、処理がステップST15に進む。 In step ST14, a circle centered on each of the three pointer coordinates (x, y, z) is set by the pointer generation process. Then, the drawing position calculation process performs an operation using the default calibration parameters for each of the three pointer coordinates (x, y, z), so that the three pointer coordinates (u, v) can be obtained. .. Three elliptical objects centered on each of the three pointer coordinates (u, v) are drawn at the depth positions on the plane detected by the plane detection process. Then, a notification is given to the user to urge the user to perform a contact operation near the center of the drawn object. Then, the process proceeds to step ST15.
 ステップST15では、ユーザによる接触操作に対応するタッチ点がタッチ点検出処理により検出される。そして、処理がステップST16に進む。 In step ST15, the touch point corresponding to the contact operation by the user is detected by the touch point detection process. Then, the process proceeds to step ST16.
 ステップST16では、ポインタ座標(u,v)とタッチ点(x,y,z)とのペア数が十分であるか否かが判断される。本例では、3個のオブジェクトが描画されることから、3個のタッチ点(x,y,z)が検出される。従って、ステップST16では、ポインタ座標(u,v)とタッチ点(x,y,z)とのペア数が3組以上であるか否かが判断される。係る判断は、例えば、信号処理部21に行われる。 In step ST16, it is determined whether or not the number of pairs of the pointer coordinates (u, v) and the touch points (x, y, z) is sufficient. In this example, since three objects are drawn, three touch points (x, y, z) are detected. Therefore, in step ST16, it is determined whether or not the number of pairs of the pointer coordinates (u, v) and the touch points (x, y, z) is three or more. Such a determination is made, for example, by the signal processing unit 21.
 ステップST16の判断結果がNoである場合には、処理がステップST15に戻り、再度、タッチ点検出処理が行われる。なお、この際に、接触操作がなされていないオブジェクトが強調されたり、既に接触操作がなされたオブジェクトが消去されるようにし、ユーザが接触操作を行っていないオブジェクトが認識され易くしても良い。 If the determination result in step ST16 is No, the process returns to step ST15, and the touch point detection process is performed again. At this time, the object that has not been touched may be emphasized, or the object that has already been touched may be deleted so that the object that has not been touched by the user can be easily recognized.
 ステップST16の判断結果がYesである場合には、処理がステップST17に進む。ステップST17では、3組のポインタ座標(u,v)及びタッチ点(x,y,z)の対応を使用したキャリブレーション処理が行われることによりキャリブレーションパラメータ[R|T]が設定される。設定後のキャリブレーションパラメータ[R|T]がメモリ部26に保存される。保存されたキャリブレーションパラメータ[R|T]が次に更新されるまで使用される。そして、処理がステップST18に進み、キャリブレーション処理が終了される。 If the determination result in step ST16 is Yes, the process proceeds to step ST17. In step ST17, the calibration parameter [R | T] is set by performing the calibration process using the correspondence between the three sets of pointer coordinates (u, v) and the touch points (x, y, z). The calibration parameter [R | T] after setting is saved in the memory unit 26. It will be used until the saved calibration parameter [R | T] is updated next time. Then, the process proceeds to step ST18, and the calibration process is completed.
[本実施形態により得られる効果]
 本実施形態によれば、キャリブレーションパラメータを適切に設定することできる。また、本実施形態では、キャリブレーション処理用の特定のオブジェクト等を用意する必要がない。
 また、ユーザは、平面に描画されたオブジェクトに対して接触操作を行うだけで良いので、ユーザに対して過度の負担を強いることなく、簡便にキャリブレーション処理を行うことができる。また、ユーザは、実物体に対して接触操作を行うことにより触感を得られるのでタッチ点が空間中の実物体が存在しない位置に対する操作を行うことに比べて安定する。ユーザにとって位置を特定しやすい実物体上に重畳してオブジェクトが表示されることにより、実空間における3次元の位置が安定して検出されることで、キャリブレーション処理の精度が向上する。
 また、例えば周囲の実物体の同一平面にオブジェクトが描画されることにより、タッチ点が当該平面上に存在するという拘束条件を利用できるので、キャリブレーション処理に係る演算を簡略化することができる。
 また、周囲に平面が存在すればキャリブレーション処理を行うことができるので、キャリブレーション処理を行う際の場所的な制約を略なくすことができる。
[Effects obtained by this embodiment]
According to this embodiment, the calibration parameters can be set appropriately. Further, in the present embodiment, it is not necessary to prepare a specific object or the like for the calibration process.
Further, since the user only needs to perform a contact operation on the object drawn on the plane, the calibration process can be easily performed without imposing an excessive burden on the user. Further, since the user can obtain a tactile sensation by performing a touch operation on the real object, the touch point is more stable than performing an operation on a position in the space where the real object does not exist. By superimposing the object on the real object whose position is easy for the user to specify, the three-dimensional position in the real space is stably detected, and the accuracy of the calibration process is improved.
Further, for example, by drawing the object on the same plane of the surrounding real object, the constraint condition that the touch point exists on the plane can be used, so that the calculation related to the calibration process can be simplified.
Further, since the calibration process can be performed if there is a flat surface in the surroundings, it is possible to eliminate the locational restrictions when performing the calibration process.
<変形例>
 以上、本開示の一実施形態について具体的に説明したが、本開示の内容は上述した一実施形態に限定されるものではなく、本開示の技術的思想に基づく各種の変形が可能である。以下、変形例について説明する。
<Modification example>
Although one embodiment of the present disclosure has been specifically described above, the content of the present disclosure is not limited to the one embodiment described above, and various modifications based on the technical idea of the present disclosure are possible. Hereinafter, a modified example will be described.
 情報処理装置1は、眼鏡型に限定されることはなく、頭部に装着されるHUD(Head Up Display)でも良いし、ヘルメッ卜型や車のフロン卜ガラス(ヘッドアップディスプレイ)型の表示デバイスをであっても良い。また、必ずしも人体に装着されるものではなく、可搬型の装置であっても良い。 The information processing device 1 is not limited to the eyeglass type, and may be a HUD (Head Up Display) worn on the head, a Helmet type or a car front glass (head up display) type display device. May be. Further, the device is not necessarily attached to the human body and may be a portable device.
 上述した実施形態に係る情報処理装置1の機能の一部がクラウド上の機器で行われても良い。この場合には、当該クラウド上の機器が情報処理装置となり得る。 A part of the functions of the information processing device 1 according to the above-described embodiment may be performed by a device on the cloud. In this case, the device on the cloud can be an information processing device.
 本開示は、装置、方法、プログラム、システム等により実現することもできる。例えば、上述した実施形態で説明した機能を行うプログラムをダウンロード可能とし、実施形態で説明した機能を有しない装置が当該プログラムをダウンロードしてインストールすることにより、当該装置において実施形態で説明した制御を行うことが可能となる。本開示は、このようなプログラムを配布するサーバにより実現することも可能である。また、各実施形態、変形例で説明した事項は、適宜組み合わせることが可能である。 This disclosure can also be realized by devices, methods, programs, systems, etc. For example, by making it possible to download a program that performs the functions described in the above-described embodiment and downloading and installing the program by a device that does not have the functions described in the above-described embodiment, the control described in the embodiment can be performed in the device. It becomes possible to do. The present disclosure can also be realized by a server that distributes such a program. In addition, the items described in each embodiment and modification can be combined as appropriate.
 なお、本開示中に例示された効果により本開示の内容が限定して解釈されるものではない。 Note that the content of this disclosure is not construed as limited by the effects exemplified in this disclosure.
 本開示は、以下の構成も採ることができる。
(1)
 実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、前記実物体上の前記接触操作が行われた位置を、前記オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行うキャリブレーション処理部を有する
 情報処理装置。
(2)
 前記接触操作が行われた前記実物体上の位置を検出する接触操作検出部を有する
 (1)に記載の情報処理装置。
(3)
 前記接触操作検出部は、前記接触操作が行われた前記実物体上の位置のデプスを検出する
 (2)に記載の情報処理装置。
(4)
 前記接触操作が行われる前記実物体のデプスに基づき前記実物体上の平面を検出する平面検出部を有し、
 前記3個のオブジェクトは前記実物体上の前記平面上のデプスの位置に重畳表示される
 (3)に記載の情報処理装置。
(5)
 前記3個のオブジェクトは、検出された前記実物体上の同一平面上に重畳表示される
 (4)に記載の情報処理装置。
(6)
 前記パラメータは、前記実物体を撮影する撮影ユニットの姿勢と位置とを含むパラメータである
 (1)から(5)までの何れかに記載の情報処理装置。
(7)
 前記撮影ユニットを有する
 (6)に記載の情報処理装置。
(8)
 前記平面検出部は、検出された複数の平面のうち、最も近くに位置する平面を、前記オブジェクトを重畳表示する平面として設定する
 (4)又は(5)に記載の情報処理装置。
(9)
 前記キャリブレーション処理部は、前記情報処理装置の使用開始時に前記キャリブレーション処理を行う
 (1)から(8)までの何れかに記載の情報処理装置。
(10)
 前記キャリブレーション処理部は、前記ディスプレイと目との位置の間のずれが所定以上である場合に前記キャリブレーション処理を行う
 (1)から(8)までの何れかに記載の情報処理装置。
(11)
 前記3個のオブジェクトの前記ディスプレイにおけるそれぞれの描画位置を演算する描画位置演算処理部を有する
 (1)から(10)までの何れかに記載の情報処理装置。
(12)
 前記描画位置演算処理部により設定された前記描画位置に前記オブジェクトを表示する処理を行う描画処理部を有する
 (11)に記載の情報処理装置。
(13)
 キャリブレーション処理部が、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、前記実物体上の前記接触操作が行われた位置を、前記オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行う
 情報処理方法。
(14)
 キャリブレーション処理部が、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、前記実物体上の前記接触操作が行われた位置を、前記オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行う
 情報処理方法をコンピュータに実行させるプログラム。
The present disclosure may also adopt the following configuration.
(1)
Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the position where the contact operation is performed on the real object is set to the position on the display displaying the object. An information processing device having a calibration processing unit that performs calibration processing for setting parameters to be converted.
(2)
The information processing apparatus according to (1), which has a contact operation detection unit that detects a position on the real object on which the contact operation has been performed.
(3)
The information processing device according to (2), wherein the contact operation detection unit detects the depth of the position on the real object on which the contact operation is performed.
(4)
It has a plane detection unit that detects a plane on the real object based on the depth of the real object on which the contact operation is performed.
The information processing apparatus according to (3), wherein the three objects are superimposed and displayed at the position of the depth on the plane on the real object.
(5)
The information processing apparatus according to (4), wherein the three objects are superimposed and displayed on the same plane on the detected real object.
(6)
The information processing apparatus according to any one of (1) to (5), wherein the parameter is a parameter including a posture and a position of a photographing unit that photographs the real object.
(7)
The information processing apparatus according to (6), which has the photographing unit.
(8)
The information processing apparatus according to (4) or (5), wherein the plane detection unit sets the plane located closest to the detected plurality of planes as a plane on which the object is superimposed and displayed.
(9)
The information processing apparatus according to any one of (1) to (8), wherein the calibration processing unit performs the calibration processing at the start of use of the information processing apparatus.
(10)
The information processing apparatus according to any one of (1) to (8), wherein the calibration processing unit performs the calibration processing when the deviation between the positions of the display and the eyes is equal to or greater than a predetermined value.
(11)
The information processing apparatus according to any one of (1) to (10), which has a drawing position calculation processing unit for calculating each drawing position of the three objects on the display.
(12)
The information processing apparatus according to (11), which has a drawing processing unit that performs a process of displaying the object at the drawing position set by the drawing position calculation processing unit.
(13)
Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the calibration processing unit sets the position on the real object where the contact operation is performed. An information processing method that performs a calibration process that sets parameters to be converted to positions on the display to be displayed.
(14)
Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the calibration processing unit sets the position on the real object where the contact operation is performed. A program that causes a computer to execute an information processing method that performs a calibration process that sets parameters to be converted to the position on the display to be displayed.
1・・・情報処理装置
3・・・表示部
11・・・カメラ
21・・・信号処理部
22・・・センサ部
23・・・キャリブレーション処理部
24・・・AR重畳処理部
25・・・描画処理部
1 ... Information processing device 3 ... Display unit 11 ... Camera 21 ... Signal processing unit 22 ... Sensor unit 23 ... Calibration processing unit 24 ... AR superimposition processing unit 25 ...・ Drawing processing unit

Claims (14)

  1.  実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、前記実物体上の前記接触操作が行われた位置を、前記オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行うキャリブレーション処理部を有する
     情報処理装置。
    Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the position where the contact operation is performed on the real object is set to the position on the display displaying the object. An information processing device having a calibration processing unit that performs calibration processing for setting parameters to be converted.
  2.  前記接触操作が行われた前記実物体上の位置を検出する接触操作検出部を有する
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, further comprising a contact operation detection unit that detects a position on the real object on which the contact operation has been performed.
  3.  前記接触操作検出部は、前記接触操作が行われた前記実物体上の位置のデプスを検出する
     請求項2に記載の情報処理装置。
    The information processing device according to claim 2, wherein the contact operation detection unit detects the depth of the position on the real object on which the contact operation has been performed.
  4.  前記接触操作が行われる前記実物体のデプスに基づき前記実物体上の平面を検出する平面検出部を有し、
     前記3個のオブジェクトは前記実物体上の前記平面上のデプスの位置に重畳表示される
     請求項3に記載の情報処理装置。
    It has a plane detection unit that detects a plane on the real object based on the depth of the real object on which the contact operation is performed.
    The information processing device according to claim 3, wherein the three objects are superimposed and displayed at the position of the depth on the plane on the real object.
  5.  前記3個のオブジェクトは、検出された前記実物体上の同一平面上に重畳表示される
     請求項4に記載の情報処理装置。
    The information processing device according to claim 4, wherein the three objects are superimposed and displayed on the same plane on the detected real object.
  6.  前記パラメータは、前記実物体を撮影する撮影ユニットの姿勢と位置とを含むパラメータである
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the parameter is a parameter including a posture and a position of a photographing unit that photographs the real object.
  7.  前記撮影ユニットを有する
     請求項6に記載の情報処理装置。
    The information processing apparatus according to claim 6, further comprising the photographing unit.
  8.  前記平面検出部は、検出された複数の平面のうち、最も近くに位置する平面を、前記オブジェクトを重畳表示する平面として設定する
     請求項4に記載の情報処理装置。
    The information processing apparatus according to claim 4, wherein the plane detection unit sets the plane located closest to the detected plurality of planes as a plane on which the object is superimposed and displayed.
  9.  前記キャリブレーション処理部は、前記情報処理装置の使用開始時に前記キャリブレーション処理を行う
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the calibration processing unit performs the calibration processing at the start of use of the information processing apparatus.
  10.  前記キャリブレーション処理部は、前記ディスプレイと目との位置の間のずれが所定以上である場合に前記キャリブレーション処理を行う
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the calibration processing unit performs the calibration processing when the deviation between the positions of the display and the eyes is equal to or greater than a predetermined value.
  11.  前記3個のオブジェクトの前記ディスプレイにおけるそれぞれの描画位置を演算する描画位置演算処理部を有する
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, further comprising a drawing position calculation processing unit that calculates the drawing positions of the three objects on the display.
  12.  前記描画位置演算処理部により設定された前記描画位置に前記オブジェクトを表示する処理を行う描画処理部を有する
     請求項11に記載の情報処理装置。
    The information processing apparatus according to claim 11, further comprising a drawing processing unit that performs a process of displaying the object at the drawing position set by the drawing position calculation processing unit.
  13.  キャリブレーション処理部が、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、前記実物体上の前記接触操作が行われた位置を、前記オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行う
     情報処理方法。
    Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the calibration processing unit sets the position on the real object where the contact operation is performed. An information processing method that performs a calibration process that sets parameters to be converted to positions on the display to be displayed.
  14.  キャリブレーション処理部が、実物体に対して重畳表示された少なくとも3個のオブジェクトのそれぞれに対する接触操作の検出結果に基づいて、前記実物体上の前記接触操作が行われた位置を、前記オブジェクトを表示するディスプレイにおける位置に変換するパラメータを設定するキャリブレーション処理を行う
     情報処理方法をコンピュータに実行させるプログラム。
    Based on the detection result of the contact operation for each of at least three objects superimposed and displayed on the real object, the calibration processing unit sets the position on the real object where the contact operation is performed. A program that causes a computer to execute an information processing method that performs a calibration process that sets parameters to be converted to the position on the display to be displayed.
PCT/JP2020/027831 2019-10-15 2020-07-17 Information processing device, information processing method, and program WO2021075113A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019188845A JP2021063922A (en) 2019-10-15 2019-10-15 Information processing device, information processing method, and program
JP2019-188845 2019-10-15

Publications (1)

Publication Number Publication Date
WO2021075113A1 true WO2021075113A1 (en) 2021-04-22

Family

ID=75486206

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/027831 WO2021075113A1 (en) 2019-10-15 2020-07-17 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2021063922A (en)
WO (1) WO2021075113A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003269913A (en) * 2002-03-19 2003-09-25 Canon Inc Device and method for calibrating sensor, program, and storage medium
JP2010102215A (en) * 2008-10-27 2010-05-06 Sony Computer Entertainment Inc Display device, image processing method and computer program
US20120035934A1 (en) * 2010-08-06 2012-02-09 Dynavox Systems Llc Speech generation device with a projected display and optical inputs
US20140078176A1 (en) * 2012-09-14 2014-03-20 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
JP2016018213A (en) * 2014-07-10 2016-02-01 セイコーエプソン株式会社 Hmd calibration with direct geometric modeling
US20160080732A1 (en) * 2014-09-17 2016-03-17 Qualcomm Incorporated Optical see-through display calibration
JP2016218547A (en) * 2015-05-15 2016-12-22 セイコーエプソン株式会社 Head mounted display device, method for controlling the same and computer program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003269913A (en) * 2002-03-19 2003-09-25 Canon Inc Device and method for calibrating sensor, program, and storage medium
JP2010102215A (en) * 2008-10-27 2010-05-06 Sony Computer Entertainment Inc Display device, image processing method and computer program
US20120035934A1 (en) * 2010-08-06 2012-02-09 Dynavox Systems Llc Speech generation device with a projected display and optical inputs
US20140078176A1 (en) * 2012-09-14 2014-03-20 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
JP2016018213A (en) * 2014-07-10 2016-02-01 セイコーエプソン株式会社 Hmd calibration with direct geometric modeling
US20160080732A1 (en) * 2014-09-17 2016-03-17 Qualcomm Incorporated Optical see-through display calibration
JP2016218547A (en) * 2015-05-15 2016-12-22 セイコーエプソン株式会社 Head mounted display device, method for controlling the same and computer program

Also Published As

Publication number Publication date
JP2021063922A (en) 2021-04-22

Similar Documents

Publication Publication Date Title
US11507336B2 (en) Augmented reality display sharing
CN107111370B (en) Virtual representation of real world objects
US10942024B2 (en) Information processing apparatus, information processing method, and recording medium
JP6622395B2 (en) Method and apparatus for adjusting virtual reality images
US10269139B2 (en) Computer program, head-mounted display device, and calibration method
US10623721B2 (en) Methods and systems for multiple access to a single hardware data stream
US10037614B2 (en) Minimizing variations in camera height to estimate distance to objects
US11755122B2 (en) Hand gesture-based emojis
JP6456347B2 (en) INSITU generation of plane-specific feature targets
US10277814B2 (en) Display control method and system for executing the display control method
US11189054B2 (en) Localization and mapping using images from multiple devices
US20210110562A1 (en) Planar surface detection
JP2017138973A (en) Method and program for providing virtual space
WO2019142560A1 (en) Information processing device for guiding gaze
US9600938B1 (en) 3D augmented reality with comfortable 3D viewing
KR102310994B1 (en) Computing apparatus and method for providing 3-dimensional interaction
US20200211275A1 (en) Information processing device, information processing method, and recording medium
JP6446465B2 (en) I / O device, I / O program, and I / O method
WO2021075113A1 (en) Information processing device, information processing method, and program
JP2018109940A (en) Information processing method and program for causing computer to execute the same
WO2021166717A1 (en) Display control device, display control method, and recording medium
US20240005536A1 (en) Perspective Correction of User Input Objects
JP6205047B1 (en) Information processing method and program for causing computer to execute information processing method
CN117642775A (en) Information processing apparatus for determining holding of object
JP2018097477A (en) Display control method and program making display control method thereof executed by computer

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20875787

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20875787

Country of ref document: EP

Kind code of ref document: A1