WO2021235193A1 - 情報処理システム、情報処理方法およびプログラム - Google Patents

情報処理システム、情報処理方法およびプログラム Download PDF

Info

Publication number
WO2021235193A1
WO2021235193A1 PCT/JP2021/016735 JP2021016735W WO2021235193A1 WO 2021235193 A1 WO2021235193 A1 WO 2021235193A1 JP 2021016735 W JP2021016735 W JP 2021016735W WO 2021235193 A1 WO2021235193 A1 WO 2021235193A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
content
local coordinate
glass
information processing
Prior art date
Application number
PCT/JP2021/016735
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
保乃花 尾崎
健太郎 井田
拓也 池田
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Priority to JP2022524352A priority Critical patent/JPWO2021235193A1/ja
Publication of WO2021235193A1 publication Critical patent/WO2021235193A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to an information processing system, an information processing method and a program.
  • AR Augmented Reality
  • AR technology that expands the real environment with a computer is attracting attention.
  • AR technology is realized by devices such as AR glasses, head-mounted displays, and projectors that superimpose and display content in real space.
  • the position of the content superimposed on the real space changes depending on the position and posture of the device. If an error occurs in the detection result of the sensor that detects the position and orientation of the device, the coordinate system (origin position and coordinate axis position) set in the device will be deviated. When a plurality of devices share one AR space, an error occurs in the relative position between the coordinate systems set for each device, and the position of the content displayed on each device is deviated. Therefore, it is desirable to correct the coordinate system of each device to align the position of the content. However, if the position of the content is corrected during the operation of the content, the operation subject of the content may feel uncomfortable.
  • an error information detection unit that detects information on an error that has occurred in a relative position of a plurality of local coordinate systems, and a plurality of local coordinate systems in response to the start of operation of the content.
  • a correction unit that corrects the relative positions of all other local coordinate systems with respect to the specific local coordinate system used to generate the content, which the operating subject of the content recognizes as an operation target, based on the error information.
  • the information processing system to have is provided. Further, according to the present disclosure, an information processing method in which the information processing of the information processing system is executed by a computer, and a program for realizing the information processing of the information processing system in the computer are provided.
  • FIG. 1 is a schematic diagram of the information processing system 1 of the first embodiment.
  • the information processing system 1 includes, for example, a processing device 10, a plurality of devices 20, and a storage device 30.
  • the processing device 10 generates the content CT displayed by the plurality of devices 20.
  • the content CT is superimposed and displayed on the real space RS by the device 20.
  • the plurality of devices 20 include, for example, an AR glass 20A, an AR glass 20B, and a projector 20C.
  • the AR glasses 20A and the AR glasses 20B are attached to the head of the user U, and the content CT is superimposed and displayed on the real space RS in the field of view of the user U.
  • the projector 20C is fixed at a predetermined position in the real space RS and projects the content CT onto the screen SCR provided in the real space RS.
  • the screen SCR is, for example, a table.
  • FIG. 2 is a diagram showing an example of a coordinate system set in the real space RS.
  • a local coordinate system LC is set for each of the plurality of devices 20.
  • the local coordinate system LC has an origin and coordinate axes defined for each device 20.
  • the origin position and the position of the coordinate axis of the local coordinate system LC are calculated based on information such as the internal parameters for each device 20 included in the registration data 32, the installation position, and the posture at the installation position.
  • the processing device 10 calculates the coordinates of the content CT displayed on the device 20 by using the local coordinate system LC set on the device 20.
  • the content CT displayed by the AR glass 20A and the AR glass 20B will be referred to as a glass content GC.
  • the content CT displayed by the projector 20C is referred to as PJ content.
  • the local coordinate system LC set in the AR glass 20A is referred to as a glass coordinate system LCA.
  • the local coordinate system LC set in the AR glass 20B is referred to as a glass coordinate system LCB.
  • the local coordinate system LC set in the projector 20C is referred to as a PJ coordinate system LCC.
  • the reference numeral WC indicates a world coordinate system.
  • the world coordinate system WC is a coordinate system that defines the entire real space RS.
  • the world coordinate system WC is defined, for example, based on the results of a pre-performed real-space RS scan.
  • the PJ coordinate system LCC is defined based on the detection result of a sensor fixed at a predetermined position in the real space RS.
  • the glass coordinate system LCA is estimated from the detection result of the sensor mounted on the AR glass 20A.
  • the glass coordinate system LCB is estimated from the detection result of the sensor mounted on the AR glass 20B.
  • the world coordinate system WC is the coordinate system most conforming to the real world. Therefore, the correction of the positional deviation generated in the content CT of each device 20 may be performed with reference to the world coordinate system WC.
  • the PJ coordinate system LCC is used as the reference local coordinate system
  • the glass coordinate system LCA and the glass coordinate system LCB are used as the reference local coordinate system without using the world coordinate system WC.
  • a relative position correction method of matching is used. This is because the sensor of the projector 20C installed at the fixed position is considered to have a smaller detection error when used for a long time than the sensors of the AR glasses 20A and the AR glasses 20B that move with the user U. be.
  • chess is performed by the first user U1 wearing the AR glass 20A and the second user U2 wearing the AR glass 20B.
  • the AR glass 20A and the AR glass 20B display a virtual object representing a chess piece as a glass content GC.
  • the projector 20C displays a virtual object representing a chess board as PJ content PJC.
  • the glass content GC is displayed according to the squares of the PJ content PJC.
  • Each device 20 has one or more sensors. Each device 20 outputs the sensor data detected by one or more sensors to the processing device 10.
  • the AR glass 20A and the AR glass 20B have a plurality of sensors including a depth sensor, an accelerometer, a microphone and a camera.
  • the sensor data output from the AR glasses 20A and the AR glasses 20B is used, for example, for recognizing the real space RS, estimating the position in the real space RS, and detecting the gesture of the user U.
  • the projector 20C has a plurality of sensors including a depth sensor, a motion sensor, a microphone and a camera.
  • the sensor data output from the projector 20C is used, for example, to detect the position of the user U in the real space RS.
  • the processing device 10 is, for example, a glass sensor data acquisition unit 11, a PJ sensor data acquisition unit 12, a space recognition unit 13, a gesture detection unit 14, an interaction detection unit 15, an error information detection unit 16, a correction unit 17, and a content generation unit 18. And has a display control unit 19.
  • the glass sensor data acquisition unit 11 acquires, for example, the sensor data output from the AR glass 20A and the AR glass 20B.
  • the glass sensor data acquisition unit 11 outputs, for example, the acquired sensor data to the space recognition unit 13 and the gesture detection unit 14.
  • the PJ sensor data acquisition unit 12 acquires, for example, the sensor data output from the projector 20C.
  • the PJ sensor data acquisition unit 12 outputs, for example, the acquired sensor data to the interaction detection unit 15.
  • the space recognition unit 13 recognizes the real space RS around the AR glass 20A and the AR glass 20B, and the AR glass 20A and the AR glass in the real space RS, for example, based on the sensor data acquired from the glass sensor data acquisition unit 11.
  • the position of 20B is estimated, and the real objects around the AR glasses 20A and AR glasses 20B are detected.
  • Recognition of the real space RS and estimation of the position in the real space RS are performed, for example, by using a technique called SLAM (Simultaneus Localization and Mapping).
  • the space recognition unit 13 outputs, for example, information on the glass coordinate system (information on the position of the origin and the position of the coordinate axis) generated using SLAM and information on the environment map to the error information detection unit 16.
  • the gesture detection unit 14 detects the gestures of the first user U1 and the second user U2 based on the sensor data acquired from the glass sensor data acquisition unit 11, for example. For example, the gesture detection unit 14 detects the gesture based on the movement of the operation part OP (the part where the operation subject interacts with the content CT) of the user U who is the operation subject of the content CT.
  • the operation part OP the part where the operation subject interacts with the content CT
  • the operation site OP is a site where the position or form of the content CT can be changed by the movement.
  • the operation site OP is the hand or finger of the user U in contact with the content CT.
  • the operation site OP is , The hand or finger of the user U separated from the content CT by a predetermined distance.
  • the gesture detection unit 14 identifies the user U who operates the content CT, for example, based on the sensor data acquired from the glass sensor data acquisition unit 11.
  • the gesture detection unit 14 outputs, for example, information about the operation subject of the content CT and the gesture for the content CT performed by the operation subject to the content generation unit 18.
  • the gesture detection unit 14 detects, for example, the coordinates of the operation site OP in the glass coordinate system LCA and the glass coordinate system LCB, respectively, based on the sensor data acquired from the glass sensor data acquisition unit 11.
  • the gesture detection unit 14 outputs the coordinate information of the operation site OP detected for each glass coordinate system to the error information detection unit 16.
  • the gesture detection unit 14 outputs a detection error signal to the error information detection unit 16, for example, when the operation subject of the content CT cannot be detected or when the coordinates of the operation site OP cannot be detected.
  • the interaction detection unit 15 detects the coordinates of the operation site OP in the PJ coordinate system LCC, for example, based on the sensor data acquired from the PJ sensor data acquisition unit 12.
  • the interaction detection unit 15 outputs, for example, the coordinate information of the operation site OP in the PJ coordinate system LCC to the error information detection unit 16.
  • the interaction detection unit 15 determines, for example, the accuracy of the sensor data acquired from the PJ sensor data acquisition unit 12.
  • the accuracy of the sensor data is quantified according to, for example, the distance between the sensor and the user U, the size of the operation site OP superimposed on the content CT, and the like. For example, when the user U is located at a position far away from the sensor of the projector 20C, or when occlusion occurs with respect to the sensor of the projector 20C, the accuracy of the calculated sensor data is low. For example, when the accuracy of the sensor data is equal to or less than the threshold value, the interaction detection unit 15 outputs a detection error signal to the error information detection unit 16.
  • the error information detection unit 16 detects information on errors that occur at relative positions of a plurality of local coordinate systems LC.
  • the error information detection unit 16 detects, for example, the deviation generated at the relative positions of the origin and the coordinate axes of the plurality of local coordinate systems LC as error information.
  • the error information detection unit 16 detects error information based on the coordinates of the operation site OP detected for each local coordinate system LC, for example. For example, as an initial setting, it is assumed that the origin positions and coordinate axis positions of all the local coordinate system LCs are calibrated so as to match. In the initial state immediately after calibration, the coordinates of the operation site OP in the PJ coordinate system LCC and the coordinates of the operation site OP in the glass coordinate system LCA match. After that, it is assumed that the coordinates of the operation site OP in the PJ coordinate system LCC become (30, 40, 50) and the coordinates of the operation site OP in the glass coordinate system LCA become (25, 45, 55) due to long-term use. .. In this case, the error generated in the relative position between the PJ coordinate system LCC and the glass coordinate system LCA is calculated as (-5, 5, 5).
  • the error information detection unit 16 outputs, for example, error information to the correction unit 17. For example, when the error information detection unit 16 acquires a detection error signal from the gesture detection unit 14 and the interaction detection unit 15, the error information detection unit 16 outputs the correction error signal to the correction unit 17.
  • the correction unit 17 corrects, for example, the relative positions of all other local coordinate system LCs with respect to the specific local coordinate system LC based on the error information in response to the start of the operation of the content CT. For example, when the correction unit 17 acquires a correction error signal from the error information detection unit 16, the correction unit 17 does not correct the local coordinate system LC.
  • the specific local coordinate system LC is the local coordinate system LC used to generate the content CT that the operating subject of the content CT recognizes as an operation target.
  • the origin position and the position of the coordinate axis of the local coordinate system LC other than the specific local coordinate system LC are moved to the position where the error of the relative position with the specific local coordinate system LC becomes small. It is preferred that the origins and axes of the other local coordinate system LCs move to positions where the relative position errors that occur with the particular local coordinate system LC are offset.
  • FIG. 3 is a diagram showing the arrangement of each local coordinate system LC after a predetermined time has elapsed from the initial state.
  • the dotted line shows the arrangement of the local coordinate system LC in the initial state.
  • the solid line shows the arrangement of the local coordinate system LC after the lapse of a predetermined time.
  • the glass coordinate system LCA is in a state of being rotated by the rotation amount RT in the first direction D1 as compared with the initial state.
  • the glass coordinate system LCB is in a state of being translated by a distance PM in the second direction D2 as compared with the initial state.
  • the position of the PJ coordinate system LCC has not changed.
  • FIG. 4 shows an example in which the first user U1 performs an operation of moving a piece to the glass content GC as the first case.
  • the operating subject is the first user U1.
  • the content CT recognized by the operation subject as the operation target is the glass content GC generated by using the glass coordinate system LCA.
  • the positions of the glass coordinate system LCB and the PJ coordinate system LCC are corrected with reference to the glass coordinate system LCA. For example, the origin position and the position of the coordinate axis of the glass coordinate system LCB and the PJ coordinate system LCC move to a position where the error of the relative position with the glass coordinate system LCA becomes small.
  • FIG. 4 shows an example in which the first user U1 performs an operation of moving a piece to the glass content GC as the first case.
  • the operating subject is the first user U1.
  • the content CT recognized by the operation subject as the operation target is the glass content GC generated by using the glass coordinate system LCA.
  • the glass coordinate system LCB is corrected by rotating in the first direction D1 by the rotation amount RT and translating in the third direction D3 opposite to the second direction D2 by the distance PM.
  • a correction is performed to rotate the PJ coordinate system LCC by the rotation amount RT in the first direction D1.
  • FIG. 5 shows, as a second case, a correction method when the second user U2 performs an operation of moving a piece with respect to the glass content GC.
  • the operating subject is the second user U2.
  • the content CT recognized by the operation subject as the operation target is the glass content GC generated by using the glass coordinate system LCB.
  • the positions of the glass coordinate system LCA and the PJ coordinate system LCC are corrected with reference to the glass coordinate system LCB. For example, the origin position and the position of the coordinate axis of the glass coordinate system LCA and the PJ coordinate system LCC move to a position where the error of the relative position with the glass coordinate system LCB becomes small.
  • FIG. 5 shows, as a second case, a correction method when the second user U2 performs an operation of moving a piece with respect to the glass content GC.
  • the operating subject is the second user U2.
  • the content CT recognized by the operation subject as the operation target is the glass content GC generated by using the glass coordinate system LCB.
  • the glass coordinate system LCA is corrected by rotating in the fourth direction D4 opposite to the first direction D1 by the rotation amount RT and translating in the second direction D2 by the distance PM.
  • correction is performed to translate the PJ coordinate system LCC by the distance PM in the second direction D2.
  • FIG. 6 shows, as a third case, a correction method when the PJ content PJC includes a design change menu for changing the design of the board and the first user U1 operates the design change menu.
  • the operating subject is the first user U1.
  • the content CT recognized by the operation subject as the operation target is the PJ content PJC generated by using the PJ coordinate system LCC.
  • the first user U1 starts the operation of the design change menu, the positions of the glass coordinate system LCA and the glass coordinate system LCB are corrected with reference to the PJ coordinate system LCC.
  • the origin position and the position of the coordinate axis of the glass coordinate system LCA and the glass coordinate system LCB move to a position where the error of the relative position with the PJ coordinate system LCC becomes small.
  • the glass coordinate system LCA is corrected to rotate by the rotation amount RT in the fourth direction D4.
  • correction is performed to translate the glass coordinate system LCB by the distance PM in the third direction D3.
  • the correction unit 17 ends the correction based on the specific local coordinate system LC in response to the completion of the operation of the content CT. Then, the correction unit 17 switches the local coordinate system LC that is the reference of the correction from the specific local coordinate system LC to the reference local coordinate system (PJ coordinate system LCC). The correction unit 17 corrects the relative positions of all other local coordinate system LCs with respect to the reference local coordinate system based on the error information.
  • the reference local coordinate system is a local coordinate system LC in which an error generated in a position relative to the world coordinate system WC is smaller than that of a specific local coordinate system LC.
  • the origin position and the position of the coordinate axis are more likely to shift due to changes in the environment or long-term use as compared with the PJ coordinate system LCC. Therefore, the PJ coordinate system LCC is used as the reference local coordinate system.
  • the origin position and the position of the coordinate axis of the local coordinate system LC other than the reference local coordinate system move to the position where the error of the relative position with the reference local coordinate system becomes small. It is preferable that the origin position and the position of the coordinate axis of the other local coordinate system LC are moved to a position where the error of the relative position generated with the reference local coordinate system is offset.
  • the correction based on the glass coordinate system LCA is completed, and the local coordinate system LC as the reference of the correction is the PJ coordinate system.
  • Switch to LCC reference local coordinate system
  • the positions of the glass coordinate system LCA and the glass coordinate system LCB are corrected with reference to the PJ coordinate system LCC.
  • the origin position and the position of the coordinate axis of the glass coordinate system LCA and the glass coordinate system LCB move to a position where the error of the relative position with the PJ coordinate system LCC becomes small.
  • the position of the PJ coordinate system LCC has not changed from the initial state. Therefore, the glass coordinate system LCA and the glass coordinate system LCB are returned to the positions in the initial state shown by the dotted line by the correction based on the PJ coordinate system LCC.
  • the second case described above the second case described above.
  • the local coordinate system LC used as the reference for correction when the first user U1 starts the operation of the PJ content PJC is the PJ coordinate system LCC (reference local coordinate system). Therefore, even if the first user U1 finishes the operation of the PJ content PJC, the correction based on the PJ coordinate system LCC is maintained.
  • the correction unit 17 generates, for example, a first coordinate conversion formula that performs coordinate conversion between the glass coordinate system LCA and the PJ coordinate system LCC.
  • the correction unit 17 generates, for example, a second coordinate conversion formula that performs coordinate conversion between the glass coordinate system LCB and the PJ coordinate system LCC.
  • the correction unit 17 outputs the first coordinate conversion formula and the second coordinate conversion formula to the content generation unit 18.
  • the content generation unit 18 generates, for example, a content CT that each device 20 superimposes on the real space RS and displays it, and outputs the content CT to the display control unit 19.
  • the content generation unit 18 generates, for example, a content CT corresponding to the gesture detected by the gesture detection unit 14.
  • the content generation unit 18 sets the coordinates of the content CT using, for example, the PJ coordinate system LCC in which the origin position and the position of the coordinate axis are least likely to deviate.
  • the content generation unit 18 calculates the coordinates of the content CT in the glass coordinate system LCA and the glass coordinate system LCB by using the first coordinate conversion formula and the second coordinate conversion formula.
  • the display control unit 19 outputs, for example, the content CT generated by the content generation unit 18 to each device 20.
  • the display control unit 19 adds a correction process to the content CT generated by the content generation unit 18 as necessary.
  • the projector 20C projects the content CT onto the screen SCR. Therefore, the display control unit 19 outputs the content CT subjected to the geometric correction based on the screen model included in the registration data 32 to the projector 20C.
  • the screen model is a three-dimensional model of the screen SCR.
  • the screen model contains, for example, coordinate information on the surface of the screen SCR on which the content CT is projected.
  • the display control unit 19 gradually changes the position of the content CT to the position calculated by the correction based on the reference local coordinate system in response to the completion of the operation of the content CT. This reduces the sense of discomfort caused by the sudden change in the position of the content CT after the operation is completed.
  • the storage device 30 stores, for example, the program 31 executed by the processing device 10 and the registration data 32.
  • the program 31 is a program that causes a computer to execute information processing according to the present embodiment.
  • the processing device 10 performs various processes according to the program 31 stored in the storage device 30.
  • the storage device 30 may be used as a work area for temporarily storing the processing result of the processing device 10.
  • the storage device 30 includes any non-transient storage medium such as, for example, a semiconductor storage medium and a magnetic storage medium.
  • the storage device 30 includes, for example, an optical disk, a magneto-optical disk, or a flash memory.
  • the program 31 is stored, for example, in a non-transient storage medium that can be read by a computer.
  • the processing device 10 is, for example, a computer composed of a processor and a memory.
  • the memory of the processing device 10 includes a RAM (Random Access Memory) and a ROM (Read Only Memory).
  • the processing device 10 executes the glass sensor data acquisition unit 11, the PJ sensor data acquisition unit 12, the space recognition unit 13, the gesture detection unit 14, the interaction detection unit 15, the error information detection unit 16, and the correction unit. 17.
  • FIG. 7 is a diagram showing an example in which the information processing system 1 is applied to an exhibition of a city model.
  • the stereoscopic image of the building is superimposed and displayed on the planar image of the city model projected on the screen SCR.
  • the plane image of the city model projected on the screen SCR is the content CT (PJ content PJC) displayed by the projector 20C.
  • the stereoscopic image of the building superimposed on the planar image of the city model is the content CT (glass content GC) displayed by the AR glass 20A and the AR glass 20B.
  • FIG. 8 is a diagram showing an example of content CT (PJ content PJC and glass content GC) viewed from the viewpoint of the second user U2.
  • content CT PJ content PJC and glass content GC
  • the glass content GC is a three-dimensional display of a part of the building BLD displayed on the PJ content PJC based on the viewpoint of the second user U2.
  • the glass content GC is superimposed on the area OVL in which the plan image of the building BLD is displayed in the PJ content PJC.
  • FIG. 7 shows how the second user U2 reaches for the glass content GC and tries to operate the glass content GC with his / her finger.
  • the processing device 10 generates the glass coordinate system LCB based on the sensor data acquired from the AR glass 20B. If the AR glass 20B is used for a long time, an error occurs in the sensor data of the AR glass 20B. Therefore, the region where the glass content GC and the PJ content PJC are superimposed is deviated from the region OVL which should be superimposed.
  • the processing device 10 corrects the PJ coordinate system LCC in order to suppress the deviation. By this correction, the relative position between the glass content GC and the PJ content PJC is corrected.
  • FIGS. 9 and 10 are diagrams showing a method of correcting the relative position between the glass content GC and the PJ content PJC.
  • FIG. 9 is a diagram showing a method adopted in this embodiment.
  • FIG. 10 is a diagram showing a comparative example.
  • the gesture detection unit 14 recognizes the finger as the operation site OP.
  • the gesture detection unit 14 determines that the operation has been started by the second user U2.
  • the correction unit 17 corrects the glass coordinate system LCA and the PJ coordinate system LCC with reference to the glass coordinate system LCB.
  • the position of the PJ content PJC displayed by the projector 20C moves.
  • the position of the glass content GC recognized by the second user U2 as an operation target does not change. Therefore, the second user U2 can continue to operate the glass content GC without any discomfort.
  • the position of the glass content GC displayed on the AR glass 20A also changes by correcting the relative position of the glass coordinate system LCA. Therefore, the position of the glass content GC recognized by the first user U1 via the AR glass 20A changes.
  • the first user U1 is only looking at the operation of the second user U2, and does not perform an operation on the glass content GC. Therefore, even if the position of the glass content GC recognized by the first user U1 changes slightly, the first user U1 does not feel a great deal of discomfort.
  • the glass coordinate system LCA and the glass coordinate system LCB are corrected with reference to the PJ coordinate system LCC.
  • the reason why the PJ coordinate system LCC is used as the correction reference is that the PJ coordinate system LCC is less likely to cause misalignment than the glass coordinate system LCB generated by using SLAM.
  • the position of the PJ content PJC displayed by the projector 20C does not change, but the position of the glass content GC recognized by the second user U2 as an operation target changes. Since the position of the glass content GC to be operated changes when the second user U2 starts the operation, the second user U2 feels a great sense of discomfort. When the movement amount of the glass content GC is large, the finger is deviated from the glass content GC, and the operation for the glass content GC cannot be performed.
  • the relative position of the other local coordinate system LC is corrected with reference to the local coordinate system LC used for generating the content CT recognized as the operation target by the operation subject.
  • the glass coordinate system LCA and the glass coordinate system LCB are corrected with reference to the PJ coordinate system LCC.
  • 11 to 13 are diagrams showing other application examples of the information processing system 1.
  • the processing device 10 combines the PJ content PJC and the glass content GC to form one slider.
  • PJ Content PJC displays a slider track.
  • the glass content GC displays a slider knob and a part of the track.
  • the AR glass 20A and the AR glass 20B set the glass content area GCA at a position superimposing on the PJ content PJC, and display the glass content GC in the glass content area GCA.
  • the PJ content PJC and the glass content GC are displayed without causing misalignment with each other.
  • FIG. 12 there is a misalignment between the PJ content PJC and the glass content GC.
  • the glass content GC is displayed at a position closer to the viewpoint of the user U than the PJ content PJC. Therefore, the area SHA of the PJ content PJC shielded by the glass content area GCA is widened. Further, a part of the tracks included in the glass content GC intersects with the tracks displayed by the PJ content PJC.
  • FIG. 13 when the user U starts to operate the knob of the slider, the position of the PJ content PJC is corrected according to the position of the glass content GC.
  • the relative position of the PJ content PJC is corrected so that a part of the track included in the glass content GC and the track displayed by the PJ content PJC are parallel to each other.
  • the correction method is not limited to this. The following correction methods are also conceivable.
  • the correction unit 17 corrects the local coordinate system LC by a coordinate transformation process for translating the local coordinate system LC.
  • the correction unit 17 obtains the difference between the coordinates of the two local coordinate systems LC indicating the same operation point.
  • the correction unit 17 translates one local coordinate system LC based on the difference.
  • the correction unit 17 corrects the local coordinate system LC by translation processing that translates the local coordinate system LC and rotates the local coordinate system LC around one rotation axis. conduct.
  • the correction unit 17 obtains a straight line passing through the two operation points in each of the two local coordinate system LCs.
  • the correction unit 17 translates and rotates one local coordinate system LC so that the two straight lines obtained by the two local coordinate system LCs match.
  • the correction unit 17 corrects the local coordinate system LC by translation processing that translates the local coordinate system LC and rotates the local coordinate system LC around the two rotation axes. conduct.
  • the correction unit 17 performs the three operation points in each of the two local coordinate system LCs. Find the plane containing.
  • the correction unit 17 translates and rotates one local coordinate system LC so that the two planes obtained by the two local coordinate system LCs match.
  • FIG. 14 is a flowchart showing the information processing method of the present embodiment.
  • step S1 the gesture detection unit 14 determines whether or not the content CT operation has been performed by the user U. If it is determined that the content CT operation has been performed in step S1 (step S1: Yes), the process proceeds to step S2. If it is determined in step S1 that the content CT operation has not been performed (step S1: No), step S1 is repeated until the content CT operation is detected.
  • step S2 the error information detection unit 16 determines whether or not an error has occurred in the relative positions of the plurality of local coordinate system LCs. If it is determined that an error has occurred in step S2 (step S2: Yes), the process proceeds to step S3. If it is determined in step S2 that no error has occurred (step S2: No), the process ends.
  • step S3 the gesture detection unit 14 determines whether or not the target of the operation by the user U is the PJ content PJC. If it is determined in step S3 that the operation target is PJ content PJC (step S3: Yes), the process proceeds to step S4. If it is determined in step S3 that the operation target is not the PJ content PJC (step S3: No), the process proceeds to step S8.
  • step S4 the interaction detection unit 15 determines the accuracy of the sensor data output from the PJ sensor data acquisition unit 12. If the accuracy of the sensor data is larger than the threshold value in step S4 (step S4: No), the process proceeds to step S5. If the accuracy of the sensor data is equal to or less than the threshold value in step S4 (step S4: Yes), the process proceeds to step S16. In step S16, a correction error is output and correction is not performed.
  • step S5 the correction unit 17 corrects the relative positions of the glass coordinate system LCA and the glass coordinate system LCB with reference to the PJ coordinate system LCC.
  • step S6 the correction unit 17 recalculates the positions of the glass contents GC of the AR glasses 20A and the AR glasses 20B by using the first coordinate conversion formula and the second coordinate conversion formula.
  • step S7 the display control unit 19 corrects the displayed position of the glass content GC based on the position calculated by the correction unit 17.
  • step S8 the gesture detection unit 14 determines whether or not the target of the operation by the user U is the glass content GC. If it is determined in step S8 that the operation target is the glass content GC (step S8: Yes), the process proceeds to step S9. If it is determined in step S8 that the operation target is not the glass content GC (step S8: No), the process proceeds to step S16. In step S16, a correction error is output and correction is not performed.
  • step S9 the correction unit 17 calculates the position where the position of the PJ content PJC is temporarily corrected.
  • the correction unit 17 corrects the relative position of the PJ coordinate system LCC with reference to the glass coordinate system used to generate the glass content GC recognized as the operation target by the operation subject.
  • the correction unit 17 calculates the position of the PJ content PJC by using a coordinate conversion formula that performs coordinate conversion between the glass coordinate system and the PJ coordinate system LCC.
  • step S10 the display control unit 19 corrects the displayed position of the PJ content PJC based on the position calculated by the correction unit 17.
  • step S11 the gesture detection unit 14 determines whether or not the operation of the glass content GC by the user U is completed. If it is determined in step S11 that the operation of the glass content GC is completed (step S11: Yes), the process proceeds to step S12. If it is not determined in step S11 that the operation of the glass content GC is completed (step S11: No), the process returns to step S9.
  • step S12 the correction unit 17 corrects the relative positions of the glass coordinate system LCA and the glass coordinate system LCB with reference to the PJ coordinate system LCC.
  • step S13 the correction unit 17 recalculates the positions of the AR glass 20A and the AR glass 20B glass content GC by using the first coordinate conversion formula and the second coordinate conversion formula.
  • step S14 the display control unit 19 corrects the displayed position of the glass content GC based on the position calculated by the correction unit 17.
  • step S15 the correction unit 17 ends the correction of the PJ coordinate system LCC with reference to the glass coordinate system.
  • the display control unit 19 moves the position of the PJ content PJC to the position set by using the PJ coordinate system LCC.
  • the information processing system 1 has an error information detection unit 16 and a correction unit 17.
  • the error information detection unit 16 detects information on errors that occur at relative positions of a plurality of local coordinate systems LC.
  • the correction unit 17 corrects the relative positions of all the other local coordinate system LCs with respect to the specific local coordinate system LC based on the error information.
  • the specific local coordinate system LC is a local coordinate system LC used to generate a content CT recognized as an operation target by the operating subject of the content CT among a plurality of local coordinate system LCs.
  • the information processing of the above-mentioned information processing system 1 is executed by a computer.
  • the program 31 of the present embodiment makes the computer realize the information processing of the above-mentioned information processing system 1.
  • the local coordinate system LC (origin position and coordinate axis position) other than the specific local coordinate system LC has an error in the relative position with the specific local coordinate system LC. Move to no position.
  • the local coordinate system LC (specific local coordinate system LC) related to the generation of the content CT recognized by the operating subject does not move. Therefore, the position of the content CT recognized by the operating subject is not changed. Therefore, the operating subject can continue to operate the content CT without feeling uncomfortable.
  • the error information detection unit 16 detects error information based on, for example, the coordinates of the part OP that interacts with the content CT of the operation subject, which is detected for each local coordinate system LC.
  • the position of the operation site OP is detected using the coordinates of each local coordinate system LC.
  • an error generated in the relative position between the local coordinate system LCs is detected.
  • the above error is detected at the timing when the operation is performed. Therefore, the local coordinate system LC can be corrected at the timing when the operation is started.
  • a member called a marker for correcting the position / orientation information of the device 20 is unnecessary.
  • Markers are used in fields such as AR glasses and head-mounted displays.
  • the marker is installed at a specific position in the real space RS.
  • the user U equipped with the device 20 moves to the position of the marker when correcting the error generated in the position / attitude information, and takes a picture of the marker with the camera.
  • an error in the position / orientation information (error generated in the relative position between the world coordinate system WC and the local coordinate system LC) can be obtained. It can be corrected.
  • the user U In the correction method using the marker, the user U must move to the position of the marker.
  • the coordinates of the operation site OP detected for each device 20 an error generated in the relative position between the local coordinate system LCs is detected. Therefore, the local coordinate system LC can be easily corrected without using a marker.
  • the correction unit 17 ends the correction based on the specific local coordinate system LC in response to the completion of the operation of the content CT. After that, the correction unit 17 corrects, for example, the relative positions of all other local coordinate system LCs with respect to the reference local coordinate system based on the error information.
  • the reference local coordinate system is, for example, a local coordinate system LC having a smaller error in a position relative to the world coordinate system than a specific local coordinate system LC.
  • the local coordinate system LC that is the reference for correction is switched from the specific local coordinate system LC to the reference local coordinate system at the timing when the operation of the content CT is completed. Since the coordinates are corrected with reference to the reference local coordinate system, the positional deviation between the real object of the real space RS and the content CT of each device 20 is unlikely to occur.
  • the information processing system 1 has, for example, a display control unit 19.
  • the display control unit 19 gradually changes the position of the content CT to the position calculated by the correction based on the reference local coordinate system in response to the completion of the operation of the content CT.
  • the position of the content CT does not change suddenly. Therefore, the operation subject is less likely to feel uncomfortable.
  • the correction unit 17 corrects the local coordinate system LC by, for example, a coordinate conversion process for translating the local coordinate system LC.
  • the correction of the local coordinate system LC is performed with a small amount of calculation.
  • the correction unit 17 corrects the local coordinate system LC by, for example, a coordinate transformation process for translating the local coordinate system LC and rotating the local coordinate system LC around one or more rotation axes.
  • the correction of the local coordinate system LC is performed with high accuracy.
  • FIG. 15 is a schematic diagram of the information processing system 2 of the second embodiment.
  • the difference from the first embodiment in this embodiment is that the content CT used is only the glass content GC, and the reference local coordinate system is determined based on the operation history of a plurality of devices.
  • the differences from the first embodiment will be mainly described.
  • FIG. 15 shows an example in which the information processing system 2 is applied to techno sports.
  • a device 20 for displaying the content CT is attached to the heads of a plurality of users U who are players.
  • AR glass 20A is used as the device 20, for example.
  • the AR glass 20A displays a virtual object showing the ball as the glass content GC.
  • the user U puts the ball displayed as the glass content GC into the opponent's goal and competes for the number of goals.
  • the processing device 40 has, for example, a reference local coordinate system determination unit 41.
  • the reference local coordinate system determination unit 41 determines, for example, the glass coordinate system having the smallest error in the relative position with the world coordinate system WC as the reference local coordinate system based on the operation history of the plurality of AR glasses 20A.
  • the operation history of the AR glass 20A is determined based on, for example, the sensor data detected by the AR glass 20A.
  • the space recognition unit 13 uses SLAM to create an environment map for each AR glass 20A and estimate the position of the AR glass 20A in the environment map.
  • the reference local coordinate system determination unit 41 has been using the AR glass 20A since the last time the relative position of the glass coordinate system LCA was corrected with respect to the reference local coordinate system based on the movement history of the AR glass 20A in the environment map. Calculates the distance traveled by.
  • the reference local coordinate system determination unit 41 compares the movement distances after the correction processing using the reference local coordinate system calculated for each AR glass 20A, and uses the glass coordinate system LCA of the AR glass 20A having the smallest movement distance as a reference. Judged as a local coordinate system.
  • the reference local coordinate system determination unit 41 may determine the reference local coordinate system based on the history of acceleration for each AR glass 20A. For example, the reference local coordinate system determination unit 41 detects the acceleration history for each AR glass 20A based on the acceleration data included in the sensor data. Based on the acceleration history of the AR glass 20A, the reference local coordinate system determination unit 41 has AR that exceeds the threshold value since the last correction of the relative position of the glass coordinate system with respect to the reference local coordinate system. The time applied to the glass 20A is calculated. The reference local coordinate system determination unit 41 compares the above times calculated for each AR glass 20A, and determines that the glass coordinate system LCA of the AR glass 20A having the shortest time is the reference local coordinate system. The threshold value is included in the registration data 52, for example.
  • the correction unit 17 corrects the relative positions of all other glass coordinate system LCA with respect to the specific glass coordinate system LCA based on the error information in response to the start of the operation of the glass content GC.
  • the specific glass coordinate system LCA is the glass coordinate system LCA used to generate the glass content GC recognized as the operation target by the operating subject of the glass content GC.
  • this correction for example, the origin position and the position of the coordinate axis of other glass coordinate system LCA other than the specific glass coordinate system LCA move to a position where the error of the relative position with the specific glass coordinate system LCA becomes small.
  • the correction unit 17 ends the correction based on the specific glass coordinate system LCA in response to the completion of the operation of the glass content GC. Then, the correction unit 17 switches, for example, the glass coordinate system LCA that is the reference for correction from the specific glass coordinate system LCA to the reference local coordinate system determined by the reference local coordinate system determination unit 41.
  • the correction unit 17 corrects, for example, the relative positions of all other glass coordinate system LCA with respect to the reference local coordinate system based on the error information. By this correction, for example, the origin position and the position of the coordinate axis of the glass coordinate system LCA other than the reference local coordinate system are moved to the position where the error of the relative position with the reference local coordinate system becomes small.
  • the storage device 50 stores, for example, the program 51 executed by the processing device 40 and the registration data 52.
  • the program 51 is a program for causing a computer to execute information processing according to the present embodiment.
  • the processing device 40 performs various processes according to the program 51 stored in the storage device 50.
  • the processing device 40 executes the glass sensor data acquisition unit 11, the space recognition unit 13, the gesture detection unit 14, the reference local coordinate system determination unit 41, the error information detection unit 16, the correction unit 17, and the content generation. It functions as a unit 18 and a display control unit 19.
  • the reference local coordinate system is determined based on the operation history of the plurality of AR glasses 20A. Therefore, even when the local coordinate system LC is deviated due to the operation of the AR glass 20A, the reference local coordinate system as the reference for correction can be appropriately selected.
  • Modification example 16 and 17 are diagrams showing an example in which the information processing system 2 is applied to the correction of the local coordinate system LC set in the robot RB.
  • the plurality of devices 20 to be processed include the AR glass 20A and the robot RB.
  • the robot RB is, for example, a robot that autonomously moves using SLAM.
  • the robot coordinate system LCR is set as the local coordinate system LC in the robot RB.
  • the processing device 40 uses the robot coordinate system LCR to generate a content CT (robot content RC) representing a ball.
  • the robot RB recognizes that the ball exists in the real space RS.
  • the AR glass 20A superimposes and displays the glass content GC representing the same ball on the real space RS.
  • the position is corrected.
  • the local coordinate system LC that is the reference of the correction is switched from the specific local coordinate system LC to the reference local coordinate system.
  • the correction method of the local coordinate system LC is the same as that described in the second embodiment.
  • the robot RB has a plurality of sensors including a depth sensor, an acceleration sensor, a microphone and a camera.
  • the robot RB recognizes the real space RS around the robot RB, estimates the position of the robot RB in the real space RS, and the real object around the robot RB based on the sensor data detected by a plurality of sensors. Perform detection. Recognition of the real space RS and estimation of the position in the real space RS are performed using, for example, SLAM.
  • the robot RB outputs, for example, the robot coordinate system LCR information (information regarding the origin position and the position of the coordinate axis) and the environment map information generated by using SLAM to the error information detection unit 16.
  • the robot RB detects the coordinates of the operation part OP (the part where the robot RB interacts with the robot content RC) in the robot coordinate system LCR based on the sensor data.
  • the robot RB outputs information on the coordinates of the operation site OP to the error information detection unit 16.
  • the error information detection unit 16 is, for example, a relative position between the robot coordinate system LCR and the glass coordinate system LCA based on the coordinates of the operation site OP in the robot coordinate system LCR and the coordinates of the operation site OP in the glass coordinate system LCA. Detects the information of the error that occurred in.
  • the correction unit 17 corrects the relative position of the glass coordinate system LCA with respect to the robot coordinate system LCR based on the error information in response to the start of the operation of the content CT.
  • the position of the glass content GC recognized by the user U via the AR glass 20A is corrected according to the position of the robot content RC recognized by the robot RB as an operation target.
  • the reference local coordinate system determination unit 41 determines the reference local coordinate system based on the operation history of the robot RB and the operation history of the AR glass 20A.
  • the correction unit 17 sets the local coordinate system LC as the correction reference from the robot coordinate system LCR to the glass when the operation of the robot content RC by the robot RB is completed. Switch to the coordinate system LCA.
  • An error information detector that detects information on errors that occur at relative positions in multiple local coordinate systems, All other local coordinate systems for the particular local coordinate system used to generate the content recognized by the content operator among the plurality of local coordinate systems in response to the initiation of the content operation.
  • a correction unit that corrects the relative position of the Information processing system with.
  • the information processing system according to (1) or (2) above which corrects the relative positions of all other local coordinate systems with respect to the reference local coordinate system having a small error based on the information of the error.
  • the present invention has a display control unit having a display control unit that gradually changes the position of the content to a position calculated by a correction based on the reference local coordinate system in response to the completion of the operation of the content.
  • Information processing system (5) Based on the operation history of a plurality of devices in which the plurality of local coordinate systems are set, the local coordinate system having the smallest error in the position relative to the world coordinate system is determined to be the reference local coordinate system.
  • the information processing system according to (3) or (4) above which has a coordinate system determination unit.
  • the information processing system according to any one of (1) to (5) above, wherein the correction unit corrects the local coordinate system by a coordinate transformation process for translating the local coordinate system. (7) The correction unit corrects the local coordinate system by translation processing for parallel movement of the local coordinate system and rotational movement of the local coordinate system around one or more rotation axes (1) to (5). The information processing system according to any one of the above. (8) Detects information on errors that occur in relative positions in multiple local coordinate systems, All other local coordinate systems for the particular local coordinate system used to generate the content recognized by the content operator among the plurality of local coordinate systems in response to the initiation of the content operation. The relative position of is corrected based on the above error information. A method of information processing performed by a computer that has.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2021/016735 2020-05-21 2021-04-27 情報処理システム、情報処理方法およびプログラム WO2021235193A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022524352A JPWO2021235193A1 (de) 2020-05-21 2021-04-27

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020088674 2020-05-21
JP2020-088674 2020-05-21

Publications (1)

Publication Number Publication Date
WO2021235193A1 true WO2021235193A1 (ja) 2021-11-25

Family

ID=78708559

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/016735 WO2021235193A1 (ja) 2020-05-21 2021-04-27 情報処理システム、情報処理方法およびプログラム

Country Status (2)

Country Link
JP (1) JPWO2021235193A1 (de)
WO (1) WO2021235193A1 (de)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003308514A (ja) * 1997-09-01 2003-10-31 Canon Inc 情報処理方法及び情報処理装置
JP2014514653A (ja) * 2011-03-29 2014-06-19 クアルコム,インコーポレイテッド 各ユーザの視点に対する共有デジタルインターフェースのレンダリングのためのシステム
WO2017013986A1 (ja) * 2015-07-17 2017-01-26 シャープ株式会社 情報処理装置、端末、および、遠隔通信システム
WO2019044003A1 (ja) * 2017-09-04 2019-03-07 株式会社ワコム 空間位置指示システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003308514A (ja) * 1997-09-01 2003-10-31 Canon Inc 情報処理方法及び情報処理装置
JP2014514653A (ja) * 2011-03-29 2014-06-19 クアルコム,インコーポレイテッド 各ユーザの視点に対する共有デジタルインターフェースのレンダリングのためのシステム
WO2017013986A1 (ja) * 2015-07-17 2017-01-26 シャープ株式会社 情報処理装置、端末、および、遠隔通信システム
WO2019044003A1 (ja) * 2017-09-04 2019-03-07 株式会社ワコム 空間位置指示システム

Also Published As

Publication number Publication date
JPWO2021235193A1 (de) 2021-11-25

Similar Documents

Publication Publication Date Title
US11478308B2 (en) Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US10852847B2 (en) Controller tracking for multiple degrees of freedom
CN102596085B (zh) 用于遥控操作的微创从手术器械的手控制系统
CN102596086B (zh) 微创手术系统中的主要手指追踪装置
US8879787B2 (en) Information processing device and information processing method
CN102665588B (zh) 用于微创手术系统中的手存在性探测的方法和系统
Schmidt et al. Depth-based tracking with physical constraints for robot manipulation
US20110267265A1 (en) Spatial-input-based cursor projection systems and methods
US20090058850A1 (en) System and method for intuitive interactive navigational control in virtual environments
WO2017021902A1 (en) System and method for gesture based measurement of virtual reality space
US20230316677A1 (en) Methods, devices, apparatuses, and storage media for virtualization of input devices
US20090079745A1 (en) System and method for intuitive interactive navigational control in virtual environments
JP3793158B2 (ja) 情報処理方法及び情報処理装置
WO2021235193A1 (ja) 情報処理システム、情報処理方法およびプログラム
Guerrero et al. Visual map‐less navigation based on homographies
Mohareri et al. A vision-based location positioning system via augmented reality: An application in humanoid robot navigation
US20230005098A1 (en) Information processing device, method, and program
Montijano et al. Position-based navigation using multiple homographies
Pfanne Grasp State Estimation
Nogueira et al. Pose estimation of a humanoid robot using images from a mobile external camera
Adán et al. Landmark real-time recognition and positioning for pedestrian navigation
CN114967943A (zh) 基于3d手势识别确定6dof位姿的方法及设备
Ma et al. Kinect based character navigation in VR Game
Lowalekar Control of a single ground vehicle using aerial and onboard camera views

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21809026

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022524352

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21809026

Country of ref document: EP

Kind code of ref document: A1