WO2021210155A1 - Display device, display method, and program - Google Patents

Display device, display method, and program Download PDF

Info

Publication number
WO2021210155A1
WO2021210155A1 PCT/JP2020/016841 JP2020016841W WO2021210155A1 WO 2021210155 A1 WO2021210155 A1 WO 2021210155A1 JP 2020016841 W JP2020016841 W JP 2020016841W WO 2021210155 A1 WO2021210155 A1 WO 2021210155A1
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
display device
users
display
image
Prior art date
Application number
PCT/JP2020/016841
Other languages
French (fr)
Japanese (ja)
Inventor
佐藤 隆
誉宗 巻口
正典 横山
高田 英明
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2020/016841 priority Critical patent/WO2021210155A1/en
Publication of WO2021210155A1 publication Critical patent/WO2021210155A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits

Definitions

  • the present invention relates to a display device, a display method, and a program.
  • Non-Patent Document 1 proposes a system in which a plurality of users instruct coordinates with a laser pointer on a shared screen and perform operations such as drawing a figure.
  • the laser pointer is changed by the user and the coordinates are acquired by the camera.
  • the present invention has been made in view of the above, and an object of the present invention is to provide a display device that can intuitively visually recognize one's own input on a screen shared by a plurality of users.
  • the display device of one aspect of the present invention is associated with a plurality of projection units that provide different images for each of the plurality of users on a display surface shared by the plurality of users, and the plurality of projection units.
  • a plurality of input units that acquire each user's operation and operate the corresponding cursor, and each of the plurality of projection units are operated by the input unit corresponding to the projection unit in the image provided by the projection unit. It has a control unit that displays the first cursor in a manner different from that of the second cursor group operated by the other input units.
  • the present invention it is possible to provide a display device that can intuitively visually recognize one's own input on a screen shared by a plurality of users.
  • FIG. 1 is a diagram for explaining an outline of the display device of the present embodiment.
  • FIG. 2 is a top view showing an example of the configuration of the display device.
  • FIG. 3 is a cross-sectional view showing an example of the configuration of the display device.
  • FIG. 4 is a top view showing an example of the configuration of another display device.
  • FIG. 5 is a diagram showing an example of the configuration of the information processing device.
  • FIG. 6 is a diagram showing an example of the correspondence between the sensor held by the information processing device and the projector.
  • FIG. 7 is a flowchart showing a processing flow of the information processing apparatus.
  • FIG. 8 is a diagram showing an example of a cursor image.
  • FIG. 9 is a diagram showing an example of the hardware configuration of the information processing device.
  • the display device 1 of the present embodiment is a tubular display device having an opening on the upper surface, and is provided with a circular reflective screen 50 on the inner bottom of the main body.
  • Users 200A to 200C look into the bottom screen 50 through the opening.
  • a plurality of cursors operated by each of the users 200A to 200C are displayed on the screen 50.
  • Users 200A to 200C can operate the cursor displayed on the screen 50 by the pointing hand gesture.
  • the cursor operated by the user is highlighted in a manner different from that of the cursor group operated by other users.
  • the display device 1 includes a plurality of projectors 20-1 to 20-4 on the upper portion of the housing 40.
  • the projectors 20-1 to 20-4 are fixed to the inside of the housing 40 and project an image on the screen 50.
  • the screen 50 is a reflective optical screen that couples the iris surfaces (planes corresponding to the aperture of the lens) of the projectors 20-1 to 20-4 at positions corresponding to the projection distance and the focal length.
  • Projectors 20-1 to 20-4 can provide different images for each user 200A to 200C to the shared screen 50.
  • each user 200A to 200C can see the image projected by the projectors 20-1 to 20-4 arranged opposite to each other.
  • the user 200A can see the image projected by the projector 20-3, and the user 200C facing the user 200A sees the image projected by the projector 20-1 arranged on the opposite side of the projector 20-3. Can be done.
  • Sensors 30-1 to 30-4 are mounted on each of the projectors 20-1 to 20-4.
  • the sensors 30-1 to 30-4 sense the movements of the users 200A to 200C, and transmit the sensing data to the information processing device 10 described later.
  • the information processing device 10 recognizes the pointing hand gestures of the users 200A to 200C based on the sensing data of the sensors 30-1 to 30-4, and operates the cursors of the users 200A to 200C respectively.
  • the information processing device 10 associates the projectors 20-1 to 20-4 with the facing sensors 30-1 to 30-4. For example, the projector 20-3 that projects the image viewed by the user 200A and the sensor 30-1 that detects the pointing hand gesture of the user 200A are associated with each other.
  • the information processing device 10 calculates the coordinates of each cursor based on the sensing data of the sensors 30-1 to 30-4.
  • the information processing device 10 emphasizes the cursor operated based on the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 in the images supplied to the projectors 20-1 to 20-4, respectively. To display.
  • the cursor operated based on the sensing data of the sensor 30-1 is emphasized more than the cursor group operated based on the sensing data of the sensors 30-2 to 30-4. Is displayed. That is, the user 200A sees the image in which the cursor operated by the user 200A is emphasized and displayed. Other users 200B and 200C also see the image displayed with the cursor operated by each user highlighted.
  • the mounting positions of the projectors 20-1 to 20-4 and the sensors 30-1 to 30-4 are examples, and are not limited to these.
  • the arrangement position of the screen 50 is not limited to the bottom inside the main body of the display device 1.
  • the screen 50 may be arranged on the wall, and a plurality of projectors may be arranged so as to face the screen 50 and project images at different projection angles. Also in this case, different images can be visually recognized depending on the viewpoint position. It is desirable that the number of projectors and sensors is equal to or greater than the number of users who use them at the same time. When multiple sensors respond to one user, select one representative sensor, associate it with the cursor, and display it in the image that can be seen from that position.
  • Examples of the method of selecting the representative sensor include a method of selecting the sensor having the largest response and a method of selecting the sensor located in the center of the responding sensor group.
  • the cursors of the plurality of users must be displayed at the same time on an image that is commonly seen by the plurality of users.
  • FIG. 4 shows an example of another display device.
  • 60 projectors 20 are arranged side by side in a circle.
  • Each projector 20 outputs images of the entire circumference of the subject taken at different angles.
  • linear blending can be optically realized, and a three-dimensional image that smoothly complements the intermediate viewpoint can be presented.
  • the user can see the entire circumference of the three-dimensional image projected on the screen 50 by changing the viewpoint position along the outer circumference of the display device 1.
  • the sensor and the projector 20 do not have a one-to-one correspondence with each other.
  • a plurality of projectors may be associated with one sensor.
  • the display unit including the projectors 20-1 to 20-4 is not limited to the above-mentioned spatial imaging iris surface method, but is an integral type display, a parallax barrier type, or a 3D display using 3D glasses (optical shutter, Anything such as a polarizing plate) that can show different images depending on the viewpoint position is sufficient.
  • the integral method is a method in which a lens array is arranged in front of a display surface and an image to be displayed is switched according to a viewing angle.
  • the parallax barrier method is a method in which barrier layers are arranged in front of a display surface and the image to be displayed is switched according to the viewing angle.
  • the input unit including the sensors 30-1 to 30-4 is not limited to the above-mentioned hand gesture recognition, and each of the users 200A to 200C can operate the cursor individually, such as mouse, joystick, or laser pointer recognition. Anything is fine.
  • the information processing device 10 is a device that inputs cursor operation information from sensors 30-1 to 30-4 in the display device 1 and supplies images to projectors 20-1 to 20-4.
  • the information processing device 10 shown in FIG. 5 includes coordinate input units 101-1 to 101-4, image output units 102-1 to 102-4, a control unit 103, and a storage unit 104.
  • the coordinate input units 101-1 to 101-4 receive the sensing data from the sensors 30-1 to 30-4 and calculate the coordinates of the corresponding cursors. That is, the coordinate input units 101-1 to 101-4 receive the operation information of the cursor from each of the users 200A to 200C, and calculate the coordinates of the cursor based on the operation information.
  • the image output units 102-1 to 102-4 supply images to be shown to each of the users 200A to 200C to each of the projectors 20-1 to 20-4.
  • the control unit 103 identifies the correspondence between the projectors 20-1 to 20-4 and the sensors 30-1 to 30-4 with reference to the information stored in the storage unit 104, and the image output units 102-1 to 102-
  • the highlighting cursor is switched every 4.
  • the control unit 103 superimposes the highlighted cursor image on the coordinates of the cursor operated by the corresponding sensors 30-1 to 30-4 on each of the original images supplied to the projectors 20-1 to 20-4.
  • the normally displayed cursor image is superimposed on the coordinates of the other cursors.
  • the original video supplied to the projectors 20-1 to 20-4 may be input from the outside, or may be read out stored in the storage unit 104.
  • the storage unit 104 stores the correspondence between the sensors 30-1 to 30-4 and the projectors 20-1 to 20-4.
  • two types of cursor images A and B are associated with each other.
  • the cursor image A is a highlighting cursor image
  • the cursor image B is a normal display cursor image.
  • the sensor 30-3 (the sensor ID is 3) corresponds to the projector 20-1 (the projector ID is 1).
  • the image of the highlight 3 is used for the cursor operated based on the sensing data of the sensor 30-3, and the image using the image of the simple 3 is supplied to the projector 20-1 for the other cursors.
  • step S11 the information processing device 10 receives sensing data from each of the sensors 30-1 to 30-4, and obtains cursor coordinates corresponding to each of the sensors 30-1 to 30-4.
  • the cursor coordinates may not be displayed (the cursor may not be displayed).
  • the information processing device 10 executes the following steps S12 and S13 for each of the projectors 20-1 to 20-4.
  • step S12 the information processing device 10 displays the emphasized cursor at the cursor coordinates operated by the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 to be processed.
  • step S13 the information processing apparatus 10 normally displays a cursor group other than the cursor displayed in step S12, and supplies the image on which the cursor is superimposed to the projectors 20-1 to 20-4 to be processed.
  • the information processing device 10 displays the image supplied to the projector 20-1 with the highlight 3 at the cursor coordinates operated by the sensor 30-3. The image is superimposed, and the simple 3 image is superimposed on the other cursor coordinates.
  • FIG. 8 shows an example of the cursor image 501 for normal display and the cursor images 511 to 515 for highlighting.
  • the cursor image 511 is a enlarged version of the cursor.
  • the cursor image 512 has a different contour or shadow.
  • the cursor image 513 has different colors. You may change the shape of the cursor.
  • the cursor image 514 is obtained by changing the direction of the cursor.
  • the cursor image 515 is annotated near the cursor.
  • the cursor image When the cursor image has top and bottom or characters are displayed near the cursor, the cursor image is rotated according to the viewing angle of the users 200A to 200C, that is, the direction of the displayed image, and the cursor is visually recognized by the users 200A to 200C. It may be easier to do.
  • the cursor that is normally displayed that is, the cursor that is operated by another user, may be switched between display and non-display. For example, when the user 200A sets his / her own cursor to be hidden, the cursor operated by the user 200A is not displayed in the images of the other users 200B and 200C.
  • the display device 1 of the present embodiment includes a plurality of projectors 20-1 to 20-4 and a projector 20 that provide different images for each of the plurality of users on the screen 50 shared by the plurality of users. It is associated with -1 to 20-4, and has a plurality of sensors 30-1 to 30-4 that acquire the operations of each of the plurality of users and operate the corresponding cursors.
  • the information processing device 10 sets the cursor operated by the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 in the image projected by each of the projectors 20-1 to 20-4. It is displayed in a mode different from that of the cursor operated by the sensors 30-1 to 30-4. As a result, in the image projected on the shared screen 50, the cursor operated by each user is displayed in a different manner from the cursor group operated by other users, so that each user can easily move his / her cursor. It becomes possible to grasp.
  • the information processing device 10 described above includes, for example, a central processing unit (CPU) 901, a memory 902, a storage 903, a communication device 904, an input device 905, and an output device 906, as shown in FIG.
  • CPU central processing unit
  • a general-purpose computer system including the above can be used.
  • the information processing device 10 is realized by the CPU 901 executing a predetermined program loaded on the memory 902.
  • This program can be recorded on a computer-readable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory, or can be distributed via a network.

Abstract

A display device 1 according to the present embodiment has: a plurality of projectors 20-1 to 20-4 that provide different videos for a plurality of individual users on a screen 50 shared by the plurality of users; a plurality of sensors 30-1 to 30-4 that are associated with the projectors 20-1 to 20-4, that obtain operations of the plurality of individual users, and that operate corresponding cursors; and a control unit 103 that, with respect to the plurality of individual projectors 20-1 to 20-4, cause cursors operated by the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 to be displayed in the videos provided by the projectors 20-1 to 20-4 in modes different from those of cursors operated by the other sensors 30-1 to 30-4.

Description

表示装置、表示方法、およびプログラムDisplay device, display method, and program
 本発明は、表示装置、表示方法、およびプログラムに関する。 The present invention relates to a display device, a display method, and a program.
 非特許文献1には、共有されたスクリーンに対し複数のユーザがレーザーポインタで座標を指示し、図形を描くなどの操作をするシステムが提案されている。非特許文献1では、ユーザによってレーザーポインタを変え、カメラによって座標を取得している。 Non-Patent Document 1 proposes a system in which a plurality of users instruct coordinates with a laser pointer on a shared screen and perform operations such as drawing a figure. In Non-Patent Document 1, the laser pointer is changed by the user and the coordinates are acquired by the camera.
 しかしながら、複数のユーザで同じ画面を共有する場合、自分のカーソルを把握しにくく、ユーザが増えると画面表示が煩雑になる、という課題があった。 However, when sharing the same screen with multiple users, there is a problem that it is difficult to grasp one's cursor and the screen display becomes complicated as the number of users increases.
 本発明は、上記に鑑みてなされたものであり、複数のユーザで共有される画面上において自分の入力を直感的に視認できる表示装置を提供することを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to provide a display device that can intuitively visually recognize one's own input on a screen shared by a plurality of users.
 本発明の一態様の表示装置は、複数のユーザが共有する表示面に対して前記複数のユーザごとに異なる映像を提供する複数の投影部と、前記複数の投影部に対応付けられ、前記複数のユーザそれぞれの操作を取得して対応するカーソルを操作する複数の入力部と、前記複数の投影部ごとに、当該投影部の提供する映像において、当該投影部に対応する前記入力部によって操作される第1カーソルを他の前記入力部によって操作される第2カーソル群と異なる態様で表示する制御部を有する。 The display device of one aspect of the present invention is associated with a plurality of projection units that provide different images for each of the plurality of users on a display surface shared by the plurality of users, and the plurality of projection units. A plurality of input units that acquire each user's operation and operate the corresponding cursor, and each of the plurality of projection units are operated by the input unit corresponding to the projection unit in the image provided by the projection unit. It has a control unit that displays the first cursor in a manner different from that of the second cursor group operated by the other input units.
 本発明によれば、複数のユーザで共有される画面上において自分の入力を直感的に視認できる表示装置を提供することができる。 According to the present invention, it is possible to provide a display device that can intuitively visually recognize one's own input on a screen shared by a plurality of users.
図1は、本実施形態の表示装置の概要を説明するための図である。FIG. 1 is a diagram for explaining an outline of the display device of the present embodiment. 図2は、表示装置の構成の一例を示す上面図である。FIG. 2 is a top view showing an example of the configuration of the display device. 図3は、表示装置の構成の一例を示す断面図である。FIG. 3 is a cross-sectional view showing an example of the configuration of the display device. 図4は、別の表示装置の構成の一例を示す上面図である。FIG. 4 is a top view showing an example of the configuration of another display device. 図5は、情報処理装置の構成の一例を示す図である。FIG. 5 is a diagram showing an example of the configuration of the information processing device. 図6は、情報処理装置の保持するセンサとプロジェクタの対応関係の一例を示す図である。FIG. 6 is a diagram showing an example of the correspondence between the sensor held by the information processing device and the projector. 図7は、情報処理装置の処理の流れを示すフローチャートである。FIG. 7 is a flowchart showing a processing flow of the information processing apparatus. 図8は、カーソル画像の例を示す図である。FIG. 8 is a diagram showing an example of a cursor image. 図9は、情報処理装置のハードウェア構成の一例を示す図である。FIG. 9 is a diagram showing an example of the hardware configuration of the information processing device.
 以下、本発明の一実施形態について図面を用いて説明する。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
 図1に示すように、本実施形態の表示装置1は、上面に開口部を備えた筒型の表示装置であり、本体の内側の底部に円形の反射型スクリーン50を備える。ユーザ200A~200Cは、開口部から底部のスクリーン50をのぞき込む。スクリーン50には、ユーザ200A~200Cのそれぞれが操作する複数のカーソルが表示される。ユーザ200A~200Cは、指差しハンドジェスチャにより、スクリーン50に表示されるカーソルを操作可能である。各ユーザ200A~200Cが見る映像では、自身の操作するカーソルは、他のユーザの操作するカーソル群とは異なる態様で強調表示されている。 As shown in FIG. 1, the display device 1 of the present embodiment is a tubular display device having an opening on the upper surface, and is provided with a circular reflective screen 50 on the inner bottom of the main body. Users 200A to 200C look into the bottom screen 50 through the opening. A plurality of cursors operated by each of the users 200A to 200C are displayed on the screen 50. Users 200A to 200C can operate the cursor displayed on the screen 50 by the pointing hand gesture. In the images viewed by each user 200A to 200C, the cursor operated by the user is highlighted in a manner different from that of the cursor group operated by other users.
 図2および図3に示すように、表示装置1は、筐体40の上部に複数のプロジェクタ20-1~20-4を備える。プロジェクタ20-1~20-4は、筐体40の内側に固定されて、スクリーン50に映像を投影する。スクリーン50は、投影距離と焦点距離に応じた位置にプロジェクタ20-1~20-4のアイリス面(レンズの絞り相当の面)を結合する反射型の光学スクリーンである。プロジェクタ20-1~20-4は、共有のスクリーン50に対してユーザ200A~200Cごとに異なる映像を提供できる。具体的には、各ユーザ200A~200Cは、自身の対面に配置されたプロジェクタ20-1~20-4が投影する映像を見ることができる。例えば、ユーザ200Aは、プロジェクタ20-3が投影する映像を見ることができ、ユーザ200Aの対面のユーザ200Cは、プロジェクタ20-3の対面に配置されたプロジェクタ20-1が投影する映像を見ることができる。 As shown in FIGS. 2 and 3, the display device 1 includes a plurality of projectors 20-1 to 20-4 on the upper portion of the housing 40. The projectors 20-1 to 20-4 are fixed to the inside of the housing 40 and project an image on the screen 50. The screen 50 is a reflective optical screen that couples the iris surfaces (planes corresponding to the aperture of the lens) of the projectors 20-1 to 20-4 at positions corresponding to the projection distance and the focal length. Projectors 20-1 to 20-4 can provide different images for each user 200A to 200C to the shared screen 50. Specifically, each user 200A to 200C can see the image projected by the projectors 20-1 to 20-4 arranged opposite to each other. For example, the user 200A can see the image projected by the projector 20-3, and the user 200C facing the user 200A sees the image projected by the projector 20-1 arranged on the opposite side of the projector 20-3. Can be done.
 プロジェクタ20-1~20-4それぞれの上には、センサ30-1~30-4が取り付けられる。各センサ30-1~30-4は、ユーザ200A~200Cそれぞれの動きをセンシングし、センシングデータを後述の情報処理装置10へ送信する。情報処理装置10は、センサ30-1~30-4のセンシングデータに基づいてユーザ200A~200Cの指差しハンドジェスチャを認識し、ユーザ200A~200Cそれぞれのカーソルを操作する。 Sensors 30-1 to 30-4 are mounted on each of the projectors 20-1 to 20-4. The sensors 30-1 to 30-4 sense the movements of the users 200A to 200C, and transmit the sensing data to the information processing device 10 described later. The information processing device 10 recognizes the pointing hand gestures of the users 200A to 200C based on the sensing data of the sensors 30-1 to 30-4, and operates the cursors of the users 200A to 200C respectively.
 情報処理装置10は、プロジェクタ20-1~20-4と対面のセンサ30-1~30-4とを対応付けておく。例えば、ユーザ200Aの見る映像を投影するプロジェクタ20-3とユーザ200Aの指差しハンドジェスチャを検知するセンサ30-1が対応付けられる。情報処理装置10は、センサ30-1~30-4のセンシングデータに基づいて各カーソルの座標を計算する。情報処理装置10は、プロジェクタ20-1~20-4のそれぞれに供給する映像において、プロジェクタ20-1~20-4に対応するセンサ30-1~30-4に基づいて操作されたカーソルを強調して表示する。例えば、プロジェクタ20-3の投影する映像では、センサ30-1のセンシングデータに基づいて操作されたカーソルがセンサ30-2~30-4のセンシングデータに基づいて操作されたカーソル群よりも強調して表示される。つまり、ユーザ200Aは、自身が操作するカーソルが強調して表示された映像を見ることになる。他のユーザ200B,200Cも各自が操作するカーソルが強調して表示された映像を見る。 The information processing device 10 associates the projectors 20-1 to 20-4 with the facing sensors 30-1 to 30-4. For example, the projector 20-3 that projects the image viewed by the user 200A and the sensor 30-1 that detects the pointing hand gesture of the user 200A are associated with each other. The information processing device 10 calculates the coordinates of each cursor based on the sensing data of the sensors 30-1 to 30-4. The information processing device 10 emphasizes the cursor operated based on the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 in the images supplied to the projectors 20-1 to 20-4, respectively. To display. For example, in the image projected by the projector 20-3, the cursor operated based on the sensing data of the sensor 30-1 is emphasized more than the cursor group operated based on the sensing data of the sensors 30-2 to 30-4. Is displayed. That is, the user 200A sees the image in which the cursor operated by the user 200A is emphasized and displayed. Other users 200B and 200C also see the image displayed with the cursor operated by each user highlighted.
 なお、プロジェクタ20-1~20-4およびセンサ30-1~30-4の取り付け位置は一例であり、これに限るものではない。スクリーン50の配置位置は、表示装置1本体の内側の底部に限るものではない。スクリーン50を壁に配置し、スクリーン50に対向させて、異なる投影角度で映像を投影するように複数のプロジェクタを配置してもよい。この場合も、視点位置に応じて異なる映像が視認できる。プロジェクタとセンサの台数は、同時に利用するユーザ数以上であることが望ましい。ひとりのユーザに複数のセンサが反応する場合は、代表するセンサをひとつ選び、それとカーソルを対応させ、その位置から見える画像に表示する。代表センサを選択する方法には、例えば、最も反応が大きいものを選ぶ方法や、反応したセンサ群の中央に位置するものを選ぶ方法などがある。ひとつのセンサが複数のユーザに反応する場合は、その複数ユーザに共通して見える画像に、複数ユーザのカーソルを同時に表示せざるをえない。 The mounting positions of the projectors 20-1 to 20-4 and the sensors 30-1 to 30-4 are examples, and are not limited to these. The arrangement position of the screen 50 is not limited to the bottom inside the main body of the display device 1. The screen 50 may be arranged on the wall, and a plurality of projectors may be arranged so as to face the screen 50 and project images at different projection angles. Also in this case, different images can be visually recognized depending on the viewpoint position. It is desirable that the number of projectors and sensors is equal to or greater than the number of users who use them at the same time. When multiple sensors respond to one user, select one representative sensor, associate it with the cursor, and display it in the image that can be seen from that position. Examples of the method of selecting the representative sensor include a method of selecting the sensor having the largest response and a method of selecting the sensor located in the center of the responding sensor group. When one sensor responds to a plurality of users, the cursors of the plurality of users must be displayed at the same time on an image that is commonly seen by the plurality of users.
 図4に、別の表示装置の一例を示す。図4の表示装置1では、60台のプロジェクタ20を円状に並べて配置した。各プロジェクタ20からは、被写体の全周を異なる角度で撮影した映像を出力する。隣接するプロジェクタ20の作り出すアイリス面を左右方向に重ねることで、リニアブレンディングが光学的に実現でき、中間視点をなめらかに補完した3次元映像を提示できる。ユーザは、表示装置1の外周に沿って視点位置を変えることで、スクリーン50上に投影される3次元映像の全周を見ることができる。なお、図4の表示装置1では、1人のユーザは隣接する複数のプロジェクタ20の映像が重ね合わされた映像を見ることになるので、センサとプロジェクタ20とを一対一で対応させずに、1つのセンサに対して複数のプロジェクタを対応付けてもよい。 FIG. 4 shows an example of another display device. In the display device 1 of FIG. 4, 60 projectors 20 are arranged side by side in a circle. Each projector 20 outputs images of the entire circumference of the subject taken at different angles. By superimposing the iris surfaces produced by the adjacent projectors 20 in the left-right direction, linear blending can be optically realized, and a three-dimensional image that smoothly complements the intermediate viewpoint can be presented. The user can see the entire circumference of the three-dimensional image projected on the screen 50 by changing the viewpoint position along the outer circumference of the display device 1. In the display device 1 of FIG. 4, since one user sees an image in which images of a plurality of adjacent projectors 20 are superimposed, the sensor and the projector 20 do not have a one-to-one correspondence with each other. A plurality of projectors may be associated with one sensor.
 また、プロジェクタ20-1~20-4を含む表示部は、上記の空間結像アイリス面方式に限るものではなく、インテグラル方式ディスプレイ、視差バリア方式、あるいは3Dメガネを使う3Dディスプレイ(光学シャッター、偏光板)などの、視点位置に応じて異なる映像を見せることができるものであればよい。例えば、インテグラル方式は、表示面の前にレンズアレイを並べて、見る角度によって表示する画像を切り替える方式である。視差バリア方式は、表示面の前にバリア層を並べて、見る角度によって表示する画像を切り替える方式である。 Further, the display unit including the projectors 20-1 to 20-4 is not limited to the above-mentioned spatial imaging iris surface method, but is an integral type display, a parallax barrier type, or a 3D display using 3D glasses (optical shutter, Anything such as a polarizing plate) that can show different images depending on the viewpoint position is sufficient. For example, the integral method is a method in which a lens array is arranged in front of a display surface and an image to be displayed is switched according to a viewing angle. The parallax barrier method is a method in which barrier layers are arranged in front of a display surface and the image to be displayed is switched according to the viewing angle.
 また、センサ30-1~30-4を含む入力部は、上記のハンドジェスチャ認識に限るものでなく、マウス、ジョイスティック、あるいはレーザーポインタ認識など、ユーザ200A~200Cのそれぞれが個別にカーソルを操作できるものであればよい。 Further, the input unit including the sensors 30-1 to 30-4 is not limited to the above-mentioned hand gesture recognition, and each of the users 200A to 200C can operate the cursor individually, such as mouse, joystick, or laser pointer recognition. Anything is fine.
 次に、図5を参照し、本実施形態の情報処理装置10について説明する。情報処理装置10は、表示装置1において、センサ30-1~30-4からカーソルの操作情報を入力して、プロジェクタ20-1~20-4に映像を供給する装置である。 Next, the information processing device 10 of the present embodiment will be described with reference to FIG. The information processing device 10 is a device that inputs cursor operation information from sensors 30-1 to 30-4 in the display device 1 and supplies images to projectors 20-1 to 20-4.
 図5に示す情報処理装置10は、座標入力部101-1~101-4、画像出力部102-1~102-4、制御部103、および記憶部104を備える。 The information processing device 10 shown in FIG. 5 includes coordinate input units 101-1 to 101-4, image output units 102-1 to 102-4, a control unit 103, and a storage unit 104.
 座標入力部101-1~101-4は、センサ30-1~30-4からセンシングデータを受信し、対応するカーソルの座標を計算する。つまり、座標入力部101-1~101-4は、ユーザ200A~200Cのそれぞれからカーソルの操作情報を受信し、操作情報に基づいてカーソルの座標を計算する。 The coordinate input units 101-1 to 101-4 receive the sensing data from the sensors 30-1 to 30-4 and calculate the coordinates of the corresponding cursors. That is, the coordinate input units 101-1 to 101-4 receive the operation information of the cursor from each of the users 200A to 200C, and calculate the coordinates of the cursor based on the operation information.
 画像出力部102-1~102-4は、ユーザ200A~200Cのそれぞれに見せる映像をプロジェクタ20-1~20-4のそれぞれに供給する。 The image output units 102-1 to 102-4 supply images to be shown to each of the users 200A to 200C to each of the projectors 20-1 to 20-4.
 制御部103は、記憶部104に格納した情報を参照してプロジェクタ20-1~20-4とセンサ30-1~30-4との対応付けを特定し、画像出力部102-1~102-4ごとに、強調表示するカーソルを切り替える。例えば、制御部103は、プロジェクタ20-1~20-4に供給する元映像のそれぞれに、対応するセンサ30-1~30-4で操作されたカーソルの座標には強調表示したカーソル画像を重畳し、その他のカーソルの座標には通常表示したカーソル画像を重畳する。プロジェクタ20-1~20-4に供給する元映像は、外部から入力してもよいし、記憶部104に格納したものを読み出してもよい。 The control unit 103 identifies the correspondence between the projectors 20-1 to 20-4 and the sensors 30-1 to 30-4 with reference to the information stored in the storage unit 104, and the image output units 102-1 to 102- The highlighting cursor is switched every 4. For example, the control unit 103 superimposes the highlighted cursor image on the coordinates of the cursor operated by the corresponding sensors 30-1 to 30-4 on each of the original images supplied to the projectors 20-1 to 20-4. However, the normally displayed cursor image is superimposed on the coordinates of the other cursors. The original video supplied to the projectors 20-1 to 20-4 may be input from the outside, or may be read out stored in the storage unit 104.
 記憶部104は、図6に示すように、センサ30-1~30-4とプロジェクタ20-1~20-4との対応を記憶する。図6の例では、センサIDとプロジェクタIDに加えて、2種類のカーソル画像A,Bを対応付けている。カーソル画像Aは、強調表示用のカーソル画像であり、カーソル画像Bは、通常表示用のカーソル画像である。例えば、図6の例では、プロジェクタ20-1(プロジェクタIDを1とする)にはセンサ30-3(センサIDを3とする)が対応する。センサ30-3のセンシングデータに基づいて操作されたカーソルにはハイライト3の画像を用い、他のカーソルには単純3の画像を用いた映像がプロジェクタ20-1に供給される。 As shown in FIG. 6, the storage unit 104 stores the correspondence between the sensors 30-1 to 30-4 and the projectors 20-1 to 20-4. In the example of FIG. 6, in addition to the sensor ID and the projector ID, two types of cursor images A and B are associated with each other. The cursor image A is a highlighting cursor image, and the cursor image B is a normal display cursor image. For example, in the example of FIG. 6, the sensor 30-3 (the sensor ID is 3) corresponds to the projector 20-1 (the projector ID is 1). The image of the highlight 3 is used for the cursor operated based on the sensing data of the sensor 30-3, and the image using the image of the simple 3 is supplied to the projector 20-1 for the other cursors.
 次に、図7を参照し、情報処理装置10の動作について説明する。 Next, the operation of the information processing device 10 will be described with reference to FIG. 7.
 ステップS11にて、情報処理装置10は、センサ30-1~30-4のそれぞれからセンシングデータを受信し、センサ30-1~30-4のそれぞれに対応するカーソル座標を求める。ユーザ200A~200Cの存在しないセンサ30-1~30-4については、カーソル座標なし(カーソルを表示しない)としてもよい。 In step S11, the information processing device 10 receives sensing data from each of the sensors 30-1 to 30-4, and obtains cursor coordinates corresponding to each of the sensors 30-1 to 30-4. For sensors 30-1 to 30-4 in which users 200A to 200C do not exist, the cursor coordinates may not be displayed (the cursor may not be displayed).
 情報処理装置10は、プロジェクタ20-1~20-4のそれぞれについて、以下のステップS12,S13の処理を実行する。 The information processing device 10 executes the following steps S12 and S13 for each of the projectors 20-1 to 20-4.
 ステップS12にて、情報処理装置10は、処理対象のプロジェクタ20-1~20-4に対応するセンサ30-1~30-4で操作されたカーソル座標に、強調したカーソルを表示する。 In step S12, the information processing device 10 displays the emphasized cursor at the cursor coordinates operated by the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 to be processed.
 ステップS13にて、情報処理装置10は、ステップS12で表示したカーソル以外のカーソル群を通常表示し、カーソルを重畳した映像を処理対象のプロジェクタ20-1~20-4に供給する。 In step S13, the information processing apparatus 10 normally displays a cursor group other than the cursor displayed in step S12, and supplies the image on which the cursor is superimposed to the projectors 20-1 to 20-4 to be processed.
 例えば、情報処理装置10が図6に示したテーブルを保持する場合、情報処理装置10は、プロジェクタ20-1に供給する映像には、センサ30-3で操作されたカーソル座標にハイライト3の画像を重畳し、それ以外のカーソル座標には単純3の画像を重畳する。 For example, when the information processing device 10 holds the table shown in FIG. 6, the information processing device 10 displays the image supplied to the projector 20-1 with the highlight 3 at the cursor coordinates operated by the sensor 30-3. The image is superimposed, and the simple 3 image is superimposed on the other cursor coordinates.
 図8に通常表示のカーソル画像501と強調表示のカーソル画像511~515の例を示す。カーソル画像511は、カーソルを大きくしたものである。カーソル画像512は、輪郭または影を変えたものである。カーソル画像513は、色を変えたものである。カーソルの形を変えてもよい。カーソル画像514は、カーソルの向きを変えたものである。カーソル画像515は、カーソルの近くに注釈を付与したものである。 FIG. 8 shows an example of the cursor image 501 for normal display and the cursor images 511 to 515 for highlighting. The cursor image 511 is a enlarged version of the cursor. The cursor image 512 has a different contour or shadow. The cursor image 513 has different colors. You may change the shape of the cursor. The cursor image 514 is obtained by changing the direction of the cursor. The cursor image 515 is annotated near the cursor.
 カーソル画像に上下がある場合もしくはカーソルの近くに文字を表示する場合、ユーザ200A~200Cの見る角度つまり表示される映像の方向に応じてカーソル画像を回転させて、ユーザ200A~200Cからカーソルを視認しやすくしてもよい。 When the cursor image has top and bottom or characters are displayed near the cursor, the cursor image is rotated according to the viewing angle of the users 200A to 200C, that is, the direction of the displayed image, and the cursor is visually recognized by the users 200A to 200C. It may be easier to do.
 通常表示するカーソル、つまり他のユーザが操作するカーソルは、表示/非表示を切り替え可能にしてもよい。例えば、ユーザ200Aが自身のカーソルを非表示に設定すると、他のユーザ200B,200Cの映像にはユーザ200Aの操作するカーソルが表示されなくなる。 The cursor that is normally displayed, that is, the cursor that is operated by another user, may be switched between display and non-display. For example, when the user 200A sets his / her own cursor to be hidden, the cursor operated by the user 200A is not displayed in the images of the other users 200B and 200C.
 以上説明したように、本実施形態の表示装置1は、複数のユーザが共有するスクリーン50に対して複数のユーザごとに異なる映像を提供する複数のプロジェクタ20-1~20-4と、プロジェクタ20-1~20-4に対応付けられ、複数のユーザそれぞれの操作を取得して対応するカーソルを操作する複数のセンサ30-1~30-4を有する。情報処理装置10は、プロジェクタ20-1~20-4のそれぞれが投影する映像において、プロジェクタ20-1~20-4に対応するセンサ30-1~30-4によって操作されるカーソルを、他のセンサ30-1~30-4によって操作されるカーソルと異なる態様で表示する。これにより、共有するスクリーン50に投影される映像において、各ユーザの操作する自分のカーソルは他のユーザの操作するカーソル群と異なる態様で表示されるので、各ユーザは、自分のカーソルを容易に把握することが可能となる。 As described above, the display device 1 of the present embodiment includes a plurality of projectors 20-1 to 20-4 and a projector 20 that provide different images for each of the plurality of users on the screen 50 shared by the plurality of users. It is associated with -1 to 20-4, and has a plurality of sensors 30-1 to 30-4 that acquire the operations of each of the plurality of users and operate the corresponding cursors. The information processing device 10 sets the cursor operated by the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 in the image projected by each of the projectors 20-1 to 20-4. It is displayed in a mode different from that of the cursor operated by the sensors 30-1 to 30-4. As a result, in the image projected on the shared screen 50, the cursor operated by each user is displayed in a different manner from the cursor group operated by other users, so that each user can easily move his / her cursor. It becomes possible to grasp.
 上記説明した情報処理装置10には、例えば、図9に示すような、中央演算処理装置(CPU)901と、メモリ902と、ストレージ903と、通信装置904と、入力装置905と、出力装置906とを備える汎用的なコンピュータシステムを用いることができる。このコンピュータシステムにおいて、CPU901がメモリ902上にロードされた所定のプログラムを実行することにより、情報処理装置10が実現される。このプログラムは磁気ディスク、光ディスク、半導体メモリ等のコンピュータ読み取り可能な記録媒体に記録することも、ネットワークを介して配信することもできる。 The information processing device 10 described above includes, for example, a central processing unit (CPU) 901, a memory 902, a storage 903, a communication device 904, an input device 905, and an output device 906, as shown in FIG. A general-purpose computer system including the above can be used. In this computer system, the information processing device 10 is realized by the CPU 901 executing a predetermined program loaded on the memory 902. This program can be recorded on a computer-readable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory, or can be distributed via a network.
 1…表示装置
 20-1~20-4…プロジェクタ
 30-1~30-4…センサ
 40…筐体
 50…スクリーン
 10…情報処理装置
 101-1~101-4…座標入力部
 102-1~102-4…画像出力部
 103…制御部
 104…記憶部
1 ... Display device 20-1 to 20-4 ... Projector 30-1 to 30-4 ... Sensor 40 ... Housing 50 ... Screen 10 ... Information processing device 101-1 to 101-4 ... Coordinate input unit 102-1 to 102 -4 ... Image output unit 103 ... Control unit 104 ... Storage unit

Claims (7)

  1.  複数のユーザが共有する表示面に対して前記複数のユーザごとに異なる映像を提供する複数の投影部と、
     前記複数の投影部に対応付けられ、前記複数のユーザそれぞれの操作を取得して対応するカーソルを操作する複数の入力部と、
     前記複数の投影部ごとに、当該投影部の提供する映像において、当該投影部に対応する前記入力部によって操作される第1カーソルを他の前記入力部によって操作される第2カーソル群と異なる態様で表示する制御部を有する
     表示装置。
    A plurality of projection units that provide different images for each of the plurality of users on a display surface shared by a plurality of users.
    A plurality of input units associated with the plurality of projection units, acquiring operations of the plurality of users and operating the corresponding cursors, and
    For each of the plurality of projection units, in the image provided by the projection unit, the first cursor operated by the input unit corresponding to the projection unit is different from the second cursor group operated by the other input units. A display device having a control unit for displaying with.
  2.  請求項1に記載の表示装置であって、
     前記第1カーソルは、前記第2カーソル群に対して、大きさ、輪郭、色、形状、カーソルの向き、および注釈表示付きの少なくとも1つ以上の特徴によって強調される
     表示装置。
    The display device according to claim 1.
    The first cursor is a display device that is highlighted by at least one feature with respect to the second cursor group in size, contour, color, shape, cursor orientation, and annotation display.
  3.  請求項1に記載の表示装置であって、
     前記第1カーソルのみを表示し、前記第2カーソル群を表示しない
     表示装置。
    The display device according to claim 1.
    A display device that displays only the first cursor and does not display the second cursor group.
  4.  請求項1ないし3のいずれかに記載の表示装置であって、
     前記表示面は、前記複数の投影部からの映像を反射して空間結像アイリス面を形成する
     表示装置。
    The display device according to any one of claims 1 to 3.
    The display surface is a display device that reflects images from the plurality of projection units to form a spatially imaged iris surface.
  5.  請求項4に記載の表示装置であって、
     隣接する前記投影部からの映像で形成される前記空間結像アイリス面を重畳させて前記ユーザに3次元映像を提供する
     表示装置。
    The display device according to claim 4.
    A display device that provides a three-dimensional image to the user by superimposing the spatial imaging iris surface formed by the image from the adjacent projection unit.
  6.  複数の投影部のそれぞれから、複数のユーザが共有する表示面に対して前記複数のユーザごとに異なる映像を提供するステップと、
     前記複数の投影部に対応付けられた複数の入力部のそれぞれが、前記複数のユーザそれぞれの操作を取得して対応するカーソルを操作するステップと、
     前記複数の投影部ごとに、当該投影部の提供する映像において、当該投影部に対応する前記入力部によって操作される第1カーソルを他の前記入力部によって操作される第2カーソル群と異なる態様で表示するステップを有する
     表示方法。
    A step of providing different images for each of the plurality of users from each of the plurality of projection units on a display surface shared by the plurality of users.
    A step in which each of the plurality of input units associated with the plurality of projection units acquires an operation of each of the plurality of users and operates a corresponding cursor.
    For each of the plurality of projection units, in the image provided by the projection unit, the first cursor operated by the input unit corresponding to the projection unit is different from the second cursor group operated by the other input units. A display method that has steps to display in.
  7.  請求項1ないし5のいずれかに記載の表示装置の各部としてコンピュータを動作させるプログラム。 A program that operates a computer as each part of the display device according to any one of claims 1 to 5.
PCT/JP2020/016841 2020-04-17 2020-04-17 Display device, display method, and program WO2021210155A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/016841 WO2021210155A1 (en) 2020-04-17 2020-04-17 Display device, display method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/016841 WO2021210155A1 (en) 2020-04-17 2020-04-17 Display device, display method, and program

Publications (1)

Publication Number Publication Date
WO2021210155A1 true WO2021210155A1 (en) 2021-10-21

Family

ID=78084300

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/016841 WO2021210155A1 (en) 2020-04-17 2020-04-17 Display device, display method, and program

Country Status (1)

Country Link
WO (1) WO2021210155A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07120835A (en) * 1993-10-27 1995-05-12 Nippon Telegr & Teleph Corp <Ntt> Multilingual display device
JP2011197380A (en) * 2010-03-19 2011-10-06 Seiko Epson Corp Display device, display system and display method
JP2014089379A (en) * 2012-10-31 2014-05-15 Seiko Epson Corp Image display system, and control method of image display system
JP2015135572A (en) * 2014-01-16 2015-07-27 キヤノン株式会社 Information processing apparatus and control method of the same
JP2015146611A (en) * 2015-03-17 2015-08-13 セイコーエプソン株式会社 Interactive system and control method of interactive system
US20170212640A1 (en) * 2014-05-23 2017-07-27 Piqs Technology (Shenzhen) Limited Interactive display systems

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07120835A (en) * 1993-10-27 1995-05-12 Nippon Telegr & Teleph Corp <Ntt> Multilingual display device
JP2011197380A (en) * 2010-03-19 2011-10-06 Seiko Epson Corp Display device, display system and display method
JP2014089379A (en) * 2012-10-31 2014-05-15 Seiko Epson Corp Image display system, and control method of image display system
JP2015135572A (en) * 2014-01-16 2015-07-27 キヤノン株式会社 Information processing apparatus and control method of the same
US20170212640A1 (en) * 2014-05-23 2017-07-27 Piqs Technology (Shenzhen) Limited Interactive display systems
JP2015146611A (en) * 2015-03-17 2015-08-13 セイコーエプソン株式会社 Interactive system and control method of interactive system

Similar Documents

Publication Publication Date Title
CN105264572B (en) Information processing equipment, information processing method and program
EP3227760B1 (en) Pointer projection for natural user input
US11069146B2 (en) Augmented reality for collaborative interventions
JP5966510B2 (en) Information processing system
US20160357491A1 (en) Information processing apparatus, information processing method, non-transitory computer-readable storage medium, and system
TWI559174B (en) Gesture based manipulation of three-dimensional images
US11284061B2 (en) User input device camera
CN107430325A (en) The system and method for interactive projection
CN108900829B (en) Dynamic display system
JP7182920B2 (en) Image processing device, image processing method and program
US11954268B2 (en) Augmented reality eyewear 3D painting
JP2017187667A (en) Head-mounted display device and computer program
US20170102791A1 (en) Virtual Plane in a Stylus Based Stereoscopic Display System
JPH11155152A (en) Method and system for three-dimensional shape information input, and image input device thereof
US11675198B2 (en) Eyewear including virtual scene with 3D frames
JP6381361B2 (en) DATA PROCESSING DEVICE, DATA PROCESSING SYSTEM, DATA PROCESSING DEVICE CONTROL METHOD, AND PROGRAM
Chapdelaine-Couture et al. The omnipolar camera: A new approach to stereo immersive capture
JP4052357B2 (en) Virtual environment experience display device
WO2021210155A1 (en) Display device, display method, and program
JP2019146155A (en) Image processing device, image processing method, and program
KR100845274B1 (en) Apparatus and method for generating user-interface based on face recognition in a exhibition system
US20230007227A1 (en) Augmented reality eyewear with x-ray effect
JP2016181302A (en) Computer program and computer system for controlling object operation in immersive virtual space
JP5337409B2 (en) Information presentation device
JP2019032713A (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20930919

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20930919

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP