JP2022096634A - 触覚コンテンツの提示及び実施 - Google Patents
触覚コンテンツの提示及び実施 Download PDFInfo
- Publication number
- JP2022096634A JP2022096634A JP2021203931A JP2021203931A JP2022096634A JP 2022096634 A JP2022096634 A JP 2022096634A JP 2021203931 A JP2021203931 A JP 2021203931A JP 2021203931 A JP2021203931 A JP 2021203931A JP 2022096634 A JP2022096634 A JP 2022096634A
- Authority
- JP
- Japan
- Prior art keywords
- interest
- touch screen
- computing device
- video
- video frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000004044 response Effects 0.000 claims abstract description 86
- 238000000034 method Methods 0.000 claims abstract description 51
- 230000008569 process Effects 0.000 claims description 20
- 230000008859 change Effects 0.000 claims description 9
- 230000006870 function Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 9
- 230000001771 impaired effect Effects 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000035807 sensation Effects 0.000 description 3
- 201000004569 Blindness Diseases 0.000 description 1
- 206010047571 Visual impairment Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 208000029257 vision disease Diseases 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000004393 visual impairment Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B6/00—Tactile signalling systems, e.g. personal calling systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/003—Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/007—Teaching or communicating with blind persons using both tactile and audible presentation of the information
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B3/00—Audible signalling systems; Audible personal calling systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (20)
- ビデオフレームを含むビデオをタッチスクリーン上に表示する工程と、
ビデオフレームの顕著性マップに基づいて、ビデオフレーム内の関心領域を決定する工程と、
ビデオフレームが表示されている間に、タッチスクリーンの領域上のタッチを検出する工程と、
タッチスクリーンの領域が関心領域とオーバーラップしていることを決定したことに応答して、触覚応答を生成する工程とを含む方法。 - 触覚応答はアクチュエータ又はスピーカを使用して生成される、請求項1に記載の方法。
- ビデオフレーム内の関心オブジェクトを検出する工程と、
関心領域が関心オブジェクトを含むように顕著性マップを生成する工程を含む、請求項1記載の方法。 - 関心領域内のオブジェクトのサイズ、オブジェクトの曲率、又はビデオフレーム内のオブジェクトの深さの1つ又は複数に基づいて、触覚応答の強度又は周波数を変化させることを含む、請求項1に記載の方法。
- 顕著性マップにおける顕著性のグラデーションに基づいて、触覚応答の強度又は周波数を変化させることを含む、請求項1に記載の方法。
- タッチスクリーンの領域が関心領域とオーバーラップすると決定したことに応答して、オーディオ応答を生成することを含む、請求項1に記載の方法。
- 触覚応答は、タッチスクリーンの層を介して電気信号を通信することによって生成される、請求項1に記載の方法。
- ビデオフレームを含むビデオを表示するように構成されたタッチスクリーンと、
タッチスクリーンに通信可能に結合されたハードウェアプロセッサであって、
ビデオフレームの顕著性マップに基づいて、ビデオフレーム内の関心領域を決定し、
ビデオフレームが表示されている間に、タッチスクリーンの領域上のタッチを検出し、
タッチスクリーンの領域が関心領域とオーバーラップしていると決定したことに応答して、触覚応答を生成するように構成されたハードウェアプロセッサとを含む装置。 - アクチュエータ又はスピーカの少なくとも1つ備え、触覚応答はアクチュエータ又はスピーカの少なくとも1つを使用して生成される、請求項8に記載の装置。
- ハードウェアプロセッサは、ビデオフレーム内の関心オブジェクトを検出し、
関心領域が関心オブジェクトを含むように顕著性マップを生成するように構成される、請求項8に記載の装置。 - ハードウェアプロセッサは、関心領域内のオブジェクトのサイズ、オブジェクトの曲率、又はビデオフレーム内のオブジェクトの深さの1つ又は複数に基づいて、触覚応答の強度又は周波数を変更するように構成される、請求項8に記載の装置。
- ハードウェアプロセッサは、顕著性マップにおける顕著性のグラデーションに基づいて、触覚応答の強度又は周波数を変化させるように構成される、請求項8に記載の装置。
- スピーカを備え、ハードウェアプロセッサは、タッチスクリーンの領域が関心領域とオーバーラップすると決定したことに応答して、スピーカを使用して、オーディオ応答を生成するように構成される、請求項8に記載の装置。
- 触覚応答は、タッチスクリーンを通じて電気信号を通信することによって生成される、請求項8に記載の装置。
- ビデオフレームを含むビデオを通信するように構成されたサーバと、
コンピューティングデバイスであって、
サーバからのビデオを表示するように構成されたタッチスクリーンと、
タッチスクリーンに通信可能に結合されたハードウェアプロセッサであって、
ビデオフレームの顕著性マップに基づいて、ビデオフレーム内の関心領域を決定し、
ビデオフレームが表示されている間に、タッチスクリーンの領域上のタッチを検出し、
タッチスクリーンの領域が関心領域とオーバーラップすると決定したことに応答して、触覚応答を生成するハードウェアプロセッサを含むコンピューティングデバイスを含むシステム。 - コンピューティングデバイスはアクチュエータ又はスピーカの少なくとも1つを含み、触覚応答は、アクチュエータ又はスピーカの少なくとも1つを使用して生成される、請求項15に記載のシステム。
- ハードウェアプロセッサは、関心領域内のオブジェクトのサイズ、オブジェクトの曲率、又はビデオフレーム内のオブジェクトの深さの1つ又は複数に基づいて、触覚応答の強度又は周波数を変更するように構成される、請求項15に記載のシステム。
- ハードウェアプロセッサは、顕著性マップにおける顕著性のグラデーションに基づいて、触覚応答の強度又は周波数を変化させるように構成される、請求項15に記載のシステム。
- コンピューティングデバイスはスピーカを備え、タッチスクリーンの領域が関心領域とオーバーラップすると決定したことに応答して、ハードウェアプロセッサは、スピーカを使用して、オーディオ応答を生成するように構成される、請求項15に記載のシステム。
- 触覚応答は、タッチスクリーンを通じて電気信号を通信することによって生成される、請求項15に記載のシステム。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/125,353 US11604516B2 (en) | 2020-12-17 | 2020-12-17 | Haptic content presentation and implementation |
US17/125,353 | 2020-12-17 |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2022096634A true JP2022096634A (ja) | 2022-06-29 |
JP7350831B2 JP7350831B2 (ja) | 2023-09-26 |
Family
ID=81991932
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2021203931A Active JP7350831B2 (ja) | 2020-12-17 | 2021-12-16 | 触覚コンテンツの提示及び実施 |
Country Status (3)
Country | Link |
---|---|
US (2) | US11604516B2 (ja) |
JP (1) | JP7350831B2 (ja) |
CN (1) | CN114647306A (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011077687A1 (ja) * | 2009-12-21 | 2011-06-30 | 京セラ株式会社 | 触感呈示装置および触感呈示装置の制御方法 |
US20140071117A1 (en) * | 2012-09-11 | 2014-03-13 | Dell Products Lp. | Method for Using the GPU to Create Haptic Friction Maps |
US20150280836A1 (en) * | 2014-03-31 | 2015-10-01 | Samsung Electronics Co., Ltd. | Method of sharing and receiving information based on sound signal and apparatus using the same |
US20170090571A1 (en) * | 2015-09-29 | 2017-03-30 | General Electric Company | System and method for displaying and interacting with ultrasound images via a touchscreen |
US20180322908A1 (en) * | 2017-05-02 | 2018-11-08 | Samsung Electronics Co., Ltd. | Method for giving dynamic effect to video and electronic device thereof |
JP2019133679A (ja) * | 2012-11-20 | 2019-08-08 | イマージョン コーポレーションImmersion Corporation | ガイド及び静電的摩擦との協力のための触覚的な手がかり(Haptic Cues)を提供する方法及び装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8573979B2 (en) | 2007-11-21 | 2013-11-05 | Intel-Ge Care Innovations Llc | Tactile display to allow sight impaired to feel visual information including color |
WO2009097866A1 (en) | 2008-02-04 | 2009-08-13 | Nokia Corporation | Device and method for providing tactile information |
EP2856282A4 (en) * | 2012-05-31 | 2015-12-02 | Nokia Technologies Oy | DISPLAY APPARATUS |
KR102091077B1 (ko) * | 2012-12-14 | 2020-04-14 | 삼성전자주식회사 | 입력 유닛의 피드백을 제어하는 휴대 단말 및 방법과, 이를 제공하는 상기 입력 유닛 및 방법 |
US9372095B1 (en) | 2014-05-08 | 2016-06-21 | Google Inc. | Mobile robots moving on a visual display |
KR101554256B1 (ko) | 2015-02-16 | 2015-09-18 | 박동현 | 햅틱 패턴을 이용한 시각 장애인용 문자 표시 방법, 이 방법이 적용된 터치스크린 및 이를 이용한 디스플레이 장치 |
US10795446B2 (en) | 2018-04-25 | 2020-10-06 | Seventh Sense OÜ | Portable electronic haptic vision device |
WO2020139091A1 (es) | 2018-12-28 | 2020-07-02 | Bustamante Solis Cesar Jose | Dispositvo háptico en arreglo matricial |
US11216149B2 (en) * | 2019-03-15 | 2022-01-04 | Samsung Electronics Co., Ltd. | 360° video viewer control using smart device |
US20210216126A1 (en) * | 2020-01-13 | 2021-07-15 | Comcast Cable Communications, Llc | Methods and systems for battery management |
-
2020
- 2020-12-17 US US17/125,353 patent/US11604516B2/en active Active
-
2021
- 2021-12-15 CN CN202111535262.4A patent/CN114647306A/zh active Pending
- 2021-12-16 JP JP2021203931A patent/JP7350831B2/ja active Active
-
2023
- 2023-02-10 US US18/108,068 patent/US20230221804A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011077687A1 (ja) * | 2009-12-21 | 2011-06-30 | 京セラ株式会社 | 触感呈示装置および触感呈示装置の制御方法 |
US20140071117A1 (en) * | 2012-09-11 | 2014-03-13 | Dell Products Lp. | Method for Using the GPU to Create Haptic Friction Maps |
JP2019133679A (ja) * | 2012-11-20 | 2019-08-08 | イマージョン コーポレーションImmersion Corporation | ガイド及び静電的摩擦との協力のための触覚的な手がかり(Haptic Cues)を提供する方法及び装置 |
US20150280836A1 (en) * | 2014-03-31 | 2015-10-01 | Samsung Electronics Co., Ltd. | Method of sharing and receiving information based on sound signal and apparatus using the same |
US20170090571A1 (en) * | 2015-09-29 | 2017-03-30 | General Electric Company | System and method for displaying and interacting with ultrasound images via a touchscreen |
US20180322908A1 (en) * | 2017-05-02 | 2018-11-08 | Samsung Electronics Co., Ltd. | Method for giving dynamic effect to video and electronic device thereof |
Also Published As
Publication number | Publication date |
---|---|
US20220197384A1 (en) | 2022-06-23 |
US11604516B2 (en) | 2023-03-14 |
US20230221804A1 (en) | 2023-07-13 |
JP7350831B2 (ja) | 2023-09-26 |
CN114647306A (zh) | 2022-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8992232B2 (en) | Interactive and educational vision interfaces | |
JP7033218B2 (ja) | 仮想オブジェクトとしての物理入力デバイスの表示 | |
JP6670361B2 (ja) | レンダリングする音響オブジェクトをユーザが選択するためのユーザインタフェース、および/またはレンダリングする音響オブジェクトをユーザが選択するためのユーザインタフェースをレンダリングする方法 | |
CN106796452A (zh) | 通过敲击控制的头戴式显示装置及其控制方法、用于控制该装置的计算机程序 | |
JP2011198249A (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
US10748003B2 (en) | Mitigation of augmented reality markup blindness | |
US20200082627A1 (en) | Loading indicator in augmented reality environment | |
JP7048784B2 (ja) | 表示制御システム、表示制御方法及びプログラム | |
US20210326094A1 (en) | Multi-device continuity for use with extended reality systems | |
US20230102820A1 (en) | Parallel renderers for electronic devices | |
JP7087367B2 (ja) | 情報処理装置、プログラム及び制御方法 | |
US10592048B2 (en) | Auto-aligner for virtual reality display | |
US20230221830A1 (en) | User interface modes for three-dimensional display | |
CN110968248B (zh) | 生成用于视觉触摸检测的指尖的3d模型 | |
JP2022096634A (ja) | 触覚コンテンツの提示及び実施 | |
US9921651B2 (en) | Video display for visually impaired people | |
CN115087957A (zh) | 虚拟场景 | |
CN106775245B (zh) | 基于虚拟现实的用户属性设置方法及装置 | |
CN109847343B (zh) | 虚拟现实交互方法及装置、存储介质、电子设备 | |
JP2020080122A (ja) | 情報処理装置、情報処理方法、および記憶媒体 | |
US11972088B2 (en) | Scene information access for electronic device applications | |
US11946744B2 (en) | Synchronization of a gyroscope in a virtual-reality environment | |
US20240220069A1 (en) | Scene information access for electronic device applications | |
US20230370578A1 (en) | Generating and Displaying Content based on Respective Positions of Individuals | |
Kapralos et al. | Serious games: Customizing the audio-visual interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20220217 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20230116 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20230228 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20230508 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20230816 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20230913 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 7350831 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |