JP2021506052A - コミュニケーション方法及びシステム - Google Patents

コミュニケーション方法及びシステム Download PDF

Info

Publication number
JP2021506052A
JP2021506052A JP2020550959A JP2020550959A JP2021506052A JP 2021506052 A JP2021506052 A JP 2021506052A JP 2020550959 A JP2020550959 A JP 2020550959A JP 2020550959 A JP2020550959 A JP 2020550959A JP 2021506052 A JP2021506052 A JP 2021506052A
Authority
JP
Japan
Prior art keywords
control unit
user
data
gesture
output module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2020550959A
Other languages
English (en)
Japanese (ja)
Other versions
JP2021506052A5 (enExample
Inventor
コーンベルグ,イタイ
レズキン,オル
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyefree Assisting Communication Ltd
Original Assignee
Eyefree Assisting Communication Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyefree Assisting Communication Ltd filed Critical Eyefree Assisting Communication Ltd
Publication of JP2021506052A publication Critical patent/JP2021506052A/ja
Publication of JP2021506052A5 publication Critical patent/JP2021506052A5/ja
Priority to JP2023188271A priority Critical patent/JP7625673B2/ja
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/7405Details of notification to user or communication with user or patient; User input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Cardiology (AREA)
  • Psychology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Dermatology (AREA)
  • Pulmonology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Evolutionary Computation (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
JP2020550959A 2017-12-07 2018-12-06 コミュニケーション方法及びシステム Pending JP2021506052A (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023188271A JP7625673B2 (ja) 2017-12-07 2023-11-02 コミュニケーション方法及びシステム

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US201762595694P 2017-12-07 2017-12-07
US62/595,694 2017-12-07
US201862642048P 2018-03-13 2018-03-13
US62/642,048 2018-03-13
US201862755680P 2018-11-05 2018-11-05
US62/755,680 2018-11-05
PCT/IL2018/051335 WO2019111257A1 (en) 2017-12-07 2018-12-06 Communication methods and systems

Related Child Applications (1)

Application Number Title Priority Date Filing Date
JP2023188271A Division JP7625673B2 (ja) 2017-12-07 2023-11-02 コミュニケーション方法及びシステム

Publications (2)

Publication Number Publication Date
JP2021506052A true JP2021506052A (ja) 2021-02-18
JP2021506052A5 JP2021506052A5 (enExample) 2022-01-11

Family

ID=64665588

Family Applications (2)

Application Number Title Priority Date Filing Date
JP2020550959A Pending JP2021506052A (ja) 2017-12-07 2018-12-06 コミュニケーション方法及びシステム
JP2023188271A Active JP7625673B2 (ja) 2017-12-07 2023-11-02 コミュニケーション方法及びシステム

Family Applications After (1)

Application Number Title Priority Date Filing Date
JP2023188271A Active JP7625673B2 (ja) 2017-12-07 2023-11-02 コミュニケーション方法及びシステム

Country Status (6)

Country Link
US (1) US11612342B2 (enExample)
EP (1) EP3721320B1 (enExample)
JP (2) JP2021506052A (enExample)
CN (1) CN111656304B (enExample)
IL (1) IL275071B2 (enExample)
WO (1) WO2019111257A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2025051735A (ja) * 2023-09-22 2025-04-04 ソフトバンクグループ株式会社 システム

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11457860B2 (en) * 2018-07-09 2022-10-04 Cheng Qian Human-computer interactive device and method
US11966055B2 (en) * 2018-07-19 2024-04-23 Magic Leap, Inc. Content interaction driven by eye metrics
IL268575B2 (en) 2019-08-07 2023-02-01 Eyefree Assisting Communication Ltd System and method for patient monitoring
WO2021156753A1 (en) * 2020-02-04 2021-08-12 Ecole Polytechnique Federale De Lausanne (Epfl) Method and system for determining the intention of performing a voluntary action
IL285071B2 (en) * 2021-07-22 2023-05-01 Eyefree Assisting Communication Ltd A system and method for monitoring the cognitive state of a patient based on eye and brain activity
JP2024527788A (ja) 2021-07-22 2024-07-26 アイフリー アシスティング コミュニケ-ション リミテッド 判定された鎮静状態に基づいて患者とコミュニケーションをとるためのシステム
WO2023076573A1 (en) * 2021-10-28 2023-05-04 Lifedrive Llc Drive manager for power wheelchair and related methods
US12032156B2 (en) * 2022-04-08 2024-07-09 Mirza Faizan Apparatus to enable differently abled users to communicate and a method thereof
WO2025030206A1 (en) * 2023-08-08 2025-02-13 Squid Eye Pty Ltd Remote eye gaze cursor control technology
US12340013B1 (en) * 2023-11-30 2025-06-24 OpenBCI, Inc. Graphical user interface for computer control through biometric input
CN117519487B (zh) * 2024-01-05 2024-03-22 安徽建筑大学 一种基于视觉动捕的掘进机操控教学辅助培训系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10187334A (ja) * 1996-12-26 1998-07-14 Canon Inc 情報入力方法および情報入力装置と情報入力用記憶媒体
WO2009093435A1 (ja) * 2008-01-25 2009-07-30 Panasonic Corporation 脳波インタフェースシステム、脳波インタフェース装置、方法およびコンピュータプログラム
WO2016142933A1 (en) * 2015-03-10 2016-09-15 Eyefree Assisting Communication Ltd. System and method for enabling communication through eye feedback

Family Cites Families (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4973149A (en) 1987-08-19 1990-11-27 Center For Innovative Technology Eye movement detector
US4836670A (en) 1987-08-19 1989-06-06 Center For Innovative Technology Eye movement detector
US4950069A (en) 1988-11-04 1990-08-21 University Of Virginia Eye movement detector with improved calibration and speed
JPH0749744A (ja) 1993-08-04 1995-02-21 Pioneer Electron Corp 頭部搭載型表示入力装置
US5912721A (en) 1996-03-13 1999-06-15 Kabushiki Kaisha Toshiba Gaze detection apparatus and its method as well as information display apparatus
US6351273B1 (en) 1997-04-30 2002-02-26 Jerome H. Lemelson System and methods for controlling automatic scrolling of information on a display or screen
JP2000137789A (ja) 1998-10-29 2000-05-16 Sharp Corp 画像処理装置
US6456262B1 (en) 2000-05-09 2002-09-24 Intel Corporation Microdisplay with eye gaze detection
US6943754B2 (en) 2002-09-27 2005-09-13 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
JP3673834B2 (ja) 2003-08-18 2005-07-20 国立大学法人山口大学 眼球運動を用いた視線入力コミュニケーション方法
US20110077548A1 (en) * 2004-04-01 2011-03-31 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them
CN102670163B (zh) 2004-04-01 2016-04-13 威廉·C·托奇 控制计算装置的系统及方法
KR100594117B1 (ko) 2004-09-20 2006-06-28 삼성전자주식회사 Hmd 정보 단말기에서 생체 신호를 이용하여 키를입력하는 장치 및 방법
JP4654434B2 (ja) 2004-11-24 2011-03-23 国立大学法人佐賀大学 視線方向特定システム
EP1943583B1 (en) 2005-10-28 2019-04-10 Tobii AB Eye tracker with visual feedback
CN101336089A (zh) 2006-01-26 2008-12-31 诺基亚公司 眼睛跟踪器设备
WO2007113975A1 (ja) 2006-03-31 2007-10-11 National University Corporation Shizuoka University 視点検出装置
JP4685708B2 (ja) 2006-05-22 2011-05-18 富士通株式会社 携帯端末装置
US20100149073A1 (en) 2008-11-02 2010-06-17 David Chaum Near to Eye Display System and Appliance
US8788977B2 (en) 2008-11-20 2014-07-22 Amazon Technologies, Inc. Movement recognition as input mechanism
JP5456791B2 (ja) * 2009-01-26 2014-04-02 トビー・テクノロジー・アーベー 空間領域の映像に対する人の注視点を決定するためのシステム及びその方法
US10019634B2 (en) * 2010-06-04 2018-07-10 Masoud Vaziri Method and apparatus for an eye tracking wearable computer
CN101893934A (zh) 2010-06-25 2010-11-24 宇龙计算机通信科技(深圳)有限公司 一种智能调整屏幕显示的方法和装置
US8593375B2 (en) 2010-07-23 2013-11-26 Gregory A Maltz Eye gaze user interface and method
JP2012048358A (ja) 2010-08-25 2012-03-08 Sony Corp 閲覧機器、情報処理方法およびプログラム
WO2012083415A1 (en) 2010-11-15 2012-06-28 Tandemlaunch Technologies Inc. System and method for interacting with and analyzing media on a display using eye gaze tracking
US8888287B2 (en) 2010-12-13 2014-11-18 Microsoft Corporation Human-computer interface system having a 3D gaze tracker
US8911087B2 (en) * 2011-05-20 2014-12-16 Eyefluence, Inc. Systems and methods for measuring reactions of head, eyes, eyelids and pupils
US20130022947A1 (en) * 2011-07-22 2013-01-24 Muniz Simas Fernando Moreira Method and system for generating behavioral studies of an individual
AU2011204946C1 (en) 2011-07-22 2012-07-26 Microsoft Technology Licensing, Llc Automatic text scrolling on a head-mounted display
KR101355895B1 (ko) 2011-10-20 2014-02-05 경북대학교 산학협력단 사용자 의도 판단 장치 및 그 방법
US8955973B2 (en) * 2012-01-06 2015-02-17 Google Inc. Method and system for input detection using structured light projection
US9171198B1 (en) 2012-04-02 2015-10-27 Google Inc. Image capture technique
KR101850035B1 (ko) * 2012-05-02 2018-04-20 엘지전자 주식회사 이동 단말기 및 그 제어방법
US9823742B2 (en) 2012-05-18 2017-11-21 Microsoft Technology Licensing, Llc Interaction and management of devices using gaze detection
US9261961B2 (en) 2012-06-07 2016-02-16 Nook Digital, Llc Accessibility aids for users of electronic devices
TWI471808B (zh) * 2012-07-20 2015-02-01 Pixart Imaging Inc 瞳孔偵測裝置
US9380287B2 (en) * 2012-09-03 2016-06-28 Sensomotoric Instruments Gesellschaft Fur Innovative Sensorik Mbh Head mounted system and method to compute and render a stream of digital images using a head mounted display
US9007301B1 (en) * 2012-10-11 2015-04-14 Google Inc. User interface
JP6195462B2 (ja) 2013-04-01 2017-09-13 矢崎総業株式会社 ヒューズユニット
KR102163996B1 (ko) 2013-04-03 2020-10-13 삼성전자 주식회사 사용자 기기의 비접촉식 인식 기능 성능 향상 방법 및 장치
KR102181897B1 (ko) 2013-05-09 2020-11-23 에스케이플래닛 주식회사 눈동자 추적을 이용한 모바일 툴팁 방법 및 장치
US20140368442A1 (en) 2013-06-13 2014-12-18 Nokia Corporation Apparatus and associated methods for touch user input
US20140375541A1 (en) * 2013-06-25 2014-12-25 David Nister Eye tracking via depth camera
CN109875501B (zh) * 2013-09-25 2022-06-07 曼德美姿集团股份公司 生理参数测量和反馈系统
CN104679226B (zh) * 2013-11-29 2019-06-25 上海西门子医疗器械有限公司 非接触式医疗控制系统、方法及医疗设备
KR101549645B1 (ko) * 2014-01-28 2015-09-03 영남대학교 산학협력단 표정 동작사전을 이용한 표정인식 방법 및 장치
CN107655461B (zh) * 2014-05-05 2020-07-24 赫克斯冈技术中心 测量子系统和测量系统
US9411417B2 (en) * 2014-07-07 2016-08-09 Logitech Europe S.A. Eye gaze tracking system and method
US20170115742A1 (en) * 2015-08-01 2017-04-27 Zhou Tian Xing Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command
CA2996039A1 (en) * 2015-08-21 2017-03-02 Magic Leap, Inc. Eyelid shape estimation using eye pose measurement
WO2017216118A1 (en) 2016-06-13 2017-12-21 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Method and eye tracking system for performing a calibration procedure for calibrating an eye tracking device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10187334A (ja) * 1996-12-26 1998-07-14 Canon Inc 情報入力方法および情報入力装置と情報入力用記憶媒体
WO2009093435A1 (ja) * 2008-01-25 2009-07-30 Panasonic Corporation 脳波インタフェースシステム、脳波インタフェース装置、方法およびコンピュータプログラム
WO2016142933A1 (en) * 2015-03-10 2016-09-15 Eyefree Assisting Communication Ltd. System and method for enabling communication through eye feedback

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2025051735A (ja) * 2023-09-22 2025-04-04 ソフトバンクグループ株式会社 システム

Also Published As

Publication number Publication date
US20210259601A1 (en) 2021-08-26
WO2019111257A1 (en) 2019-06-13
IL275071B2 (en) 2024-06-01
JP7625673B2 (ja) 2025-02-03
CN111656304B (zh) 2023-10-17
IL275071B1 (en) 2024-02-01
EP3721320B1 (en) 2022-02-23
US11612342B2 (en) 2023-03-28
CN111656304A (zh) 2020-09-11
EP3721320A1 (en) 2020-10-14
IL275071A (en) 2020-07-30
JP2024012497A (ja) 2024-01-30

Similar Documents

Publication Publication Date Title
JP7625673B2 (ja) コミュニケーション方法及びシステム
US11977682B2 (en) Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio
US11284844B2 (en) Electromyography (EMG) assistive communications device with context-sensitive user interface
CN111542800B (zh) 具有对于高速、精确和直观的用户交互的适配的大脑-计算机接口
JP2021511567A (ja) 高速、正確、且つ直感的なユーザ対話のための適合を伴う脳−コンピュータインタフェース
JP7701063B2 (ja) 患者監視のためのシステム及び方法
JP2004527815A (ja) 感知電気生理学的データに基づく活動開始方法及びシステム
US12455623B2 (en) Multiple switching electromyography (EMG) assistive communications device
Pomboza-Junez et al. Toward the gestural interface: comparative analysis between touch user interfaces versus gesture-based user interfaces on mobile devices
Rania et al. EOG Based Text and Voice Controlled Remote Interpreter for Quadriplegic Patients
Perrin et al. A comparative psychophysical and EEG study of different feedback modalities for HRI
Guerreiro Assistive Technologies for Spinal Cord Injured Individuals A Survey
Guerreiro Myographic Mobile Accessibility for Tetraplegics
Strzelecki System Sensor: a Practical Human-Computer Interface for People with Disabilities

Legal Events

Date Code Title Description
A529 Written submission of copy of amendment under article 34 pct

Free format text: JAPANESE INTERMEDIATE CODE: A529

Effective date: 20200804

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20211203

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20211203

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20221130

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20221220

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20230320

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20230704