JP2003143683A - Command entry device - Google Patents

Command entry device

Info

Publication number
JP2003143683A
JP2003143683A JP2001335470A JP2001335470A JP2003143683A JP 2003143683 A JP2003143683 A JP 2003143683A JP 2001335470 A JP2001335470 A JP 2001335470A JP 2001335470 A JP2001335470 A JP 2001335470A JP 2003143683 A JP2003143683 A JP 2003143683A
Authority
JP
Japan
Prior art keywords
command
vibration
input device
command input
pulse
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2001335470A
Other languages
Japanese (ja)
Other versions
JP4037086B2 (en
Inventor
Masaaki Fukumoto
Toshiaki Sugimura
利明 杉村
雅朗 福本
Original Assignee
Ntt Docomo Inc
株式会社エヌ・ティ・ティ・ドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ntt Docomo Inc, 株式会社エヌ・ティ・ティ・ドコモ filed Critical Ntt Docomo Inc
Priority to JP2001335470A priority Critical patent/JP4037086B2/en
Publication of JP2003143683A publication Critical patent/JP2003143683A/en
Application granted granted Critical
Publication of JP4037086B2 publication Critical patent/JP4037086B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Abstract

PROBLEM TO BE SOLVED: To provide a command entry device with excellent operability and user-friendliness. SOLUTION: An acceleration sensor or the like is incorporated in the command entry device 1 such as an earphone microphone and senses vibration when a user knocks a microphone main body or the body close to a microphone wearing position by a finger 2A or the like. The vibration is continuously produced through continuous knocking and pulses corresponding to the vibration are converted into a simple command such as on-hook or off-hook. Thus, the user can simply enter a command without the need for the user to seek a small button or the like provided to the microphone main body.

Description

Description: BACKGROUND OF THE INVENTION [0001] 1. Field of the Invention [0002] The present invention relates to a command input device, and more particularly to a command input device which is mounted on a user's body and used for an apparatus which performs an operation corresponding to an input command. About. 2. Description of the Related Art Cellular phones and PHS (Personal)
Portable devices such as Handyphone System) have been rapidly spreading in recent years. The housing of this type of portable device is often provided with a jack terminal for attaching an earphone microphone. By attaching an earphone microphone to this jack terminal, a call can be made without using an earphone or a microphone as a handset provided in the portable device body. [0003] There are various types of earphone microphones, and the earphone type is most widely used. An earphone microphone of an ear hole insertion type has an insertion portion for insertion into an ear hole, a plug for insertion into a jack terminal provided in a housing of a portable device, and a cable for transmitting and receiving signals between the insertion portion and the plug. It is composed of A small speaker for outputting sound is provided at the end of the insertion section, and a receiver using the function of an earphone is realized. Also,
A microphone is provided in a part of the insertion portion, for example, near a connection portion with a cable, and a microphone is realized. [0004] In the above-mentioned command input operation to the earphone microphone of the ear canal insertion type, a method of pressing a button provided on the housing with a fingertip is used. However, in this method, it is difficult to find and press a small button after wearing the earphone,
There is a disadvantage that operability is poor. Further, since a certain amount of force is required to press the button, when the button is pressed, the mounted state of the earphone is shifted from the initial position. For this reason, it is necessary to return to the original state, and there is a disadvantage that usability is poor. SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned drawbacks of the prior art, and an object of the present invention is to provide a command input device with good operability and ease of use. A command input device according to a first aspect of the present invention is a command input device used for a device that performs an operation corresponding to a command to be input, and is issued by a user of the own device. And a conversion unit that converts the vibration detected by the vibration detection unit into the command, and is attached to the body of the user. According to a second aspect of the present invention, in the first aspect, the converting means includes a filter for extracting a specific frequency component contained in the vibration. The command input device according to claim 3 of the present invention is the command input device according to claim 2, wherein the converting means converts the binary code of the pulse train based on the specific frequency component extracted by the filter in accordance with the length of time interval between pulses. It is characterized by being converted into. According to a fourth aspect of the present invention, there is provided a command input device according to any one of the first to third aspects, wherein the vibration is caused by an impact directly applied to a housing of the own device. Features. According to a fifth aspect of the present invention, there is provided a command input device according to any one of the first to third aspects.
In the paragraph, the vibration is generated by an impact applied near a mounting portion of the apparatus. A command input device according to a sixth aspect of the present invention is characterized in that, in any one of the first to fifth aspects, at least a part of the own device is mounted near an ear canal. A command input device according to a seventh aspect of the present invention is characterized in that, in any one of the first to fifth aspects, at least a part of the own device is mounted near a wrist. In short, in the present invention, the command input is performed by tapping the device housing (any portion is acceptable) or tapping the human body part near the housing mounting portion. By doing this,
Since there is no need to search for a small button, the operability is good, and since the device mounting position does not shift due to the pressing operation, the usability is good. Therefore, quick information input in the portable device can be realized. Embodiments of the present invention will now be described with reference to the drawings. In the drawings referred to in the following description, the same parts as those in the other drawings are denoted by the same reference numerals. FIG. 1 is a diagram showing an embodiment of a command input device according to the present invention, and shows a case where the command input device is an earphone microphone of a mobile phone. As shown in FIG. 1, a command input device 1 according to the present embodiment is used by inserting it into an ear canal of a user's face 20.
1, a microphone (hereinafter referred to as a microphone) 12, and a cable 100 provided between a plug inserted into a jack terminal (not shown) of the mobile phone (TEL) 10. In the command input device configured as described above, as will be described later, the device 1
Command input is realized by hitting the housing of the. FIG. 2 is a block diagram showing an example of the internal configuration of the command input device. In the figure, a command input device of the present embodiment includes an acceleration sensor (SN) 13 for detecting a vibration emitted by a user, an acceleration sensor, in addition to the above-described earphone (EARP) 11 and microphone (MIC) 12. Sensor amplifier (SA) 1 for amplifying the detection output of
4, a bandpass filter (BPF) 15 for extracting a specific frequency component from the amplified detection output, and a comparator (CMP) for converting the output of the bandpass filter 15 into a pulse by comparing with a predetermined level. 16
, A timer (TIM) 17, and a command table (CC) in which commands to be sent to the mobile phone are stored and held.
B) Output of 18 and timer 17 and command table 1
8 which converts the pulse output from the comparator 16 into a corresponding command with reference to the content of the command generation unit (C).
AM) 19. The above components are provided inside the device housing 1a. Hereinafter, a command input operation using the command input device having the above configuration will be described. When the housing 1a of the earphone microphone is lightly tapped with the user's finger 2A, the impact acceleration is detected by the acceleration sensor 13. The detected signal is amplified by the sensor amplifier 14, and the bandpass filter 15 selects only a specific frequency component when hit with a finger. In this case, 80 ~
By extracting and using frequency components around 100 Hz,
Noise due to operations other than tapping with a finger can be eliminated. Thereafter, the pulse is converted by the comparator 16 into one pulse for each tapping operation, and a timing pulse train TY composed of a binary code of "0" or "1" is generated.
It is sent to the command generation unit 19 as P. The command generation unit 19 determines the timing of the transmitted pulse train with reference to the pulse output from the timer 17, and determines the command CCM based on the relationship between the timing described in the command table 18 and the command. The determined command CCM is transmitted to the mobile phone 10. Next, an example of a method of generating a command from the timing pulse train TYP will be described with reference to FIG. The timing pulse train TYP is generated once for each tapping operation, and its pulse width M is constant. Time T1 and time T2 are time constants for identification, and in this example, T
1 <T2. The length of this time is determined by the timer 17
Can be measured by the output data of Tapping action is performed,
When one pulse is generated, the length of the time until the next pulse is generated, that is, the time interval between the pulses, is determined based on the time of the pulse rise. If the next pulse occurs within T1 time from the first pulse rising time, it is regarded as "0", the reference time is returned to 0, and the next pulse is waited. In addition, from the first pulse rising time, T
If one hour has elapsed and the next pulse occurs within T2 time, it is regarded as "1", the reference time is returned to 0, and the next pulse is waited. If the next pulse does not occur within the time T2, the pulse train analysis is terminated. According to the above-described method, a code string of "0" and "1" is formed like a Morse code using the pulse generation time intervals. Note that, unlike the Morse code, the final code of the code string generated by this method is always “1”. Referring to FIG. 3A, since the second pulse P2 has not occurred within the time T2 after the first pulse P1 has been generated, it is converted to the final code "1". Also,
Referring to FIG. 7B, since the second pulse P2 has occurred within the time T1 after the first pulse P1 has occurred, it is regarded as "0". Thereafter, since the next pulse has not occurred within the time T2, the final code “1” is added and converted to “01”. Further, referring to FIG. 2C, since the first pulse P1 has been generated, the second pulse P2 has been generated within the time T2 after the lapse of the time T1, and therefore, is regarded as "1". It is. Thereafter, since the next pulse has not occurred within the time T2, the final code "1" is added and converted to "11". Similarly, referring to FIG. 3D, the time from the first pulse P1 to the second pulse P2 is within T1 time, and the time from the second pulse P2 to the third pulse P3 is T
One hour later, within T2 hours, from the third pulse P3 to the fourth pulse P4, within T1 hours, the fourth pulse P4
To the fifth pulse P5, within the time T2 after the lapse of the time T1, and the next pulse has not occurred within the time T2 after the fifth pulse P5. For this reason, the final code “1” is added and converted to “01011”. The keying command obtained as described above,
That is, the code string of “0” and “1” is converted into the transmission command CCM by referring to the table 18. When this transmission command CCM is sent to the mobile phone 10,
A predetermined operation is performed in the mobile phone 10. An example of the operation command is shown in FIG. The operation command is used to avoid ambient noise and malfunction due to normal operation.
Some length is required. However, with respect to dials and the like in which use situations are clear, such as numbers, operability can be improved by using short commands. In this embodiment, the command "1101" is assigned as off-hook and the command "0101" is assigned as on-hook. Therefore, if the device housing or the like is hit such that the intervals of the three pulses are within the time T2 after the time T1 and within the time T2 after the time T1, the last “1” is obtained. Thus, a code string “1101” is generated. When this code string “1101” is sent to the mobile phone, the mobile phone goes off-hook. Similarly, if the device housing or the like is hit such that the intervals between the three pulses are within the time T1 and within the time T1 after the time T1, the "1" is added at the end. , A code string “0101”. This code string "0
When "101" is sent to the mobile phone, the mobile phone enters an on-hook state. In this example, the pulse width M is constant, but a more complicated command can be generated by detecting the strength of the tapping operation and giving a different pulse width to use the strength of the tapping operation. Here, a method of generating a keying command will be described with reference to FIG. FIG. 11 is a flowchart showing a process of generating a command from a pulse train. In this figure, the apparatus is in a waiting state until a pulse is input by an input operation such as hitting the housing (step S501). If the next pulse is input within the time T1 from the first pulse, it is regarded as "0". (Step S502 → S503).
On the other hand, if the next pulse is input within the time T2 after the lapse of the time T1 from the first pulse, it is regarded as “1” (steps S502 → S504 → S505). By repeating the above process, a process of replacing the input interval between pulses with a code of “0” or “1” is performed. If the next pulse is not inputted even after the time T2 has elapsed from the first pulse, "1" is added to the end to determine the pulse code (step S504 → S
506). By configuring the command input device as described above, it is possible to control the telephone by hitting any part of the entire device housing, search for the small button provided on the device housing and press it as before You don't have to. Therefore, the wearing state of the earphone does not deviate from the initial position, and the usability is good. In this embodiment, the housing 1a is directly touched by a fingertip.
In addition to this, the input can also be performed by lightly tapping the cheekbone 2B, the bone behind the ear, the temple, the nose, the chin, or the like, which is a part near the part where the earphone microphone is attached, with the fingertip. Also in this case, noise can be eliminated by appropriately selecting the constant of the bandpass filter 15 according to each input method. Often 8
The use of components around 0 to 100 Hz is effective. In this embodiment, an earphone microphone of an ear canal type is used, but it is clear that the present invention can be widely applied to a device worn on a user's body. That is, it may be of a type that is worn over the ear shell, or a type that is installed on the head or neck with a band such as headphones. Furthermore, in addition to the input operation using the fingertip, the acceleration sensor 13 may detect an impact acceleration generated by a repetitive operation such as an operation of meshing teeth. In this case, the operation 2C for meshing the teeth
The vibration caused by the vibration is transmitted to the chin and the skull, and this is detected by the acceleration sensor 13. In this case, the pass frequency of the band-pass filter 15 is set to 600 to
By setting the frequency around 650 Hz, detection can be performed efficiently. With the above configuration, a hands-free input operation can be realized. In addition to the operation of a mobile phone or a voice recorder as an earphone microphone, a combination with, for example, a glasses-type information terminal device is also possible. Also in this case, the vibration generated by the impact directly applied to the housing of the device or the vicinity of the mounting portion of the device (near the ear hole or temple).
May be detected by an acceleration sensor and converted into a command. Furthermore, a combination with a wristwatch type or a ring type information terminal is also conceivable. In this case, too, the acceleration sensor detects the vibration caused by the shock directly applied to the housing of the device and the shock caused by the vicinity of the mounting part of the device (near the wrist or arm) and outputs the vibration to the command. Just convert it. It is clear that the present invention can be widely applied to a command input device used for other devices. As described above, the present invention detects the vibration generated by the user, converts the detected vibration into a command, and outputs the command. There is an effect that an easy-to-use command input device can be realized.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram showing one embodiment of a usage example of a command input device according to the present invention. FIG. 2 is a block diagram illustrating a configuration example of a command input device according to the present invention. FIG. 3 is a diagram illustrating an example of a process of generating a command from a pulse train. FIG. 4 is a diagram showing the correspondence between pulse trains and commands. FIG. 5 is a flowchart illustrating a process of generating a command from a pulse train according to the present invention. [Description of Signs] 1 Command input device 1a Housing 10 Mobile phone 11 Earphone 12 Microphone 13 Acceleration sensor 14 Sensor amplifier 15 Bandpass filter 16 Comparator 17 Timer 18 Command table 19 Command generation unit 100 Cable

Claims (1)

  1. Claims 1. A command input device for use in a device that performs an operation corresponding to an input command, a vibration detecting means for detecting a vibration generated by a user of the own device, and a vibration detecting device for detecting the vibration. Converting means for converting the vibration detected by the means into the command, the command input device being attached to the body of the user. 2. The command input device according to claim 1, wherein said conversion means includes a filter for extracting a specific frequency component contained in said vibration. 3. The method according to claim 2, wherein the conversion unit converts the pulse train based on the specific frequency component extracted by the filter into a binary code according to the length of a time interval between pulses. Command input device. 4. The command input device according to claim 1, wherein the vibration is generated by an impact directly applied to a housing of the own device. 5. The command input device according to claim 1, wherein the vibration is generated by an impact applied near a mounting portion of the own device. 6. The apparatus according to claim 1, wherein at least a part of the apparatus is mounted near an ear canal.
    Command input device according to the paragraph. 7. The apparatus according to claim 1, wherein at least a part of the apparatus is worn near a wrist.
    Command input device according to the paragraph.
JP2001335470A 2001-10-31 2001-10-31 Command input device Expired - Fee Related JP4037086B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001335470A JP4037086B2 (en) 2001-10-31 2001-10-31 Command input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2001335470A JP4037086B2 (en) 2001-10-31 2001-10-31 Command input device

Publications (2)

Publication Number Publication Date
JP2003143683A true JP2003143683A (en) 2003-05-16
JP4037086B2 JP4037086B2 (en) 2008-01-23

Family

ID=19150455

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001335470A Expired - Fee Related JP4037086B2 (en) 2001-10-31 2001-10-31 Command input device

Country Status (1)

Country Link
JP (1) JP4037086B2 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006137400A1 (en) * 2005-06-21 2006-12-28 Japan Science And Technology Agency Mixing device, method, and program
JP2007019787A (en) * 2005-07-07 2007-01-25 Yamaha Corp Input device, helmet system with input function, and vehicle system provided with the same
EP1940195A2 (en) 2006-12-27 2008-07-02 Sony Corporation Sound outputting apparatus, sound outputting method, sound output processing program and sound outputting system
JP2010187218A (en) * 2009-02-12 2010-08-26 Sony Corp Control device, method and program
JP2010213099A (en) * 2009-03-11 2010-09-24 Sony Ericsson Mobile Communications Ab Apparatus and method for processing sound signal
JP2010258623A (en) * 2009-04-22 2010-11-11 Yamaha Corp Operation detecting apparatus
JP2011524656A (en) * 2008-04-30 2011-09-01 ディーピー テクノロジーズ インコーポレイテッド Improved headset
EP2375775A1 (en) * 2010-04-07 2011-10-12 Sony Corporation Audio signal processing apparatus, audio signal processing method, and program
JP2012194901A (en) * 2011-03-17 2012-10-11 Sharp Corp Electronic apparatus, control method of electronic apparatus, control program, and recording medium
CN103873644A (en) * 2012-12-10 2014-06-18 联想(北京)有限公司 Data processing method and terminal
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
US8876738B1 (en) 2007-04-04 2014-11-04 Dp Technologies, Inc. Human activity monitoring device
US8902154B1 (en) 2006-07-11 2014-12-02 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface
US8922808B2 (en) 2008-01-30 2014-12-30 Seiko Epson Corporation Electronic device that receives a wait-for-impact-detection command
US8949070B1 (en) 2007-02-08 2015-02-03 Dp Technologies, Inc. Human activity monitoring device with activity identification
WO2015041206A1 (en) * 2013-09-17 2015-03-26 株式会社ジーデバイス Automatic appliance-activating device, handheld information appliance, handheld-information-appliance system, and program
US9390229B1 (en) 2006-04-26 2016-07-12 Dp Technologies, Inc. Method and apparatus for a health phone
US9529437B2 (en) 2009-05-26 2016-12-27 Dp Technologies, Inc. Method and apparatus for a motion state aware device
US9940161B1 (en) 2007-07-27 2018-04-10 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
WO2018167901A1 (en) * 2017-03-16 2018-09-20 ヤマハ株式会社 Headphones

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8023659B2 (en) 2005-06-21 2011-09-20 Japan Science And Technology Agency Mixing system, method and program
WO2006137400A1 (en) * 2005-06-21 2006-12-28 Japan Science And Technology Agency Mixing device, method, and program
JP2007019787A (en) * 2005-07-07 2007-01-25 Yamaha Corp Input device, helmet system with input function, and vehicle system provided with the same
US9390229B1 (en) 2006-04-26 2016-07-12 Dp Technologies, Inc. Method and apparatus for a health phone
US8902154B1 (en) 2006-07-11 2014-12-02 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface
US9495015B1 (en) 2006-07-11 2016-11-15 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface to determine command availability
JP2008166897A (en) * 2006-12-27 2008-07-17 Sony Corp Sound outputting apparatus, sound outputting method, sound output processing program and sound outputting system
US8204241B2 (en) 2006-12-27 2012-06-19 Sony Corporation Sound outputting apparatus, sound outputting method, sound output processing program and sound outputting system
EP1940195A2 (en) 2006-12-27 2008-07-02 Sony Corporation Sound outputting apparatus, sound outputting method, sound output processing program and sound outputting system
US8949070B1 (en) 2007-02-08 2015-02-03 Dp Technologies, Inc. Human activity monitoring device with activity identification
US8876738B1 (en) 2007-04-04 2014-11-04 Dp Technologies, Inc. Human activity monitoring device
US9940161B1 (en) 2007-07-27 2018-04-10 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US8922808B2 (en) 2008-01-30 2014-12-30 Seiko Epson Corporation Electronic device that receives a wait-for-impact-detection command
JP2011524656A (en) * 2008-04-30 2011-09-01 ディーピー テクノロジーズ インコーポレイテッド Improved headset
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
JP2010187218A (en) * 2009-02-12 2010-08-26 Sony Corp Control device, method and program
JP2010213099A (en) * 2009-03-11 2010-09-24 Sony Ericsson Mobile Communications Ab Apparatus and method for processing sound signal
JP2010258623A (en) * 2009-04-22 2010-11-11 Yamaha Corp Operation detecting apparatus
US9529437B2 (en) 2009-05-26 2016-12-27 Dp Technologies, Inc. Method and apparatus for a motion state aware device
US8634565B2 (en) 2010-04-07 2014-01-21 Sony Corporation Audio signal processing apparatus, audio signal processing method, and program
EP2375775A1 (en) * 2010-04-07 2011-10-12 Sony Corporation Audio signal processing apparatus, audio signal processing method, and program
US9479883B2 (en) 2010-04-07 2016-10-25 Sony Corporation Audio signal processing apparatus, audio signal processing method, and program
JP2012194901A (en) * 2011-03-17 2012-10-11 Sharp Corp Electronic apparatus, control method of electronic apparatus, control program, and recording medium
CN103873644A (en) * 2012-12-10 2014-06-18 联想(北京)有限公司 Data processing method and terminal
WO2015041206A1 (en) * 2013-09-17 2015-03-26 株式会社ジーデバイス Automatic appliance-activating device, handheld information appliance, handheld-information-appliance system, and program
WO2018167901A1 (en) * 2017-03-16 2018-09-20 ヤマハ株式会社 Headphones

Also Published As

Publication number Publication date
JP4037086B2 (en) 2008-01-23

Similar Documents

Publication Publication Date Title
US8706482B2 (en) Voice coder with multiple-microphone system and strategic microphone placement to deter obstruction for a digital communication device
US6104808A (en) Portable communication device with speakerphone operation
EP0749656B1 (en) Portable communication device
US7953454B2 (en) Wireless hands-free system with silent user signaling
US7215790B2 (en) Voice transmission apparatus with UWB
CN1197422C (en) Sound close detection for mobile terminal and other equipment
CN101110875B (en) Apparatus for preventing loss of portable telephone using a bluetooth communication protocol and control method thereof
US8111842B2 (en) Filter adaptation based on volume setting for certification enhancement in a handheld wireless communications device
DE60034551T2 (en) Communication protocol between a communication device and an external attachment
JP3508776B2 (en) Transceiver
US5790684A (en) Transmitting/receiving apparatus for use in telecommunications
ES2343323T3 (en) Wireless headphones for use in a voice recognition environment.
EP1770976A1 (en) Communication method for a mobile telephone and telecom network
JP3154725B2 (en) Method for transmitting voice information
US7117021B2 (en) Bluetooth cassette-like device for a hands-free cell-phone kit
US5280524A (en) Bone conductive ear microphone and method
US5295193A (en) Device for picking up bone-conducted sound in external auditory meatus and communication device using the same
US4334315A (en) Wireless transmitting and receiving systems including ear microphones
JP5978286B2 (en) Communication device
US6574345B1 (en) Structure of a wearable and hands free earphone
TWI323118B (en) Wireless telephone with uni-directional and omni-directional microphones
US20060120546A1 (en) Ear fixed type conversation device
JP2011023848A (en) Headset
TW401667B (en) Incoming calling system
TW462200B (en) Bone conduction voice transmission apparatus and system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20040928

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20060613

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060704

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060901

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20071016

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20071031

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101109

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111109

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121109

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121109

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131109

Year of fee payment: 6

LAPS Cancellation because of no payment of annual fees