CN101576788A - Multi-point touch positioning method - Google Patents

Multi-point touch positioning method Download PDF

Info

Publication number
CN101576788A
CN101576788A CNA2009100400111A CN200910040011A CN101576788A CN 101576788 A CN101576788 A CN 101576788A CN A2009100400111 A CNA2009100400111 A CN A2009100400111A CN 200910040011 A CN200910040011 A CN 200910040011A CN 101576788 A CN101576788 A CN 101576788A
Authority
CN
China
Prior art keywords
operating point
point
screen
camera
analysis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2009100400111A
Other languages
Chinese (zh)
Other versions
CN101576788B (en
Inventor
周虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtron Group Co Ltd
Original Assignee
Vtron Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Technologies Ltd filed Critical Vtron Technologies Ltd
Priority to CN2009100400111A priority Critical patent/CN101576788B/en
Publication of CN101576788A publication Critical patent/CN101576788A/en
Application granted granted Critical
Publication of CN101576788B publication Critical patent/CN101576788B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a multi-point touch positioning method, which achieves the positioning of multi-point touch operation on a screen through the combination of intelligent analysis and infrared positioning of a camera. In a hardware layer of a device, the camera and an infrared device acquire image data at the same time, a main control unit integrates the data and an analysis result to perform operating point coordinate pairing, operating point group management and operating point information coding in turn and then send the data and the analysis result to a drive software layer for verification, error correcting and decoding, and finally an application software layer receives the information and then draws a menu at a corresponding position. The method, through the combination of camera monitoring and infrared positioning, optimizes the intelligent features of a system to the utmost extent, positions and judges operating points on a large screen accurately and quickly without influencing the visual effect of the screen, and can be widely applied to various occasions; besides, the system has excellent extendibility and can provide necessary hardware technique bases for humanized operations of application software to the utmost extent.

Description

A kind of multi-point touch locating method
Technical field
The present invention relates to touch-screen framing technology, particularly a kind of multi-point touch locating method.
Background technology
At present, touch-screen is as a kind of novel mutual display device, its range of application more and more widely, in the touch-screen industry, large-size screen monitors use infrared location technology usually, but this technology the realization of multiple spot and intelligent aspect the many technological difficulties that can't go beyond are arranged, such as when determining two operating points on the screen by infrared location, then there are two X-axis coordinates, two Y-axis coordinates, how to mate these two operating point coordinates and just become one of intelligentized difficult point in infrared location, when operating point when being a plurality of, this difficult point is just more obvious; Two of its difficult point is under the situation of passive pen, if several users is respectively held not homochromy pen at the on-screen options work, at this moment can't judge the pen of which user with what color, particularly under fight patterns such as goods electronic sand map; Simultaneously, when two or more operating points were mobile on screen, which user the infrared location of single employing can't judge in operation, and this is the intelligentized another difficult point in infrared location.And in the touch panel device of electric capacity press mold mode, though solved the problem of multiple point touching, but the screen size of its realization is restricted, and may be able to solve the multiple point touching problem of 12 cun big the small screen according to optimistic estimate in 2 years, but still can't solve at short notice large-size screen monitors.
A kind of preferably at present scheme is to adopt camera to realize the large-size screen monitors location, this scheme can be divided into location, back and two kinds of forms of prelocalization, but still exist a lot of weak points, after be positioned at intelligent direction and also differ from a step, and liquid crystal display, electrical screen etc. are limited to some extent, and the shadow casting technique that prelocalization is adopted positions by the operative position unfilled corner that occurs on the fluorescent screen, makes user perspective effect etc. all undesirable like this.
Therefore, increase the intelligent degree of interactive device simultaneously, strengthen the user also just becomes the embedded system development to the intelligentized recognition capability of interactive device the task of top priority.
Summary of the invention
The objective of the invention is to overcome above-mentioned the deficiencies in the prior art, a kind of multi-point touch locating method is provided, this method has realized the accurate location of multiple point touching, has promoted the intellectuality of equipment, makes things convenient for the user to using the operation of software.
The present invention is achieved through the following technical solutions: a kind of multi-point touch locating method, and the intelligent analysis by camera combines with infrared location, realizes the location of multiple point touching operation on the screen, specifically may further comprise the steps:
(1) in the device hardware layer, the camera head monitor screen, collection in real time enters the image information of screen and its picture recognition module of delivering to main control unit is analyzed; Simultaneously infrared positioning equipment is gathered image coordinate data on the X-axis and the image coordinate data on the Y-axis respectively;
(2) according to picture recognition module to camera collection to the analysis result of image information and X-axis coordinate data and the Y-axis coordinate data that infrared positioning equipment collects, the coordinating analysis module of main control unit, operating point combinatory analysis module and a colour analysis module are carried out the pairing of operating point coordinate, operating point grouping management and operating point information coding successively;
(3) the communication transmitter unit of hardware device level is sent to the drive software layer with the coded message that obtains;
(4) behind the communication receiving element Receiving coded information of drive software layer, earlier to its verification and error correcting that carries out information correctness, deliver to the Analysis Service unit then and decode, decoded information is sent to the application software layer by the message trigger interface;
(5) after the application software layer reception information, draw menu in corresponding position.
Wherein, step (2) is specially:
(2-1) picture recognition module to camera collection to image information analyze, determine relative position relation between each operating point by each the operating point position that photographs, by number of people number before the analysis screen and the arm number on the screen each operating point is divided into groups, written color by analysis operation point extracts close color-values, i.e. immediate color-values in the color-values in 28, for example: the pen of white is RGB (255,255 when dispatching from the factory definition, 255), the pen of grey is RGB (150,150,150), if the color-values that the pen that the user uses obtains after the color inspection is RGB (254,254,254)~RGB (225,225,225) color-values between, after then its written color being extracted, think the user what at this moment use is that the pen of white marks, this color-values is RGB (255,255,255);
(2-2) the comprehensive X-axis coordinate data and the Y-axis coordinate data that collect of the relative position relation of each operating point and infrared positioning equipment, the coordinating analysis module is carried out the pairing of operating point coordinate, obtains the coordinate figure of each operating point;
(2-3) the grouping situation of synthetic operation point and the coordinate figure of each operating point, operating point combinatory analysis module is carried out grouping management to operating point, judges whether same group of interior operating point is the associativity operation, clear and definite operator's mode of operation;
(2-4) after clear and definite operator's the mode of operation, the written color-values that the combined with intelligent picture recognition module is extracted is encoded to the information of the operating point coordinate on the screen, mode of operation and a look attribute, forms the coded message that communication is used.
28 kinds of color-values described in the step (2-1) are the color-values of system's default setting when dispatching from the factory, and 28 kinds of color-values also can increase and decrease as the case may be, and the span of its corresponding each color-values is RGB (0,0,0)~RGB (255,255,255).
Operator's mode of operation described in the step (2-3) is for clicking selected or the given combination function action, and described given combination function action comprises convergent-divergent, rotation and moves etc.
In the above step, picture recognition module can obtain the general location of each operating point to the analysis of each operating point position, i.e. the relation of position between each operating point; Number of people number before the screen and the arm number on the screen are analyzed, can be determined operator's the number and the relation of operating point, and divide into groups, promptly which individual has carried out the operation of which point; When the written color of each operating point is analyzed, when the operator use be finger or other instrument that can't differentiate the time, written color is an acquiescence look.
The data that collect when picture recognition module graphical analysis result and infrared positioning equipment are when main control unit is analyzed, if infrared positioning equipment collects following data: two data (100 of X-axis; 300), two data (100 of Y-axis; 200); And its permutation and combination has four groups, especially as four angles of rectangle.And if intelligent picture recognition module obtains the information of three operating points by analysis, prove at this moment and on the giant-screen three points are arranged simultaneously, the vision facilities module analysis goes out the cardinal principle coordinate position of these three points, such as the data that " one on the left side; two on the right " or " top one; following two " (even point out respectively more in detail three points cardinal principle coordinate range) collects in conjunction with infrared positioning equipment and the analysis result of picture recognition module so just can accurately have been determined this position coordinates of 3.If the analysis result of picture recognition module is two operating points, then whole screen can be divided into several zones, the data that collect of zone and the infrared positioning equipment by the operating point place can obtain the accurate position coordinates of two operating points then.
Described device hardware layer comprises camera, infrared positioning equipment, main control unit and communication transmitter unit; Described main control unit comprises picture recognition module, coordinating analysis module, operating point combinatory analysis module and a colour analysis module that connects successively, described camera, infrared positioning equipment are connected with picture recognition module respectively, and the communication transmitter unit is connected with a colour analysis module; Described drive software layer comprises communication receiving element, Analysis Service unit and the message trigger interface that connects successively; Described application software layer comprises application hardware interface end and the application software menu drafting module that connects successively; Described communication unit is connected with the communication transmitter unit, and the message trigger interface is connected with the application hardware interface end.Communication modes between described communication transmitter unit and the communication receiving element is infrared, bluetooth, serial ports, parallel port, USB, RFID or network interface.
Described camera adopts a plurality of, and as preferred version, the number of described camera is 2~3, between each camera each other communication share the image information that collects separately; This is because if only use a camera, the situation that an operating point blocks another operating point then might occur, therefore uses a plurality of cameras can guarantee that the image information that collects is comprehensive, so that the correct analysis of intelligent picture recognition module.
The present invention has following advantageous effect with respect to above-mentioned prior art:
The present invention combines with infrared location by monitoring camera-shooting, has at utmost optimized the intelligent feature of system, and is accurate, rapid to the location determination of operating point on the giant-screen, do not influence the visual effect of screen, can be widely used in multiple occasion.
System of the present invention can be according to the scale of its main control module and the constantly figure and the action of newly-increased intelligent identification of ability of external calculating, system itself has good extendibility, can farthest operate for the hommization of application software the necessary hardware technical foundation is provided.
Description of drawings
Fig. 1 is a method flow diagram of the present invention.
Fig. 2 is the intellectual analysis process synoptic diagram of main control unit among the present invention.
Fig. 3 is an equipment work synoptic diagram of the present invention.
Fig. 4 is the workflow synoptic diagram of each unit module among the present invention.
Embodiment
Below in conjunction with embodiment and accompanying drawing, the present invention is described in further detail, but embodiments of the present invention are not limited thereto.
Embodiment
A kind of multi-point touch locating method of present embodiment, intelligent analysis by camera combines with infrared location, realize the location of multiple point touching operation on the screen, present embodiment adopts 2 cameras, shown in the workflow diagram of the method flow diagram of Fig. 1 and each unit module of Fig. 4, specifically may further comprise the steps:
(1) in the device hardware layer, the camera head monitor screen, collection in real time enters the image information of screen and its picture recognition module of delivering to main control unit is analyzed; Simultaneously infrared positioning equipment is gathered image coordinate data on the X-axis and the image coordinate data on the Y-axis respectively;
(2) according to picture recognition module to camera collection to the analysis result of image information and X-axis coordinate data and the Y-axis coordinate data that infrared positioning equipment collects, the coordinating analysis module of main control unit, operating point combinatory analysis module and a colour analysis module are carried out the pairing of operating point coordinate, operating point grouping management and operating point information coding successively;
(3) the communication transmitter unit of hardware device level is sent to the drive software layer with the coded message that obtains;
(4) behind the communication receiving element Receiving coded information of drive software layer, earlier to its verification and error correcting that carries out information correctness, deliver to the Analysis Service unit then and decode, decoded information is sent to the application software layer by the message trigger interface;
(5) after the application software layer reception information, draw menu in corresponding position.
The duty of this process device as shown in Figure 3, two cameras 1 are located at the upper left corner and the upper right corner of giant-screen respectively, infrared positioning equipment is located at the frame of giant-screen, and dotted portion is each operating point relative position that the camera captured image information obtains after intelligent picture recognition module analysis among the figure.
Shown in the intellectual analysis process of Fig. 2 main control unit, wherein step (2) is specially:
(2-1) intelligent picture recognition module to camera collection to image information analyze, determine relative position relation between each operating point by each the operating point position that photographs, by number of people number before the analysis screen and the arm number on the screen each operating point is divided into groups, written color by analysis operation point extracts close color-values, i.e. immediate color-values in the color-values in 28, for example: the pen of white is RGB (255,255 when dispatching from the factory definition, 255), the pen of grey is RGB (150,150,150), if the color-values that the pen that the user uses obtains after the color inspection is RGB (254,254,254)~RGB (225,225,225) color-values between, after then its written color being extracted, think the user what at this moment use is that the pen of white marks, this color-values is RGB (255,255,255);
(2-2) the comprehensive X-axis coordinate data and the Y-axis coordinate data that collect of the relative position relation of each operating point and infrared positioning equipment, the coordinating analysis module is carried out the pairing of operating point coordinate, obtains the coordinate figure of each operating point;
(2-3) the grouping situation of synthetic operation point and the coordinate figure of each operating point, operating point combinatory analysis module is carried out grouping management to operating point, judges whether same group of interior operating point is the associativity operation, clear and definite operator's mode of operation;
(2-4) after clear and definite operator's the mode of operation, the written color-values that the combined with intelligent picture recognition module is extracted is encoded to the information of the operating point coordinate on the screen, mode of operation and a look attribute, forms the coded message that communication is used.
28 kinds of color-values in the step (2-1) are the color-values of system's default setting when dispatching from the factory, and 28 kinds of color-values also can increase and decrease as the case may be, and the span of its corresponding each color-values is RGB (0,0,0)~RGB (255,255,255).
Operator's mode of operation is for clicking selected or given combination function action in the step (2-3), and the given combination function action comprises convergent-divergent, rotation and mobile etc.
In the above step, intelligent picture recognition module can obtain the general location of each operating point to the analysis of each operating point position, i.e. the relation of position between each operating point; Number of people number before the screen and the arm number on the screen are analyzed, can be determined operator's the number and the relation of operating point, and divide into groups, promptly which individual has carried out the operation of which point; When the written color of each operating point is analyzed, when the operator use be finger or other instrument that can't differentiate the time, written color is an acquiescence look.
The data that collect when picture recognition module graphical analysis result and infrared positioning equipment are when main control unit is analyzed, if infrared positioning equipment collects following data: two data (100 of X-axis; 300), two data (100 of Y-axis; 200); And its permutation and combination has four groups, especially as four angles of rectangle.And if intelligent picture recognition module obtains the information of three operating points by analysis, prove at this moment and on the giant-screen three points are arranged simultaneously, the vision facilities module analysis goes out the cardinal principle coordinate position of these three points, such as the data that " one on the left side; two on the right " or " top one; following two " (even point out respectively more in detail three points cardinal principle coordinate range) collects in conjunction with infrared positioning equipment and the analysis result of picture recognition module so just can accurately have been determined this position coordinates of 3.If the analysis result of picture recognition module is two operating points, then whole screen can be divided into several zones, the data that collect of zone and the infrared positioning equipment by the operating point place can obtain the accurate position coordinates of two operating points then.
As shown in Figure 4, the device hardware layer comprises camera, infrared positioning equipment, main control unit and communication transmitter unit; Main control unit comprises picture recognition module, coordinating analysis module, operating point combinatory analysis module and a colour analysis module that connects successively, camera, infrared positioning equipment are connected with intelligent picture recognition module respectively, and the communication transmitter unit is connected with a colour analysis module; The drive software layer comprises communication receiving element, Analysis Service unit and the message trigger interface that connects successively; The application software layer comprises application hardware interface end and the application software menu drafting module that connects successively; Communication unit is connected with the communication transmitter unit, and the message trigger interface is connected with the application hardware interface end.
Communication modes between communication transmitter unit and the communication receiving element is infrared, bluetooth, serial ports, parallel port, USB, RFID or network interface.
The number of camera also can be 3 or more, between each camera each other communication share the image information collect separately; This is because if only use a camera, the situation that an operating point blocks another operating point then might occur, therefore uses a plurality of cameras can guarantee that the image information that collects is comprehensive, so that the correct analysis of intelligent picture recognition module.
As mentioned above, just can realize the present invention preferably, the foregoing description is preferred embodiment of the present invention only, is not to be used for limiting practical range of the present invention; Be that all equalizations of doing according to content of the present invention change and modification, all contained by claim of the present invention scope required for protection.

Claims (7)

1, a kind of multi-point touch locating method is characterized in that, the intelligent analysis by camera combines with infrared location, realizes the location of multiple point touching operation on the screen, specifically may further comprise the steps:
(1) in the device hardware layer, the camera head monitor screen, collection in real time enters the image information of screen and its picture recognition module of delivering to main control unit is analyzed; Simultaneously infrared positioning equipment is gathered image coordinate data on the X-axis and the image coordinate data on the Y-axis respectively;
(2) according to picture recognition module to camera collection to the analysis result of image information and X-axis coordinate data and the Y-axis coordinate data that infrared positioning equipment collects, the coordinating analysis module of main control unit, operating point combinatory analysis module and a colour analysis module are carried out the pairing of operating point coordinate, operating point grouping management and operating point information coding successively;
(3) the communication transmitter unit of hardware device level is sent to the drive software layer with the coded message that obtains;
(4) behind the communication receiving element Receiving coded information of drive software layer, earlier to its verification and error correcting that carries out information correctness, deliver to the Analysis Service unit then and decode, decoded information is sent to the application software layer by the message trigger interface;
(5) after the application software layer reception information, draw menu in corresponding position.
2, multi-point touch locating method according to claim 1 is characterized in that, step (2) is specially:
(2-1) picture recognition module to camera collection to image information analyze, determine relative position relation between each operating point by each the operating point position that photographs, by analyzing number of people number before the screen and the arm number on the screen each operating point is divided into groups, the written color by analysis operation point is extracted in 28 immediate color-values in the color-values;
(2-2) the comprehensive X-axis coordinate data and the Y-axis coordinate data that collect of the relative position relation of each operating point and infrared positioning equipment, the coordinating analysis module is carried out the pairing of operating point coordinate, obtains the coordinate figure of each operating point;
(2-3) the grouping situation of synthetic operation point and the coordinate figure of each operating point, operating point combinatory analysis module is carried out grouping management to operating point, judges whether same group of interior operating point is the associativity operation, clear and definite operator's mode of operation;
(2-4) after clear and definite operator's the mode of operation, the written color-values that the combined with intelligent picture recognition module is extracted is encoded to the information of the operating point coordinate on the screen, mode of operation and a look attribute, forms the coded message that communication is used.
3, multi-point touch locating method according to claim 2 is characterized in that, 28 kinds of color-values described in the step (2-1) are the color-values of system's default setting when dispatching from the factory, the scope of getting of its corresponding each color-values is RGB (0,0,0)~RGB (255,255,255).
4, multi-point touch locating method according to claim 2 is characterized in that, the described operator's of step (2-3) mode of operation is for clicking selected or given combination function action, and described given combination function action comprises convergent-divergent, rotation and moves.
5, multi-point touch locating method according to claim 1 is characterized in that, described device hardware layer comprises camera, infrared positioning equipment, main control unit and communication transmitter unit; Described main control unit comprises picture recognition module, coordinating analysis module, operating point combinatory analysis module and a colour analysis module that connects successively, described camera, infrared positioning equipment are connected with picture recognition module respectively, and the communication transmitter unit is connected with a colour analysis module; Described drive software layer comprises communication receiving element, Analysis Service unit and the message trigger interface that connects successively; Described application software layer comprises application hardware interface end and the application software menu drafting module that connects successively; Described communication unit is connected with the communication transmitter unit, and the message trigger interface is connected with the application hardware interface end.
6, multi-point touch locating method according to claim 1 is characterized in that, the communication modes between described communication transmitter unit and the communication receiving element is infrared, bluetooth, serial ports, parallel port, USB, RFID or network interface.
7, multi-point touch locating method according to claim 1 is characterized in that, the number of described camera is 2~3, between each camera each other communication share the image information collect separately.
CN2009100400111A 2009-06-04 2009-06-04 Multi-point touch positioning method Expired - Fee Related CN101576788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100400111A CN101576788B (en) 2009-06-04 2009-06-04 Multi-point touch positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100400111A CN101576788B (en) 2009-06-04 2009-06-04 Multi-point touch positioning method

Publications (2)

Publication Number Publication Date
CN101576788A true CN101576788A (en) 2009-11-11
CN101576788B CN101576788B (en) 2011-01-26

Family

ID=41271732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100400111A Expired - Fee Related CN101576788B (en) 2009-06-04 2009-06-04 Multi-point touch positioning method

Country Status (1)

Country Link
CN (1) CN101576788B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105139770A (en) * 2015-09-29 2015-12-09 深圳市汉丰光电有限公司 Rapidly spliced LED display system and display method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107526518A (en) * 2017-06-15 2017-12-29 北京仁光科技有限公司 A kind of remote commanding system and its remote command method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100485595C (en) * 2007-07-25 2009-05-06 广东威创视讯科技股份有限公司 Touch panel device and multi-point touch locating method
CN101145091A (en) * 2007-11-01 2008-03-19 复旦大学 Touch panel based on infrared pick-up and its positioning detection method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105139770A (en) * 2015-09-29 2015-12-09 深圳市汉丰光电有限公司 Rapidly spliced LED display system and display method thereof

Also Published As

Publication number Publication date
CN101576788B (en) 2011-01-26

Similar Documents

Publication Publication Date Title
CN105045692B (en) Testing device for touch screens and system, testing touch screen control device
US8648811B2 (en) Remote control system for electronic device and remote control method thereof
CN102446032B (en) Information input method and terminal based on camera
CN103092432A (en) Trigger control method and system of man-machine interaction operating instruction and laser emission device
CN111105506A (en) Real-time monitoring method and device based on mixed reality
CN110506252B (en) Terminal screen is fixed a position to transform relation based on mark figure point coordinate in pattern
WO2019085185A1 (en) Intelligent pen, control method, device and apparatus for intelligent pen, and storage medium
CN104460951A (en) Human-computer interaction method
CN107092350A (en) A kind of remote computer based system and method
CN108919982A (en) A kind of automatic key mouse switching method based on facial orientation identification
CN107483729B (en) Method and system for selecting captured image by multi-image display
CN101576788B (en) Multi-point touch positioning method
CN104375630A (en) Public information terminal based on Kinect
CN102984563A (en) Intelligent remote controlled television system and remote control method thereof
CN103164281B (en) Control to gather method and the electronic equipment of image
CN112199997A (en) Terminal and tool processing method
CN107870752A (en) Wall method, terminal, video wall and system on terminal window
CN109740703A (en) Multimedia digital platform Intelligent exhibition system
CN102446035B (en) Method and device for discriminating color of touch pen
CN104866201A (en) Intelligent device and method for triggering editing function of application
CN101067761A (en) Method and equipment for making interactive operation with display system
JP6448176B2 (en) Medical image sharing system, medical image sharing method, and medical image sharing program
CN110837825A (en) Meter identification and reading system based on embedded type
CN106339089A (en) Human-computer interaction action identification system and method
CN215068140U (en) System for wall surface interaction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 510670 Guangdong Province, Guangzhou high tech Industrial Development Zone Kezhu Road No. 233

Patentee after: Wei Chong group Limited by Share Ltd

Address before: 510663 No. 6, color road, hi tech Industrial Development Zone, Guangdong, Guangzhou, China

Patentee before: Guangdong Weichuangshixun Science and Technology Co., Ltd.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110126

Termination date: 20180604