CN102542554A - Method and system for detecting thin line in image - Google Patents

Method and system for detecting thin line in image Download PDF

Info

Publication number
CN102542554A
CN102542554A CN2010106069727A CN201010606972A CN102542554A CN 102542554 A CN102542554 A CN 102542554A CN 2010106069727 A CN2010106069727 A CN 2010106069727A CN 201010606972 A CN201010606972 A CN 201010606972A CN 102542554 A CN102542554 A CN 102542554A
Authority
CN
China
Prior art keywords
edge
distance
fine rule
relation
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010106069727A
Other languages
Chinese (zh)
Other versions
CN102542554B (en
Inventor
袁梦尤
邓玥琳
张宏志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fangzhu Wuhan Technology Co ltd
Founder International Co Ltd
Original Assignee
Founder International Co Ltd
Founder International Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Founder International Co Ltd, Founder International Beijing Co Ltd filed Critical Founder International Co Ltd
Priority to CN201010606972.7A priority Critical patent/CN102542554B/en
Publication of CN102542554A publication Critical patent/CN102542554A/en
Application granted granted Critical
Publication of CN102542554B publication Critical patent/CN102542554B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a method and system for detecting a thin line in an image, belonging to the technical field of image processing. The method provided by the invention comprises the following steps of: firstly detecting edge of the image, and recording the detected edge; then generating an edge distance relational graph, wherein the edge distance relational graph is used for expressing a distance between each pixel point and the nearest edge of the pixel point in the image and a direction of each pixel point; and finally extracting the thin line from the edge distance relational graph according to a set thin line maximum width threshold. By adopting the method and system provided by the invention, all the thin lines with the widths of being less than the set maximum width in the image can be detected without being influenced by topological structures of the thin lines, and the thin line in the specific forms that the width of the thin line is gradient and the like can be detected.

Description

Fine rule detection method and system in a kind of image
Technical field
The present invention relates to fine rule detection method and system in a kind of image, belong to technical field of image processing.
Background technology
Fine rule method in the detected image of present relatively main flow mainly contains following two kinds:
The template detection method: the core concept of these class methods mainly is the wave filter that adopts specific size, has directivity, utilizes response that the both sides, edge of fine rule produce that the fine rule of specified width, which width is detected.When the fine rule width of these class methods in image is inconsistent, be difficult to detect with the template of fixed size.And, if fine rule is complex-shaped and exist the situation of width gradual change, template size to be difficult for confirming.When the template of choosing was big, its operation efficiency was also relatively lower.
The streakline tracking: these class methods are to be based upon on the basis of rim detection or template detection, and record streakline unique point is followed the tracks of, extracted regular directive streakline.Only fine rule is more effective relatively uniformly to thinner and width for these class methods, like the extraction of fingerprint and palmmprint etc., and the application relative narrower.
Summary of the invention
To the defective that exists in the prior art, technical matters to be solved by this invention provides a kind of fine rule detection method and system that the fine rule topological structure influences and efficient is high that do not receive.
For solving the problems of the technologies described above, the technical scheme that the present invention adopts is following:
Fine rule detection method in a kind of image may further comprise the steps:
(1) image is carried out rim detection, write down detected edge;
(2) generate the Edge Distance graph of a relation; Said Edge Distance graph of a relation is used for representing distance and the direction of each pixel of said image to edge nearest with it;
(3) according to the fine rule breadth extreme threshold value N that sets MaxIn said Edge Distance graph of a relation, extract fine rule.
Aforesaid fine rule detection method, edge described in the step (1) adopts the Coordinate Chain table record at pixel edge in centre joint coordinate chained list or the image; Wherein, said centre joint coordinate chained list is the chained list that the coordinate in all relative pixel centre positions of both sides of edges connects in regular turn.
Aforesaid fine rule detection method, in the Edge Distance graph of a relation, the distance that will arrive its nearest edge is greater than N described in the step (2) MaxThe distance relation value of/2 pixel is set to invalid value.
Aforesaid fine rule detection method, described in the step (2) in the Edge Distance graph of a relation, for closed edge not, the initial direction both sides opposite in sign along the edge; For closed edge, be positioned at the opposite in sign in the closed edge inboard and the outside.
Aforesaid fine rule detection method, the generative process of Edge Distance graph of a relation is following described in the step (2):
At first initialization Edge Distance graph of a relation all is arranged to invalid value with each represented in Edge Distance graph of a relation pixel to the distance at edge nearest with it;
Scan the centre joint coordinate chained list at each bar edge then; In the computed image with the ultimate range of current coordinate at N MaxPixel in/2 scopes is to the distance of current coordinate, and the distance modification of the point that this pixel is corresponding is the distance to current coordinate among the distance relation figure on the edge of, and confirms the direction of this point; If this pixel on the edge of among the distance relation figure corresponding point be modified distance, the size of current distance and former distance relatively then is if current distance less than former distance, is then replaced former distance with current distance.
Aforesaid fine rule detection method, the process of extracting fine rule described in the step (3) is following:
(a) locate fine line region on the edge of among the distance relation figure: with in the Edge Distance graph of a relation near invalid value one side and with the distance at edge at N MaxThe distance of the point in/2 scopes is made as invalid value;
(b) with in the image with the Edge Distance graph of a relation in the corresponding pixel of point in the fine line region merge.
Aforesaid fine rule detection method, after obtaining fine rule, said method also comprises the step of removing the undesirable fine rule of length breadth ratio, the breadth extreme of the length/fine rule of said length breadth ratio=fine rule.
Said methods of length calculation is: the breadth extreme of the area/fine rule of fine rule.
Wherein, the area of fine rule is the pixel number that fine line region comprised; The breadth extreme of said fine rule is peaked 2 times of this fine line region middle distance in the Edge Distance graph of a relation.
Fine rule detection system in a kind of image comprises being used for image is carried out rim detection, writes down the pick-up unit at detected edge;
Be used to generate the generating apparatus of Edge Distance graph of a relation; Said Edge Distance graph of a relation is used for representing distance and the direction of each pixel of said image to edge nearest with it;
Be used for according to the fine rule breadth extreme threshold value N that sets MaxIn said Edge Distance graph of a relation, extract the extraction element of fine rule.
Aforesaid fine rule detection system, wherein, extraction element comprises the positioning unit that is used for the location of distance relation figure on the edge of fine line region; Be used for the merge cells that the pixel that the point in the fine line region in image and the Edge Distance graph of a relation is corresponding merges.
Aforesaid fine rule detection system also comprises the removal device that is used for after obtaining fine rule, removing the undesirable fine rule of length breadth ratio.
The method of the invention and system; Through generating the Edge Distance graph of a relation; Detect the mode that satisfies the fine rule of setting breadth extreme on the edge of among the distance relation figure; Can detect in the image all less than the fine rule of setting breadth extreme, not receive the influence of fine rule topological structure, can the detection width gradual change etc. the fine rule of special shape.Simultaneously, compared with prior art, when having avoided template big, the problem that operation efficiency is low.
Description of drawings
Fig. 1 is a fine rule detection system structured flowchart in the image in the embodiment;
Fig. 2 is the method flow diagram that adopts fine rule in the system shown in Figure 1 detected image;
Fig. 3 is with the edge synoptic diagram in the centre joint coordinate representation image in the embodiment;
Fig. 4 A be in the embodiment not closed edge generate the process synoptic diagram 1 of Edge Distance graph of a relation, Fig. 4 B is that closed edge does not generate the process synoptic diagram 2 of Edge Distance graph of a relation;
Fig. 5 is the synoptic diagram that comprises the Edge Distance graph of a relation of closed edge in the embodiment;
Fig. 6 A comprises the not synoptic diagram 1 of the Edge Distance graph of a relation of closed edge angle point in the embodiment, Fig. 6 B comprises the not synoptic diagram 2 of the Edge Distance graph of a relation of closed edge angle point;
Fig. 7 A is the synoptic diagram that comprises the distance relation figure at two broken line edges in the embodiment, and Fig. 7 B is that Fig. 7 A is through the synoptic diagram as a result after the corrosion treatment;
Fig. 8 comprises the distance relation figure of closed edge through the synoptic diagram as a result after the corrosion treatment in the embodiment;
Fig. 9 is through the synoptic diagram as a result behind the label to Fig. 7 B.
Embodiment
Describe the present invention below in conjunction with embodiment and accompanying drawing.
Fig. 1 shows in this embodiment fine rule detection system structure in the image.As shown in Figure 1, this system comprises pick-up unit 11, the generating apparatus 12 that is connected with pick-up unit 11, the extraction element 13 that is connected with generating apparatus 12, and the removal device 14 that is connected with extraction element 13.Wherein, extraction element 13 comprises positioning unit 131 and merge cells 132.
Pick-up unit 11 is used for image is carried out rim detection, and writes down detected edge.
Generating apparatus 12 is used to generate the Edge Distance graph of a relation; Said Edge Distance graph of a relation is used for representing distance and the direction of each pixel of said image to edge nearest with it.
Extraction element 13 is used for according to the fine rule breadth extreme threshold value N that sets MaxIn said Edge Distance graph of a relation, extract fine rule.Positioning unit 131 is used on the edge of, and distance relation figure locatees fine line region.Merge cells 132 is used for the pixel that the point in the fine line region in image and the Edge Distance graph of a relation is corresponding and merges.
Removal device 14 is used for after obtaining fine rule, removing the undesirable fine rule of length breadth ratio.
Fig. 2 shows the method flow that adopts the fine rule in the system shown in Figure 1 detected image.As shown in Figure 2, this method may further comprise the steps:
(1) 11 pairs of images of pick-up unit carry out rim detection, and write down detected edge.
Edge detection method can adopt existing method, like the Canny operator.
In this embodiment, detected edge is adopted centre joint Coordinate Chain table record.With the upper left angle point of image is that true origin is set up rectangular coordinate system, and level is to the right an x axle positive dirction, is y axle positive dirction straight down.The coordinate of each pixel in this coordinate system is given a definition image, and the intermediate position coordinates of two neighbor pixels, i.e. centre joint coordinate.The coordinate in all relative pixel centre positions of both sides of edges is connected into chained list in regular turn, write down the edge with this chained list.As shown in Figure 3, for edge 31, connect into chained list according to the order of sequence with the coordinate in relative pixel centre position, 31 both sides, edge, write down this edge 31.
Can also adopt other mode records to detected edge, like Coordinate Chain table record with pixel edge in the image.
(2) generating apparatus 12 generates the Edge Distance graph of a relation.
The size of Edge Distance graph of a relation is identical with the image size; Each point in the Edge Distance graph of a relation is corresponding one by one with each pixel in the image, and the pixel that each point is used for writing down the image corresponding with this point arrives minor increment and the direction with this nearest edge of pixel.Said direction is meant that pixel is positioned at the position at edge, distinguishes with sign in distance relation figure.For closed edge not, the initial direction both sides opposite in sign along the edge; For closed edge, be positioned at the opposite in sign in the closed edge inboard and the outside.Certainly, the direction of pixel can adopt other methods to set up, as long as can distinguish and the position at edge relation.
The concrete generative process of Edge Distance graph of a relation is following:
At first initialization Edge Distance graph of a relation all is arranged to invalid value with the distance of every bit, as 0,255 etc.
Scan the centre joint coordinate chained list at each bar edge then, in the computed image with the ultimate range of current coordinate at N MaxPixel in/2 scopes is to the distance of current coordinate, and the distance modification of the point that this pixel is corresponding is the distance to current coordinate among the distance relation figure on the edge of, and confirms the direction of this point.If this pixel on the edge of among the distance relation figure corresponding point be modified distance (being non-initial invalid value), the size of current distance and former distance relatively then is if current distance less than former distance, is then replaced former distance with current distance.
In this embodiment, N MaxGet 6, promptly the width of fine rule is no more than 6 pixel width.Shown in Fig. 4 A, at first scan the centre joint coordinate chained list at edge 41, obtain first centre joint coordinate, as current coordinate.In the computed image with the pixel of ultimate range in 3 pixel width ranges of current coordinate distance to current coordinate; Promptly calculate with current coordinate be pixel in 3 * 3 territories at center to the distance of current coordinate, be the distance of this pixel coordinate and current coordinate with the distance modification of point corresponding in the Edge Distance graph of a relation with this pixel.If this pixel is 41 left sides on the edge of, then distance is a negative value; If 41 right side on the edge of, then distance be on the occasion of.Continue scanning centre joint coordinate chained list, obtain second centre joint coordinate,, repeat said process, up to the centre joint coordinate chained list that has scanned edge 41 as current coordinate.Handle next bar edge again,, generate final Edge Distance graph of a relation up to handling all edges.
Shown in Fig. 4 B, when handling edge 42, for the pixel of the 7th row, its distance with respect to edge 42 (being current distance) is 2.Since this pixel on the edge of among the distance relation figure distance of corresponding point be modified (being non-invalid value), therefore relatively current distance and this are put the size of former distance.If current distance is put former distance less than this, then with this former distance of current distance replacement.This situation explanation edge 2 is the nearest edges of this row pixel.Because this row pixel is positioned at the left side at edge 42, therefore on the edge of among the distance relation figure direction of corresponding point for negative.
As shown in Figure 5, for closed edge, be provided with the inner pixel in edge to the range direction at edge for just, the outside, edge is for just.
Shown in Fig. 6 A and 6B, at N MaxThe pixel in/2 scopes and the distance of edge angle point have following two kinds of computing method:
A kind of method is the corner method.Shown in Fig. 6 A, at N MaxIn/2 scopes, the oblique line of pixel and angle point distance is equal to level or vertical distance.
Another kind method is the fillet method.Shown in Fig. 6 B, at N MaxIn/2 scopes, with the oblique line of pixel and angle point apart from distance as this pixel.Computing formula is:
Figure BSA00000399237200061
Wherein, x 1And y 1Be respectively the horizontal ordinate of pixel, x 2And y 2Be respectively the horizontal ordinate of angle point.
(3) extraction element 13 is according to the fine rule breadth extreme threshold value N that sets MaxIn said Edge Distance graph of a relation, extract fine rule.Specifically comprise the steps:
(a) positioning unit 131 location fine line region.
On the edge of among the distance relation figure, erode near invalid value one side and with the distance on border at N MaxThe distance of the point in/2 scopes, be about near invalid value one side and with the distance on border at N MaxThe distance of the point in/2 scopes is made as invalid value.Remaining zone is a fine line region.Because fine line region can only appear between two edges and closed edge that do not seal, thus with the frontier distance of invalid value and Edge Distance graph of a relation at N MaxPoint in/2 scopes can not be fine line region.In closed edge,, do not belong to fine line region equally if invalid value explains that then the width of closed edge does not satisfy the maximum fine rule width range of setting.
Edge Distance graph of a relation shown in Fig. 7 A (invalid value is 0), the effect after excessive erosion is shown in Fig. 7 B.The Edge Distance graph of a relation that comprises closed edge shown in Figure 5, the effect after excessive erosion is as shown in Figure 8, because there is not effective value in closed edge inside, the zone of therefore this closed edge being described and being surrounded does not belong to fine line region.
(b) merge cells 132 with in the image with the Edge Distance graph of a relation in the corresponding pixel of point in the fine line region merge.
After confirming fine line region, need different fine rules be distinguished.Can the point in the fine line region be carried out identical label, utilize label to realize distinguishing mutually with other fine rules.As shown in Figure 9 to the effect after the Edge Distance graph of a relation after the excessive erosion carries out label shown in Fig. 7 B, label is that 1 point corresponding pixel in image is the pixel in the fine line region.
After obtaining fine rule, removal device 14 is removed the undesirable fine rule of length breadth ratio.The breadth extreme of the length/fine rule of said length breadth ratio=fine rule.Said methods of length calculation is: the breadth extreme of the area/fine rule of the length=fine rule of fine rule; The area of fine rule can confirm that the breadth extreme of fine rule is peaked 2 times of this fine line region middle distance in the Edge Distance graph of a relation through the pixel number that fine line region comprised.If the value of length breadth ratio, explains then that this fine rule width is bigger less than pre-set threshold, do not think that it is a fine rule.
Obviously, those skilled in the art can carry out various changes and modification to the present invention and not break away from the spirit and scope of the present invention.Like this, belong within the scope of claim of the present invention and equivalent technology thereof if of the present invention these are revised with modification, then the present invention also is intended to comprise these changes and modification interior.

Claims (12)

1. fine rule detection method in the image may further comprise the steps:
(1) image is carried out rim detection, write down detected edge;
(2) generate the Edge Distance graph of a relation; Said Edge Distance graph of a relation is used for representing distance and the direction of each pixel of said image to edge nearest with it;
(3) according to the fine rule breadth extreme threshold value N that sets MaxIn said Edge Distance graph of a relation, extract fine rule.
2. fine rule detection method as claimed in claim 1 is characterized in that: edge described in the step (1) adopts the Coordinate Chain table record at pixel edge in centre joint coordinate chained list or the image; Wherein, said centre joint coordinate chained list is the chained list that the coordinate in all relative pixel centre positions of both sides of edges connects in regular turn.
3. fine rule detection method as claimed in claim 1 is characterized in that: in the Edge Distance graph of a relation, the distance that will arrive its nearest edge is greater than N described in the step (2) MaxThe distance relation value of/2 pixel is set to invalid value.
4. fine rule detection method as claimed in claim 3 is characterized in that: described in the step (2) in the Edge Distance graph of a relation, and for closed edge not, the initial direction both sides opposite in sign along the edge; For closed edge, be positioned at the opposite in sign in the closed edge inboard and the outside.
5. fine rule detection method as claimed in claim 4 is characterized in that: the generative process of Edge Distance graph of a relation is following described in the step (2):
At first initialization Edge Distance graph of a relation all is arranged to invalid value with each represented in Edge Distance graph of a relation pixel to the distance at edge nearest with it;
Scan the centre joint coordinate chained list at each bar edge then; In the computed image with the ultimate range of current coordinate at N MaxPixel in/2 scopes is to the distance of current coordinate, and the distance modification of the point that this pixel is corresponding is the distance to current coordinate among the distance relation figure on the edge of, and confirms the direction of this point; If this pixel on the edge of among the distance relation figure corresponding point be modified distance, the size of current distance and former distance relatively then is if current distance less than former distance, is then replaced former distance with current distance.
6. like each described fine rule detection method in the claim 1~5, it is characterized in that: the process of extracting fine rule described in the step (3) is following:
(a) locate fine line region on the edge of among the distance relation figure: with in the Edge Distance graph of a relation near invalid value one side and with the distance at edge at N MaxThe distance of the point in/2 scopes is made as invalid value;
(b) with in the image with the Edge Distance graph of a relation in the corresponding pixel of point in the fine line region merge.
7. fine rule detection method as claimed in claim 6 is characterized in that: after obtaining fine rule, said method also comprises the step of removing the undesirable fine rule of length breadth ratio, the breadth extreme of the length/fine rule of said length breadth ratio=fine rule.
8. fine rule detection method as claimed in claim 7 is characterized in that: said methods of length calculation is: the breadth extreme of the area/fine rule of fine rule.
9. fine rule detection method as claimed in claim 8 is characterized in that: the area of said fine rule is the pixel number that fine line region comprised; The breadth extreme of said fine rule is peaked 2 times of this fine line region middle distance in the Edge Distance graph of a relation.
10. the fine rule detection system in the image comprises being used for image is carried out rim detection, writes down the pick-up unit (11) at detected edge;
Be used to generate the generating apparatus (12) of Edge Distance graph of a relation; Said Edge Distance graph of a relation is used for representing distance and the direction of each pixel of said image to edge nearest with it;
Be used for according to the fine rule breadth extreme threshold value N that sets MaxIn said Edge Distance graph of a relation, extract the extraction element (13) of fine rule.
11. fine rule detection system as claimed in claim 10 is characterized in that: said extraction element (13) comprises the positioning unit (131) that is used for the location of distance relation figure on the edge of fine line region; Be used for the merge cells (132) that the pixel that the point in the fine line region in image and the Edge Distance graph of a relation is corresponding merges.
12. like claim 10 or 11 described fine rule detection systems, it is characterized in that: said system also comprises the removal device (14) that is used for after obtaining fine rule, removing the undesirable fine rule of length breadth ratio.
CN201010606972.7A 2010-12-16 2010-12-16 Method and system for detecting thin line in image Active CN102542554B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010606972.7A CN102542554B (en) 2010-12-16 2010-12-16 Method and system for detecting thin line in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010606972.7A CN102542554B (en) 2010-12-16 2010-12-16 Method and system for detecting thin line in image

Publications (2)

Publication Number Publication Date
CN102542554A true CN102542554A (en) 2012-07-04
CN102542554B CN102542554B (en) 2015-01-28

Family

ID=46349377

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010606972.7A Active CN102542554B (en) 2010-12-16 2010-12-16 Method and system for detecting thin line in image

Country Status (1)

Country Link
CN (1) CN102542554B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727669A (en) * 2008-10-27 2010-06-09 北京大学 Method and device for detecting thin line of image

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727669A (en) * 2008-10-27 2010-06-09 北京大学 Method and device for detecting thin line of image

Also Published As

Publication number Publication date
CN102542554B (en) 2015-01-28

Similar Documents

Publication Publication Date Title
JP5936239B2 (en) Road surface degradation degree estimation method and algorithm (crack detection method using Gabor filter output image of multi-resolution image)
CN107704801B (en) Curve lane line detection method based on segmented straight line and segmented Bezier curve
CN105469027B (en) For the horizontal and vertical lines detection and removal of file and picture
CN107045634B (en) Text positioning method based on maximum stable extremum region and stroke width
WO2018107525A1 (en) Detection method for mold of electric injection molding machine
CN103345743B (en) A kind of image partition method for battery tail end smart flaw detection
CN107452035B (en) Method and apparatus for analyzing lane line image and computer readable medium thereof
CN107154034B (en) State detection method and system for stay wire positioning hook of high-speed rail contact network
Zhang et al. Text line segmentation for handwritten documents using constrained seam carving
JP2014059875A5 (en)
CN109697717B (en) Lining crack identification method based on image automatic search
CN108550138A (en) Refractory brick surface scratch recognition methods based on frequency filtering enhancing
CN113689429B (en) Wood board defect detection method based on computer vision
CN108445009B (en) Crack detection method for solar cell panel
CN109829910B (en) PCB defect detection method based on neighborhood search
CN103942809A (en) Method for detecting joint fissures in rock images
JP2007316685A (en) Traveling path boundary detection device and traveling path boundary detection method
CN105678737A (en) Digital image corner point detection method based on Radon transform
CN111932490A (en) Method for extracting grabbing information of visual system of industrial robot
CN104036514A (en) Circle detection method based on histogram peak value search
Ying et al. Research on an automatic counting method for steel bars' image
CN101777176B (en) Method and device for removing saw teeth in raster image
CN103810713A (en) Eight-connected image processing method and device
CN105469401A (en) Ship groove positioning method based on computer vision
JP2010128616A (en) Circle detection apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230815

Address after: 430000 No. 01, floor 11, office tower unit, Guanggu financial center construction project, No. 668, Gaoxin Avenue, Donghu New Technology Development Zone, Wuhan, Hubei Province

Patentee after: Fangzhu (Wuhan) Technology Co.,Ltd.

Patentee after: FOUNDER INTERNATIONAL Co.,Ltd.

Address before: 100080, Beijing City, Haidian District, No. 52 West Fourth Ring Road, SMIC building 19

Patentee before: Founder International Co.,Ltd. (Beijing)

Patentee before: FOUNDER INTERNATIONAL Co.,Ltd.