EP3495093B1 - Eyeglasses lens processing system and eyeglasses lens processing method - Google Patents

Eyeglasses lens processing system and eyeglasses lens processing method Download PDF

Info

Publication number
EP3495093B1
EP3495093B1 EP18209972.1A EP18209972A EP3495093B1 EP 3495093 B1 EP3495093 B1 EP 3495093B1 EP 18209972 A EP18209972 A EP 18209972A EP 3495093 B1 EP3495093 B1 EP 3495093B1
Authority
EP
European Patent Office
Prior art keywords
lens
eyeglasses
edge
edge information
eyeglasses lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP18209972.1A
Other languages
German (de)
French (fr)
Other versions
EP3495093A1 (en
Inventor
Ryuma SAKUDA
Kyoji Takeichi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidek Co Ltd
Original Assignee
Nidek Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidek Co Ltd filed Critical Nidek Co Ltd
Publication of EP3495093A1 publication Critical patent/EP3495093A1/en
Application granted granted Critical
Publication of EP3495093B1 publication Critical patent/EP3495093B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B9/00Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor
    • B24B9/02Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground
    • B24B9/06Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground of non-metallic inorganic material, e.g. stone, ceramics, porcelain
    • B24B9/08Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground of non-metallic inorganic material, e.g. stone, ceramics, porcelain of glass
    • B24B9/14Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground of non-metallic inorganic material, e.g. stone, ceramics, porcelain of glass of optical work, e.g. lenses, prisms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B47/00Drives or gearings; Equipment therefor
    • B24B47/22Equipment for exact control of the position of the grinding tool or work at the start of the grinding operation
    • B24B47/225Equipment for exact control of the position of the grinding tool or work at the start of the grinding operation for bevelling optical work, e.g. lenses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B9/00Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor
    • B24B9/02Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground
    • B24B9/06Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground of non-metallic inorganic material, e.g. stone, ceramics, porcelain
    • B24B9/08Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground of non-metallic inorganic material, e.g. stone, ceramics, porcelain of glass
    • B24B9/14Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor characterised by a special design with respect to properties of materials specific to articles to be ground of non-metallic inorganic material, e.g. stone, ceramics, porcelain of glass of optical work, e.g. lenses, prisms
    • B24B9/146Accessories, e.g. lens mounting devices

Definitions

  • the present disclosure relates to an eyeglasses lens processing system, and an eyeglasses lens processing method.
  • eyeglasses lenses processed according to the following operation are fitted in an eyeglasses frame.
  • an operator measures optical characteristics of an eyeglasses lens by using a lens meter, and prints a dot on a surface of the eyeglasses lens.
  • the operator detects the dot by using a shaft alignment apparatus (for example, a cup attachment apparatus attaching a cup to the surface of the eyeglasses lens), and sets a shaft alignment position of the eyeglasses lens (See JP-A-2001-311919 , for example.).
  • a shaft alignment apparatus for example, a cup attachment apparatus attaching a cup to the surface of the eyeglasses lens
  • the operator obtains edge information (for example, a position of a front surface, a position of a rear surface, and an edge thickness) by using a lens peripheral edge processing apparatus, and processes a peripheral edge of the eyeglasses lens. More specifically, the edge information of the eyeglasses lens is obtained by an edge information measurement unit mounted on the lens edge processing apparatus, and the peripheral edge of the eyeglasses lens is processed by a processing unit mounted on the lens peripheral edge processing apparatus. The operator may adjust a position of a bevel formed at the edge of the eyeglasses lens between the step of obtaining the edge information of the eyeglasses lens and the step of processing the peripheral edge of the eyeglasses lens (See JP-A-2009-131939 , for example.). Further, an eyeglasses lens processing system according to the preamble portion of claim 1 is known from US 2014/297016 A1 .
  • An object of the present invention is to provide an eyeglasses lens processing system, and an eyeglasses lens processing method, which are capable of efficiently processing an eyeglasses lens.
  • an eyeglasses lens processing system is provided, as defined in claim 1 and an eyeglasses lens processing method of processing a peripheral edge of an eyeglasses lens, as defined in claim 9.
  • Preferred embodiments are defined in the dependent claims.
  • L and R added to a reference sign respectively indicate a left side and a right side.
  • the terms “identical”, “parallel”, and “vertical” respectively include a substantially identical state, a substantially parallel state, and a substantially vertical state.
  • the following item classified in ⁇ > may be used independently or in an associated manner.
  • a shaft alignment apparatus will be described as an example, but at least a part of the technique of the present disclosure is not limited to a case of being applied to the shaft alignment apparatus.
  • at least a part of the technique of the present disclosure is applicable to a lens processing apparatus use to process a lens in a step before an eyeglasses lens is held by a lens holding portion.
  • a shaft alignment apparatus including a lens support portion (for example, a lens support mechanism 10) where an eyeglasses lens is mounted, and a shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion (for example, a lens holding unit 100), and for setting a shaft alignment position of the eyeglasses lens mounted on the lens support portion, the lens holding portion for clamping and holding the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens.
  • a lens support portion for example, a lens support mechanism 10
  • a shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion (for example, a lens holding unit 100)
  • the lens holding portion for clamping and holding the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens.
  • the shaft alignment apparatus may be a cup attachment apparatus (for example, a cup attachment apparatus 1) including a cup attachment portion (for example, a cup attachment mechanism 30) for attaching a cup for causing an eyeglasses lens to be held by the lens holding portion, to the eyeglasses lens on the basis of a shaft alignment position which is set by the shaft alignment position setting portion, and for attaching the cup to a surface of the eyeglasses lens mounted on the lens support portion.
  • a cup attachment apparatus for example, a cup attachment apparatus 1
  • a cup attachment portion for example, a cup attachment mechanism 30
  • the shaft alignment apparatus includes the shaft alignment position setting portion (for example, a lens information measurement mechanism 40).
  • the shaft alignment position setting portion sets a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion that clamps and holds an eyeglasses lens in order to process a peripheral edge of the eyeglasses lens.
  • the shaft alignment position may be at least one of an optical central position, a geometric central position, and the like of the eyeglasses lens.
  • the shaft alignment apparatus includes a refractive index obtaining portion (for example, a control unit 70).
  • the refractive index obtaining portion obtains a refractive index of an eyeglasses lens.
  • the refractive index obtaining portion may obtain a refractive index of an eyeglasses lens measured by another apparatus which is different from the shaft alignment apparatus.
  • the refractive index obtaining portion may include a refractive index measurement portion for measuring a refractive index of an eyeglasses lens. In this case, a refractive index of an eyeglasses lens is measured by the refractive index measurement portion, and thus the refractive index of the eyeglasses lens is obtained.
  • the shaft alignment apparatus may include a target lens shape obtaining portion (for example, a frame shape measurement mechanism 20).
  • the target lens shape obtaining portion obtains a target lens shape of an eyeglasses lens.
  • a target lens shape applied to an eyeglasses lens may be at least one of an inner peripheral shape of an eyeglasses frame, an outer peripheral shape of a demonstration lens, and the like.
  • the shaft alignment apparatus includes an image data obtaining portion (for example, an image data obtaining mechanism 60).
  • the image data obtaining portion obtains sectional image data including front surface image data for a front surface of an eyeglasses lens and rear surface image data for a rear surface of the eyeglasses lens.
  • the image data obtaining portion may obtain sectional image data based on a target lens shape applied to an eyeglasses lens.
  • the sectional image data may be obtained for at least a part of the entire periphery (all portions in which the target lens shape is formed at each radius vector angle) of the translation of the eyeglasses lens.
  • the sectional image data may be obtained for the entire periphery of the target lens shape of the eyeglasses lens.
  • the sectional image data may be obtained for a plurality of regions (for example, points at respective radius vector angles) in the entire periphery of the target lens shape applied to the eyeglasses lens.
  • the sectional image data may be obtained for some regions in the entire periphery of the target lens shape applied to the eyeglasses lens.
  • the sectional image data may be an image (image data).
  • the sectional image data may be a signal (signal data).
  • the image data obtaining portion includes projection optical system (for example, a projection optical system 64).
  • the projection optical system projects a measurement light beam toward a front surface or a rear surface of an eyeglasses lens.
  • the projection optical system may perpendicularly apply the measurement light beam to a lens surface of the eyeglasses lens.
  • the measurement light beam may be irregularly reflected on the lens surface of the eyeglasses lens.
  • the projection optical system may apply the measurement light beam to the lens surface of the eyeglasses lens with a predetermined tilt angle.
  • the projection optical system may include a light source (for example, a light source 65).
  • the light source may apply a point-shaped measurement light beam toward an eyeglasses lens.
  • a point light source may be used as the light source.
  • a single point light source may be disposed, and may apply a single point measurement light beam.
  • a plurality of point light sources may be disposed side by side, and may apply a measurement light beam having a width.
  • the light source may apply a slit-shaped measurement light beam toward an eyeglasses lens.
  • a slit plate and a lens may be provided on an optical path between the light source and a lens surface (that is, a front surface or a rear surface of the eyeglasses lens) of the eyeglasses lens.
  • a measurement light beam having a width can be applied to the eyeglasses lens.
  • the projection optical system may include an optical member.
  • the optical member at least one of a lens, a mirror, a diaphragm, and the like may be used.
  • a measurement light beam emitted from the light source may be applied toward the lens surface of the eyeglasses lens via each optical member.
  • an optical member is not limited to the optical member, and other optical members may be used.
  • the image data obtaining portion includes a light receiving optical system (for example, a light receiving optical system 66).
  • the light receiving optical system may include a light receiving element (for example, an imaging element 69).
  • the light receiving optical system receives a first reflected light beam as a result of a measurement light beam being reflected at a front surface of an eyeglasses lens and a second reflected light beams as a result of a measurement light beam being reflected at a rear surface of the eyeglasses lens, with the light receiving element.
  • the light receiving optical system may receive a reflected light beam (for example, regular reflected light or irregular reflected light) reflected at the lens surface of the eyeglasses lens with the light receiving element.
  • the image data obtaining portion obtains sectional image data (for example, sectional image data 75) including front surface image data formed by the first reflected light beam and rear surface image data formed by the second reflected light beam.
  • sectional image data for example, sectional image data 75
  • the image data obtaining portion can obtain sectional image data in a non-contact manner by using a measurement light beam. Therefore, an operator can efficiently obtain edge information of the eyeglasses lens. Consequently, the image data obtaining portion can obtain sectional image data with a simple configuration.
  • the light receiving optical system may include an optical member.
  • the optical member at least one of a lens, a mirror, a diaphragm, and the like may be used.
  • a reflected light beam reflected at the lens surface of the eyeglasses lens may be received by the light receiving element via each optical member.
  • an optical member is not limited to the optical member, and other optical members may be used.
  • the shaft alignment apparatus includes an edge information obtaining portion (for example, the control unit 70).
  • the edge information obtaining portion obtains edge information about an edge of an eyeglasses lens.
  • the edge information may be at least one of a front surface position of the eyeglasses lens, a rear surface position of the eyeglasses lens, an edge thickness of the eyeglasses lens, and the like.
  • the edge information may be at least one of a front surface curve value of the eyeglasses lens, a rear surface curve value of the eyeglasses lens, and the like. Consequently, edge information which is obtained by using a lens peripheral edge processing apparatus in the related art can be obtained by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • the edge information obtaining portion may include an edge measurement portion.
  • the edge measurement portion is used to obtain edge information of an eyeglasses lens.
  • a measurement optical system for example, a measurement optical system 61
  • the edge measurement portion obtains edge information of an eyeglasses lens mounted on the lens support portion.
  • the edge measurement portion may obtain edge information of the eyeglasses lens in a non-contact manner.
  • the edge measurement portion may obtain edge information of the eyeglasses lens in a contact manner. In this case, edge information may be obtained by bringing a tracking stylus or the like into contact with the eyeglasses lens.
  • the edge information obtaining portion measures the eyeglasses lens by controlling the edge measurement portion, so as to obtain edge information. Consequently, the edge information of the eyeglasses lens can be obtained by using the shaft alignment apparatus, and thus the usage time of the lens peripheral edge processing apparatus can reduced.
  • the edge information obtaining portion may control the edge measurement portion to obtain edge information on the basis of a target lens shape applied to the eyeglasses lens. Consequently, edge information at a position (for example, at least one of a position matching the target lens shape, a position of a bevel at each radius vector angle, and a position of a chamfer at each radius vector angle) based on the target lens shape applied to the eyeglasses lens. In other words, edge information of at least one of a peripheral edge and a periphery of the target lens shape applied to the eyeglasses lens is obtained.
  • edge information of a plurality of regions for example, a points at respective predetermined radius vector angles
  • edge information at the position may be obtained by performing interpolation by using edge information at the radius vector angles of the periphery.
  • the edge information obtaining portion may obtain edge information about an edge of the eyeglasses lens on the basis of a refractive index and sectional image data of the eyeglasses lens. Consequently, it is possible to efficiently obtain edge information of the eyeglasses lens by using sectional image data which is changed due to the refractive index of the eyeglasses lens.
  • the edge information obtaining portion may correct edge information on the basis of a refractive index and sectional image data of the eyeglasses lens.
  • the edge information obtaining portion may correct the sectional image data on the basis of the refractive index, and obtain edge information on the basis of the corrected sectional image data.
  • the edge information obtaining portion may correct edge information which is obtained on the basis of the sectional image data, on the basis of the refractive index, so as to obtain edge information.
  • the shaft alignment apparatus includes a finishing position setting portion (for example, the control unit 70).
  • the finishing position setting portion sets an edge finishing position on the basis of edge information of the eyeglasses lens.
  • the edge finishing position may be at least one of a position of a bevel, a position of a groove, and a position of a chamfer formed at the edge of the eyeglasses lens. Consequently, a finishing position which is set by using a lens peripheral edge processing apparatus in the related art can be set by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, the standby time of the lens peripheral edge processing apparatus is mitigated, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • the finishing position setting portion displays a finishing position based on edge information on a display portion (for example, a display 2).
  • the finishing position setting portion may automatically set an edge finishing position on the basis of edge information.
  • the finishing position setting portion may allow an operator to manually set an edge finishing position.
  • the finishing position setting portion sets a finishing position on the basis of an operation signal from the operation portion for adjusting a finishing position on the display portion. Consequently, the operator can easily understand a finishing position formed in the eyeglasses lens. The operator easily adjusts a finishing position formed in the eyeglasses lens.
  • the finishing position setting portion may display a finishing position based on edge information of a left eyeglasses lens and a finishing position based on edge information of a right eyeglasses lens on the display portion in a comparable manner.
  • the finishing position setting portion may set a finishing position of at least one of the left eyeglasses lens and the right eyeglasses lens on the basis of an operation signal from the operation portion.
  • the operation portion may be provided in the display portion.
  • the display portion and the operation portion may be separately provided. Consequently, the operator can easily understand the balance between finishing positions in the left eyeglasses lens and the right eyeglasses lens.
  • the operator can adjust each of the finishing positions set in the left eyeglasses lens and the right eyeglasses lens by taking into consideration the balance between the finishing positions. Therefore, the operator can easily manufacture eyeglasses with good appearance.
  • the shaft alignment apparatus including the lens support portion where an eyeglasses lens is mounted, and the shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion (for example, a lens holding unit 100), and for setting a shaft alignment position of the eyeglasses lens mounted on the lens support portion, the lens holding portion for clamping and holding the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, may be used as a lens shape measurement apparatus.
  • the cup attachment apparatus including the cup attachment portion for attaching a cup for causing an eyeglasses lens to be held by the lens holding portion, to the eyeglasses lens on the basis of a shaft alignment position which is set by the shaft alignment position setting portion, and for attaching the cup to a surface of the eyeglasses lens mounted on the lens support portion, may be used as a lens shape measurement apparatus.
  • a lens peripheral edge processing apparatus for example, a lens peripheral edge processing apparatus 90
  • a processing tool for example, a processing unit 300
  • at least one of the shaft alignment apparatus, the cup attachment apparatus, and the lens peripheral edge processing apparatus obtains edge information of an eyeglasses lens in a non-contact manner, and thus an operator can efficiently obtain the edge information of the eyeglasses lens.
  • an eyeglasses lens processing system for processing an eyeglasses lens may be built by using the shaft alignment apparatus (or the cup attachment apparatus).
  • the eyeglasses lens processing system may be built by using the shaft alignment apparatus which includes the shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion (for example, a lens holding unit 100) that clamps and holds the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, and a lens peripheral edge processing apparatus which includes the processing tool, the lens holding portion for holding the eyeglasses lens, and the process control data obtaining portion (for example, a control unit 95) for obtaining process control data for processing a peripheral edge of the eyeglasses lens, and processes the eyeglasses lens held by the lens holding portion by controlling the processing tool on the basis of the process control data obtained by the process control data obtaining portion.
  • the shaft alignment apparatus which includes the shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eye
  • the eyeglasses lens processing system may be built by using the cup attachment apparatus which includes the cup attachment portion for attaching a cup to a surface of an eyeglasses lens, and the lens peripheral edge processing apparatus which includes the processing tool, the lens holding portion for holding the eyeglasses lens attached with the cup, and the process control data obtaining portion for obtaining process control data for processing a peripheral edge of the eyeglasses lens, and processes the eyeglasses lens held by the lens holding portion by controlling the processing tool on the basis of the process control data obtained by the process control data obtaining portion.
  • the cup attachment apparatus may obtain first edge information of the eyeglasses lens.
  • an edge finishing position may be set on the basis of the first edge information of the eyeglasses lens.
  • the lens peripheral edge processing apparatus may obtain second edge information at one or more radius vector angles of the eyeglasses lens. For example, as the second edge information, edge information of one point of the eyeglasses lens may be obtained. For example, as the second edge information, edge information of a plurality of points of the eyeglasses lens may be obtained. In a case where edge information of a plurality of points is obtained as the second edge information, at least one of deformation or tilting of the eyeglasses lens caused by being held by the lens holding portion may be predicted. Consequently, the process control data obtaining portion may obtain process control data on the basis of the first edge information, the second edge information, and the finishing position.
  • an eyeglasses lens processing method for processing a peripheral edge of an eyeglasses lens may be performed by using the shaft alignment apparatus (or the cup attachment apparatus).
  • a finishing position setting step of setting an edge finishing position on the basis of the first edge information of the eyeglasses lens may be executed.
  • the eyeglasses lens attached with the cup may be held by the lens holding portion of the lens peripheral edge processing apparatus after the finishing position setting step and the cut attachment step are executed, and, in the process control data obtaining step, the process control data may be obtained on the basis of the finishing position after the finishing position setting step and the cup attachment step are executed.
  • a second edge information obtaining step of obtaining second edge information at one or more radius vector angles of the eyeglasses may be executed after the holding step is executed.
  • the process control data obtaining step the process control data may be obtained on the basis of the first edge information, the second edge information, and the finishing position.
  • a lens processing apparatus which is different from a lens peripheral edge processing apparatus processing an eyeglasses lens and executes a processing step before the eyeglasses lens is held by the lens holding portion provided in the lens peripheral edge processing apparatus may include the edge information obtaining portion for obtaining edge information about an edge of the eyeglasses lens.
  • the lens processing apparatus may be a lens meter.
  • the lens processing apparatus may be an eyeglasses frame shape measurement apparatus.
  • a cup attachment apparatus which attaches a cup Cu installed with a lens chuck shaft to a lens LE will be exemplified.
  • at least some techniques exemplified in the present example may be applied to apparatuses other than the cup attachment apparatus.
  • a shaft alignment apparatus at least some techniques exemplified in the present example may be applied to an apparatus holding the lens LE at a lens chuck shaft without using the cup Cu at a set shaft alignment position.
  • the shaft alignment apparatus will be described as an example, but, at least some techniques exemplified in the present example are not limited to a case of being applied to the shaft alignment apparatus.
  • at least some techniques exemplified in the present example is applicable to a lens processing apparatus used to process a lens in a step before the lens is held by the lens holding portion.
  • a lens processing apparatus which is a lens peripheral edge processing apparatus processing an eyeglasses lens and executes a processing step before the eyeglasses lens is held by lens
  • the holding portion provided in the lens peripheral edge processing apparatus may include the edge information obtaining portion for obtaining edge information about an edge of the eyeglasses lens.
  • the lens processing apparatus may be a lens meter.
  • the lens processing apparatus may be an eyeglasses frame shape measurement apparatus.
  • the lens processing apparatus may include the finishing position setting portion for setting a finishing position.
  • Fig. 1 is an exterior view of a cup attachment apparatus 1.
  • the cup attachment apparatus 1 includes a display (monitor) 2, an input button 3, a lens support mechanism 10, a frame shape measurement mechanism 20, a cup attachment mechanism 30, a lens information measurement mechanism 40, an image data obtaining mechanism 60, and the like.
  • the display 2 may be at least one of a liquid crystal display (LCD), an organic electroluminescence (EL) display, and a plasma display.
  • the display 2 is a touch panel.
  • the display 2 is used as an operation portion (input button 3) inputting signals or parameters for causing the cup attachment apparatus 1 to perform various processes, and a display portion displaying various pieces of information (for example, input parameters, optical characteristics of the lens LE, and edge information of the lens LE).
  • the display 2 and an operation portion may be separately provided.
  • at least one of a mouse, a joystick, a keyboard, and a portable terminal may be used as the operation portion.
  • Fig. 2 is a schematic configuration diagram of the lens support mechanism 10.
  • the lens support mechanism 10 is used to mount the lens LE in a state in which a surface (front surface) of the lens LE is directed upward.
  • the lens support mechanism 10 includes a cylindrical base 11, a ring member 12, a protection cover 13, support pins 14, rotation shafts 15, arms 16, and the like.
  • the protection cover 13 attached to the ring member 12 is provided on the cylindrical base 11.
  • a target plate 46 which will be described later or the like is disposed inside the cylindrical base 11.
  • the rotation shafts 15 are disposed on an outer circumferential portion of the cylindrical base 11.
  • the arms 16 are respectively disposed at upper ends of the rotation shafts 15.
  • the support pins 14 are respectively disposed at distal ends of the arms 16.
  • the support pins 14 are arranged in the same distance and with the same angle with respect to an optical axis L1.
  • the support pins 14 are brought into contact with a back surface (rear surface) of the lens LE so as to hold the lens.
  • the rotation shaft 15 is rotated about a central axis K1 via a rotation transfer mechanism (not illustrated) transmitting rotation of a motor or the like.
  • the arm 16 and the support pin 14 are moved from a retreat position indicated by a solid line to a support position indicated by a dotted line in conjunction with rotation of the rotation shaft 15. Consequently, a size of a region which can be supported by the support pins 14 can be changed by adjusting a distance from the optical axis L1 to the support pin 14 and a gap between the support pins 14.
  • the cylindrical base 11 is rotated about the optical axis L1 via the rotation transfer mechanism configured with a motor 17, a gear mechanism (not illustrated), and the like.
  • rotation of the motor 17 is transmitted to the cylindrical base 11 via the gear mechanism (not illustrated).
  • the rotation shaft 15, the arm 16 provided at the rotation shaft 15, and the support pin 14 provided at the arm 16 are integrally moved around the ring member 12 in conjunction with the rotation of the cylindrical base 11. Consequently, the lens LE mounted on the support pins 14 can be rotated.
  • the frame shape measurement mechanism 20 is used to trace a shape of an eyeglasses frame (hereinafter, referred to as a frame). Consequently, an inner peripheral shape (in other words, a target lens shape applied to the lens LE which will be described later) or the like of the frame may be obtained.
  • JP-A-2015-31847 is referred to with respect to a configuration of the frame shape measurement mechanism 20.
  • Fig. 3 is a schematic configuration diagram of the cup attachment mechanism 30.
  • the cup attachment mechanism 30 is used to attach the cup Cu on the surface (front surface) of the lens LE.
  • the cup attachment mechanism 30 is used to fix (block) the cup Cu to the surface of the lens LE.
  • a cup fixation location is the surface of the lens LE, and is not limited to.
  • a cup fixation location may be a back surface (rear surface) of the lens.
  • the cup attachment mechanism 30 sets the lens LE a shaft alignment position being an attachment position of a lens holding unit 100 (refer to Fig. 11 ) clamping and holding the lens LE in order to process a peripheral edge of the lens LE.
  • the cup attachment mechanism 30 relatively changes positions of the cup Cu installed at an installation portion 31 which will be described later and the lens LE supported by the lens support mechanism 10, so as to attach the cup Cu at an appropriate position on the surface of the lens LE.
  • an appropriate location (a location where the cup Cu is attached) in the lens LE can be sandwiched between lens chuck shafts 102L and 102R (refer to Fig. 11 ) of the lens peripheral edge processing apparatus 90.
  • a shaft alignment position may be set to an optical central position O (refer to Fig. 6 ) of the lens LE, and may be set to a geometric central position.
  • positions which are different from the above-described positions may be set as a shaft alignment position.
  • the optical central position O of the lens LE is set as a shaft alignment position.
  • the cup attachment mechanism 30 includes the installation portion 31, an arm 32, an arm holding base 33, an X-direction movement mechanism 35, a Y-direction movement mechanism 36, a Z-direction movement mechanism 37, and the like.
  • the installation portion 31 is fixed to the arm 32.
  • the installation portion 31 is fitted to an uneven portion formed in the cup Cu.
  • the arm 32 has a rotation transfer mechanism (not illustrated) variably maintaining a rotation angle of the installation portion 31 in the horizontal direction.
  • the arm 32 is attached to the arm holding base 33.
  • the arm holding base 33 includes a motor 34.
  • rotation of the motor 34 is transferred to the installation portion 31 via the rotation transfer mechanism (not illustrated) provided in the arm 32.
  • the installation portion 31 is rotated about the central axis K1 of the cup Cu due to the rotation of the motor 34. Consequently, a rotation angle of the cup Cu in the horizontal direction can be changed.
  • the X-direction movement mechanism 35 is moved in a leftward-and-rightward direction (X direction) of the cup attachment apparatus 1.
  • the Y-direction movement mechanism 36 is provided on the X-direction movement mechanism 35.
  • the Y-direction movement mechanism 36 is moved in an upward-and-downward direction (Y direction) of the cup attachment apparatus 1.
  • the Z-direction movement mechanism 37 is provided on the Y-direction movement mechanism 36.
  • the Z-direction movement mechanism 37 is moved in a front-and-rear direction (Z direction) of the cup attachment apparatus 1.
  • the Z-direction movement mechanism 37 holds the arm 32, the arm holding base 33, and the motor 34.
  • the X-direction movement mechanism 35 is moved, and thus the Y-direction movement mechanism 36, the Z-direction movement mechanism 37, the arm 32, and the like are moved in the leftward-and-rightward direction with respect to the cup attachment apparatus 1.
  • the Z-direction movement mechanism 37 is moved, and thus the arm 32 and the like are moved in the front-and-rear direction with respect to the cup attachment apparatus 1. Consequently, the installation portion 31 held at the arm 32 is moved up to an upper part of the lens support mechanism 10.
  • the Y-direction movement mechanism 36 is moved, and thus the Z-direction movement mechanism 37, the arm 32, and the like are moved in the upward-and-downward direction with respect to the cup attachment apparatus 1. Consequently, the cup Cu installed at the installation portion 31 is blocked to the front surface of the lens LE mounted on the support pins 14.
  • Fig. 4 is a schematic configuration diagram of the lens information measurement mechanism 40 provided in the cup attachment apparatus 1.
  • the lens information measurement mechanism 40 in the present example is used as both of is also used as a measurement optical system obtaining optical characteristics of the lens and a measurement optical system obtaining information (for example, a shape of a lens, a dot added to the lens, or a hidden mark formed on the lens) regarding of another lens.
  • the measurement optical system obtaining optical characteristics of the lens and the measurement optical system obtaining information regarding of another lens may be separately provided.
  • the lens information measurement mechanism 40 includes an illumination optical system 41, a light receiving optical system 45, an imaging optical system 48, and the like.
  • the illumination optical system 41 includes a light source 42, a half mirror 43, a concave mirror 44, and the like.
  • the light source 42 irradiates the lens with a measurement light beam.
  • the light source 42 may be a light emitting diode (LED).
  • a measurement light beam emitted from the light source 42 is reflected at the half mirror 43 disposed on the optical axis L2, so as to coincide with the optical axis L2.
  • the concave mirror 44 reflects a measurement light beam in a direction from the optical axis L1 to the optical axis L2, and shapes the measurement light beam into a parallel light beam (substantially parallel light beam) having a diameter larger than that of the lens LE disposed on the optical axis L1.
  • a lens may be used instead of a concave mirror, but it is advantageous to use the concave mirror in terms of miniaturization of an apparatus.
  • the light receiving optical system 45 includes a target plate 46, an imaging element 47, and the like.
  • the target plate 46 is used to detect the optical center or the like of the lens LE. Details of the target plate 46 will be described later.
  • the imaging element 47 images a measurement light beam which is applied from the light source 42, and passes through the lens LE and the target plate 46.
  • the imaging element 47 may be a charge coupled apparatus (CCD) or a complementary metal oxide semiconductor (CMOS).
  • CMOS complementary metal oxide semiconductor
  • the light receiving optical system 45 in the present example may have a configuration in which a lens is disposed between the target plate 46 and the imaging element 47.
  • the imaging optical system 48 includes the concave mirror 44, a diaphragm 49, an imaging lens 50, an imaging element 51, and the like.
  • an imaging magnification of the imaging optical system 48 is a magnification at which the entire lens LE is imaged by the imaging element 51.
  • the concave mirror 44 of the imaging optical system 48 is also used as the concave mirror 44 of the illumination optical system 41.
  • the diaphragm 49 is disposed at a focal position (substantial focal position) of the concave mirror 44.
  • the diaphragm 49 has a positional relationship of being conjugate (substantially conjugate) to the light source 42.
  • the imaging element 51 images a reflected light beam which is applied from the light source 42, and is reflected by a recursive reflection member 52 which will be described later.
  • the imaging element 51 may be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS).
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • a focus position of the imaging element 51 is formed near the surface of the lens LE by the imaging lens 50 and the concave mirror 44. Consequently, the dot added to the surface of the lens, the hidden mark formed on the lens, or the like can be imaged in a focused state.
  • Fig. 5 illustrates an example of the target plate 46.
  • a plurality of openings (passing holes for light beams) 55 are formed in the target plate 46 in a predetermined pattern.
  • the openings 55 are formed by sticking the recursive reflection member 52 which will be described later to a region (that is, a region indicated by diagonal lines in Fig. 5 ) other than the openings 55.
  • the circular openings 55 are arranged at an identical interval.
  • shapes or intervals of the openings 55 are not limited to the present example.
  • the openings 55 include a central hole 56 formed at the center of the target plate 46, and peripheral holes 57 formed in the periphery of the central hole 56.
  • the central hole 56 matches the optical axis L1.
  • the central hole 56 in the present example has a size which is different from that of each of the peripheral holes 57, and can thus be differentiated from the peripheral holes 57.
  • a size, the number, a shape, a position, and the like of the central hole 56 may be different from those in the present example as long as the central hole 56 can be differentiated from the peripheral holes 57.
  • a correspondence relationship of each peripheral hole 57 can be specified. More specifically, it is possible to specify a correspondence relationship between an image of the peripheral hole 57 captured in a state in which the lens LE is not mounted on the support pins 14 and an image of the peripheral hole 57 captured in a state in which the lens LE is mounted on the support pins 14.
  • the recursive reflection member 52 is used to reflect a measurement light beam in the same direction (the substantially same direction) as an incidence direction.
  • the recursive reflection member 52 is stuck to an upper surface of the target plate 46 and an upper surface of a disk member 54 having an opening 53 at the center thereof.
  • the disk member 54 may be rotated about the optical axis L1 by a rotation mechanism (not illustrated).
  • JP-A-2008-299140 is referred to.
  • Fig. 6 is a schematic configuration diagram of the image data obtaining mechanism 60.
  • the image data obtaining mechanism 60 obtains the sectional image data 75 including front surface image data P1s for the front surface of the lens LE and rear surface image data Q2s for the rear surface of the lens LE (refer to Fig. 9 ).
  • the image data obtaining mechanism 60 may include a projection optical system 64 which projects a measurement light beam toward the front surface or the rear surface of the lens LE.
  • the image data obtaining mechanism 60 may include a light receiving optical system 66 which receives a first reflected light beam R1 reflected at the front surface of the lens LE and a second reflected light beam R2 reflected at the rear surface of the lens LE.
  • the image data obtaining mechanism 60 in the present example may include the sectional image data 75 including the front surface image data P1s formed by the first reflected light beam R1 and the rear surface image data Q2s formed by the second reflected light beam.
  • the image data obtaining mechanism 60 includes a measurement optical system 61, a Y-direction movement mechanism 62, a Z-direction movement direction 63, and the like.
  • the measurement optical system 61 in the present example is configured to include the sectional image data 75 of the lens LE by using the projection optical system 64 and the light receiving optical system 66 based on the Scheimpflug principle.
  • the measurement optical system 61 may employ optical systems having other configurations instead of optical systems based on the Scheimpflug principle.
  • the projection optical system 64 projects a measurement light beam toward the front surface or the rear surface of the lens LE.
  • the projection optical system 64 may perpendicularly projects a measurement light beam to the lens LE.
  • an optical axis L3 of the projection optical system 64 is perpendicular to the lens LE.
  • the projection optical system 64 may projects a measurement light beam to the lens LE at a predetermined tilt angle.
  • the measurement light beam may be irregularly reflected at the front surface of the lens LE.
  • the projection optical system 64 has a light source 65.
  • the light source 65 irradiates the lens LE with a measurement light beam.
  • the light source 65 may be an LED, a laser device, or the like.
  • the projection optical system 64 in the present example may have a configuration in which a lens is disposed between the light source 65 and the lens LE.
  • the projection optical system 64 in the present example may have a configuration in which pin holes are disposed between the light source 65 and the lens LE.
  • the light receiving optical system 66 receives the first reflected light beam R1 reflected at the front surface of the lens LE and the second reflected light beam R2 reflected at the rear surface of the lens LE.
  • an optical axis L4 of the light receiving optical system 66 is disposed with a predetermined tilt angle with respect to the optical axis L3 of the projection optical system 64.
  • the light receiving optical system 66 includes a lens 67, a diaphragm 68, an imaging element 69, and the like.
  • the lens 67 collects reflected light beams (for example, a regularly reflected beam of the lens LE and scattered light of the lens LE) reflected at the front surface and the rear surface of the lens LE at a position of the diaphragm 68.
  • the diaphragm 68 is disposed at a focal length of the lens 67.
  • the diaphragm 68 is disposed such that sizes of images of the first reflected light beam R1 and the second reflected light beam R2 captured by the imaging element 69 are not changed (that is, an imaging magnification is constant) even if a distance between the light receiving optical system 66 and the lens LE is changed.
  • the imaging element 69 may be a two-dimensional imaging element (for example, at least one of a CCD and a CMOS), and may be a one-dimensional imaging element (for example, a line sensor).
  • the imaging element 69 is disposed at a position conjugate to the lens LE.
  • the imaging element 69 may be disposed such that a light reception surface thereof is perpendicular to the optical axis L4.
  • the lens 67 and the imaging element 69 are disposed on the optical axis L4 on the basis of the Scheimpflug principle.
  • a measurement light beam applied to the lens LE by the projection optical system 64, the lens system including the lens LE and the lens 67, and the light reception surface (that is, an imaging position) of the imaging element 69 are disposed according to the Scheimpflug relationship.
  • a measurement light beam applied to the lens LE branches into a reflected light beam reflected at the front surface of the lens LE and a measurement light beam directed toward the rear surface of the lens LE.
  • the measurement light beam is reflected in various directions at the front surface of the lens LE, but a reflected light beam (that is, the first reflected light beam R1) reflected in parallel to the optical axis L4 reaches the imaging element 69 via the lens 67 and the diaphragm 68.
  • the measurement light beam directed toward the rear surface of the lens LE is reflected in various at the rear surface of the lens LE, but a reflected light beam (that is, the second reflected light beam R2) reflected in parallel to the optical axis L4 reaches the imaging element 69 via the lens 67 and the diaphragm 68.
  • the imaging element 69 can image the first reflected light beam and the second reflected light beam which are reflected with an identical angle at the front surface and the rear surface of the lens LE.
  • a measurement light beam reflected in parallel to the optical axis L4 is guided to the imaging element 69 by using the diaphragm 68, but is not limited thereto.
  • the diaphragm 68 may guide a measurement light beam which is not reflected in parallel to the optical axis L4 to the imaging element 69.
  • the Y-direction movement mechanism 62 adjusts an imaging position of the imaging element 69 by using adjusting a position of the measurement optical system 61 in the upward-and-downward direction (Y direction).
  • the Y-direction movement mechanism 62 integrally moves the projection optical system 64 and the light receiving optical system 66 in the Y direction.
  • the Y-direction movement mechanism 62 may be configured with a motor and a slide mechanism.
  • the Z-direction movement direction 63 adjusts a position of a measurement light beam which is applied from the light source 65 toward the lens LE by adjusting a position of the measurement optical system 61 in the front-and-rear direction (Z direction).
  • the Z-direction movement direction 63 integrally moves the projection optical system 64 and the light receiving optical system 66 in the front-and-rear direction (Z direction).
  • the Z-direction movement direction 63 may be configured with a motor and a slide mechanism.
  • an exemplary configuration in which the projection optical system 64 and the light receiving optical system 66 are integrally moved has been described, but this is only an example.
  • the projection optical system 64 and the light receiving optical system 66 may be separately moved.
  • the image data obtaining mechanism 60 includes the Y-direction movement mechanism 62, and can thus obtain the sectional image data 75 (refer to Fig. 9 ) corresponding to the lens LE having various edge thicknesses t.
  • Figs. 12A, 12B and 12C are diagrams for explaining the Y-direction movement mechanism 62 in a case where the edge thicknesses t of the lens LE are different from each other.
  • Fig. 12A is a diagram illustrating a reflected light beam in a case where an edge of the lens LE is thin.
  • Fig. 12B is a diagram illustrating a reflected light beam in a case where an edge of the lens LE is thick.
  • Fig. 12C is a diagram illustrating a reflected light beam in a case where an imaging position of the imaging element 69 is adjusted in the state illustrated in Fig. 12B .
  • a gap W between the first reflected light beam R1 reflected at the front surface of the lens LE and the second reflected light beam R2 reflected at the rear surface of the lens LE changes depending on the edge thickness t of the lens LE.
  • the gap W between the first reflected light beam R1 and the second reflected light beam R2 is wider in the lens LE of which the edge is thick than in the lens LE of which the edge is thin.
  • neither of the first reflected light beam R1 and the second reflected light beam R2 may reach the light reception surface of the imaging element 69 depending on the edge thickness t of the lens LE.
  • the second reflected light beam R2 is deviated from the light reception surface of the imaging element 69. Therefore, an imaging position of the imaging element 69 is adjusted by driving the Y-direction movement mechanism 62 such that both of the first reflected light beam R1 and the second reflected light beam R2 reach the light reception surface of the imaging element 69.
  • the Y-direction movement mechanism 62 moves the measurement optical system 61 toward the lens LE. Consequently, the first reflected light beam R1 and the second reflected light beam R2 reach the light reception surface of the imaging element 69 as illustrated in Fig. 12C . Consequently, the image data obtaining mechanism 60 can obtain the sectional image data 75 (refer to Fig. 9 ) including the front surface image data P1s formed by the first reflected light beam R1 and the rear surface image data Q2s formed by the second reflected light beam.
  • the measurement optical system 61 is disposed in a stationary manner, if a curve value of the lens LE is great even if the edge of the lens LE is thin, neither of the first reflected light beam R1 and the second reflected light beam R2 may reach the light reception surface of the imaging element 69. Even in this case, it is possible to obtain the sectional image data 75 including the front surface image data P1s and the rear surface image data Q2s by driving the Y-direction movement mechanism 62 such that an imaging position of the imaging element 69 is adjusted.
  • the image data obtaining mechanism 60 includes the Z-direction movement direction 63, and can thus apply a measurement light beam from the projection optical system 64 toward a position based on a target lens shape TG of the lens LE.
  • the position based on the target lens shape applied to the lens LE may be a position matching the target lens shape, and may be a position (for example, a position of a bevel at each radius vector angle or a position of a chamfer at each radius vector angle) obtained through calculation on the basis of the target lens shape.
  • the lens LE mounted on the support pins 14 is rotated in the horizontal direction due to rotation of the cylindrical base 11 (refer to Fig.
  • the image data obtaining mechanism 60 can obtain the sectional image data 75 based on the target lens shape applied to the lens LE.
  • Fig. 7 is a schematic configuration diagram of a control system of the cup attachment apparatus 1.
  • a control unit 70 includes a CPU (processor), a RAM, a ROM, and the like.
  • the CPU of the control unit 70 controls the cup attachment apparatus 1.
  • the RAM of the control unit 70 temporarily stores various pieces of information.
  • the ROM of the control unit 70 stores various programs executed by the CPU.
  • control unit 70 is electrically connected to the display 2, the input button 3, the motor 34, the light source 42, the light source 65, the imaging element 47, the imaging element 51, the imaging element 69, a nonvolatile memory 71 (hereinafter, referred to as a memory 71), and the like.
  • control unit 70 is connected to a motor (not illustrated) or the like provided in each of the X-direction movement mechanism 35, the Y-direction movement mechanism 36, and the Z-direction movement mechanism 37 of the cup attachment mechanism 30.
  • control unit 70 is connected to a motor (not illustrated) or the like provided in each of the Y-direction movement mechanism 62 and the Z-direction movement direction 63 of the image data obtaining mechanism 60.
  • a non-transitory storage medium which can hold storage contents even if the supply of power is stopped may be used as the memory 71.
  • a hard disk drive, a flash ROM, or a USB memory may be used as the memory 71.
  • the memory 71 may store optical characteristics of the lens LE measured by the lens information measurement mechanism 40, sectional image data of the lens LE obtained by the measurement optical system 61, an inner peripheral shape of a frame traced by using the frame shape measurement mechanism 20, and the like.
  • the cup attachment apparatus 1 In order to manufacture eyeglasses, an operator processes the lens LE by using the cup attachment apparatus 1 and the lens peripheral edge processing apparatus.
  • the cup attachment apparatus 1 is used to attach the cup Cu to the surface of the lens LE.
  • the lens peripheral edge processing apparatus in order to obtain edge information of the lens LE, is used to set a finishing position (for example, a position of a bevel, a position of a groove, or a position of a chamfer) formed on an edge of the lens LE and to process a peripheral edge of the lens LE.
  • steps S1 to S8 will be described in order with reference to a flowchart illustrated in Fig. 8 .
  • steps S1 to S7 are executed by using the cup attachment apparatus 1
  • steps S8 to S12 are executed by using the lens peripheral edge processing apparatus.
  • the operator activates the cup attachment apparatus 1, and sequentially executes steps described below.
  • the control unit 70 obtains a target lens shape applied to the lens LE.
  • the target lens shape applied to the lens LE may be an outer peripheral shape of a demonstration lens, an inner peripheral shape of a frame, or the like.
  • the entire image of the demonstration lens may be captured by using the lens information measurement mechanism 40, so that the outer peripheral shape thereof is detected.
  • the frame may be traced by using the frame shape measurement mechanism 20, so that the inner peripheral shape thereof is detected.
  • a target lens shape applied to the lens LE obtained by using other apparatuses may be read to the cup attachment apparatus 1.
  • the control unit 70 stores the target lens shape applied to the lens LE obtained as mentioned above, in the memory 71.
  • target lens shapes applied to each of a left lens and a right lens may be obtained.
  • a target lens shape applied to either of a left lens and a right lens may be obtained, and may be laterally reversed such that a target lens shape applied to the other lens is obtained.
  • the control unit 70 obtains a refractive index of the lens LE.
  • the refractive index of the lens LE may be obtained by the operator operating the input button 3.
  • a refractive index is defined by a material (for example, plastic or glass) of the lens LE (in other words, a refractive index is defined by the lens LE to be processed), and thus the operator may input an existing refractive index.
  • the operator may input a refractive index of the lens LE which is obtained in advance by using a separate apparatus.
  • the cup attachment apparatus 1 may have a configuration which can measure a refractive index of the lens LE, and may obtain a refractive index by using the configuration.
  • the control unit 70 stores the refractive index of the lens LE obtained as mentioned above, in the memory 71.
  • the control unit 70 displays the target lens shape on the display 2.
  • the operator may set a processing condition for the lens LE for each of the left lens and the right lens by operating the input button 3.
  • the processing condition of the lens LE may be at least one of the type of lens LE (for example, a single focal lens, a bifocal lens, and a progressive lens), a material of the lens LE, a material of the frame, the presence or absence of various processings (for example, polishing, chamfering, and grooving), and an attachment position of the cup Cu to the lens LE (for example, the optical central position of the lens LE and the geometric central position of the target lens shape).
  • the operator may set a layout of the lens LE by operating the input button 3.
  • the layout of the lens LE may be at least one of a distance between the frame centers, a pupillary distance of an eyeglasses wearer, and an astigmatic axis angle of an eyeglasses wearer.
  • the operator mounts the lens LE (for example, the left lens in the present example) on the support pins 14, and attaches the cup Cu to the installation portion 31.
  • the operator operates a mode selection button (not illustrated), so as to switch a mode from a setting mode to a blocking mode.
  • the control unit 70 turns on the light source 42 of the lens information measurement mechanism 40.
  • a guide mark which is a target for aligning the optical central position (that is, an optical central mark which will be described later) of the lens LE mounted on the support pins 14 with the optical axis L1 is displayed on the display 2.
  • control unit 70 obtains a position coordinate of an opening image captured by the imaging element 51, and detects the optical central position and optical characteristics (for example, a spherical power, a cylindrical power, or an astigmatic axis angle) of the lens LE.
  • optical central position and optical characteristics for example, a spherical power, a cylindrical power, or an astigmatic axis angle
  • JP-A-2008-241694 is referred to with respect to details thereof.
  • an optical central mark is displayed at the optical central position O of the lens LE detected by the control unit 70.
  • control unit 70 performs image processing on an image of the lens LE captured by the imaging element 51 so as to detect an outer shape of the lens LE.
  • the target lens shape applied to the lens LE is displayed to overlap the outer shape of the lens LE on the display 2.
  • a display position and a display size of the target lens shape applied to the lens LE may be determined depending on the optical central position O of the lens LE, the layout data of the lens LE, an imaging magnification of the imaging optical system 48, a positional relationship between the optical axis L1 and the optical axis L2, and the like.
  • the operator moves the lens LE such that the optical central mark matches the guide mark while viewing the display 2, and operates a blocking button (not illustrated) displayed on the display 2.
  • the control unit 70 drives the X-direction movement mechanism 35 and the Y-direction movement mechanism 36 of the cup attachment mechanism 30 so as to move the arm 32 such that a central axis K2 of the cup Cu is located at the optical central position O of the lens LE.
  • the installation portion 31 may be rotated about the central axis K2 on the basis of the detected astigmatic axial angle.
  • the control unit 70 moves the arm 32 downward by driving the Z-direction movement mechanism 37. Consequently, the cup Cu installed at the installation portion 31 is blocked to the front surface of the lens LE mounted on the support pins 14 on the basis of the set shaft alignment position (that is, the optical central position O of the lens LE).
  • the operator operates the mode selection button (not illustrated), and switches the mode from the blocking mode to an edge information measurement mode. For example, consequently, in the present example, edge information of the lens LE mounted on the lens support mechanism 10 is measured.
  • the control unit 70 turns on the light source 65 of the image data obtaining mechanism 60.
  • the control unit 70 controls driving of the Z-direction movement direction 63, so as to match the optical axis L3 with an initial position on the target lens shape TG in the lens LE.
  • the initial position may be a position on a virtual line V1 which extends in the Z direction through the optical central position O of the lens LE, and may be a position on a virtual line V2 which extends in the X direction through the optical central position O of the lens LE.
  • the initial position may be set to any position.
  • the optical axis L3 matches a point P1 which is located on the target lens shape TG applied to the lens LE and on the virtual line V1. Consequently, a measurement light beam applied from the light source 65 branches into the first reflected light beam R1 reflected at the point P1 on the lens LE and the second reflected light beam R2 which reaches a point Q1 and is reflected thereat on the lens LE.
  • control unit 70 controls driving of the Y-direction movement mechanism 62, and adjusts a distance between the image data obtaining mechanism 60 and the lens LE in the Y direction such that the imaging element 69 can receive both of the first reflected light beam R1 and the second reflected light beam R2.
  • control unit 70 performs image processing (for example, edge detection) on the sectional image data 75 (refer to Fig. 9 ) captured by the imaging element 69, so as to detect rising of luminance.
  • the control unit 70 determines that both of an image (that is, the front surface image data P1s of the lens LE which will be described later) of the first reflected light beam R1 and an image (that is, the rear surface image data Q2s of the lens LE which will be described later) of the second reflected light beam R2 are detected, and finishes driving of the Y-direction movement mechanism 62. Consequently, the control unit 70 can obtain the sectional image data 75 at the position of the point P1 in the target lens shape TG applied to the lens LE.
  • Fig. 9 illustrates an example of the sectional image data 75.
  • the sectional image data 75 includes the front surface image data of the lens LE formed by the first reflected light beam R1 and the rear surface image data of the lens LE formed by the second reflected light beam R2.
  • an image of the point P1 on the lens LE is the front surface image data P1s of the lens LE.
  • an image of a point Q2 located on an extension line (that is, the dotted line illustrated in Fig. 6 ) of the second reflected light beam R2 is the rear surface image data Q2s of the lens LE.
  • the lens LE has a predetermined refractive index, and the second reflected light beam R2 is bent at the front surface of the lens LE and then reaches the imaging element 69. It appears that the point Q1 on the rear surface of the lens LE is located at the position of the point Q2 in the imaging element 69.
  • the control unit 70 drives the motor 17 so as to rotate the cylindrical base 11, and rotates the lens LE mounted on the support pins 14 in the horizontal direction by one round.
  • the control unit 70 controls the Y-direction movement mechanism 62 such that the first reflected light beam R1 is adjusted to reach an identical position on the light reception surface of the imaging element 69 at all times.
  • the control unit 70 controls the Z-direction movement direction 63 on the basis of the target lens shape TG applied to the lens LE, and matches the optical axis L3 with points (for example, points P2, P3, P4, ..., and Pn illustrated in Fig.
  • the points corresponding to the target lens shape may be points with gaps of predetermined angles (for example, 0.5 degrees and 1 degree) with the optical central position O as the center, and may be points of any number (for example, 1000 points).
  • control unit 70 controls the Z-direction movement direction 63 while rotating the lens LE in order to consecutively obtain the sectional image data 75 for each point on a radius vector angle in the target lens shape applied to the lens LE.
  • control unit 70 controls the Y-direction movement mechanism 62 while rotating the lens LE such that the front surface image data P1s of the lens LE is displayed at a predetermined position in the sectional image data 75.
  • Fig. 14 is a diagram for explaining control for displaying the front surface image data P1s at a predetermined position.
  • the control unit 70 measures a deviation amount ⁇ d between the front surface image data P1s of the lens LE and a predetermined position M in a depth direction (in other words, an upward-and-downward direction of the sectional image data 75) with respect to the sectional image data 75.
  • the control unit 70 calculates a drive amount of the Y-direction movement mechanism 62 on the basis of the measured deviation amount ⁇ d, and moves the measurement optical system 61. Consequently, an imaging position of the first reflected light beam R1 reaching the light reception surface of the imaging element 69 is changed, and thus the front surface image data P1s matches the predetermined position M in the sectional image data 75.
  • the control unit 70 measures the deviation amount ⁇ d for all radius vector angles in the target lens shape of the lens LE, and drives the Y-direction movement mechanism 62 on the basis thereof. In other words, whenever a position for obtaining the sectional image data 75 on the lens LE is changed, the control unit 70 drives the Y-direction movement mechanism 62 such that the front surface image data P1s is displayed at the predetermined position M in the sectional image data 75.
  • the control unit 70 specifies a front surface position (for example, a position coordinate) of each radius vector angle in the target lens shape applied to the lens LE on the basis of a drive amount in which the Y-direction movement mechanism 62 has been driven, by using the deviation amount ⁇ d.
  • the front surface position of each radius vector angle may be specified by obtaining the number of pulses of the motor configuring the Y-direction movement mechanism 62.
  • the control unit 70 rotates the lens LE in the horizontal direction by one round, obtains sectional image data of points on all radius vector angles in the target lens shape applied to the lens LE and front surface positions of the lens LE, and stores the sectional image data and the front surface positions in the memory 71.
  • the lens LE mounted on the support pins 14 is rotated by one round, but may be rotated by any round.
  • the lens LE mounted on the support pins 14 may be rotated by two rounds, and sectional image data at each of a bevel position and a chamfer position which are calculated on the basis of a target lens shape applied to the lens LE may be obtained.
  • the lens LE may be rotated by one round such that the optical axis L3 matches the bevel position on the lens LE, and may then be further rotated by one round such that the optical axis L3 matches the chamfer position (for example, a position inward by a predetermined distance from the bevel position) on the lens LE.
  • the cup attachment apparatus 1 may have an edge measurement mechanism obtaining edge information of the lens LE.
  • the control unit 70 may measure the lens LE by controlling the edge measurement mechanism, so as to obtain edge information.
  • the control unit 70 may control the edge measurement mechanism on the basis of the target lens shape applied to the lens LE, so as to obtain edge information.
  • the image data obtaining mechanism 60 is used as the edge measurement mechanism, and the control unit 70 obtains edge information.
  • the control unit 70 obtains edge information (first edge information) about an edge of the lens LE with respect to each piece of the obtained sectional image data 75.
  • edge information based on the target lens shape applied to the lens LE can be obtained.
  • the edge information a front surface position of the edge, a rear surface position, or an edge thickness of the lens LE may be obtained.
  • a front surface curve value or a rear surface curve value of the lens LE may be obtained.
  • a description will be made of a case where an edge thickness is obtained as the edge information of the lens LE.
  • the control unit 70 obtains position coordinates (for example, pixel coordinates) of the front surface image data P1s and the rear surface image data Q2s in the sectional image data 75. For example, the control unit 70 computes a gap h between the front surface image data P1s of the lens LE and the rear surface image data Q2s of the lens LE by using the pixel coordinates of the front surface image data P1s and the rear surface image data Q2s. For example, in this case, the control unit 70 may compute the number of pixels of the gap h by obtaining a difference between the pixel coordinates. For example, in this case, the control unit 70 sets an actual distance per pixel of the imaging element 69, and can thus convert the number of pixels of the gap h into an actual distance.
  • position coordinates for example, pixel coordinates
  • the second reflected light beam is bent at the front surface of the lens LE in the sectional image data 75 as described above, and thus the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE does not necessarily match the edge thickness t of the lens LE.
  • the control unit 70 may obtain the edge thickness t of the lens LE based on the refractive index and the sectional image data 75 of the lens LE.
  • an actual distance of the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE obtained on the basis of the sectional image data 75 is corrected on the basis of the refractive index of the lens LE, and thus the control unit 70 obtains the edge thickness t of the lens LE.
  • the edge thickness obtained on the basis of the sectional image data is corrected on the basis of the refractive index, and thus the edge thickness of the lens LE is obtained.
  • control unit 70 may correct an actual distance of the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE so as to compute the edge thickness t of the lens LE by using the following equation.
  • t h ⁇ n / cos ⁇ ⁇ ⁇
  • n indicates a refractive index of the lens LE
  • indicates an imaging magnification of the light receiving optical system 66
  • indicates a tilt angle of the optical axis L4 with respect to the front surface of the lens LE.
  • the imaging magnification ⁇ and the tilt angle ⁇ are known values on design.
  • the control unit 70 may call the refractive index n obtained in advance in step S2 from the memory 71, and obtain the edge thickness t of the lens LE by computing the above equation.
  • the control unit 70 stores the edge thickness t in the memory 71 in correspondence to a position of each point on a radius vector angle in the target lens shape applied to the lens LE.
  • the front surface and the rear surface of the lens LE are curved, and thus a measurement light beam applied from the light source 65 is not necessarily perpendicular to the front surface of the lens LE.
  • the measurement light beam is applied to the front surface of the lens LE in a tilted manner, and a measurement light beam advancing toward the rear surface of the lens LE from the rear surface is bent at the front surface of the lens LE. Therefore, the obtained edge thickness t of the lens LE may be further corrected by taking into consideration a front surface curve value of the lens LE.
  • a table (conversion table) indicating the refractive index n and the front surface curve value of the lens LE may be stored in the memory 71 in advance, and the control unit 70 may multiply a corresponding coefficient by the edge thickness t of the lens LE.
  • the front surface curve value of the lens LE may be obtained in advance by using a separate apparatus, and may be read to the cup attachment apparatus 1, and may be input by the operator.
  • the front surface curve value of the lens LE may be obtained through image processing on the sectional image data 75 which is obtained by using the cup attachment apparatus 1.
  • An equation used for computation of the edge thickness t is not limited to the equation in the present example, and an equation including at least one of the refractive index n of the lens LE, a front surface curve value, and a distance D from the optical central position O to a point (for example, the point P1 in Fig. 6 ) on each radius vector angle, as a parameter, may be used.
  • step S4 obtainment of sectional image data (step S5), and obtainment of edge information (step S6) with respect to the left lens
  • the operator sequentially executes the steps on the right lens in the same manner, so as to obtain the edge thickness t corresponding to positions in the target lens shape applied to applied to the lens LE.
  • the control unit 70 changes an operation screen displayed on the display 2 to a simulation screen, and sets a finishing position.
  • the control unit 70 sets an edge finishing position on the basis of the obtained edge information (the edge thickness t in the present example).
  • the edge finishing position is a position of a bevel, a position of a groove, or a position of a chamfer formed in the edge.
  • the finishing position may be automatically set (that is, initially set) on the basis of the obtained edge thickness t, and may be adjusted by the operator.
  • a description will be made of a case of setting a position of a bevel as the edge finishing position.
  • control unit 70 may display the finishing position on the basis of the edge thickness on the display 2, and may set the finishing position on the basis of an operation signal from the input button 3 for adjusting the finishing position on the display.
  • the control unit 70 may display a position of a bevel of the left lens and a position of a bevel of the right lens in a comparable manner.
  • the control unit 70 may set a finishing position of at least one of the left lens and the right lens on the basis of an operation signal from the input button 3. Consequently, positions of the bevels (apex positions of the bevels) can be disposed with good balance with respect to the respective edge thicknesses t by taking into consideration a difference between the edge thicknesses t of the left lens and the right lens.
  • Figs. 10A and 10B illustrate examples of a simulation screen 80.
  • Fig. 10A is a diagram illustrating the entire simulation screen 80.
  • Fig. 10B is a diagram illustrating a partially enlarged simulation screen 80.
  • target lens shapes a target lens shape TGL applied to the left lens and a target lens shape TGR applied to the right lens
  • Shapes of the edges a shape 81L of the edge of the left lens and a shape 81R of the edge of the right lens
  • the bevels are processed in the lenses LE are displayed on the simulation screen 80.
  • cursors (a cursor 82L and a cursor 82R) which are moved on the target lens shapes by an operation of the operator are displayed on the target lens shapes applied to the lenses LE.
  • the operator can designate an observation direction of the edge of the lens LE by using the cursor, and can display a shape of the edge of the lens LE.
  • entry fields 83 (an entry field 83L and an entry field 83R) for changing a curve value of a bevel curve of the lens LE (that is, a path along which the bevel is disposed in the edge of the lens LE), a position of the bevel curve in the front-and-rear direction, a tilt of the bevel curve, and the like are provided on the simulation screen 80.
  • sectional shapes of rims of the frames may be displayed in correspondence to display positions of bevel positions 84L and 84R on the simulation screen 80.
  • the control unit 70 automatically sets a temporary position of a bevel formed in the lens LE.
  • a bevel curve may be a curve along a front surface curve of the lens LE.
  • an apex position of the bevel may be set to pass through a position of a half of a portion where the edge thickness t is smallest, on the basis of the edge thickness t of the lens LE.
  • the control unit 70 sets temporary positions of bevels for the left lens and the right lens as mentioned above. In this case, the control unit 70 may automatically adjust temporary positions of bevels formed in the left and right lenses by taking into consideration a difference between the edge thicknesses t of the left lens and the right lens.
  • a distance BL from an apex position of the bevel in the left lens to the front surface of the lens LE and a distance BR from an apex position of the bevel to the front surface in the right lens of the lens LE may be automatically adjusted to be the same as each other.
  • a difference between the distance BL and the distance BR may be automatically adjusted to be equal to or less than a predetermined threshold value.
  • the predetermined threshold value may be set in advance through tests or simulation.
  • the operator operates a determination button (not illustrated).
  • the control unit 70 creates bevel formation data for forming bevels in the edges of the left lens and the right lens in response to an operation instruction, and stores the bevel formation data in the memory 71.
  • the control unit 70 transmits the created bevel formation data to the lens peripheral edge processing apparatus along with the front surface position of the lens LE.
  • the operator may manually adjust the temporary positions of the bevels which are initially set. For example, the operator moves the cursor 82L by touching the display 2, and thus designates any radius vector angle of the left lens. Consequently, the operator can determine an observation point with the emphasis on appearance based on a positional relationship between the frame and the edge of the lens LE. Consequently, the operator can compare a positional relationship between the front surface of the lens LE and the apex position of the bevel with respect to the left lens and the right lens.
  • the cursor 82R may be automatically moved to a laterally symmetric position in conjunction with the cursor 82L.
  • the operator compares the shape 81L of the edge of the left lens with the shape 81R of the edge of the right lens, and changes each of the distance BL from the apex position of the bevel in the left lens to the front surface of the lens LE and the distance BR from the apex position of the bevel in the right lens to the front surface of the lens LE.
  • the apex positions of the bevels may be moved by the operator entering any values into the entry field 83.
  • the control unit 70 changes display positions of the positions 84R and 84L of the bevels in correspondence to entered values.
  • the operator may change curve values of bevel curves or tilts by entering any values into the entry fields 83.
  • the control unit 70 computes positions of bevels again in correspondence to the entered values, and changes display positions of the positions 84L and 84R of the bevels.
  • the control unit 70 can set a temporary position of a bevel formed in the edge of the lens LE on the basis of an operation signal which is input from the display 2 due to an operation performed by the operator.
  • the operator operates the determination button (not illustrated).
  • the control unit 70 creates bevel formation data for forming bevels in the edges of the left lens and the right lens in response to an operation instruction, and stores the bevel formation data in the memory 71.
  • the control unit 70 transmits the created bevel formation data to the lens peripheral edge processing apparatus along with the front surface position of the lens LE.
  • Fig. 11 is a schematic configuration diagram of a lens peripheral edge processing apparatus 90.
  • the control unit 95 of the lens peripheral edge processing apparatus may receive the above-described bevel formation data and front surface position of the lens LE, and may store the bevel formation data and the front surface position in a memory (not illustrated).
  • the lens peripheral edge processing apparatus 90 in the present example includes a lens holding unit 100 which holds the lens LE with lens chuck shafts 102, an edge information obtaining unit 200 obtaining edge information of the lens LE, a processing unit 300 processing a peripheral edge of the lens LE, and the like.
  • the lens holding unit 100 includes a lens rotation unit 100a rotating the lens LE, a chuck shaft movement unit 100b moving the lens chuck shafts 102 in the X axis direction, and an inter-axial distance changing unit 100c moving the lens chuck shafts 102 in the Y direction (an inter-axial distance between the lens chuck shaft and a rotation axis of the processing unit 300).
  • JP-A-2015-131374 is referred to with respect to detailed configurations of the lens peripheral edge processing apparatus 90.
  • the operator holds the left lens with lens chuck shafts 102L and 102R of the lens holding unit 100.
  • a rotation central axis K3 of the lens chuck shafts 102L and 102R is designed to match a central position (in other words, the optical central position O of the lens LE) of the cup Cu.
  • the cup Cu is held to be directed in a predetermined direction with respect to the rotation central axis K3.
  • the cup Cu is held such that the virtual line V1 of the lens LE matches the Y direction of the lens peripheral edge processing apparatus 90, and the virtual line V2 of the lens LE matches the Z direction of the lens peripheral edge processing apparatus 90.
  • the X direction in the cup attachment apparatus 1 corresponds to the Z direction in the lens peripheral edge processing apparatus 90.
  • the Y direction in the cup attachment apparatus 1 corresponds to the X direction in the lens peripheral edge processing apparatus 90.
  • the Z direction in the cup attachment apparatus 1 corresponds to the Y direction in the lens peripheral edge processing apparatus 90.
  • the operator starts processing of the peripheral edge of the lens LE by operating a processing start button (not illustrated).
  • the control unit 95 of the lens peripheral edge processing apparatus 90 controls the edge information obtaining unit 200 to obtain edge information (second edge information) of at least one point at a radius vector angle based on the target lens shape in the front surface of the lens LE.
  • edge information of the point P1 on the virtual line V1 is obtained.
  • control unit 95 moves the lens LE held by the lens chuck shafts 102L and 102R by driving the lens rotation unit 100a, the chuck shaft movement unit 100b, and the inter-axial distance changing unit 100c such that a tracing stylus 201 of the edge information obtaining unit 200 is brought into contact with the point P1 on the lens LE.
  • control unit 95 recognizes positions of the point P1 in the Y direction and the Z direction with the rotation central axis K3 as a reference, and drives the lens rotation unit 100a and the inter-axial distance changing unit 100c.
  • the control unit 95 moves the lens LE in the X direction by driving the chuck shaft movement unit 100b until a sensor (not illustrated) provided in the tracing stylus 201 detects contact with the lens LE.
  • the control unit 95 measures movement amounts in which the lens chuck shafts 102L and 102R are moved in the X direction from initial positions thereof. Consequently, a position where the lens LE is located in a lens chuck shaft direction (that is, the X direction) is recognized. In other words, the position (that is, a position coordinate) of the point P1 on the lens LE is recognized.
  • the control unit 95 may slide the tracing stylus 201 on the lens LE so as to obtain edge information of one surface at a radius vector angle of the lens LE.
  • the control unit 95 correlates the front surface position of each point (that is, a position coordinate of each point) in the lens LE, obtained in the cup attachment apparatus 1, with the position coordinate of the point P1 in the X direction on the lens LE, obtained in the lens peripheral edge processing apparatus 90.
  • position coordinates in the X direction of the lens peripheral edge processing apparatus 90 are obtained with respect to the point P2 and the subsequent points on the lens LE.
  • the position coordinate in the X direction of the lens peripheral edge processing apparatus 90 is obtained in step S9.
  • correlation of position coordinates of the point P2 and the subsequent points will be described.
  • control unit 95 calls a front surface position (position coordinate) of each radius vector angle in the target lens shape applied to the lens LE from the memory (not illustrated), and correlates a position coordinate therewith for each vectoral angle of the lens LE on the basis of the position coordinate of the point P1 on the lens LE. More specifically, a position coordinate of the point P2 on the lens LE in the lens peripheral edge processing apparatus 90 is obtained by adding the position coordinate of the point P1 obtained by using the lens peripheral edge processing apparatus 90 to the position coordinate of the point P2 obtained in the cup attachment apparatus 1.
  • a position coordinate of the point P3 on the lens LE in the lens peripheral edge processing apparatus 90 is obtained by adding the position coordinate of the point P1 obtained by using the lens peripheral edge processing apparatus 90 to the position coordinate of the point P3 obtained in the cup attachment apparatus 1.
  • the control unit 95 correlates a position coordinate of one point on the lens LE, obtained by using the edge information obtaining unit 200 of the lens peripheral edge processing apparatus 90 with a position coordinate of a point at each radius vector angle of the lens LE, obtained by using the image data obtaining mechanism 60 of the cup attachment apparatus 1. Consequently, the bevel formation data created in the cup attachment apparatus 1 can be correlated with the lens LE held by the lens chuck shafts 102.
  • control unit 95 obtains process control data for processing the peripheral edge of the lens LE on the basis of the obtained edge information (that is, the second edge information) of one point on the lens LE and bevel formation data (that is, the bevel formation data created on the basis of the first edge information and the finishing position).
  • process control data a drive amount for driving at least one of the lens rotation unit 100a, the chuck shaft movement unit 100b, and the inter-axial distance changing unit 100c is calculated.
  • the tracing stylus 201 is in a state of being brought into contact with the point P1 on the lens LE held by the lens chuck shafts 102.
  • a drive amount from the position where the lens LE is brought into contact with the tracing stylus 201 is calculated as the process control data.
  • the control unit 95 may return the lens chuck shafts 102 to initial positions, and may calculate a drive amount from the initial positions as the process control data.
  • the control unit 95 moves the lens chuck shafts 102 in at least one of the X direction, the Y direction, and the Z direction on the basis of the process control data, and locates the lens LE on a grindstone 310 of the processing unit 300.
  • the control unit 95 moves the lens chuck shafts 102 in at least one of the X direction, the Y direction, and the Z direction on the basis of the process control data, and adjusts a relative positional relationship of the lens LE for the grindstone 310.
  • the peripheral edge of the lens LE is processed (for example, roughing or bevel processing).
  • the operator holds the right lens with the lens chuck shafts 102, and processes the peripheral edge of the lens LE as described above.
  • the shaft alignment apparatus of the present example includes the shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to an eyeglasses lens of the lens holding portion that clamps and holds the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, and the edge information obtaining portion for obtaining edge information about an edge of the eyeglasses lens. Consequently, edge information obtained by using a lens peripheral edge processing apparatus can be obtained by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • the edge information obtaining portion includes the edge measurement portion for obtaining edge information of an eyeglasses lens, and measures the eyeglasses lens by controlling the edge measurement portion to obtain the edge information. Consequently, the edge information of the eyeglasses lens can be obtained by using the shaft alignment apparatus, and thus the usage time of the lens peripheral edge processing apparatus can reduced.
  • the shaft alignment apparatus of the present example includes the target lens shape obtaining portion for obtaining a target lens shape applied to the eyeglasses lens. Consequently, the edge information obtaining portion of the shaft alignment apparatus can control the edge measurement portion based on the target lens shape applied to the eyeglasses lens, and can thus obtain the edge information at a position based on the target lens shape.
  • the shaft alignment apparatus of the present example includes the finishing position setting portion for setting a finishing position of the edge based on the edge information of the eyeglasses lens. Consequently, a finishing position which is set by using a lens peripheral edge processing apparatus in the related art can be set by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, the standby time of the lens peripheral edge processing apparatus is mitigated, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • the finishing position setting portion displays the finishing position based on edge information on the display portion, and sets a finishing position based on an operation signal from the operation portion for adjusting a finishing position on the display portion. Consequently, the operator can easily understand a finishing position formed in the eyeglasses lens. The operator easily adjusts a finishing position formed in the eyeglasses lens.
  • the finishing position setting portion displays a finishing position based on the edge information of a left eyeglasses lens and a finishing position based on the edge information of a right eyeglasses lens on the display portion in a comparable manner, and sets a finishing position of at least one of the left eyeglasses lens and the right eyeglasses lens based on an operation signal from the operation portion. Consequently, the operator can easily understand the balance between finishing positions in the left eyeglasses lens and the right eyeglasses lens. The operator can adjust each of the finishing positions set in the left eyeglasses lens and the right eyeglasses lens by taking into consideration the balance between the finishing positions. Therefore, the operator can easily manufacture eyeglasses with good appearance.
  • the shaft alignment apparatus of the present example includes the refractive index obtaining portion for obtaining a refractive index of the eyeglasses lens, the image data obtaining portion for obtaining sectional image data including front surface image data for a front surface of the eyeglasses lens and rear surface image data for a rear surface of the eyeglasses lens, and the edge information obtaining portion for obtaining the edge information about an edge of the eyeglasses lens based on the refractive index and the sectional image data. Consequently, it is possible to efficiently obtain edge information of the eyeglasses lens by using sectional image data which is changed due to the refractive index of the eyeglasses lens.
  • the shaft alignment apparatus of the present example includes a projection optical system which projects a measurement light beam toward the front surface or the rear surface of the eyeglasses lens, and a light receiving optical system which receives a first reflected light beam as a result of the measurement light beam being reflected at the front surface of the eyeglasses lens and a second reflected light beam as a result of the measurement light beam being reflected at the rear surface of the eyeglasses lens.
  • the image data obtaining portion can obtain sectional image data including front surface image data formed by the first reflected light beam and the rear surface image data formed by the second reflected light beam, with a simple configuration.
  • the image data obtaining portion can obtain sectional image data in a non-contact manner by using a measurement light beam. Therefore, it is possible to efficiently obtain edge information of the eyeglasses lens.
  • the shaft alignment apparatus including the lens support portion where an eyeglasses lens is mounted, and the shaft alignment position setting portion for setting a shaft alignment position being an attachment position to the eyeglasses lens of the lens holding portion, and for setting a shaft alignment position of the eyeglasses lens mounted on the lens support portion, the lens holding portion for clamping and holding the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, is also used as a lens shape measurement apparatus.
  • an operator can obtain edge information of an eyeglasses lens in a non-contact manner by using the shaft alignment apparatus, and can thus efficiently obtain the edge information of the eyeglasses lens.
  • the lens peripheral edge processing apparatus including a processing tool for processing a peripheral edge of an eyeglasses lens held by the lens holding portion is also used as a lens shape measurement apparatus.
  • an operator can obtain edge information of an eyeglasses lens in a non-contact manner by using the lens peripheral edge processing apparatus, and can thus efficiently obtain the edge information of the eyeglasses lens.
  • a configuration of holding the lens LE an exemplary configuration of mounting the lens LE on the support pins 14 has been described, but this is only an example.
  • a configuration of holding the lens LE there may be a configuration of sandwiching the lens LE.
  • a side surface (peripheral edge) of the lens LE may be sandwiched, and the front surface and the rear surface of the lens LE may be sandwiched.
  • the image data obtaining mechanism 60 obtains a sectional image of the lens LE captured by the imaging element 69, but this is only an example.
  • the image data obtaining mechanism 60 may obtain signal data before a sectional image is obtained.
  • the image data obtaining mechanism 60 corrects edge information which is obtained by using the sectional image data 75, on the basis of a refractive index of the lens LE, so as to obtain edge information, but this is only an example.
  • the image data obtaining mechanism 60 may correct the sectional image data 75 on the basis of a refractive index of the lens LE, and may obtain edge information of the lens LE on the basis of the corrected sectional image data.
  • the image data obtaining mechanism 60 may further correct the obtained edge thickness t of the lens LE by taking into consideration a front surface curve value of the lens LE.
  • the image data obtaining mechanism 60 adjusts an irradiation position of a measurement light beam to a position based on the target lens shape TG applied to the lens LE by using rotation of the lens LE in the horizontal direction and the Z-direction movement direction 63, but this is only an example.
  • the image data obtaining mechanism 60 may further include an X-direction movement mechanism adjusting a position of the measurement optical system 61 in the leftward-and-rightward direction (X direction), and may adjust an irradiation position of a measurement light beam by using the X-direction movement mechanism and the Z-direction movement direction 63. In this case, even if the lens LE is not rotated in the horizontal direction, an irradiation position of a measurement light beam can be adjusted to a position based on the target lens shape TG applied to the lens LE.
  • the control unit 70 may specify a front surface position of the edge of the lens LE on the basis of a position of the front surface image data P1s displayed in each piece of the sectional image data 75.
  • a front surface position of the edge of the lens LE may be obtained on the basis of both of a movement amount in which the measurement optical system 61 is moved by the Y-direction movement mechanism 62 and a position of the front surface image data P1s displayed in each piece of the sectional image data 75.
  • the imaging element 69 of the image data obtaining mechanism 60 is disposed such that the light reception surface thereof is perpendicular to the optical axis L4, but this is only an example.
  • the imaging element 69 may be disposed to be tiltable. Consequently, even if a focal position of the imaging element 69 is deviated in the front surface and the rear surface of the lens LE due to a refractive index and a front surface curve value of the lens LE, a focal position can be appropriately adjusted by changing an angle of the light reception surface of the imaging element 69.
  • a slit-shaped measurement light beam may be applied from the light source 65.
  • a slit plate and a lens may be provided between the light source 65 and the lens LE, and a measurement light beam applied from the light source 65 may be restricted to a slit shape so as to be collected at the lens LE. Consequently, a slit-shaped measurement light beam having a predetermined width is applied to the lens LE, and thus the imaging element 69 can capture images having predetermined widths of the front surface and the rear surface of the lens LE.
  • a front surface curve value and a rear surface curve value of the lens LE may be obtained by performing image processing on the sectional image data 75 captured in the above-described way.
  • the light source 65 is disposed to be movable, and a point-shaped measurement light beam is applied in a line shape. Consequently, the imaging element 69 can also capture images having predetermined widths of the front surface and the rear surface of the lens LE. Therefore, the control unit 70 may obtain a front surface curve value and a rear surface curve value of the lens LE by performing image processing on the sectional image data 75.
  • the image data obtaining mechanism 60 obtains the sectional image data 75 at each radius vector angle of the lens LE, and the control unit 70 obtains edge information on the basis of the sectional image data 75, but this is only an example.
  • interpolation may be performed by using edge information based on the sectional image data 75 at peripheral radius vector angles, and thus edge information of the position may be obtained.
  • the light source 65 of the image data obtaining mechanism 60 applies a measurement light beam to a position based on a target lens shape applied to the lens LE, but this is only an example.
  • measurement may be performed at any position on the lens LE.
  • the light source 65 may apply a measurement light beam to a hole position. Consequently, edge information (for example, an edge thickness) at the hole position of the lens LE or curve values of the front and rear surfaces can be obtained.
  • the control unit 70 may control a luminance level by controlling a projected light amount of a measurement light beam applied from the light source 65.
  • the control unit 70 may measure a luminance level (luminance value) from the sectional image data 75, and may determine whether or not the measured luminance level satisfies a predetermined threshold value.
  • the predetermined threshold value may be set in advance to a threshold value causing a luminance level to be determined as being favorable through tests, simulation, or the like.
  • the control unit 70 controls (changes) an amount of light to be projected from the light source 65 on the basis of a determination result. For example, in a case where it is determined that the luminance level of the sectional image data 75 does not satisfy the predetermined threshold value (for example, less than the predetermined threshold value), the control unit 70 increases an amount of light to be projected from the light source 65 such that the luminance level of the sectional image data 75 satisfies the predetermined threshold value. For example, in a case where it is determined that the luminance level of the sectional image data 75 satisfies the predetermined threshold value (for example, equal to or more than the predetermined threshold value), the control unit 70 maintains an amount of light to be projected from the light source 65 without performing control of the luminance level.
  • the predetermined threshold value for example, less than the predetermined threshold value
  • the control unit 70 maintains an amount of light to be projected from the light source 65 without performing control of the luminance level.
  • control unit 70 may control the luminance level such that the luminance level becomes more appropriate. Consequently, it is possible to obtain the sectional image data 75 of the lens LE with high accuracy in correspondence to the lens LE having various curve values.
  • the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE obtained from the sectional image data 75 is corrected on the basis of a refractive index of the lens LE, and thus the edge thickness t of the lens LE is obtained, but this is only an example.
  • the sectional image data 75 is corrected on the basis of a refractive index of the lens LE, and the gap h between front surface image data and rear surface image data after the correction is calculated such that the edge thickness t of the lens LE is obtained.
  • the sectional image data 75 may be corrected on the basis of at least one of a refractive index of the lens LE, a front surface curve value of the lens LE, and the distance D from the optical central position O to a point at each radius vector angle.
  • the sectional image data 75 and edge information are obtained, and then blocking to the lens LE may be performed.
  • the cup attachment apparatus 1 obtains edge information of the lens LE by using the measurement optical system 61, but this is only an example.
  • the cup attachment apparatus 1 includes a tracing stylus, and the tracing stylus is brought into contact with the lens LE such that edge information of the lens LE is obtained.
  • a finishing position of the lens LE may be set by using the cup attachment apparatus 1, but this is only an example.
  • a finishing position of the lens LE may be set by using the lens peripheral edge processing apparatus 90.
  • a finishing position based on an edge thickness may be displayed on a display (not illustrated) of the lens peripheral edge processing apparatus 90.
  • a finishing position may be set on the basis of an operation signal for adjusting a finishing position.
  • a finishing position of the left lens and a finishing position of the right lens may be displayed in a comparable manner.
  • edge information (second edge information) of one point at a radius vector angle based on a target lens shape in the front surface of the lens LE is obtained by using the lens peripheral edge processing apparatus 90, but this is only an example.
  • edge information of a plurality of points at radius vector angles based on a target lens shape may be obtained by using the lens peripheral edge processing apparatus 90.
  • edge information of a total of four points including two points (that is, the point P1 and the point P5 in Fig. 13 ) on the virtual line V1 and two points on the virtual line V2 may be obtained.
  • control unit 95 can predict deformation or a tilt of the lens LE caused by holding the lens LE with the lens chuck shafts 102L and 102R.
  • the lens LE is tilted.
  • Fig. 13 is a diagram illustrating a state in which the lens chuck shafts 102 hold the lens LE.
  • Fig. 13 illustrates a section along the virtual line V1, and the lens LE is tilted.
  • an intersection between a side g1 extending from the point P1 in a direction perpendicular to the rotation central axis K3 and a side g2 extending from the point P5 in a direction parallel to the rotation central axis K3 is indicated by a point P6.
  • the control unit 95 brings the tracing stylus 201 of the lens peripheral edge processing apparatus 90 into contact with two points (that is, the point P1 and the point P5) on the virtual line V1 of the lens LE, and calculates a position coordinate of each point in the X direction.
  • the control unit 95 calculates a length of each of the side g1 and the side g2.
  • the control unit 95 obtains a length of the side g2 by using a difference between position coordinates of the point P1 and the point P5 in the X direction.
  • the control unit 95 obtains a length of the side g1 on the basis of a target lens shape applied to the lens LE.
  • the length of the side g1 can be calculated on the basis of the distances.
  • the control unit 95 computes an angle ⁇ formed between the side g1 and a line segment connecting the point P1 to the point P5, by using a trigonometric function using the side g1 and the side g2. Consequently, it is possible to obtain an angle at which the lens LE is tilted in the vertical direction with respect to the lens chuck shafts 102L and 102R.
  • the control unit 95 may calculate position coordinates of the two points on the virtual line V2 of the lens LE in the lens peripheral edge processing apparatus 90, and may obtain an angle at which the lens LE is tilted in the horizontal direction with respect to the lens chuck shafts 102L and 102R in the same manner as described above.
  • the control unit 95 may cause bevel formation data created in the cup attachment apparatus 1 to correspond to the lens LE held by the lens chuck shafts 102 by taking into consideration such a tilt angle.
  • the cup attachment apparatus 1 includes an edge measurement mechanism, and measures the lens LE by controlling the edge measurement mechanism, so as to obtain edge information, but this is only an example.
  • the cup attachment apparatus 1 does not include an edge measurement mechanism, and obtains edge information from a separate apparatus.
  • step S7 a description has been made of an exemplary configuration in which, in the setting of a finishing position in step S7, the control unit 70 performs initial setting and automatic adjustment of a temporary position of a bevel, and the operator manually adjusts a temporary position of a bevel, but this is only an example.
  • at least one of such setting and adjustment may be performed.
  • such setting and adjustment may be performed in combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Ceramic Engineering (AREA)
  • Inorganic Chemistry (AREA)
  • Grinding And Polishing Of Tertiary Curved Surfaces And Surfaces With Complex Shapes (AREA)
  • Eyeglasses (AREA)

Description

    TECHNICAL FIELD
  • The present disclosure relates to an eyeglasses lens processing system, and an eyeglasses lens processing method.
  • BACKGROUND
  • In a case where eyeglasses are manufactured, generally, eyeglasses lenses processed according to the following operation are fitted in an eyeglasses frame. First, an operator measures optical characteristics of an eyeglasses lens by using a lens meter, and prints a dot on a surface of the eyeglasses lens. Next, the operator detects the dot by using a shaft alignment apparatus (for example, a cup attachment apparatus attaching a cup to the surface of the eyeglasses lens), and sets a shaft alignment position of the eyeglasses lens (See JP-A-2001-311919 , for example.).
  • Next, the operator obtains edge information (for example, a position of a front surface, a position of a rear surface, and an edge thickness) by using a lens peripheral edge processing apparatus, and processes a peripheral edge of the eyeglasses lens. More specifically, the edge information of the eyeglasses lens is obtained by an edge information measurement unit mounted on the lens edge processing apparatus, and the peripheral edge of the eyeglasses lens is processed by a processing unit mounted on the lens peripheral edge processing apparatus. The operator may adjust a position of a bevel formed at the edge of the eyeglasses lens between the step of obtaining the edge information of the eyeglasses lens and the step of processing the peripheral edge of the eyeglasses lens (See JP-A-2009-131939 , for example.). Further, an eyeglasses lens processing system according to the preamble portion of claim 1 is known from US 2014/297016 A1 .
  • Thus, in manufacturing of eyeglasses, the above-described steps are required to be performed, and thus much time is taken. An operator has a desire to reduce the time required to manufacture eyeglasses as much as possible by efficiently performing work.
  • SUMMARY
  • An object of the present invention is to provide an eyeglasses lens processing system, and an eyeglasses lens processing method, which are capable of efficiently processing an eyeglasses lens.
  • In order to achieve the object, the present invention, an eyeglasses lens processing system is provided, as defined in claim 1 and an eyeglasses lens processing method of processing a peripheral edge of an eyeglasses lens, as defined in claim 9. Preferred embodiments are defined in the dependent claims.
  • BRIEF DESCRIPTION OF DRAWINGS
    • Fig. 1 is an exterior view of a cup attachment apparatus.
    • Fig. 2 is a schematic configuration diagram of a lens support mechanism.
    • Fig. 3 is a schematic configuration diagram of a cup attachment mechanism.
    • Fig. 4 is a schematic configuration diagram of a lens information measurement mechanism.
    • Fig. 5 illustrates an example of a target plate.
    • Fig. 6 is a schematic configuration diagram of a sectional image data obtaining mechanism.
    • Fig. 7 is a schematic configuration diagram of a control system.
    • Fig. 8 is a flowchart for explaining a control operation.
    • Fig. 9 illustrates an example of sectional image data.
    • Figs. 10A and 10B illustrate examples of a simulation screen.
    • Fig. 11 is a schematic configuration diagram of a lens peripheral edge processing apparatus.
    • Figs. 12A, 12B and 12C are diagrams for explaining a Y-direction movement mechanism in a case where edge thicknesses of a lens are different from each other.
    • Fig. 13 is a diagram illustrating a state in which a lens chuck shaft holds a lens.
    • Fig. 14 is a diagram for explaining control for displaying front surface data at a predetermined position.
    DETAILED DESCRIPTION <Summary>
  • A description will be made of a summary of a shaft alignment apparatus according to an embodiment of the present disclosure. In the present disclosure, L and R added to a reference sign respectively indicate a left side and a right side. In the present disclosure, the terms "identical", "parallel", and "vertical" respectively include a substantially identical state, a substantially parallel state, and a substantially vertical state. The following item classified in < > may be used independently or in an associated manner.
  • In the technique of the present disclosure, a shaft alignment apparatus will be described as an example, but at least a part of the technique of the present disclosure is not limited to a case of being applied to the shaft alignment apparatus. For example, at least a part of the technique of the present disclosure is applicable to a lens processing apparatus use to process a lens in a step before an eyeglasses lens is held by a lens holding portion.
  • For example, as a shaft alignment apparatus, there may be provided a shaft alignment apparatus including a lens support portion (for example, a lens support mechanism 10) where an eyeglasses lens is mounted, and a shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion (for example, a lens holding unit 100), and for setting a shaft alignment position of the eyeglasses lens mounted on the lens support portion, the lens holding portion for clamping and holding the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens. For example, the shaft alignment apparatus may be a cup attachment apparatus (for example, a cup attachment apparatus 1) including a cup attachment portion (for example, a cup attachment mechanism 30) for attaching a cup for causing an eyeglasses lens to be held by the lens holding portion, to the eyeglasses lens on the basis of a shaft alignment position which is set by the shaft alignment position setting portion, and for attaching the cup to a surface of the eyeglasses lens mounted on the lens support portion.
  • For example, the shaft alignment apparatus includes the shaft alignment position setting portion (for example, a lens information measurement mechanism 40). The shaft alignment position setting portion sets a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion that clamps and holds an eyeglasses lens in order to process a peripheral edge of the eyeglasses lens. For example, the shaft alignment position may be at least one of an optical central position, a geometric central position, and the like of the eyeglasses lens.
  • For example, the shaft alignment apparatus includes a refractive index obtaining portion (for example, a control unit 70). The refractive index obtaining portion obtains a refractive index of an eyeglasses lens. For example, the refractive index obtaining portion may obtain a refractive index of an eyeglasses lens measured by another apparatus which is different from the shaft alignment apparatus. In this case, there may be a configuration in which an operator inputs the refractive index of the eyeglasses lens by operating an operation portion (for example, an input button 3). For example, the refractive index obtaining portion may include a refractive index measurement portion for measuring a refractive index of an eyeglasses lens. In this case, a refractive index of an eyeglasses lens is measured by the refractive index measurement portion, and thus the refractive index of the eyeglasses lens is obtained.
  • For example, the shaft alignment apparatus may include a target lens shape obtaining portion (for example, a frame shape measurement mechanism 20). The target lens shape obtaining portion obtains a target lens shape of an eyeglasses lens. For example, a target lens shape applied to an eyeglasses lens may be at least one of an inner peripheral shape of an eyeglasses frame, an outer peripheral shape of a demonstration lens, and the like.
  • <Image data obtaining portion>
  • For example, the shaft alignment apparatus includes an image data obtaining portion (for example, an image data obtaining mechanism 60). The image data obtaining portion obtains sectional image data including front surface image data for a front surface of an eyeglasses lens and rear surface image data for a rear surface of the eyeglasses lens. For example, in the present embodiment, the image data obtaining portion may obtain sectional image data based on a target lens shape applied to an eyeglasses lens. In this case, the sectional image data may be obtained for at least a part of the entire periphery (all portions in which the target lens shape is formed at each radius vector angle) of the translation of the eyeglasses lens. In this case, the sectional image data may be obtained for the entire periphery of the target lens shape of the eyeglasses lens. In this case, the sectional image data may be obtained for a plurality of regions (for example, points at respective radius vector angles) in the entire periphery of the target lens shape applied to the eyeglasses lens. In this case, the sectional image data may be obtained for some regions in the entire periphery of the target lens shape applied to the eyeglasses lens. For example, the sectional image data may be an image (image data). For example, the sectional image data may be a signal (signal data).
  • <Projection optical system>
  • For example, the image data obtaining portion includes projection optical system (for example, a projection optical system 64). The projection optical system projects a measurement light beam toward a front surface or a rear surface of an eyeglasses lens. For example, the projection optical system may perpendicularly apply the measurement light beam to a lens surface of the eyeglasses lens. In this case, the measurement light beam may be irregularly reflected on the lens surface of the eyeglasses lens. For example, the projection optical system may apply the measurement light beam to the lens surface of the eyeglasses lens with a predetermined tilt angle.
  • For example, the projection optical system may include a light source (for example, a light source 65). The light source may apply a point-shaped measurement light beam toward an eyeglasses lens. In this case, a point light source may be used as the light source. A single point light source may be disposed, and may apply a single point measurement light beam. A plurality of point light sources may be disposed side by side, and may apply a measurement light beam having a width. The light source may apply a slit-shaped measurement light beam toward an eyeglasses lens. For example, in this case, a slit plate and a lens may be provided on an optical path between the light source and a lens surface (that is, a front surface or a rear surface of the eyeglasses lens) of the eyeglasses lens. Also in this case, a measurement light beam having a width can be applied to the eyeglasses lens.
  • For example, the projection optical system may include an optical member. For example, as the optical member, at least one of a lens, a mirror, a diaphragm, and the like may be used. In this case, for example, a measurement light beam emitted from the light source may be applied toward the lens surface of the eyeglasses lens via each optical member. Of course, an optical member is not limited to the optical member, and other optical members may be used.
  • <Light receiving optical system>
  • For example, the image data obtaining portion includes a light receiving optical system (for example, a light receiving optical system 66). The light receiving optical system may include a light receiving element (for example, an imaging element 69). The light receiving optical system receives a first reflected light beam as a result of a measurement light beam being reflected at a front surface of an eyeglasses lens and a second reflected light beams as a result of a measurement light beam being reflected at a rear surface of the eyeglasses lens, with the light receiving element. For example, the light receiving optical system may receive a reflected light beam (for example, regular reflected light or irregular reflected light) reflected at the lens surface of the eyeglasses lens with the light receiving element.
  • For example, in the present embodiment, the image data obtaining portion obtains sectional image data (for example, sectional image data 75) including front surface image data formed by the first reflected light beam and rear surface image data formed by the second reflected light beam. For example, consequently, the image data obtaining portion can obtain sectional image data in a non-contact manner by using a measurement light beam. Therefore, an operator can efficiently obtain edge information of the eyeglasses lens. Consequently, the image data obtaining portion can obtain sectional image data with a simple configuration.
  • For example, the light receiving optical system may include an optical member. For example, as the optical member, at least one of a lens, a mirror, a diaphragm, and the like may be used. In this case, a reflected light beam reflected at the lens surface of the eyeglasses lens may be received by the light receiving element via each optical member. Of course, an optical member is not limited to the optical member, and other optical members may be used.
  • <Edge information obtaining portion>
  • For example, the shaft alignment apparatus includes an edge information obtaining portion (for example, the control unit 70). The edge information obtaining portion obtains edge information about an edge of an eyeglasses lens. For example, the edge information may be at least one of a front surface position of the eyeglasses lens, a rear surface position of the eyeglasses lens, an edge thickness of the eyeglasses lens, and the like. For example, the edge information may be at least one of a front surface curve value of the eyeglasses lens, a rear surface curve value of the eyeglasses lens, and the like. Consequently, edge information which is obtained by using a lens peripheral edge processing apparatus in the related art can be obtained by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • For example, the edge information obtaining portion may include an edge measurement portion. The edge measurement portion is used to obtain edge information of an eyeglasses lens. For example, in the present embodiment, a measurement optical system (for example, a measurement optical system 61) provided in the image data obtaining portion is also used as the edge measurement portion. For example, in the present embodiment, the edge measurement portion obtains edge information of an eyeglasses lens mounted on the lens support portion. The edge measurement portion may obtain edge information of the eyeglasses lens in a non-contact manner. Of course, the edge measurement portion may obtain edge information of the eyeglasses lens in a contact manner. In this case, edge information may be obtained by bringing a tracking stylus or the like into contact with the eyeglasses lens. For example, the edge information obtaining portion measures the eyeglasses lens by controlling the edge measurement portion, so as to obtain edge information. Consequently, the edge information of the eyeglasses lens can be obtained by using the shaft alignment apparatus, and thus the usage time of the lens peripheral edge processing apparatus can reduced.
  • For example, the edge information obtaining portion may control the edge measurement portion to obtain edge information on the basis of a target lens shape applied to the eyeglasses lens. Consequently, edge information at a position (for example, at least one of a position matching the target lens shape, a position of a bevel at each radius vector angle, and a position of a chamfer at each radius vector angle) based on the target lens shape applied to the eyeglasses lens. In other words, edge information of at least one of a peripheral edge and a periphery of the target lens shape applied to the eyeglasses lens is obtained. For example, in a case where edge information of a plurality of regions (for example, a points at respective predetermined radius vector angles) on the entire periphery of a target lens shape applied to the eyeglasses lens is obtained as a position based on the target lens shape applied to the eyeglasses lens, edge information at the position may be obtained by performing interpolation by using edge information at the radius vector angles of the periphery.
  • For example, the edge information obtaining portion may obtain edge information about an edge of the eyeglasses lens on the basis of a refractive index and sectional image data of the eyeglasses lens. Consequently, it is possible to efficiently obtain edge information of the eyeglasses lens by using sectional image data which is changed due to the refractive index of the eyeglasses lens. The edge information obtaining portion may correct edge information on the basis of a refractive index and sectional image data of the eyeglasses lens. In this case, the edge information obtaining portion may correct the sectional image data on the basis of the refractive index, and obtain edge information on the basis of the corrected sectional image data. In this case, the edge information obtaining portion may correct edge information which is obtained on the basis of the sectional image data, on the basis of the refractive index, so as to obtain edge information.
  • <Finishing position setting portion>
  • For example, the shaft alignment apparatus includes a finishing position setting portion (for example, the control unit 70). The finishing position setting portion sets an edge finishing position on the basis of edge information of the eyeglasses lens. For example, the edge finishing position may be at least one of a position of a bevel, a position of a groove, and a position of a chamfer formed at the edge of the eyeglasses lens. Consequently, a finishing position which is set by using a lens peripheral edge processing apparatus in the related art can be set by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, the standby time of the lens peripheral edge processing apparatus is mitigated, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • For example, the finishing position setting portion displays a finishing position based on edge information on a display portion (for example, a display 2). For example, the finishing position setting portion may automatically set an edge finishing position on the basis of edge information. For example, the finishing position setting portion may allow an operator to manually set an edge finishing position. In this case, the finishing position setting portion sets a finishing position on the basis of an operation signal from the operation portion for adjusting a finishing position on the display portion. Consequently, the operator can easily understand a finishing position formed in the eyeglasses lens. The operator easily adjusts a finishing position formed in the eyeglasses lens.
  • The finishing position setting portion may display a finishing position based on edge information of a left eyeglasses lens and a finishing position based on edge information of a right eyeglasses lens on the display portion in a comparable manner. The finishing position setting portion may set a finishing position of at least one of the left eyeglasses lens and the right eyeglasses lens on the basis of an operation signal from the operation portion. For example, in this case, the operation portion may be provided in the display portion. For example, in this case, the display portion and the operation portion may be separately provided. Consequently, the operator can easily understand the balance between finishing positions in the left eyeglasses lens and the right eyeglasses lens. The operator can adjust each of the finishing positions set in the left eyeglasses lens and the right eyeglasses lens by taking into consideration the balance between the finishing positions. Therefore, the operator can easily manufacture eyeglasses with good appearance.
  • For example, in the present embodiment, the shaft alignment apparatus including the lens support portion where an eyeglasses lens is mounted, and the shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion (for example, a lens holding unit 100), and for setting a shaft alignment position of the eyeglasses lens mounted on the lens support portion, the lens holding portion for clamping and holding the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, may be used as a lens shape measurement apparatus. In the present embodiment, the cup attachment apparatus including the cup attachment portion for attaching a cup for causing an eyeglasses lens to be held by the lens holding portion, to the eyeglasses lens on the basis of a shaft alignment position which is set by the shaft alignment position setting portion, and for attaching the cup to a surface of the eyeglasses lens mounted on the lens support portion, may be used as a lens shape measurement apparatus. In the present example, a lens peripheral edge processing apparatus (for example, a lens peripheral edge processing apparatus 90) including a processing tool (for example, a processing unit 300) processing a peripheral edge of an eyeglasses lens held by lens holding portion may be used as a lens shape measurement apparatus. For example, at least one of the shaft alignment apparatus, the cup attachment apparatus, and the lens peripheral edge processing apparatus obtains edge information of an eyeglasses lens in a non-contact manner, and thus an operator can efficiently obtain the edge information of the eyeglasses lens.
  • In the present embodiment, an eyeglasses lens processing system for processing an eyeglasses lens may be built by using the shaft alignment apparatus (or the cup attachment apparatus). For example, in this case, the eyeglasses lens processing system may be built by using the shaft alignment apparatus which includes the shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to the eyeglasses lens of the lens holding portion (for example, a lens holding unit 100) that clamps and holds the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, and a lens peripheral edge processing apparatus which includes the processing tool, the lens holding portion for holding the eyeglasses lens, and the process control data obtaining portion (for example, a control unit 95) for obtaining process control data for processing a peripheral edge of the eyeglasses lens, and processes the eyeglasses lens held by the lens holding portion by controlling the processing tool on the basis of the process control data obtained by the process control data obtaining portion.
  • For example, in this case, the eyeglasses lens processing system may be built by using the cup attachment apparatus which includes the cup attachment portion for attaching a cup to a surface of an eyeglasses lens, and the lens peripheral edge processing apparatus which includes the processing tool, the lens holding portion for holding the eyeglasses lens attached with the cup, and the process control data obtaining portion for obtaining process control data for processing a peripheral edge of the eyeglasses lens, and processes the eyeglasses lens held by the lens holding portion by controlling the processing tool on the basis of the process control data obtained by the process control data obtaining portion. In the eyeglasses lens processing system, the cup attachment apparatus may obtain first edge information of the eyeglasses lens. In the eyeglasses lens processing system, an edge finishing position may be set on the basis of the first edge information of the eyeglasses lens. In the eyeglasses lens processing system, the lens peripheral edge processing apparatus may obtain second edge information at one or more radius vector angles of the eyeglasses lens. For example, as the second edge information, edge information of one point of the eyeglasses lens may be obtained. For example, as the second edge information, edge information of a plurality of points of the eyeglasses lens may be obtained. In a case where edge information of a plurality of points is obtained as the second edge information, at least one of deformation or tilting of the eyeglasses lens caused by being held by the lens holding portion may be predicted. Consequently, the process control data obtaining portion may obtain process control data on the basis of the first edge information, the second edge information, and the finishing position.
  • In the present embodiment, an eyeglasses lens processing method for processing a peripheral edge of an eyeglasses lens may be performed by using the shaft alignment apparatus (or the cup attachment apparatus). For example, in this case, there may be performed a method of executing a first edge information obtaining step of obtaining first edge information of an eyeglasses lens, a shaft alignment position setting step of setting the eyeglasses lens a shaft alignment position being an attachment position with respect to the glasses lens of the lens holding portion that clamps and holds the eyeglasses lens, a holding step of causing the lens holding portion of a lens peripheral edge processing apparatus to hold the eyeglasses lens on the basis of the set shaft alignment position after the first edge information obtaining step and the shaft alignment position setting step are executed; a process control data obtaining step of obtaining process control data on the basis of the first edge information after the first edge information obtaining step and the shaft alignment position setting step are executed; and a processing step of processing the eyeglasses lens held by the lens holding portion by controlling a processing tool on the basis of the process control data.
  • For example, in this case, there may be performed a method of executing a first edge information obtaining step of obtaining first edge information of an eyeglasses lens; a cup attachment step of attaching a cup to the eyeglasses lens; a holding step of causing the lens holding portion of a lens peripheral edge processing apparatus to hold the eyeglasses lens attached with the cup after the first edge information obtaining step and the cup attachment step are executed; a process control data obtaining step of obtaining process control data on the basis of the first edge information after the first edge information obtaining step and the cup attachment step are executed; and a processing step of processing the eyeglasses lens held by the lens holding portion by controlling a processing tool on the basis of the process control data.
  • In the eyeglasses lens processing method, a finishing position setting step of setting an edge finishing position on the basis of the first edge information of the eyeglasses lens may be executed. In this case, in the holding step, the eyeglasses lens attached with the cup may be held by the lens holding portion of the lens peripheral edge processing apparatus after the finishing position setting step and the cut attachment step are executed, and, in the process control data obtaining step, the process control data may be obtained on the basis of the finishing position after the finishing position setting step and the cup attachment step are executed.
  • In the eyeglasses lens processing method, a second edge information obtaining step of obtaining second edge information at one or more radius vector angles of the eyeglasses may be executed after the holding step is executed. In this case, in the process control data obtaining step, the process control data may be obtained on the basis of the first edge information, the second edge information, and the finishing position.
  • The technique of the present disclosure may be applied not only to the shaft alignment apparatus but also to a lens processing apparatus processing an eyeglasses lens. In this case, a lens processing apparatus which is different from a lens peripheral edge processing apparatus processing an eyeglasses lens and executes a processing step before the eyeglasses lens is held by the lens holding portion provided in the lens peripheral edge processing apparatus may include the edge information obtaining portion for obtaining edge information about an edge of the eyeglasses lens. For example, the lens processing apparatus may be a lens meter. The lens processing apparatus may be an eyeglasses frame shape measurement apparatus.
  • <Example>
  • A description will be made of an Example of the present disclosure with reference to the drawings. In the present example, as a shaft alignment apparatus, a cup attachment apparatus which attaches a cup Cu installed with a lens chuck shaft to a lens LE will be exemplified. However, at least some techniques exemplified in the present example may be applied to apparatuses other than the cup attachment apparatus. For example, as a shaft alignment apparatus, at least some techniques exemplified in the present example may be applied to an apparatus holding the lens LE at a lens chuck shaft without using the cup Cu at a set shaft alignment position.
  • In the present example, the shaft alignment apparatus will be described as an example, but, at least some techniques exemplified in the present example are not limited to a case of being applied to the shaft alignment apparatus. For example, at least some techniques exemplified in the present example is applicable to a lens processing apparatus used to process a lens in a step before the lens is held by the lens holding portion. In this case, a lens processing apparatus which is a lens peripheral edge processing apparatus processing an eyeglasses lens and executes a processing step before the eyeglasses lens is held by lens the holding portion provided in the lens peripheral edge processing apparatus may include the edge information obtaining portion for obtaining edge information about an edge of the eyeglasses lens. For example, the lens processing apparatus may be a lens meter. The lens processing apparatus may be an eyeglasses frame shape measurement apparatus. Of course, the lens processing apparatus may include the finishing position setting portion for setting a finishing position.
  • Fig. 1 is an exterior view of a cup attachment apparatus 1. The cup attachment apparatus 1 includes a display (monitor) 2, an input button 3, a lens support mechanism 10, a frame shape measurement mechanism 20, a cup attachment mechanism 30, a lens information measurement mechanism 40, an image data obtaining mechanism 60, and the like. For example, the display 2 may be at least one of a liquid crystal display (LCD), an organic electroluminescence (EL) display, and a plasma display. For example, the display 2 is a touch panel. In other words, the display 2 is used as an operation portion (input button 3) inputting signals or parameters for causing the cup attachment apparatus 1 to perform various processes, and a display portion displaying various pieces of information (for example, input parameters, optical characteristics of the lens LE, and edge information of the lens LE). The display 2 and an operation portion may be separately provided. In this case, at least one of a mouse, a joystick, a keyboard, and a portable terminal may be used as the operation portion.
  • <Lens support mechanism>
  • Fig. 2 is a schematic configuration diagram of the lens support mechanism 10. The lens support mechanism 10 is used to mount the lens LE in a state in which a surface (front surface) of the lens LE is directed upward. For example, the lens support mechanism 10 includes a cylindrical base 11, a ring member 12, a protection cover 13, support pins 14, rotation shafts 15, arms 16, and the like. The protection cover 13 attached to the ring member 12 is provided on the cylindrical base 11. A target plate 46 which will be described later or the like is disposed inside the cylindrical base 11. The rotation shafts 15 are disposed on an outer circumferential portion of the cylindrical base 11. The arms 16 are respectively disposed at upper ends of the rotation shafts 15. The support pins 14 are respectively disposed at distal ends of the arms 16. In the present example, the support pins 14 are arranged in the same distance and with the same angle with respect to an optical axis L1. The support pins 14 are brought into contact with a back surface (rear surface) of the lens LE so as to hold the lens.
  • For example, the rotation shaft 15 is rotated about a central axis K1 via a rotation transfer mechanism (not illustrated) transmitting rotation of a motor or the like. For example, the arm 16 and the support pin 14 are moved from a retreat position indicated by a solid line to a support position indicated by a dotted line in conjunction with rotation of the rotation shaft 15. Consequently, a size of a region which can be supported by the support pins 14 can be changed by adjusting a distance from the optical axis L1 to the support pin 14 and a gap between the support pins 14.
  • For example, the cylindrical base 11 is rotated about the optical axis L1 via the rotation transfer mechanism configured with a motor 17, a gear mechanism (not illustrated), and the like. For example, rotation of the motor 17 is transmitted to the cylindrical base 11 via the gear mechanism (not illustrated). For example, the rotation shaft 15, the arm 16 provided at the rotation shaft 15, and the support pin 14 provided at the arm 16 are integrally moved around the ring member 12 in conjunction with the rotation of the cylindrical base 11. Consequently, the lens LE mounted on the support pins 14 can be rotated.
  • <Frame shape measurement mechanism>
  • For example, the frame shape measurement mechanism 20 is used to trace a shape of an eyeglasses frame (hereinafter, referred to as a frame). Consequently, an inner peripheral shape (in other words, a target lens shape applied to the lens LE which will be described later) or the like of the frame may be obtained. For example, JP-A-2015-31847 is referred to with respect to a configuration of the frame shape measurement mechanism 20.
  • <Cup attachment mechanism>
  • Fig. 3 is a schematic configuration diagram of the cup attachment mechanism 30. The cup attachment mechanism 30 is used to attach the cup Cu on the surface (front surface) of the lens LE. In other words, the cup attachment mechanism 30 is used to fix (block) the cup Cu to the surface of the lens LE. In the present example, a cup fixation location is the surface of the lens LE, and is not limited to. A cup fixation location may be a back surface (rear surface) of the lens.
  • For example, in the present example, the cup attachment mechanism 30 sets the lens LE a shaft alignment position being an attachment position of a lens holding unit 100 (refer to Fig. 11) clamping and holding the lens LE in order to process a peripheral edge of the lens LE. For example, the cup attachment mechanism 30 relatively changes positions of the cup Cu installed at an installation portion 31 which will be described later and the lens LE supported by the lens support mechanism 10, so as to attach the cup Cu at an appropriate position on the surface of the lens LE. As a result, an appropriate location (a location where the cup Cu is attached) in the lens LE can be sandwiched between lens chuck shafts 102L and 102R (refer to Fig. 11) of the lens peripheral edge processing apparatus 90. A shaft alignment position may be set to an optical central position O (refer to Fig. 6) of the lens LE, and may be set to a geometric central position. Of course, positions which are different from the above-described positions may be set as a shaft alignment position. For example, in the present example, the optical central position O of the lens LE is set as a shaft alignment position.
  • For example, the cup attachment mechanism 30 includes the installation portion 31, an arm 32, an arm holding base 33, an X-direction movement mechanism 35, a Y-direction movement mechanism 36, a Z-direction movement mechanism 37, and the like. For example, the installation portion 31 is fixed to the arm 32. For example, the installation portion 31 is fitted to an uneven portion formed in the cup Cu. For example, the arm 32 has a rotation transfer mechanism (not illustrated) variably maintaining a rotation angle of the installation portion 31 in the horizontal direction. For example, the arm 32 is attached to the arm holding base 33. For example, the arm holding base 33 includes a motor 34. For example, rotation of the motor 34 is transferred to the installation portion 31 via the rotation transfer mechanism (not illustrated) provided in the arm 32. In other words, the installation portion 31 is rotated about the central axis K1 of the cup Cu due to the rotation of the motor 34. Consequently, a rotation angle of the cup Cu in the horizontal direction can be changed.
  • For example, the X-direction movement mechanism 35 is moved in a leftward-and-rightward direction (X direction) of the cup attachment apparatus 1. For example, the Y-direction movement mechanism 36 is provided on the X-direction movement mechanism 35. For example, the Y-direction movement mechanism 36 is moved in an upward-and-downward direction (Y direction) of the cup attachment apparatus 1. For example, the Z-direction movement mechanism 37 is provided on the Y-direction movement mechanism 36. For example, the Z-direction movement mechanism 37 is moved in a front-and-rear direction (Z direction) of the cup attachment apparatus 1. For example, the Z-direction movement mechanism 37 holds the arm 32, the arm holding base 33, and the motor 34. For example, in the present example, the X-direction movement mechanism 35 is moved, and thus the Y-direction movement mechanism 36, the Z-direction movement mechanism 37, the arm 32, and the like are moved in the leftward-and-rightward direction with respect to the cup attachment apparatus 1. For example, the Z-direction movement mechanism 37 is moved, and thus the arm 32 and the like are moved in the front-and-rear direction with respect to the cup attachment apparatus 1. Consequently, the installation portion 31 held at the arm 32 is moved up to an upper part of the lens support mechanism 10. For example, the Y-direction movement mechanism 36 is moved, and thus the Z-direction movement mechanism 37, the arm 32, and the like are moved in the upward-and-downward direction with respect to the cup attachment apparatus 1. Consequently, the cup Cu installed at the installation portion 31 is blocked to the front surface of the lens LE mounted on the support pins 14.
  • <Lens information measurement mechanism>
  • Fig. 4 is a schematic configuration diagram of the lens information measurement mechanism 40 provided in the cup attachment apparatus 1. For example, the lens information measurement mechanism 40 in the present example is used as both of is also used as a measurement optical system obtaining optical characteristics of the lens and a measurement optical system obtaining information (for example, a shape of a lens, a dot added to the lens, or a hidden mark formed on the lens) regarding of another lens. The measurement optical system obtaining optical characteristics of the lens and the measurement optical system obtaining information regarding of another lens may be separately provided.
  • For example, the lens information measurement mechanism 40 includes an illumination optical system 41, a light receiving optical system 45, an imaging optical system 48, and the like. For example, the illumination optical system 41 includes a light source 42, a half mirror 43, a concave mirror 44, and the like. For example, the light source 42 irradiates the lens with a measurement light beam. For example, the light source 42 may be a light emitting diode (LED). For example, a measurement light beam emitted from the light source 42 is reflected at the half mirror 43 disposed on the optical axis L2, so as to coincide with the optical axis L2. For example, the concave mirror 44 reflects a measurement light beam in a direction from the optical axis L1 to the optical axis L2, and shapes the measurement light beam into a parallel light beam (substantially parallel light beam) having a diameter larger than that of the lens LE disposed on the optical axis L1. A lens may be used instead of a concave mirror, but it is advantageous to use the concave mirror in terms of miniaturization of an apparatus.
  • For example, the light receiving optical system 45 includes a target plate 46, an imaging element 47, and the like. For example, the target plate 46 is used to detect the optical center or the like of the lens LE. Details of the target plate 46 will be described later. For example, the imaging element 47 images a measurement light beam which is applied from the light source 42, and passes through the lens LE and the target plate 46. For example, the imaging element 47 may be a charge coupled apparatus (CCD) or a complementary metal oxide semiconductor (CMOS). The light receiving optical system 45 in the present example may have a configuration in which a lens is disposed between the target plate 46 and the imaging element 47.
  • For example, the imaging optical system 48 includes the concave mirror 44, a diaphragm 49, an imaging lens 50, an imaging element 51, and the like. For example, an imaging magnification of the imaging optical system 48 is a magnification at which the entire lens LE is imaged by the imaging element 51. For example, the concave mirror 44 of the imaging optical system 48 is also used as the concave mirror 44 of the illumination optical system 41. For example, the diaphragm 49 is disposed at a focal position (substantial focal position) of the concave mirror 44. For example, the diaphragm 49 has a positional relationship of being conjugate (substantially conjugate) to the light source 42. For example, the imaging element 51 images a reflected light beam which is applied from the light source 42, and is reflected by a recursive reflection member 52 which will be described later. For example, the imaging element 51 may be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS). For example, a focus position of the imaging element 51 is formed near the surface of the lens LE by the imaging lens 50 and the concave mirror 44. Consequently, the dot added to the surface of the lens, the hidden mark formed on the lens, or the like can be imaged in a focused state.
  • <Target plate>
  • Fig. 5 illustrates an example of the target plate 46. For example, a plurality of openings (passing holes for light beams) 55 are formed in the target plate 46 in a predetermined pattern. For example, in the present example, the openings 55 are formed by sticking the recursive reflection member 52 which will be described later to a region (that is, a region indicated by diagonal lines in Fig. 5) other than the openings 55. For example, in the present example, the circular openings 55 are arranged at an identical interval. As long as a pattern is formed in the target plate 46 such that the optical center or optical characteristics of the lens LE can be detected, shapes or intervals of the openings 55 are not limited to the present example.
  • For example, the openings 55 include a central hole 56 formed at the center of the target plate 46, and peripheral holes 57 formed in the periphery of the central hole 56. For example, the central hole 56 matches the optical axis L1. For example, the central hole 56 in the present example has a size which is different from that of each of the peripheral holes 57, and can thus be differentiated from the peripheral holes 57. A size, the number, a shape, a position, and the like of the central hole 56 may be different from those in the present example as long as the central hole 56 can be differentiated from the peripheral holes 57. Consequently, in a case where an image (hereinafter, referred to as an opening image) of the opening 55 captured by the imaging element 47 is deviated due to optical characteristics of the lens LE, a correspondence relationship of each peripheral hole 57 can be specified. More specifically, it is possible to specify a correspondence relationship between an image of the peripheral hole 57 captured in a state in which the lens LE is not mounted on the support pins 14 and an image of the peripheral hole 57 captured in a state in which the lens LE is mounted on the support pins 14.
  • For example, the recursive reflection member 52 is used to reflect a measurement light beam in the same direction (the substantially same direction) as an incidence direction. For example, the recursive reflection member 52 is stuck to an upper surface of the target plate 46 and an upper surface of a disk member 54 having an opening 53 at the center thereof. For example, the disk member 54 may be rotated about the optical axis L1 by a rotation mechanism (not illustrated). For example, with respect to details of the recursive reflection member 52 and the rotation mechanism, JP-A-2008-299140 is referred to.
  • <Image data obtaining mechanism>
  • Fig. 6 is a schematic configuration diagram of the image data obtaining mechanism 60. The image data obtaining mechanism 60 obtains the sectional image data 75 including front surface image data P1s for the front surface of the lens LE and rear surface image data Q2s for the rear surface of the lens LE (refer to Fig. 9). For example, in the present example, the image data obtaining mechanism 60 may include a projection optical system 64 which projects a measurement light beam toward the front surface or the rear surface of the lens LE. For example, in the present example, the image data obtaining mechanism 60 may include a light receiving optical system 66 which receives a first reflected light beam R1 reflected at the front surface of the lens LE and a second reflected light beam R2 reflected at the rear surface of the lens LE. In other words, the image data obtaining mechanism 60 in the present example may include the sectional image data 75 including the front surface image data P1s formed by the first reflected light beam R1 and the rear surface image data Q2s formed by the second reflected light beam.
  • The image data obtaining mechanism 60 includes a measurement optical system 61, a Y-direction movement mechanism 62, a Z-direction movement direction 63, and the like. For example, the measurement optical system 61 in the present example is configured to include the sectional image data 75 of the lens LE by using the projection optical system 64 and the light receiving optical system 66 based on the Scheimpflug principle. Of course, the measurement optical system 61 may employ optical systems having other configurations instead of optical systems based on the Scheimpflug principle.
  • The projection optical system 64 projects a measurement light beam toward the front surface or the rear surface of the lens LE. For example, the projection optical system 64 may perpendicularly projects a measurement light beam to the lens LE. In this case, an optical axis L3 of the projection optical system 64 is perpendicular to the lens LE. For example, the projection optical system 64 may projects a measurement light beam to the lens LE at a predetermined tilt angle. For example, in the present example, a description will be made of an example in which the projection optical system 64 perpendicularly projects a measurement light beam toward the front surface of the lens LE. In this case, the measurement light beam may be irregularly reflected at the front surface of the lens LE.
  • The projection optical system 64 has a light source 65. The light source 65 irradiates the lens LE with a measurement light beam. For example, the light source 65 may be an LED, a laser device, or the like. The projection optical system 64 in the present example may have a configuration in which a lens is disposed between the light source 65 and the lens LE. The projection optical system 64 in the present example may have a configuration in which pin holes are disposed between the light source 65 and the lens LE.
  • The light receiving optical system 66 receives the first reflected light beam R1 reflected at the front surface of the lens LE and the second reflected light beam R2 reflected at the rear surface of the lens LE. For example, an optical axis L4 of the light receiving optical system 66 is disposed with a predetermined tilt angle with respect to the optical axis L3 of the projection optical system 64. The light receiving optical system 66 includes a lens 67, a diaphragm 68, an imaging element 69, and the like. The lens 67 collects reflected light beams (for example, a regularly reflected beam of the lens LE and scattered light of the lens LE) reflected at the front surface and the rear surface of the lens LE at a position of the diaphragm 68. The diaphragm 68 is disposed at a focal length of the lens 67. The diaphragm 68 is disposed such that sizes of images of the first reflected light beam R1 and the second reflected light beam R2 captured by the imaging element 69 are not changed (that is, an imaging magnification is constant) even if a distance between the light receiving optical system 66 and the lens LE is changed. The imaging element 69 may be a two-dimensional imaging element (for example, at least one of a CCD and a CMOS), and may be a one-dimensional imaging element (for example, a line sensor). The imaging element 69 is disposed at a position conjugate to the lens LE. The imaging element 69 may be disposed such that a light reception surface thereof is perpendicular to the optical axis L4. The lens 67 and the imaging element 69 are disposed on the optical axis L4 on the basis of the Scheimpflug principle. For example, a measurement light beam applied to the lens LE by the projection optical system 64, the lens system including the lens LE and the lens 67, and the light reception surface (that is, an imaging position) of the imaging element 69 are disposed according to the Scheimpflug relationship.
  • For example, a measurement light beam applied to the lens LE branches into a reflected light beam reflected at the front surface of the lens LE and a measurement light beam directed toward the rear surface of the lens LE. The measurement light beam is reflected in various directions at the front surface of the lens LE, but a reflected light beam (that is, the first reflected light beam R1) reflected in parallel to the optical axis L4 reaches the imaging element 69 via the lens 67 and the diaphragm 68. The measurement light beam directed toward the rear surface of the lens LE is reflected in various at the rear surface of the lens LE, but a reflected light beam (that is, the second reflected light beam R2) reflected in parallel to the optical axis L4 reaches the imaging element 69 via the lens 67 and the diaphragm 68. For example, in the above-described way, the imaging element 69 can image the first reflected light beam and the second reflected light beam which are reflected with an identical angle at the front surface and the rear surface of the lens LE.
  • In the present example, a measurement light beam reflected in parallel to the optical axis L4 is guided to the imaging element 69 by using the diaphragm 68, but is not limited thereto. For example, the diaphragm 68 may guide a measurement light beam which is not reflected in parallel to the optical axis L4 to the imaging element 69.
  • The Y-direction movement mechanism 62 adjusts an imaging position of the imaging element 69 by using adjusting a position of the measurement optical system 61 in the upward-and-downward direction (Y direction). For example, the Y-direction movement mechanism 62 integrally moves the projection optical system 64 and the light receiving optical system 66 in the Y direction. For example, the Y-direction movement mechanism 62 may be configured with a motor and a slide mechanism. The Z-direction movement direction 63 adjusts a position of a measurement light beam which is applied from the light source 65 toward the lens LE by adjusting a position of the measurement optical system 61 in the front-and-rear direction (Z direction). For example, the Z-direction movement direction 63 integrally moves the projection optical system 64 and the light receiving optical system 66 in the front-and-rear direction (Z direction). For example, the Z-direction movement direction 63 may be configured with a motor and a slide mechanism. In the Example, an exemplary configuration in which the projection optical system 64 and the light receiving optical system 66 are integrally moved has been described, but this is only an example. For example, the projection optical system 64 and the light receiving optical system 66 may be separately moved.
  • For example, the image data obtaining mechanism 60 includes the Y-direction movement mechanism 62, and can thus obtain the sectional image data 75 (refer to Fig. 9) corresponding to the lens LE having various edge thicknesses t. Figs. 12A, 12B and 12C are diagrams for explaining the Y-direction movement mechanism 62 in a case where the edge thicknesses t of the lens LE are different from each other. Fig. 12A is a diagram illustrating a reflected light beam in a case where an edge of the lens LE is thin. Fig. 12B is a diagram illustrating a reflected light beam in a case where an edge of the lens LE is thick. Fig. 12C is a diagram illustrating a reflected light beam in a case where an imaging position of the imaging element 69 is adjusted in the state illustrated in Fig. 12B.
  • For example, a gap W between the first reflected light beam R1 reflected at the front surface of the lens LE and the second reflected light beam R2 reflected at the rear surface of the lens LE changes depending on the edge thickness t of the lens LE. For example, the gap W between the first reflected light beam R1 and the second reflected light beam R2 is wider in the lens LE of which the edge is thick than in the lens LE of which the edge is thin. For example, as illustrated in Figs. 12A and 12B, in a case where the measurement optical system 61 is disposed in a stationary manner, neither of the first reflected light beam R1 and the second reflected light beam R2 may reach the light reception surface of the imaging element 69 depending on the edge thickness t of the lens LE. For example, in Fig. 12B, the second reflected light beam R2 is deviated from the light reception surface of the imaging element 69. Therefore, an imaging position of the imaging element 69 is adjusted by driving the Y-direction movement mechanism 62 such that both of the first reflected light beam R1 and the second reflected light beam R2 reach the light reception surface of the imaging element 69. For example, in the present example, the Y-direction movement mechanism 62 moves the measurement optical system 61 toward the lens LE. Consequently, the first reflected light beam R1 and the second reflected light beam R2 reach the light reception surface of the imaging element 69 as illustrated in Fig. 12C. Consequently, the image data obtaining mechanism 60 can obtain the sectional image data 75 (refer to Fig. 9) including the front surface image data P1s formed by the first reflected light beam R1 and the rear surface image data Q2s formed by the second reflected light beam.
  • In a case where the measurement optical system 61 is disposed in a stationary manner, if a curve value of the lens LE is great even if the edge of the lens LE is thin, neither of the first reflected light beam R1 and the second reflected light beam R2 may reach the light reception surface of the imaging element 69. Even in this case, it is possible to obtain the sectional image data 75 including the front surface image data P1s and the rear surface image data Q2s by driving the Y-direction movement mechanism 62 such that an imaging position of the imaging element 69 is adjusted.
  • For example, the image data obtaining mechanism 60 (refer to Fig. 6) includes the Z-direction movement direction 63, and can thus apply a measurement light beam from the projection optical system 64 toward a position based on a target lens shape TG of the lens LE. The position based on the target lens shape applied to the lens LE may be a position matching the target lens shape, and may be a position (for example, a position of a bevel at each radius vector angle or a position of a chamfer at each radius vector angle) obtained through calculation on the basis of the target lens shape. For example, in the present example, the lens LE mounted on the support pins 14 is rotated in the horizontal direction due to rotation of the cylindrical base 11 (refer to Fig. 2), and a measurement light beam is applied to a position based on the target lens shape TG applied to the lens LE by the Z-direction movement direction 63 moving the measurement optical system 61 in a length direction of the radius vector of the lens LE. Consequently, the image data obtaining mechanism 60 can obtain the sectional image data 75 based on the target lens shape applied to the lens LE.
  • <Control system>
  • Fig. 7 is a schematic configuration diagram of a control system of the cup attachment apparatus 1. A control unit 70 includes a CPU (processor), a RAM, a ROM, and the like. For example, the CPU of the control unit 70 controls the cup attachment apparatus 1. The RAM of the control unit 70 temporarily stores various pieces of information. The ROM of the control unit 70 stores various programs executed by the CPU.
  • For example, the control unit 70 is electrically connected to the display 2, the input button 3, the motor 34, the light source 42, the light source 65, the imaging element 47, the imaging element 51, the imaging element 69, a nonvolatile memory 71 (hereinafter, referred to as a memory 71), and the like. For example, the control unit 70 is connected to a motor (not illustrated) or the like provided in each of the X-direction movement mechanism 35, the Y-direction movement mechanism 36, and the Z-direction movement mechanism 37 of the cup attachment mechanism 30. For example, the control unit 70 is connected to a motor (not illustrated) or the like provided in each of the Y-direction movement mechanism 62 and the Z-direction movement direction 63 of the image data obtaining mechanism 60.
  • For example, a non-transitory storage medium which can hold storage contents even if the supply of power is stopped may be used as the memory 71. For example, as the memory 71, a hard disk drive, a flash ROM, or a USB memory may be used. For example, the memory 71 may store optical characteristics of the lens LE measured by the lens information measurement mechanism 40, sectional image data of the lens LE obtained by the measurement optical system 61, an inner peripheral shape of a frame traced by using the frame shape measurement mechanism 20, and the like.
  • <Control operation>
  • A description will be made of control of the cup attachment apparatus 1 having the above-described configuration. In order to manufacture eyeglasses, an operator processes the lens LE by using the cup attachment apparatus 1 and the lens peripheral edge processing apparatus. Here, in the related art, the cup attachment apparatus 1 is used to attach the cup Cu to the surface of the lens LE. In the related art, in order to obtain edge information of the lens LE, the lens peripheral edge processing apparatus is used to set a finishing position (for example, a position of a bevel, a position of a groove, or a position of a chamfer) formed on an edge of the lens LE and to process a peripheral edge of the lens LE. In such a series of operations, it can be found that the time of using the lens peripheral edge processing apparatus is considerably longer than the time of using the cup attachment apparatus 1. There is a case where the operator desires work to be efficiently performed, but cannot proceed to the next step since the lens peripheral edge processing apparatus is in use even if attachment of the cup Cu to the lens LE is finished.
  • Therefore, in the present example, in order to mitigate a state in which the operator waits for the lens peripheral edge processing apparatus to be available, attachment of the cup Cu to the lens LE, obtainment of edge information, and setting of a finishing position are performed by using the cup attachment apparatus 1, and a peripheral edge of the lens LE is processed by using the lens peripheral edge processing apparatus. Hereinafter, steps S1 to S8 will be described in order with reference to a flowchart illustrated in Fig. 8. For example, in the present example, steps S1 to S7 are executed by using the cup attachment apparatus 1, and steps S8 to S12 are executed by using the lens peripheral edge processing apparatus.
  • <Cup attachment apparatus>
  • For example, the operator activates the cup attachment apparatus 1, and sequentially executes steps described below.
  • <Obtainment of target lens shape (S1)>
  • First, the control unit 70 obtains a target lens shape applied to the lens LE. The target lens shape applied to the lens LE may be an outer peripheral shape of a demonstration lens, an inner peripheral shape of a frame, or the like. For example, in a case where an outer peripheral shape of a demonstration lens is obtained, the entire image of the demonstration lens may be captured by using the lens information measurement mechanism 40, so that the outer peripheral shape thereof is detected. For example, in a case where an inner peripheral shape of a frame is obtained, the frame may be traced by using the frame shape measurement mechanism 20, so that the inner peripheral shape thereof is detected. Of course, a target lens shape applied to the lens LE obtained by using other apparatuses may be read to the cup attachment apparatus 1. For example, the control unit 70 stores the target lens shape applied to the lens LE obtained as mentioned above, in the memory 71. Regarding the target lens shape applied to the lens LE, target lens shapes applied to each of a left lens and a right lens may be obtained. A target lens shape applied to either of a left lens and a right lens may be obtained, and may be laterally reversed such that a target lens shape applied to the other lens is obtained.
  • <Obtainment of refractive index (S2)>
  • Next, the control unit 70 obtains a refractive index of the lens LE. The refractive index of the lens LE may be obtained by the operator operating the input button 3. For example, a refractive index is defined by a material (for example, plastic or glass) of the lens LE (in other words, a refractive index is defined by the lens LE to be processed), and thus the operator may input an existing refractive index. The operator may input a refractive index of the lens LE which is obtained in advance by using a separate apparatus. The cup attachment apparatus 1 may have a configuration which can measure a refractive index of the lens LE, and may obtain a refractive index by using the configuration. For example, the control unit 70 stores the refractive index of the lens LE obtained as mentioned above, in the memory 71.
  • <Setting of processing condition and layout (S3)>
  • For example, in a case where the target lens shape and the refractive index of the lens LE are obtained, the control unit 70 displays the target lens shape on the display 2. The operator may set a processing condition for the lens LE for each of the left lens and the right lens by operating the input button 3. For example, the processing condition of the lens LE may be at least one of the type of lens LE (for example, a single focal lens, a bifocal lens, and a progressive lens), a material of the lens LE, a material of the frame, the presence or absence of various processings (for example, polishing, chamfering, and grooving), and an attachment position of the cup Cu to the lens LE (for example, the optical central position of the lens LE and the geometric central position of the target lens shape). The operator may set a layout of the lens LE by operating the input button 3. For example, the layout of the lens LE may be at least one of a distance between the frame centers, a pupillary distance of an eyeglasses wearer, and an astigmatic axis angle of an eyeglasses wearer.
  • <Blocking (S4)>
  • In a case where setting of the processing condition and the layout of the lens LE is finished, the operator mounts the lens LE (for example, the left lens in the present example) on the support pins 14, and attaches the cup Cu to the installation portion 31. The operator operates a mode selection button (not illustrated), so as to switch a mode from a setting mode to a blocking mode. In a case where the blocking mode is selected, the control unit 70 turns on the light source 42 of the lens information measurement mechanism 40. A guide mark which is a target for aligning the optical central position (that is, an optical central mark which will be described later) of the lens LE mounted on the support pins 14 with the optical axis L1 is displayed on the display 2.
  • For example, the control unit 70 obtains a position coordinate of an opening image captured by the imaging element 51, and detects the optical central position and optical characteristics (for example, a spherical power, a cylindrical power, or an astigmatic axis angle) of the lens LE. A well-known technique is applied to detection of the optical central position and optical characteristics of the lens LE using an opening image, and thus JP-A-2008-241694 is referred to with respect to details thereof. For example, an optical central mark is displayed at the optical central position O of the lens LE detected by the control unit 70.
  • For example, the control unit 70 performs image processing on an image of the lens LE captured by the imaging element 51 so as to detect an outer shape of the lens LE. For example, the target lens shape applied to the lens LE is displayed to overlap the outer shape of the lens LE on the display 2. In this case, a display position and a display size of the target lens shape applied to the lens LE may be determined depending on the optical central position O of the lens LE, the layout data of the lens LE, an imaging magnification of the imaging optical system 48, a positional relationship between the optical axis L1 and the optical axis L2, and the like.
  • For example, the operator moves the lens LE such that the optical central mark matches the guide mark while viewing the display 2, and operates a blocking button (not illustrated) displayed on the display 2. In response to an operation signal, the control unit 70 drives the X-direction movement mechanism 35 and the Y-direction movement mechanism 36 of the cup attachment mechanism 30 so as to move the arm 32 such that a central axis K2 of the cup Cu is located at the optical central position O of the lens LE. In this case, in a case where the lens LE has a cylindrical power, the installation portion 31 may be rotated about the central axis K2 on the basis of the detected astigmatic axial angle. In a case where adjustment of the position and angle of the cup Cu is completed, the control unit 70 moves the arm 32 downward by driving the Z-direction movement mechanism 37. Consequently, the cup Cu installed at the installation portion 31 is blocked to the front surface of the lens LE mounted on the support pins 14 on the basis of the set shaft alignment position (that is, the optical central position O of the lens LE).
  • <Obtainment of sectional image data (S5)>
  • For example, in a case where blocking of the cup Cu is finished, the operator operates the mode selection button (not illustrated), and switches the mode from the blocking mode to an edge information measurement mode. For example, consequently, in the present example, edge information of the lens LE mounted on the lens support mechanism 10 is measured. In a case where the edge information measurement mode is selected, the control unit 70 turns on the light source 65 of the image data obtaining mechanism 60.
  • For example, the control unit 70 controls driving of the Z-direction movement direction 63, so as to match the optical axis L3 with an initial position on the target lens shape TG in the lens LE. For example, the initial position may be a position on a virtual line V1 which extends in the Z direction through the optical central position O of the lens LE, and may be a position on a virtual line V2 which extends in the X direction through the optical central position O of the lens LE. Of course, the initial position may be set to any position. For example, in the present example, the optical axis L3 matches a point P1 which is located on the target lens shape TG applied to the lens LE and on the virtual line V1. Consequently, a measurement light beam applied from the light source 65 branches into the first reflected light beam R1 reflected at the point P1 on the lens LE and the second reflected light beam R2 which reaches a point Q1 and is reflected thereat on the lens LE.
  • For example, the control unit 70 controls driving of the Y-direction movement mechanism 62, and adjusts a distance between the image data obtaining mechanism 60 and the lens LE in the Y direction such that the imaging element 69 can receive both of the first reflected light beam R1 and the second reflected light beam R2. For example, the control unit 70 performs image processing (for example, edge detection) on the sectional image data 75 (refer to Fig. 9) captured by the imaging element 69, so as to detect rising of luminance. For example, in a case where rising of luminance appears twice, the control unit 70 determines that both of an image (that is, the front surface image data P1s of the lens LE which will be described later) of the first reflected light beam R1 and an image (that is, the rear surface image data Q2s of the lens LE which will be described later) of the second reflected light beam R2 are detected, and finishes driving of the Y-direction movement mechanism 62. Consequently, the control unit 70 can obtain the sectional image data 75 at the position of the point P1 in the target lens shape TG applied to the lens LE.
  • Fig. 9 illustrates an example of the sectional image data 75. For example, the sectional image data 75 includes the front surface image data of the lens LE formed by the first reflected light beam R1 and the rear surface image data of the lens LE formed by the second reflected light beam R2. For example, in the present example, an image of the point P1 on the lens LE is the front surface image data P1s of the lens LE. For example, in the present example, instead of an image of the point Q1 on the rear surface of the lens LE, an image of a point Q2 located on an extension line (that is, the dotted line illustrated in Fig. 6) of the second reflected light beam R2 is the rear surface image data Q2s of the lens LE. This is because the lens LE has a predetermined refractive index, and the second reflected light beam R2 is bent at the front surface of the lens LE and then reaches the imaging element 69. It appears that the point Q1 on the rear surface of the lens LE is located at the position of the point Q2 in the imaging element 69.
  • For example, in a case where the sectional image data 75 at the initial position (that is, the point P1) on the lens LE is obtained, the control unit 70 drives the motor 17 so as to rotate the cylindrical base 11, and rotates the lens LE mounted on the support pins 14 in the horizontal direction by one round. For example, the control unit 70 controls the Y-direction movement mechanism 62 such that the first reflected light beam R1 is adjusted to reach an identical position on the light reception surface of the imaging element 69 at all times. For example, the control unit 70 controls the Z-direction movement direction 63 on the basis of the target lens shape TG applied to the lens LE, and matches the optical axis L3 with points (for example, points P2, P3, P4, ..., and Pn illustrated in Fig. 6) corresponding to the target lens shape applied to the lens LE. For example, the points corresponding to the target lens shape may be points with gaps of predetermined angles (for example, 0.5 degrees and 1 degree) with the optical central position O as the center, and may be points of any number (for example, 1000 points).
  • For example, the control unit 70 controls the Z-direction movement direction 63 while rotating the lens LE in order to consecutively obtain the sectional image data 75 for each point on a radius vector angle in the target lens shape applied to the lens LE. For example, the control unit 70 controls the Y-direction movement mechanism 62 while rotating the lens LE such that the front surface image data P1s of the lens LE is displayed at a predetermined position in the sectional image data 75. Fig. 14 is a diagram for explaining control for displaying the front surface image data P1s at a predetermined position. For example, the control unit 70 measures a deviation amount Δd between the front surface image data P1s of the lens LE and a predetermined position M in a depth direction (in other words, an upward-and-downward direction of the sectional image data 75) with respect to the sectional image data 75. For example, the control unit 70 calculates a drive amount of the Y-direction movement mechanism 62 on the basis of the measured deviation amount Δd, and moves the measurement optical system 61. Consequently, an imaging position of the first reflected light beam R1 reaching the light reception surface of the imaging element 69 is changed, and thus the front surface image data P1s matches the predetermined position M in the sectional image data 75.
  • For example, the control unit 70 measures the deviation amount Δd for all radius vector angles in the target lens shape of the lens LE, and drives the Y-direction movement mechanism 62 on the basis thereof. In other words, whenever a position for obtaining the sectional image data 75 on the lens LE is changed, the control unit 70 drives the Y-direction movement mechanism 62 such that the front surface image data P1s is displayed at the predetermined position M in the sectional image data 75. For example, the control unit 70 specifies a front surface position (for example, a position coordinate) of each radius vector angle in the target lens shape applied to the lens LE on the basis of a drive amount in which the Y-direction movement mechanism 62 has been driven, by using the deviation amount Δd. For example, in this case, the front surface position of each radius vector angle may be specified by obtaining the number of pulses of the motor configuring the Y-direction movement mechanism 62. For example, the control unit 70 rotates the lens LE in the horizontal direction by one round, obtains sectional image data of points on all radius vector angles in the target lens shape applied to the lens LE and front surface positions of the lens LE, and stores the sectional image data and the front surface positions in the memory 71.
  • In the present example, the lens LE mounted on the support pins 14 is rotated by one round, but may be rotated by any round. For example, the lens LE mounted on the support pins 14 may be rotated by two rounds, and sectional image data at each of a bevel position and a chamfer position which are calculated on the basis of a target lens shape applied to the lens LE may be obtained. For example, in this case, by controlling the Z-direction movement direction 63, the lens LE may be rotated by one round such that the optical axis L3 matches the bevel position on the lens LE, and may then be further rotated by one round such that the optical axis L3 matches the chamfer position (for example, a position inward by a predetermined distance from the bevel position) on the lens LE.
  • <Obtainment of first edge information (S6)>
  • For example, the cup attachment apparatus 1 may have an edge measurement mechanism obtaining edge information of the lens LE. For example, the control unit 70 may measure the lens LE by controlling the edge measurement mechanism, so as to obtain edge information. In this case, the control unit 70 may control the edge measurement mechanism on the basis of the target lens shape applied to the lens LE, so as to obtain edge information. In the present example, as an example, a description will be made of a case where the image data obtaining mechanism 60 is used as the edge measurement mechanism, and the control unit 70 obtains edge information.
  • For example, the control unit 70 obtains edge information (first edge information) about an edge of the lens LE with respect to each piece of the obtained sectional image data 75. For example, in the present example, since the sectional image data 75 based on the target lens shape applied to the lens LE is obtained, edge information based on the target lens shape applied to the lens LE can be obtained. For example, as the edge information, a front surface position of the edge, a rear surface position, or an edge thickness of the lens LE may be obtained. Of course, as the edge information, a front surface curve value or a rear surface curve value of the lens LE may be obtained. In the present example, as an example, a description will be made of a case where an edge thickness is obtained as the edge information of the lens LE.
  • For example, the control unit 70 obtains position coordinates (for example, pixel coordinates) of the front surface image data P1s and the rear surface image data Q2s in the sectional image data 75. For example, the control unit 70 computes a gap h between the front surface image data P1s of the lens LE and the rear surface image data Q2s of the lens LE by using the pixel coordinates of the front surface image data P1s and the rear surface image data Q2s. For example, in this case, the control unit 70 may compute the number of pixels of the gap h by obtaining a difference between the pixel coordinates. For example, in this case, the control unit 70 sets an actual distance per pixel of the imaging element 69, and can thus convert the number of pixels of the gap h into an actual distance.
  • However, for example, the second reflected light beam is bent at the front surface of the lens LE in the sectional image data 75 as described above, and thus the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE does not necessarily match the edge thickness t of the lens LE. Thus, for example, the control unit 70 may obtain the edge thickness t of the lens LE based on the refractive index and the sectional image data 75 of the lens LE. For example, in the present example, an actual distance of the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE obtained on the basis of the sectional image data 75 is corrected on the basis of the refractive index of the lens LE, and thus the control unit 70 obtains the edge thickness t of the lens LE. In other words, in the present example, the edge thickness obtained on the basis of the sectional image data is corrected on the basis of the refractive index, and thus the edge thickness of the lens LE is obtained.
  • For example, the control unit 70 may correct an actual distance of the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE so as to compute the edge thickness t of the lens LE by using the following equation. t = h × n / cos θ × β
    Figure imgb0001
  • Here, n indicates a refractive index of the lens LE, β indicates an imaging magnification of the light receiving optical system 66, and θ indicates a tilt angle of the optical axis L4 with respect to the front surface of the lens LE. The imaging magnification β and the tilt angle θ are known values on design. For example, the control unit 70 may call the refractive index n obtained in advance in step S2 from the memory 71, and obtain the edge thickness t of the lens LE by computing the above equation. For example, in a case where the edge thickness t of the lens LE is obtained as mentioned above in each piece of the sectional image data 75, the control unit 70 stores the edge thickness t in the memory 71 in correspondence to a position of each point on a radius vector angle in the target lens shape applied to the lens LE.
  • The front surface and the rear surface of the lens LE are curved, and thus a measurement light beam applied from the light source 65 is not necessarily perpendicular to the front surface of the lens LE. Thus, the measurement light beam is applied to the front surface of the lens LE in a tilted manner, and a measurement light beam advancing toward the rear surface of the lens LE from the rear surface is bent at the front surface of the lens LE. Therefore, the obtained edge thickness t of the lens LE may be further corrected by taking into consideration a front surface curve value of the lens LE.
  • For example, in this case, a table (conversion table) indicating the refractive index n and the front surface curve value of the lens LE may be stored in the memory 71 in advance, and the control unit 70 may multiply a corresponding coefficient by the edge thickness t of the lens LE. The front surface curve value of the lens LE may be obtained in advance by using a separate apparatus, and may be read to the cup attachment apparatus 1, and may be input by the operator. Of course, the front surface curve value of the lens LE may be obtained through image processing on the sectional image data 75 which is obtained by using the cup attachment apparatus 1.
  • An equation used for computation of the edge thickness t is not limited to the equation in the present example, and an equation including at least one of the refractive index n of the lens LE, a front surface curve value, and a distance D from the optical central position O to a point (for example, the point P1 in Fig. 6) on each radius vector angle, as a parameter, may be used.
  • For example, in a case where the operator has finished blocking (step S4), obtainment of sectional image data (step S5), and obtainment of edge information (step S6) with respect to the left lens, the operator sequentially executes the steps on the right lens in the same manner, so as to obtain the edge thickness t corresponding to positions in the target lens shape applied to applied to the lens LE.
  • <Setting of finishing position (S7)>
  • For example, in a case where the edge thickness t is obtained, the control unit 70 changes an operation screen displayed on the display 2 to a simulation screen, and sets a finishing position. For example, the control unit 70 sets an edge finishing position on the basis of the obtained edge information (the edge thickness t in the present example). For example, the edge finishing position is a position of a bevel, a position of a groove, or a position of a chamfer formed in the edge. For example, the finishing position may be automatically set (that is, initially set) on the basis of the obtained edge thickness t, and may be adjusted by the operator. In the present example, as an example, a description will be made of a case of setting a position of a bevel as the edge finishing position.
  • For example, the control unit 70 may display the finishing position on the basis of the edge thickness on the display 2, and may set the finishing position on the basis of an operation signal from the input button 3 for adjusting the finishing position on the display. In this case, the control unit 70 may display a position of a bevel of the left lens and a position of a bevel of the right lens in a comparable manner. In this case, the control unit 70 may set a finishing position of at least one of the left lens and the right lens on the basis of an operation signal from the input button 3. Consequently, positions of the bevels (apex positions of the bevels) can be disposed with good balance with respect to the respective edge thicknesses t by taking into consideration a difference between the edge thicknesses t of the left lens and the right lens.
  • Figs. 10A and 10B illustrate examples of a simulation screen 80. Fig. 10A is a diagram illustrating the entire simulation screen 80. Fig. 10B is a diagram illustrating a partially enlarged simulation screen 80. For example, target lens shapes (a target lens shape TGL applied to the left lens and a target lens shape TGR applied to the right lens) of the lens LE are displayed on the simulation screen 80. Shapes of the edges (a shape 81L of the edge of the left lens and a shape 81R of the edge of the right lens) after the bevels are processed in the lenses LE are displayed on the simulation screen 80. For example, cursors (a cursor 82L and a cursor 82R) which are moved on the target lens shapes by an operation of the operator are displayed on the target lens shapes applied to the lenses LE. The operator can designate an observation direction of the edge of the lens LE by using the cursor, and can display a shape of the edge of the lens LE. For example, entry fields 83 (an entry field 83L and an entry field 83R) for changing a curve value of a bevel curve of the lens LE (that is, a path along which the bevel is disposed in the edge of the lens LE), a position of the bevel curve in the front-and-rear direction, a tilt of the bevel curve, and the like are provided on the simulation screen 80. In the present example, sectional shapes of rims of the frames may be displayed in correspondence to display positions of bevel positions 84L and 84R on the simulation screen 80.
  • For example, the control unit 70 automatically sets a temporary position of a bevel formed in the lens LE. For example, a bevel curve may be a curve along a front surface curve of the lens LE. For example, an apex position of the bevel may be set to pass through a position of a half of a portion where the edge thickness t is smallest, on the basis of the edge thickness t of the lens LE. For example, the control unit 70 sets temporary positions of bevels for the left lens and the right lens as mentioned above. In this case, the control unit 70 may automatically adjust temporary positions of bevels formed in the left and right lenses by taking into consideration a difference between the edge thicknesses t of the left lens and the right lens. For example, in this case, a distance BL from an apex position of the bevel in the left lens to the front surface of the lens LE and a distance BR from an apex position of the bevel to the front surface in the right lens of the lens LE may be automatically adjusted to be the same as each other. For example, in this case, a difference between the distance BL and the distance BR may be automatically adjusted to be equal to or less than a predetermined threshold value. For example, the predetermined threshold value may be set in advance through tests or simulation.
  • For example, in a case where there is no problem in the temporary positions of the bevels which are initially set as mentioned above, the operator operates a determination button (not illustrated). The control unit 70 creates bevel formation data for forming bevels in the edges of the left lens and the right lens in response to an operation instruction, and stores the bevel formation data in the memory 71. For example, the control unit 70 transmits the created bevel formation data to the lens peripheral edge processing apparatus along with the front surface position of the lens LE.
  • For example, the operator may manually adjust the temporary positions of the bevels which are initially set. For example, the operator moves the cursor 82L by touching the display 2, and thus designates any radius vector angle of the left lens. Consequently, the operator can determine an observation point with the emphasis on appearance based on a positional relationship between the frame and the edge of the lens LE. Consequently, the operator can compare a positional relationship between the front surface of the lens LE and the apex position of the bevel with respect to the left lens and the right lens. The cursor 82R may be automatically moved to a laterally symmetric position in conjunction with the cursor 82L.
  • For example, the operator compares the shape 81L of the edge of the left lens with the shape 81R of the edge of the right lens, and changes each of the distance BL from the apex position of the bevel in the left lens to the front surface of the lens LE and the distance BR from the apex position of the bevel in the right lens to the front surface of the lens LE. For example, the apex positions of the bevels may be moved by the operator entering any values into the entry field 83. For example, the control unit 70 changes display positions of the positions 84R and 84L of the bevels in correspondence to entered values. For example, the operator may change curve values of bevel curves or tilts by entering any values into the entry fields 83. For example, in this case, the control unit 70 computes positions of bevels again in correspondence to the entered values, and changes display positions of the positions 84L and 84R of the bevels. For example, as mentioned above, the control unit 70 can set a temporary position of a bevel formed in the edge of the lens LE on the basis of an operation signal which is input from the display 2 due to an operation performed by the operator. For example, in a case where adjustment of the temporary positions of the bevels is finished, the operator operates the determination button (not illustrated). The control unit 70 creates bevel formation data for forming bevels in the edges of the left lens and the right lens in response to an operation instruction, and stores the bevel formation data in the memory 71. For example, the control unit 70 transmits the created bevel formation data to the lens peripheral edge processing apparatus along with the front surface position of the lens LE.
  • <Lens peripheral edge processing apparatus>
  • For example, in a case where the position of the bevel formed in the edge of the lens LE is set in response to the operation by using the cup attachment apparatus 1, the operator processes the edge of the lens LE by using the lens peripheral edge processing apparatus. Fig. 11 is a schematic configuration diagram of a lens peripheral edge processing apparatus 90. For example, in a case where the operator activates the lens peripheral edge processing apparatus 90, the control unit 95 of the lens peripheral edge processing apparatus may receive the above-described bevel formation data and front surface position of the lens LE, and may store the bevel formation data and the front surface position in a memory (not illustrated).
  • For example, the lens peripheral edge processing apparatus 90 in the present example includes a lens holding unit 100 which holds the lens LE with lens chuck shafts 102, an edge information obtaining unit 200 obtaining edge information of the lens LE, a processing unit 300 processing a peripheral edge of the lens LE, and the like. The lens holding unit 100 includes a lens rotation unit 100a rotating the lens LE, a chuck shaft movement unit 100b moving the lens chuck shafts 102 in the X axis direction, and an inter-axial distance changing unit 100c moving the lens chuck shafts 102 in the Y direction (an inter-axial distance between the lens chuck shaft and a rotation axis of the processing unit 300). For example, JP-A-2015-131374 is referred to with respect to detailed configurations of the lens peripheral edge processing apparatus 90.
  • <Holding of lens (S8)>
  • For example, the operator holds the left lens with lens chuck shafts 102L and 102R of the lens holding unit 100. For example, a rotation central axis K3 of the lens chuck shafts 102L and 102R is designed to match a central position (in other words, the optical central position O of the lens LE) of the cup Cu. For example, the cup Cu is held to be directed in a predetermined direction with respect to the rotation central axis K3. For example, in the present example, the cup Cu is held such that the virtual line V1 of the lens LE matches the Y direction of the lens peripheral edge processing apparatus 90, and the virtual line V2 of the lens LE matches the Z direction of the lens peripheral edge processing apparatus 90. Thus, the X direction in the cup attachment apparatus 1 corresponds to the Z direction in the lens peripheral edge processing apparatus 90. The Y direction in the cup attachment apparatus 1 corresponds to the X direction in the lens peripheral edge processing apparatus 90. The Z direction in the cup attachment apparatus 1 corresponds to the Y direction in the lens peripheral edge processing apparatus 90.
  • <Obtainment of second edge information (S9)>
  • For example, the operator starts processing of the peripheral edge of the lens LE by operating a processing start button (not illustrated). First, the control unit 95 of the lens peripheral edge processing apparatus 90 controls the edge information obtaining unit 200 to obtain edge information (second edge information) of at least one point at a radius vector angle based on the target lens shape in the front surface of the lens LE. For example, in the present example, edge information of the point P1 on the virtual line V1 is obtained.
  • For example, the control unit 95 moves the lens LE held by the lens chuck shafts 102L and 102R by driving the lens rotation unit 100a, the chuck shaft movement unit 100b, and the inter-axial distance changing unit 100c such that a tracing stylus 201 of the edge information obtaining unit 200 is brought into contact with the point P1 on the lens LE. For example, the control unit 95 recognizes positions of the point P1 in the Y direction and the Z direction with the rotation central axis K3 as a reference, and drives the lens rotation unit 100a and the inter-axial distance changing unit 100c. For example, the control unit 95 moves the lens LE in the X direction by driving the chuck shaft movement unit 100b until a sensor (not illustrated) provided in the tracing stylus 201 detects contact with the lens LE. For example, in this case, the control unit 95 measures movement amounts in which the lens chuck shafts 102L and 102R are moved in the X direction from initial positions thereof. Consequently, a position where the lens LE is located in a lens chuck shaft direction (that is, the X direction) is recognized. In other words, the position (that is, a position coordinate) of the point P1 on the lens LE is recognized. The control unit 95 may slide the tracing stylus 201 on the lens LE so as to obtain edge information of one surface at a radius vector angle of the lens LE.
  • <Correlation of position coordinate (S10)>
  • Here, for example, the control unit 95 correlates the front surface position of each point (that is, a position coordinate of each point) in the lens LE, obtained in the cup attachment apparatus 1, with the position coordinate of the point P1 in the X direction on the lens LE, obtained in the lens peripheral edge processing apparatus 90. For example, consequently, position coordinates in the X direction of the lens peripheral edge processing apparatus 90 are obtained with respect to the point P2 and the subsequent points on the lens LE. With respect to the point P1 on the lens LE, the position coordinate in the X direction of the lens peripheral edge processing apparatus 90 is obtained in step S9. Hereinafter, correlation of position coordinates of the point P2 and the subsequent points will be described.
  • For example, the control unit 95 calls a front surface position (position coordinate) of each radius vector angle in the target lens shape applied to the lens LE from the memory (not illustrated), and correlates a position coordinate therewith for each vectoral angle of the lens LE on the basis of the position coordinate of the point P1 on the lens LE. More specifically, a position coordinate of the point P2 on the lens LE in the lens peripheral edge processing apparatus 90 is obtained by adding the position coordinate of the point P1 obtained by using the lens peripheral edge processing apparatus 90 to the position coordinate of the point P2 obtained in the cup attachment apparatus 1. For example, a position coordinate of the point P3 on the lens LE in the lens peripheral edge processing apparatus 90 is obtained by adding the position coordinate of the point P1 obtained by using the lens peripheral edge processing apparatus 90 to the position coordinate of the point P3 obtained in the cup attachment apparatus 1. For example, in the above-described way, the control unit 95 correlates a position coordinate of one point on the lens LE, obtained by using the edge information obtaining unit 200 of the lens peripheral edge processing apparatus 90 with a position coordinate of a point at each radius vector angle of the lens LE, obtained by using the image data obtaining mechanism 60 of the cup attachment apparatus 1. Consequently, the bevel formation data created in the cup attachment apparatus 1 can be correlated with the lens LE held by the lens chuck shafts 102.
  • <Obtainment of process control data (S11)>
  • For example, the control unit 95 obtains process control data for processing the peripheral edge of the lens LE on the basis of the obtained edge information (that is, the second edge information) of one point on the lens LE and bevel formation data (that is, the bevel formation data created on the basis of the first edge information and the finishing position). For example, as the process control data, a drive amount for driving at least one of the lens rotation unit 100a, the chuck shaft movement unit 100b, and the inter-axial distance changing unit 100c is calculated. The tracing stylus 201 is in a state of being brought into contact with the point P1 on the lens LE held by the lens chuck shafts 102. Thus, in the present example, a drive amount from the position where the lens LE is brought into contact with the tracing stylus 201 is calculated as the process control data. Of course, the control unit 95 may return the lens chuck shafts 102 to initial positions, and may calculate a drive amount from the initial positions as the process control data.
  • <Processing of edge (S12)>
  • For example, the control unit 95 moves the lens chuck shafts 102 in at least one of the X direction, the Y direction, and the Z direction on the basis of the process control data, and locates the lens LE on a grindstone 310 of the processing unit 300. For example, the control unit 95 moves the lens chuck shafts 102 in at least one of the X direction, the Y direction, and the Z direction on the basis of the process control data, and adjusts a relative positional relationship of the lens LE for the grindstone 310. For example, consequently, the peripheral edge of the lens LE is processed (for example, roughing or bevel processing). For example, in a case where the peripheral edge of the lens LE has been processed in the above-described way with respect to the left lens, the operator holds the right lens with the lens chuck shafts 102, and processes the peripheral edge of the lens LE as described above.
  • As described above, the shaft alignment apparatus of the present example includes the shaft alignment position setting portion for setting a shaft alignment position being an attachment position with respect to an eyeglasses lens of the lens holding portion that clamps and holds the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, and the edge information obtaining portion for obtaining edge information about an edge of the eyeglasses lens. Consequently, edge information obtained by using a lens peripheral edge processing apparatus can be obtained by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • For example, in the shaft alignment apparatus of the present example, the edge information obtaining portion includes the edge measurement portion for obtaining edge information of an eyeglasses lens, and measures the eyeglasses lens by controlling the edge measurement portion to obtain the edge information. Consequently, the edge information of the eyeglasses lens can be obtained by using the shaft alignment apparatus, and thus the usage time of the lens peripheral edge processing apparatus can reduced.
  • For example, the shaft alignment apparatus of the present example includes the target lens shape obtaining portion for obtaining a target lens shape applied to the eyeglasses lens. Consequently, the edge information obtaining portion of the shaft alignment apparatus can control the edge measurement portion based on the target lens shape applied to the eyeglasses lens, and can thus obtain the edge information at a position based on the target lens shape.
  • For example, the shaft alignment apparatus of the present example includes the finishing position setting portion for setting a finishing position of the edge based on the edge information of the eyeglasses lens. Consequently, a finishing position which is set by using a lens peripheral edge processing apparatus in the related art can be set by using the shaft alignment apparatus. Therefore, the usage time of the lens peripheral edge processing apparatus is reduced, the standby time of the lens peripheral edge processing apparatus is mitigated, and thus it is possible to relatively reduce the time required to manufacture eyeglasses.
  • For example, in the shaft alignment apparatus of the present example, the finishing position setting portion displays the finishing position based on edge information on the display portion, and sets a finishing position based on an operation signal from the operation portion for adjusting a finishing position on the display portion. Consequently, the operator can easily understand a finishing position formed in the eyeglasses lens. The operator easily adjusts a finishing position formed in the eyeglasses lens.
  • For example, in the shaft alignment apparatus of the present example, the finishing position setting portion displays a finishing position based on the edge information of a left eyeglasses lens and a finishing position based on the edge information of a right eyeglasses lens on the display portion in a comparable manner, and sets a finishing position of at least one of the left eyeglasses lens and the right eyeglasses lens based on an operation signal from the operation portion. Consequently, the operator can easily understand the balance between finishing positions in the left eyeglasses lens and the right eyeglasses lens. The operator can adjust each of the finishing positions set in the left eyeglasses lens and the right eyeglasses lens by taking into consideration the balance between the finishing positions. Therefore, the operator can easily manufacture eyeglasses with good appearance.
  • For example, the shaft alignment apparatus of the present example includes the refractive index obtaining portion for obtaining a refractive index of the eyeglasses lens, the image data obtaining portion for obtaining sectional image data including front surface image data for a front surface of the eyeglasses lens and rear surface image data for a rear surface of the eyeglasses lens, and the edge information obtaining portion for obtaining the edge information about an edge of the eyeglasses lens based on the refractive index and the sectional image data. Consequently, it is possible to efficiently obtain edge information of the eyeglasses lens by using sectional image data which is changed due to the refractive index of the eyeglasses lens.
  • For example, the shaft alignment apparatus of the present example includes a projection optical system which projects a measurement light beam toward the front surface or the rear surface of the eyeglasses lens, and a light receiving optical system which receives a first reflected light beam as a result of the measurement light beam being reflected at the front surface of the eyeglasses lens and a second reflected light beam as a result of the measurement light beam being reflected at the rear surface of the eyeglasses lens. Consequently, the image data obtaining portion can obtain sectional image data including front surface image data formed by the first reflected light beam and the rear surface image data formed by the second reflected light beam, with a simple configuration. The image data obtaining portion can obtain sectional image data in a non-contact manner by using a measurement light beam. Therefore, it is possible to efficiently obtain edge information of the eyeglasses lens.
  • In the present embodiment, the shaft alignment apparatus including the lens support portion where an eyeglasses lens is mounted, and the shaft alignment position setting portion for setting a shaft alignment position being an attachment position to the eyeglasses lens of the lens holding portion, and for setting a shaft alignment position of the eyeglasses lens mounted on the lens support portion, the lens holding portion for clamping and holding the eyeglasses lens in order to process a peripheral edge of the eyeglasses lens, is also used as a lens shape measurement apparatus. For example, an operator can obtain edge information of an eyeglasses lens in a non-contact manner by using the shaft alignment apparatus, and can thus efficiently obtain the edge information of the eyeglasses lens.
  • In the present example, the lens peripheral edge processing apparatus including a processing tool for processing a peripheral edge of an eyeglasses lens held by the lens holding portion is also used as a lens shape measurement apparatus. For example, an operator can obtain edge information of an eyeglasses lens in a non-contact manner by using the lens peripheral edge processing apparatus, and can thus efficiently obtain the edge information of the eyeglasses lens.
  • <Modification examples>
  • In the present example, as a configuration of holding the lens LE, an exemplary configuration of mounting the lens LE on the support pins 14 has been described, but this is only an example. For example, as a configuration of holding the lens LE, there may be a configuration of sandwiching the lens LE. In this case, a side surface (peripheral edge) of the lens LE may be sandwiched, and the front surface and the rear surface of the lens LE may be sandwiched.
  • In the present example, a description has been made of an exemplary configuration in which the cylindrical base 11 is rotated (in other words, the support pins 14 are rotated), and thus the lens LE mounted on the support pins 14 are rotated, but this is only an example. For example, there may be a configuration in which, when the cup Cu is blocked to the surface of the lens LE, the installation portion 31 is rotated about the central axis K2 without separating the cup Cu from the installation portion 31.
  • In the present example, a description has been made of an exemplary configuration in which the image data obtaining mechanism 60 obtains a sectional image of the lens LE captured by the imaging element 69, but this is only an example. For example, the image data obtaining mechanism 60 may obtain signal data before a sectional image is obtained.
  • In the present example, a description has been made of an exemplary configuration in which the image data obtaining mechanism 60 corrects edge information which is obtained by using the sectional image data 75, on the basis of a refractive index of the lens LE, so as to obtain edge information, but this is only an example. For example, the image data obtaining mechanism 60 may correct the sectional image data 75 on the basis of a refractive index of the lens LE, and may obtain edge information of the lens LE on the basis of the corrected sectional image data. Of course, also in this case, the image data obtaining mechanism 60 may further correct the obtained edge thickness t of the lens LE by taking into consideration a front surface curve value of the lens LE.
  • In the present example, a description has been made of an exemplary configuration in which the image data obtaining mechanism 60 adjusts an irradiation position of a measurement light beam to a position based on the target lens shape TG applied to the lens LE by using rotation of the lens LE in the horizontal direction and the Z-direction movement direction 63, but this is only an example. For example, the image data obtaining mechanism 60 may further include an X-direction movement mechanism adjusting a position of the measurement optical system 61 in the leftward-and-rightward direction (X direction), and may adjust an irradiation position of a measurement light beam by using the X-direction movement mechanism and the Z-direction movement direction 63. In this case, even if the lens LE is not rotated in the horizontal direction, an irradiation position of a measurement light beam can be adjusted to a position based on the target lens shape TG applied to the lens LE.
  • In the present example, a description has been made of an exemplary configuration in which the measurement optical system 61 is moved by driving the Y-direction movement mechanism 62, and thus a front surface position of the edge of the lens LE is obtained, but this is only an example. For example, there may be a configuration in which the measurement optical system 61 is disposed in a stationary manner without driving the Y-direction movement mechanism 62, and a front surface position of the edge of the lens LE is obtained. For example, in this case, the control unit 70 may specify a front surface position of the edge of the lens LE on the basis of a position of the front surface image data P1s displayed in each piece of the sectional image data 75. Of course, a front surface position of the edge of the lens LE may be obtained on the basis of both of a movement amount in which the measurement optical system 61 is moved by the Y-direction movement mechanism 62 and a position of the front surface image data P1s displayed in each piece of the sectional image data 75.
  • In the present example, a description has been made of an exemplary configuration in which the imaging element 69 of the image data obtaining mechanism 60 is disposed such that the light reception surface thereof is perpendicular to the optical axis L4, but this is only an example. For example, the imaging element 69 may be disposed to be tiltable. Consequently, even if a focal position of the imaging element 69 is deviated in the front surface and the rear surface of the lens LE due to a refractive index and a front surface curve value of the lens LE, a focal position can be appropriately adjusted by changing an angle of the light reception surface of the imaging element 69.
  • In the present example, a description has been made of an exemplary configuration in which a point-shaped measurement light beam is applied from the light source 65 of the image data obtaining mechanism 60, but this is only an example. For example, a slit-shaped measurement light beam may be applied from the light source 65. In this case, a slit plate and a lens may be provided between the light source 65 and the lens LE, and a measurement light beam applied from the light source 65 may be restricted to a slit shape so as to be collected at the lens LE. Consequently, a slit-shaped measurement light beam having a predetermined width is applied to the lens LE, and thus the imaging element 69 can capture images having predetermined widths of the front surface and the rear surface of the lens LE. A front surface curve value and a rear surface curve value of the lens LE may be obtained by performing image processing on the sectional image data 75 captured in the above-described way.
  • In the present example, a description has been made of an exemplary configuration in which a point-shaped measurement light beam is applied from the light source 65 of the image data obtaining mechanism 60 toward one point on the lens LE, but this is only an example. For example, there may be a configuration in which the light source 65 is disposed to be movable, and a point-shaped measurement light beam is applied in a line shape. Consequently, the imaging element 69 can also capture images having predetermined widths of the front surface and the rear surface of the lens LE. Therefore, the control unit 70 may obtain a front surface curve value and a rear surface curve value of the lens LE by performing image processing on the sectional image data 75.
  • In the present example, a description has been made an exemplary configuration in which the image data obtaining mechanism 60 obtains the sectional image data 75 at each radius vector angle of the lens LE, and the control unit 70 obtains edge information on the basis of the sectional image data 75, but this is only an example. For example, with respect to a position (for example, a position between points corresponding to a target lens shape) at which the sectional image data 75 is not obtained in the lens LE, interpolation may be performed by using edge information based on the sectional image data 75 at peripheral radius vector angles, and thus edge information of the position may be obtained.
  • In the present example, a description has been made of an exemplary configuration in which the light source 65 of the image data obtaining mechanism 60 applies a measurement light beam to a position based on a target lens shape applied to the lens LE, but this is only an example. For example, measurement may be performed at any position on the lens LE. As an example, for example, in a case where processing of making a hole in the lens surface of the lens LE is performed, the light source 65 may apply a measurement light beam to a hole position. Consequently, edge information (for example, an edge thickness) at the hole position of the lens LE or curve values of the front and rear surfaces can be obtained.
  • For example, there is a case where the imaging element 69 of the image data obtaining mechanism 60 cannot obtain the sectional image data 75 of the lens LE with high accuracy in a state in which a luminance level of a reflected light beam (for example, at least one of the first reflected light beam R1 or the second reflected light beam R2) is not favorable, depending on a curve value of the lens LE. Thus, for example, the control unit 70 may control a luminance level by controlling a projected light amount of a measurement light beam applied from the light source 65. In this case, the control unit 70 may measure a luminance level (luminance value) from the sectional image data 75, and may determine whether or not the measured luminance level satisfies a predetermined threshold value. The predetermined threshold value may be set in advance to a threshold value causing a luminance level to be determined as being favorable through tests, simulation, or the like.
  • For example, the control unit 70 controls (changes) an amount of light to be projected from the light source 65 on the basis of a determination result. For example, in a case where it is determined that the luminance level of the sectional image data 75 does not satisfy the predetermined threshold value (for example, less than the predetermined threshold value), the control unit 70 increases an amount of light to be projected from the light source 65 such that the luminance level of the sectional image data 75 satisfies the predetermined threshold value. For example, in a case where it is determined that the luminance level of the sectional image data 75 satisfies the predetermined threshold value (for example, equal to or more than the predetermined threshold value), the control unit 70 maintains an amount of light to be projected from the light source 65 without performing control of the luminance level. Of course, even in a case where it is determined that the luminance level of the sectional image data 75 satisfies the predetermined threshold value, the control unit 70 may control the luminance level such that the luminance level becomes more appropriate. Consequently, it is possible to obtain the sectional image data 75 of the lens LE with high accuracy in correspondence to the lens LE having various curve values.
  • In the present example, a description has been made of an exemplary configuration in which the gap h between the front surface image data P1s and the rear surface image data Q2s of the lens LE obtained from the sectional image data 75 is corrected on the basis of a refractive index of the lens LE, and thus the edge thickness t of the lens LE is obtained, but this is only an example. For example, there may be a configuration in which the sectional image data 75 is corrected on the basis of a refractive index of the lens LE, and the gap h between front surface image data and rear surface image data after the correction is calculated such that the edge thickness t of the lens LE is obtained. The sectional image data 75 may be corrected on the basis of at least one of a refractive index of the lens LE, a front surface curve value of the lens LE, and the distance D from the optical central position O to a point at each radius vector angle.
  • In the present example, a description has been made of an exemplary configuration in which blocking is performed by using the cup attachment apparatus 1, and then the sectional image data 75 and edge information (for example, the edge thickness t of the lens LE) are obtained, but this is only an example. In the present example, there may be a configuration in which blocking, obtainment of the sectional image data 75, and obtainment of edge information are performed by using the cup attachment apparatus 1. For example, in this case, the sectional image data 75 and edge information are obtained, and then blocking to the lens LE may be performed.
  • In the present example, a description has been made of an exemplary configuration in which the cup attachment apparatus 1 obtains edge information of the lens LE by using the measurement optical system 61, but this is only an example. For example, there may be a configuration in which the cup attachment apparatus 1 includes a tracing stylus, and the tracing stylus is brought into contact with the lens LE such that edge information of the lens LE is obtained.
  • In the present example, a description has been made of an exemplary configuration in which a finishing position of the lens LE is set by using the cup attachment apparatus 1, but this is only an example. For example, a finishing position of the lens LE may be set by using the lens peripheral edge processing apparatus 90. In this case, for example, a finishing position based on an edge thickness may be displayed on a display (not illustrated) of the lens peripheral edge processing apparatus 90. For example, a finishing position may be set on the basis of an operation signal for adjusting a finishing position. For example, a finishing position of the left lens and a finishing position of the right lens may be displayed in a comparable manner.
  • In the present example, a description has been made of an exemplary configuration in which edge information (second edge information) of one point at a radius vector angle based on a target lens shape in the front surface of the lens LE is obtained by using the lens peripheral edge processing apparatus 90, but this is only an example. For example, edge information of a plurality of points at radius vector angles based on a target lens shape may be obtained by using the lens peripheral edge processing apparatus 90. In this case, for example, edge information of a total of four points including two points (that is, the point P1 and the point P5 in Fig. 13) on the virtual line V1 and two points on the virtual line V2 may be obtained. Consequently, the control unit 95 can predict deformation or a tilt of the lens LE caused by holding the lens LE with the lens chuck shafts 102L and 102R. Hereinafter, as an example, a description will be made of a case where the lens LE is tilted.
  • Fig. 13 is a diagram illustrating a state in which the lens chuck shafts 102 hold the lens LE. For example, Fig. 13 illustrates a section along the virtual line V1, and the lens LE is tilted. In Fig. 13, an intersection between a side g1 extending from the point P1 in a direction perpendicular to the rotation central axis K3 and a side g2 extending from the point P5 in a direction parallel to the rotation central axis K3 is indicated by a point P6. For example, the control unit 95 brings the tracing stylus 201 of the lens peripheral edge processing apparatus 90 into contact with two points (that is, the point P1 and the point P5) on the virtual line V1 of the lens LE, and calculates a position coordinate of each point in the X direction. Next, the control unit 95 calculates a length of each of the side g1 and the side g2. For example, the control unit 95 obtains a length of the side g2 by using a difference between position coordinates of the point P1 and the point P5 in the X direction. For example, the control unit 95 obtains a length of the side g1 on the basis of a target lens shape applied to the lens LE. For example, in the present example, since a distance from the rotation central axis K3 to the point P1 in the Y direction and a distance from the rotation central axis K3 to the point P5 in the Y direction are known on the basis of the target lens shape, the length of the side g1 can be calculated on the basis of the distances.
  • For example, the control unit 95 computes an angle α formed between the side g1 and a line segment connecting the point P1 to the point P5, by using a trigonometric function using the side g1 and the side g2. Consequently, it is possible to obtain an angle at which the lens LE is tilted in the vertical direction with respect to the lens chuck shafts 102L and 102R. For example, the control unit 95 may calculate position coordinates of the two points on the virtual line V2 of the lens LE in the lens peripheral edge processing apparatus 90, and may obtain an angle at which the lens LE is tilted in the horizontal direction with respect to the lens chuck shafts 102L and 102R in the same manner as described above. For example, the control unit 95 may cause bevel formation data created in the cup attachment apparatus 1 to correspond to the lens LE held by the lens chuck shafts 102 by taking into consideration such a tilt angle.
  • In the present example, a description has been made of an exemplary configuration in which the cup attachment apparatus 1 includes an edge measurement mechanism, and measures the lens LE by controlling the edge measurement mechanism, so as to obtain edge information, but this is only an example. For example, there may be a configuration in which the cup attachment apparatus 1 does not include an edge measurement mechanism, and obtains edge information from a separate apparatus.
  • In the present example, a description has been made of an exemplary configuration in which, in the setting of a finishing position in step S7, the control unit 70 performs initial setting and automatic adjustment of a temporary position of a bevel, and the operator manually adjusts a temporary position of a bevel, but this is only an example. For example, at least one of such setting and adjustment may be performed. Of course, such setting and adjustment may be performed in combination thereof.
  • 1
    cup attachment apparatus
    2
    display
    10
    lens support mechanism
    20
    frame shape measurement mechanism
    30
    cup attachment mechanism
    40
    lens information measurement mechanism
    60
    image data obtaining mechanism
    70
    control unit
    71
    memory
    75
    sectional image data
    80
    simulation screen
    90
    lens peripheral edge processing apparatus
    100
    lens holding unit
    102
    lens chuck shaft
    200
    edge information obtaining unit
    300
    processing unit

Claims (10)

  1. An eyeglasses lens processing system that processes a peripheral edge of an eyeglasses lens (LE), comprising:
    a lens peripheral edge processing apparatus (90) that includes a processing tool (300), a lens holding portion (100) for holding the eyeglasses lens (LE), and a process control data obtaining portion (95) for obtaining process control data for processing a peripheral edge of the eyeglasses lens (LE),
    wherein the lens peripheral edge processing apparatus (90) is adapted to control the processing tool (300) based on the process control data obtained by the process control data obtaining portion (95) to process the eyeglasses lens (LE) held by the lens holding portion (100),
    characterized in that the eyeglasses lens processing system further comprising:
    a shaft alignment apparatus (1) that includes a shaft alignment position setting portion (40) for setting a shaft alignment position being an attachment position with respect to an eyeglasses lens (LE) of a lens holding portion (100) that clamps and holds the eyeglasses lens (LE) in order to process a peripheral edge of the eyeglasses lens (LE),
    wherein the shaft alignment apparatus (1) includes an edge information obtaining portion (70) for obtaining edge information about an edge of the eyeglasses lens (LE), and
    wherein the eyeglasses lens processing system is adapted to:
    • in the shaft alignment apparatus (1):
    o set the shaft alignment position and
    ∘ obtain the edge information of the eyeglasses lens (LE) and
    • after the edge information is obtained and the shaft alignment position is set, in the lens peripheral edge processing apparatus (90):
    o hold the eyeglasses lens (LE) by the lens holding portion (100) based on the shaft alignment position set by the shaft alignment apparatus (1),
    ∘ obtain the process control data based on the edge information obtained in the shaft alignment apparatus (1), and
    o control the processing tool (300) based on the process control data to process the eyeglasses lens (LE) held by the lens holding portion (100).
  2. The eyeglasses lens processing system according to claim 1,
    wherein the edge information obtaining portion (70) includes an edge measurement portion for obtaining edge information of the eyeglasses lens (LE), and is adapted to measure the eyeglasses lens (LE) by controlling the edge measurement portion to obtain the edge information.
  3. The eyeglasses lens processing system according to claim 2, further comprising:
    a target lens shape obtaining portion (20) for obtaining a target lens shape applied to the eyeglasses lens (LE),
    wherein the edge information obtaining portion (70) is adapted to control the edge measurement portion based on the target lens shape to obtain the edge information.
  4. The eyeglasses lens processing system according to any one of claims 1 to 3, further comprising:
    a finishing position setting portion for setting a finishing position of the edge based on the edge information of the eyeglasses lens (LE).
  5. The eyeglasses lens processing system according to claim 4,
    wherein the finishing position setting portion is adapted to display the finishing position based on the edge information on a display portion (2), and to set the finishing position based on an operation signal from an operation portion for adjusting the finishing position on the display portion (2).
  6. The eyeglasses lens processing system according to claim 5,
    wherein the finishing position setting portion is adapted to display the finishing position based on the edge information of a left eyeglasses lens (LE) and the finishing position based on the edge information of a right eyeglasses lens (LE) on the display portion (2) in a comparable manner, and to set the finishing position of at least one of the left eyeglasses lens (LE) and the right eyeglasses lens (LE) based on an operation signal from the operation portion.
  7. The eyeglasses lens processing system according to any one of claims 2 to 6, further comprising:
    a lens support portion (10) where the eyeglasses lens (LE) is mounted,
    wherein the edge measurement portion is adapted to obtain edge information of the eyeglasses lens (LE) mounted on the lens support portion (10).
  8. The eyeglass lens processing system according to any one of claims 1 to 7,
    wherein the shaft alignment apparatus (1) is a cup attachment apparatus that includes a cup attachment portion (30) for attaching a cup for holding the eyeglasses lens (LE) with the lens holding portion (100) to the eyeglasses lens (LE) based on the shaft alignment position set by the shaft alignment position setting portion.
  9. An eyeglasses lens processing method of processing a peripheral edge of an eyeglasses lens, the method comprising:
    an edge information obtaining step (S6) of obtaining, by a shaft aligning apparatus (1), edge information of the eyeglasses lens (LE);
    a shaft alignment position setting step (S7) of setting, by a shaft aligning apparatus, a shaft alignment position being an attachment position with respect to the eyeglasses lens (LE) of a lens holding portion (100) that clamps and holds the eyeglasses lens (LE);
    a holding step (S8) of causing the lens holding portion (100) of a lens peripheral edge processing apparatus (90) to hold the eyeglasses lens (LE) based on the set shaft alignment position after the edge information obtaining step (S6) and the shaft alignment position setting step (S7) are executed;
    a process control data obtaining step (S1 1) of obtaining process control data based on the edge information after the edge information obtaining step (S6) and the shaft alignment position setting step (S7) are executed; and
    a processing step (S12) of controlling a processing tool of a lens peripheral edge processing apparatus based on the process control data to process the eyeglasses lens (LE) held by the lens holding portion (100).
  10. The eyeglasses lens processing method according to claim 9, wherein the shaft alignment position setting step (S7) is a cup attachment step of attaching a cup (Cu) to the eyeglasses lens (LE).
EP18209972.1A 2017-12-05 2018-12-04 Eyeglasses lens processing system and eyeglasses lens processing method Active EP3495093B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017233834A JP7087366B2 (en) 2017-12-05 2017-12-05 Axis setting device, spectacle lens processing system, and spectacle lens processing method

Publications (2)

Publication Number Publication Date
EP3495093A1 EP3495093A1 (en) 2019-06-12
EP3495093B1 true EP3495093B1 (en) 2023-08-09

Family

ID=64604483

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18209972.1A Active EP3495093B1 (en) 2017-12-05 2018-12-04 Eyeglasses lens processing system and eyeglasses lens processing method

Country Status (2)

Country Link
EP (1) EP3495093B1 (en)
JP (1) JP7087366B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7467896B2 (en) 2019-12-02 2024-04-16 株式会社ニデック Apparatus for processing periphery of eyeglass lenses, program for processing periphery of eyeglass lenses, and method for processing eyeglass lenses
KR102399153B1 (en) * 2020-07-27 2022-05-18 주식회사 휴비츠 Lens shape measuring device using multiple retroreflective materials with different luminance and measurement method thereof

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6012965A (en) * 1997-10-07 2000-01-11 Micro Optics Design Corp. Manufacturing ophthalmic lenses using lens structure cognition and spatial positioning system
JP3828685B2 (en) * 1999-08-31 2006-10-04 株式会社ニデック Cup mounting device
JP3842953B2 (en) 2000-04-28 2006-11-08 株式会社ニデック Cup mounting device
JP5202011B2 (en) 2007-02-28 2013-06-05 株式会社ニデック Lens meter
JP4970149B2 (en) 2007-05-31 2012-07-04 株式会社ニデック Cup mounting device
JP5134346B2 (en) 2007-11-30 2013-01-30 株式会社ニデック Eyeglass lens peripheral processing equipment
JP6338039B2 (en) * 2012-03-09 2018-06-06 株式会社ニデック Device with cup mounting unit
JP6187742B2 (en) * 2013-03-29 2017-08-30 株式会社ニデック Eyeglass lens processing equipment
JP6338042B2 (en) 2013-08-02 2018-06-06 株式会社ニデック Glasses parameter calculation device, glasses parameter calculation program
JP6390103B2 (en) 2014-01-14 2018-09-19 株式会社ニデック Lens peripheral processing apparatus and lens peripheral processing program

Also Published As

Publication number Publication date
EP3495093A1 (en) 2019-06-12
JP2019098484A (en) 2019-06-24
JP7087366B2 (en) 2022-06-21

Similar Documents

Publication Publication Date Title
CN110998225B (en) Spectacle frame shape measuring device and lens processing device
JP6236313B2 (en) Block device, spectacle lens manufacturing method and program
EP1997585A1 (en) Cup attaching apparatus
EP3495093B1 (en) Eyeglasses lens processing system and eyeglasses lens processing method
EP2636481B1 (en) Apparatus having cup attaching unit
JP2003329541A (en) Positioning method of eye point and eye point measuring system
JP2007526510A (en) Method for centering a spectacle lens in a centering-blocking device and associated centering-blocking device
JP7172029B2 (en) Alignment device
JP7243043B2 (en) Target shape measuring device
CN112577443B (en) Spectacle frame shape measuring device and lens processing device
EP3290892B1 (en) Lens measurement device
EP3575033B1 (en) Processing control data acquiring apparatus, processing control data acquiring method and processing control data acquiring program
JP6338042B2 (en) Glasses parameter calculation device, glasses parameter calculation program
JP7187799B2 (en) Spectacle lens periphery processing information acquisition device and spectacle lens periphery processing information acquisition program
JP7225644B2 (en) lens measuring device
JP7243706B2 (en) Lens shape acquisition device
KR102141448B1 (en) Apparatus having means for attaching cup
JP7156288B2 (en) Spectacle frame shape measuring device and spectacle frame shape measuring program
JP2022057964A (en) Spectacle frame shape measurement device and control program of spectacle frame shape measurement device
JP2022057965A (en) Spectacle frame shape measurement device and control program of spectacle frame shape measurement device
JP2018200411A (en) Loupe mounting position determination method for binocular loupe device, system, loupe mounting auxiliary tool, and assembly method of binocular loupe device
JP2006281331A (en) Suction tool fitting device of spectacle lens and suction tool fitting position determination method

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191205

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20220713

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20230517

RIN1 Information on inventor provided before grant (corrected)

Inventor name: TAKEICHI, KYOJI

Inventor name: SAKUDA, RYUMA

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602018054921

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230821

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20230809

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1596949

Country of ref document: AT

Kind code of ref document: T

Effective date: 20230809

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231110

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231209

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231211

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231109

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231209

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231110

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20231110

Year of fee payment: 6

Ref country code: FR

Payment date: 20231108

Year of fee payment: 6

Ref country code: DE

Payment date: 20231031

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230809