US20180373220A1 - System and Method for Designing Scleral Lenses - Google Patents

System and Method for Designing Scleral Lenses Download PDF

Info

Publication number
US20180373220A1
US20180373220A1 US16/100,205 US201816100205A US2018373220A1 US 20180373220 A1 US20180373220 A1 US 20180373220A1 US 201816100205 A US201816100205 A US 201816100205A US 2018373220 A1 US2018373220 A1 US 2018373220A1
Authority
US
United States
Prior art keywords
curve
generating
curves
haptic
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/100,205
Inventor
Gregory Gemoules
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US16/100,205 priority Critical patent/US20180373220A1/en
Publication of US20180373220A1 publication Critical patent/US20180373220A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/19Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path
    • G05B19/21Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path using an incremental digital measuring device
    • G05B19/23Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path using an incremental digital measuring device for point-to-point control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29DPRODUCING PARTICULAR ARTICLES FROM PLASTICS OR FROM SUBSTANCES IN A PLASTIC STATE
    • B29D11/00Producing optical elements, e.g. lenses or prisms
    • B29D11/00009Production of simple or compound lenses
    • B29D11/00038Production of contact lenses
    • B29D11/00105Production of contact lenses covering a large part of the cornea
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29DPRODUCING PARTICULAR ARTICLES FROM PLASTICS OR FROM SUBSTANCES IN A PLASTIC STATE
    • B29D11/00Producing optical elements, e.g. lenses or prisms
    • B29D11/00951Measuring, controlling or regulating
    • B29D11/00961Measuring, controlling or regulating using microprocessors or computers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36204Lathe, turning

Definitions

  • the present invention relates to optical systems.
  • the present invention relates to systems methods for designing scleral lenses.
  • Keratoconus is a degenerative disorder of the eye in which structural changes within the cornea cause thinning and loss of curvature. Keratoconus can cause substantial distortion of vision, including double vision (diplopia), streaking and hyper-sensitivity to light. Keratoconus is typically diagnosed during adolescence. Debilitating deterioration in vision can occur.
  • LASIK Laser-Assisted Keratomileusis
  • flap complications occur in from 0.3 to 5.7 percent of LASIK procedures, according to the April 2006 issue of American Journal of Ophthalmology . These flap complications can lead to an irregularly shaped eye surface and distorted vision.
  • Irregular astigmatism also may occur from LASIK correction that is not centered property on the eye or from irregular healing after the procedure.
  • epithelial ingrowth occurs when cells from the outer layer of the cornea, the epithelium, grow under the flap. In some cases, blurred vision or chromic discomfort can result.
  • DLK diffuse lamellar keratitis
  • corneal implants called “Intacs,” may be prescribed to hold the cornea in place.
  • U.S. Pat. No. 5,570,142 to Lieberman discloses a contact lens for asymmetric aspheric corneas with a peripheral portion to fit peripheral portion of the cornea as determined by scan of subject eye.
  • the contact lens is not substantially greater in diameter than the cornea.
  • the process for manufacturing the lens uses three-dimensional topographic data from points on the cornea. The data is used to shape at least the peripheral portion of the posterior surface of the lens.
  • the process in Lieberman limited to corneal lenses only and not suitable for scleral lenses.
  • U.S. Pat. No. 5,452,031 to Ducharme discloses a method for manufacturing a contact lens.
  • the contact lens is made through use of a computer implementing an approximation of the cornea.
  • Piecewise polynomials approximating the corneal topology have equal first and second derivatives where they join.
  • a polynomial representing the central optical portion of the lens and the piecewise polynomial adjacent to the central optical portion curve have an equal first derivative where they join.
  • a contact lens is cut corresponding to the lens surface defined by the piecewise polynomials.
  • the method is Ducharme requires an approximation of only the cornea thereby leading to an inaccurate representation of the scleral surface and an uncomfortable fit.
  • the prior art fails to disclose or suggest a system and method for designing scleral lenses that conform to the shape of the eye surface. Therefore, there is a need in the art for a system and method for designing scleral leases that accurately follows the shape of the eye surface such that the lens does not rotate when worn or move excessively, and can be worn comfortably and safely on the eye.
  • a system and method for designing scleral lenses includes a computer, a camera, and a lathe connected by a network.
  • the camera captures a set of sagittal images at each of a set of sagittal planes of an eye and sends the set of sagittal images to the computer.
  • a combined sagittal image is created at each sagittal plane from each set of sagittal images.
  • a spline curve is defined for each combined sagittal image to create a set of spline curves.
  • a set of back surface curves is created from the set of spline curves.
  • a back optical surface is generated.
  • a back haptic surface curve is created for each back surface curve creating a set of back haptic surface curves.
  • a back haptic surface is generated from the set of back haptic surface curves.
  • a blend surface is generated to connect the back haptic surface to the back optical surface to create the back lens surface. The process is repeated to create the front lens surface.
  • a font surface curve is created at each plane adjacent to the back surface to create a set of front surface curves.
  • a front optical surface is generated.
  • a front haptic surface curve is created for each front surface curve creating a set of front haptic surface curves.
  • a front haptic surface is generated from the set of front haptic surface curves.
  • a blend surface is generated to connect the front haptic surface to the front optical surface to create the front lens surface.
  • a “point cloud” is generated from the back lens surface and the front lens surface.
  • the point cloud is converted to a lens image.
  • the lens is converted to a text file.
  • the text file is sent to the lathe and a scleral lens is cut by the lathe using the text file.
  • FIG. 1 is a schematic for a system for designing a scleral lens of a preferred embodiment.
  • FIG. 2 is a flowchart of a method for designing a scleral lens of a preferred embodiment.
  • FIG. 3A is a set of sagittal planes of an eye of a preferred embodiment.
  • FIG. 3B is a sagittal image captured on a sagittal plane of a preferred embodiment.
  • FIG. 3C is a sagittal image taken on a sagittal plane of a preferred embodiment.
  • FIG. 3D is a combined sagittal image of a preferred embodiment.
  • FIG. 4 is a spline curve on a corneal surface of a sagittal image of a preferred embodiment.
  • FIG. 5 is a near point carve on a sagittal image of a preferred embodiment.
  • FIG. 6 is a set of near point curves of a preferred embodiment.
  • FIG. 7A is a point cloud of an optical section of a preferred embodiment.
  • FIG. 7B is a set of back surface points of a preferred embodiment.
  • FIG. 7C is a point cloud of an optical section and an optical surface of a preferred embodiment.
  • FIG. 7D is an optical surface and a trim tool of a preferred embodiment.
  • FIG. 7E is an optical surface of a preferred embodiment.
  • FIG. 8 is a drawing of an optical back surface trimmed to a near point curve of a preferred embodiment.
  • FIG. 9 is a drawing of an optical back surface section and a haptic surface section of a preferred embodiment.
  • FIG. 10 is a drawing of a back surface of a lens of a preferred embodiment.
  • FIG. 11 is a drawing of a back surface of a lens, a spline of a front surface of a lens, and a front optical section of a preferred embodiment.
  • FIG. 12 is a drawing of a from surface and a back surface of a lens.
  • FIG. 13 is a drawing of a point cloud of a lens.
  • FIG. 14 is a drawing of a completed lens.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or a composition of matter, or any new and useful improvement thereof. Therefore, aspects of the present disclosure may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Further, aspects of the present disclosure may take the foil i of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereof.
  • the computer readable media may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave.
  • the propagated data signal may take any of a variety of forms, including, but not limbed to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, or any suitable combination thereof.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++, C#, .NET, Objective C, Ruby, Python SQL, or other modern and commercially available programming languages.
  • object oriented programming language such as Java, C++, C#, .NET, Objective C, Ruby, Python SQL, or other modern and commercially available programming languages.
  • These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computes to implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • system 100 includes computer 101 , camera 102 , and lathe 103 , each connected to network 104 .
  • Computer 101 includes processor 105 and memory 106 connected to processor 105 .
  • Lens design process 107 is saved in memory 106 and executed by processor 105 .
  • camera 102 captures a set of sagittal images of an eye.
  • the set of sagittal images are sent to computer 101 through network 104 .
  • a lens is designed using lens design process 107 from the set of sagittal images, as will be further described below.
  • the lens design is sent to lathe 103 through network 104 .
  • Lathe 103 cuts the lens according to the design.
  • camera 102 is a Visante® optical coherence tomography (“OCT”) unit available from Carl Zeiss Meditec, Inc.
  • OCT optical coherence tomography
  • Other suitable imaging devices known in the art may be employed.
  • lathe 103 is a DAC ALM lens lather available from DAC International, Inc. Other suitable lens lathes known in the art may be employed.
  • lens design process 200 will be described.
  • a back lens surface is first created that will rest on an eye.
  • a front lens surface is created second, based on the back lens surface, as will be further described below.
  • step 201 a set of overlapping sagittal images is captured at each of a set of sagittal planes of an eye.
  • each set of sagittal images are combined to create a combined sagittal image for each sagittal plane.
  • each set of sagittal images are digitally “stitched” together using a graphics editing software program to create the combined sagittal image, as will be further described below.
  • portions of each set of sagittal images are cut and pasted together using a graphics editing software program to create the combined sagittal image.
  • a graphics editing software program to create the combined sagittal image.
  • Other image editing techniques known in the art may be employed.
  • a set of back surface curves are created.
  • a spline curve is defined on each combined sagittal image by tracing the surface of the eye to create a set of spline curves, as will be further described below.
  • a lens curve is also generated for each combined sagittal image by converting each spline curve to a near point carve at a predetermined distance from the eye surface, as will be further described below.
  • step 204 an optical surface is generated.
  • a haptic surface curve is created for each lens curve to create a set of haptic surface curves.
  • a haptic surface is generated from the set of haptic surface curves.
  • step 207 a blend surface is generated for the optical surface and the haptic surface.
  • step 208 the haptic surface, the blend surface, and the optical surface are combined to form a lens surface.
  • step 209 if the front surface has been created, then lens design process 200 proceeds to step 211 . If the front surface has not been created, then lens design process 200 proceeds to step 210 .
  • a set of front surface curves are created.
  • a front lens curve is generated at each sagittal plane by creating a near point curve at a predetermined distance from the back lens surface, as will be further described below.
  • Steps 204 , 205 , 206 , 207 , and 208 are repeated to create the front lens surface from the set of front surface curves.
  • step 211 the back lens surface and front lens surface are converted to a point cloud.
  • step 212 the point cloud is concerted to a three-dimensional lens image.
  • step 213 the three-dimensional lens image is converted to a text file.
  • step 214 the lens is cut by a lathe using the text file.
  • the text file is a cutting pattern of the lens that guides the movement of the lathe.
  • Eye 301 has sagittal planes 302 , 303 , 304 , 305 , 306 , and 307 , and reference point 344 .
  • Angle a separates each of sagittal planes 302 , 303 , 304 , 305 , 306 , and 307 .
  • angle a is approximately 30°. In other embodiments, other angles may be used.
  • a set of sagittal images is captured at each of sagittal planes 302 , 303 , 304 , 305 , 306 , and 307 .
  • sagittal image 308 is captured from point 309 to point 310 and sagittal image 311 is captured from point 312 to point 313 .
  • sagittal image 314 is captured from point 315 to point 316 and sagittal image 317 is captured from point 318 to point 319 .
  • sagittal image 320 is captured from point 321 to point 322 and sagittal image 323 is captured from point 324 to point 325 .
  • sagittal image 326 is captured from point 327 to point 328 and sagittal image 329 is captured from point 330 to point 331 .
  • sagittal image 332 is captured from point 333 to point 334 and sagittal image 335 is captured from point 336 to point 337 .
  • sagittal image 338 is captured from point 339 to point 340 and sagittal image 341 is captured from point 342 to point 343 .
  • two overlapping sagittal images are captured at each sagittal plane.
  • the pupil of the subject eye is dilated.
  • a low power infrared light beam is centered on the dilated pupil to create the reference point.
  • the eye is moved laterally to move the infrared radiation beam off-center to capture the two overlapping sagittal images.
  • Other techniques known in the art may be employed.
  • sagittal image 308 spans from point 309 to point 310 .
  • Sagittal image 308 has reference point 344 , eye surface 345 , and pupillary center 349 .
  • Sagittal image 311 spans from point 312 to point 313 .
  • Sagittal image 311 has reference point 344 , eye surface 346 , and pupillary center 349 .
  • Combined sagittal image 347 includes sagittal image 308 and sagittal image 311 .
  • Sagittal image 308 has eye surface 345 .
  • Sagittal image 311 has eye surface 346 .
  • Combined sagittal image 345 has reference point 344 , combined eye surface 348 , and pupillary center 349 . Eye surfaces 345 and 346 form combined eye surface 348 .
  • each of eye surfaces 345 and 346 is a “surface profile” created by surface detection in the Visante® OCT unit when each of sagittal images 308 and 311 is captured.
  • sagittal image 308 and 311 are combined by overlapping eye surfaces 345 and 346 at pupillary center 349 to create a “best fit” combined eye surface.
  • sagittal images 308 and 311 are digitally “stitched” together using a graphics editing software.
  • sagittal images 308 and 311 are combined by overlapping eye surfaces 345 and 346 and aligning eye surfaces 345 and 346 at reference point 344 to create a “best fit” combined eye surface.
  • combined sagittal image 347 is created by digitally “stitching” sagittal images 308 and 311 using Adobe Photoshop software available from Adobe Systems, Inc.
  • sagittal images 308 and 311 are imported into Adobe Photoshop and combined using the “Photomerge” tool in Adobe Photoshop.
  • Other suitable image editing software known in the art may be employed.
  • portions of sagittal images 308 and 311 are “cut” and “pasted” together, using Adobe Photoshop to create combined sagittal image 347 .
  • Other image editing software and techniques known in the art may be employed.
  • Combined sagittal image 401 includes eye surface 403 .
  • Eye surface 403 has surface apex 405 .
  • Spline curve 402 traces eye surface 403 and intersects surface apex 405 .
  • Spline curve 402 has a plurality of control points 404 to manipulate the curvature of spline curve 402 .
  • spline curve 402 and control points 404 are created by importing combined sagittal image 401 into CATIA Design software, available from Dassault Systemes Americas Corp., and drawing spline curve 402 and control points on eye surface 403 .
  • CATIA Design software available from Dassault Systemes Americas Corp.
  • Other computer-aided design (“CAD”), computer-aided manufacturing (“CAM”) and/or computer-aided engineering (“CAE”) software known in the art may be employed.
  • each of control points 404 is digitally “attached” to spline curve 402 at anatomical parts of eye surface 403 .
  • the density and location of each control points 404 depend on the curvature of eye surface 403 . Any of control points 404 may be repositioned in any direction by selecting and dragging any of control points 404 to alter spline curve 402 .
  • control points In a preferred embodiment, approximately 30 control points are attached to spline curve 402 . In other embodiments, other numbers of control points are employed.
  • haptic section 406 is defined along spline curve 402 by end point 407 and point 408 .
  • Optical section 409 is defined along spline curve 402 by point 410 and point 411 .
  • Haptic section 412 is defined along spline curve 402 by point 413 and point 414 .
  • surface curve 501 is distance 503 from eye surface 502 at apex 516 .
  • Surface curve 501 has a plurality of control points 504 to manipulate the curvature and location of surface curve 501 .
  • distance 503 is in a range from approximately 0 . 35 nun to approximately 0.5 mm to accommodate for a “settling distance” of the lens on the eye.
  • the settling distance is in a range of approximately 0.2 mm to 0.25 mm. In other embodiments, other distances are employed.
  • each of control points 504 is digitally “attached” to surface curve 501 at anatomical parts of eye surface 502 .
  • the density and location of each control points 504 depend on the curvature of eye surface 502 . Any of control points 504 may be repositioned in any direction by selecting and dragging any of control points 504 to reposition surface curve 501 .
  • surface curve 501 is created by selecting and dragging control points 504 to reposition a portion of spline curve 402 in FIG. 4 by a predetermined distance from eye surface 502 .
  • Surface carve 501 contacts eye surface 502 at contact points 505 and 506 .
  • Surface curve 501 has end points 507 and 508 .
  • End point 507 is distance 509 from centerline 510 .
  • Contact point 505 is distance 511 from centerline 510 .
  • Contact point 506 is distance 512 from centerline 510 .
  • End point 508 is distance 513 from centerline 510 .
  • Haptic section curve 514 of surface curve 501 is defined by end point 507 and contact point 505 .
  • Haptic section curve 515 of surface curve 501 is defined by end point 508 and contact point 506 .
  • Haptic section curves 514 and 515 contact eye surface 502 .
  • the distances of surface curve 501 are listed in Table 1 below. In other embodiments, other distances are employed according to the desired lens design.
  • set of surface curves 600 includes surface curves 601 , 602 , 603 , 604 , 605 , and 606 .
  • Surface curves 601 , 602 , 603 , 604 , 605 , and 606 are joined at apex 607 .
  • Each of surface curves 601 , 602 , 603 , 604 , 605 , and 606 is separated from each other by angle a.
  • Each of surface curves 601 , 602 , 603 , 604 , 605 , and 606 extends along a sagittal plane as shown in FIG. 3A .
  • Surface curve 601 extends along sagittal plane 305 .
  • Surface curve 602 extends along sagittal plane 304 .
  • Surface curve 603 expends along sagittal plane 303 .
  • Surface curve 604 extends along sagittal plane 302 .
  • Surface curve 605 extends along sagittal plane 307 .
  • Surface curve 606 extends along sagittal plane 306 .
  • Step 204 will be further described with reference to FIGS. 7A, 7B, 7C, 7D, 8, and 9 .
  • optical point cloud 700 is imparted from a spreadsheet of predetermined point values.
  • Optical point cloud 700 is a set of three-dimensional coordinates.
  • Optical point cloud 700 has set of back surface points 701 and set of front surface points 702 , thickness 703 , diameter 704 , and height 705 , each of which may vary according to the desired lens design.
  • the predetermined point values are a set of polar coordinates generated from standard optical lens design methods known in the art that depend on the desired diameter, material, focal power, refractive index of the desired material, sagittal value of the curve of the desired material, and a minimum thickness of the desired material, and the general shape of the optical surfaces, i.e., spherical, aspherical, or toric.
  • the curvature of set of front surface points 702 depends on the curvature of set of back surface points 701 . For example, at a given diameter and a given set of back surface points the curve of the front optical surface will generally have a steeper curve as the focal power increases and a flatter curve as the focal power decreases.
  • the predetermined point values of set of back surface points 701 are determined by the method disclosed in U.S. application Ser. No. 13/277,139, filed on Oct. 19, 2011 and is incorporated herein by reference.
  • set of back surface points 701 is imported first and set of front surface points 702 is imported second.
  • set of back surface points 701 is separated from optical point cloud 700 by determining a first set of surface coordinates that, positioned on an eye, are closest to the pupillary center.
  • set of front surface points 702 is separated from optical point cloud 700 by determining a second set of surface coordinates that, when positioned on an eye, are furthest from the papillary center.
  • set of back surface points 701 is imported and separated from optical point cloud 700 .
  • Set of back surface points 701 has diameter 706 and height 707 .
  • surface 708 is generated from set of back surface points 701 by connecting the point values of set of back surface points 701 .
  • surface 708 is a best fit non-uniform rational B-spline (“NURBS”) surface.
  • NURBS non-uniform rational B-spline
  • surface 708 is a t-spline surface.
  • surface 708 is a subdivision surface.
  • Other surface models known in the art may be employed.
  • Optical surface 710 has diameter 711 and height 712 that matches diameter 706 and height 707 of set of hack surface points 701 , respectively.
  • Optical surface 901 has surface edge 916 .
  • Optical surface 901 is overlaid on surface curve 902 and eye surface 903 .
  • Surface edge 916 of optical surface 901 is aligned with surface curve 902 through control points 920 and 921 by selecting and dragging optical surface 901 to reposition optical surface 901 .
  • Surface curve 902 is further aligned with surface edge 916 of optical surface 901 at control points 920 and 921 by selecting and dragging control points 920 and 921 so that surface curve 902 is substantially tangential to surface edge 916 at control points 920 and 921 .
  • Optical surface 901 is trimmed with trim tool 918 to create surface edge 917 and define surface edge 916 from point 920 to point 921 .
  • Surface edge 917 is adjacent eye surface 903 and does not contact eye surface 903 .
  • Haptic surface curve 902 is trimmed with trim tool 918 by deleting portions of surface cure 902 covered by trim tool 918 to create haptic surface curves 904 and 905 .
  • Haptic surface curve 904 is defined from point 922 to point 926 .
  • Haptic surface curve 915 is defined from point 927 to point 931 .
  • Haptic surface curves 904 and 915 contact eye surface 903 .
  • the slope of haptic surface curve 904 may be altered using points 922 , 923 , 924 , 925 , and 926 by selecting and dragging any of points 922 , 923 , 924 , 925 , and 926 so reposition haptic surface curve 904 .
  • the slope of haptic surface curve 915 may be altered using points 927 , 928 , 929 , 930 , and 931 by selecting and dragging any of points 927 , 928 , 929 , 930 , and 931 to reposition haptic surface curve 915 . Altering the slopes of haptic surface curves 904 and 915 provides a better fit of haptic surface curves 904 and 915 on eye surface 903 .
  • Each of haptic surface curves 905 , 906 , 907 , 908 , 909 , 910 , 911 , 912 , 913 , and 914 is created as previously described.
  • Each of haptic surface curves 904 , 905 , 906 , 907 , 908 , 909 , 910 , 911 , 912 , 913 , 914 , and 915 extend along a sagittal plane as shown in FIG. 3A .
  • Each of haptic surface curves 904 , 905 , 906 , 907 , 908 , 909 , 910 , 911 , 912 , 913 , 914 , and 915 is separated from each other by angle a.
  • Haptic surface curves 904 and 915 extend along sagittal plane 302 .
  • Haptic surface carves 905 and 914 extend along sagittal plane 303 .
  • Haptic surface curves 907 and 912 extend along sagittal plane 304 .
  • Haptic surface curves 909 and 910 extend along sagittal plane 305 .
  • Haptic surface curves 911 and 908 extend along sagittal plane 306 .
  • Haptic surface curves 913 and 906 extend along sagittal plane 307 .
  • Haptic surface 1002 is generated to connect haptic surface curves 1003 , 1004 , 1005 , 1006 , 1007 , 1008 , 1009 , 1010 , 1011 , and 1012 .
  • haptic surface 1002 is a best fit NURBS surface. In another embodiment, haptic surface 1002 is a t-spline surface. In another embodiment, haptic surface 1002 is a subdivision surface. Other surface models known in the art may be employed.
  • optical surface 1001 is a best fit NURBS surface. In another embodiment, optical surface 1001 is a t-spline surface. In another embodiment, optical surface 1001 is a subdivision surface. Other surface models known in the art may be employed.
  • Blend surface 1103 is a best fit surface generated to connect optical surface 1102 with haptic surface 1104 .
  • Lens surface 1101 includes optical surface 1102 , blend surface 1103 attached to optical surface 1102 , and haptic surface 1104 attached to blend surface 1103 .
  • blend surface 1103 is a best fit NURBS surface. In another embodiment, blend surface 1103 is a t-spline surface. In another embodiment, blend surface 1103 is a subdivision surface. Other surface models known in the art may be employed.
  • Each front surface curve 1202 is generated adjacent to back lens surface 1201 and extends along a sagittal plane as shown in FIG. 3A .
  • front surface carve 1202 extends along sagittal plane 302 .
  • Front surface curve 1202 is constrained by back lens surface 1201 .
  • Front surface curve 1202 has haptic surface curves 1217 and 1218 .
  • Front surface curve 1202 is trimmed as previously described to create haptic surface curves 1217 and 1218 .
  • Haptic surface curve 1217 is defined from end point 1206 to point 1210 .
  • Haptic surface curve 1218 is defined from point 1212 to end point 1214 .
  • Front optical surface 1220 is overlaid onto front surface curve 1202 .
  • Edge 1225 of optical surface 1220 is aligned with front surface curve 1202 through points 1221 , 1222 , 1223 , 1224 , and 1204 by selecting and dragging front optical surface 1220 to align edge 1225 of optical surface 1220 with front surface curve 1202 as previously described.
  • Optical surface 1220 is trimmed as previously described to define optical surface 1220 from point 1222 to point 1223 .
  • Point 1204 is distance 1205 from back lens surface apex 1203 .
  • End point 1206 is distance 1207 from center line 1208 and distance 1209 from back lens surface 1201 .
  • Point 1210 is distance 1211 from back lens 1201 .
  • Point 1212 is distance 1213 from back lens surface 1201 .
  • End point 1214 is distance 1215 from center line 1208 and distance 1216 from back lens surface 1201 .
  • Distance 1205 depends on the desired lens design. In a preferred embodiment, distance 1205 is in a range from 0.15 mm to 0.4 mm.
  • the distances of front surface carve 1202 are listed in Table 2 below. In other embodiments, other distances are employed according to the desired lens design.
  • lens 1300 includes back lens surface 1301 and front lens surface 1302 adjacent to back lens surface 1302 .
  • Front lens surface 1302 includes front haptic surface 1303 , front blend surface 1304 attached to haptic surface 1303 , and front optical surface 1305 attached to front blend surface 1304 .
  • Each of front haptic surface 1303 , front blend surface 1304 , and front optical section 1305 is generated as previously described with respect to the back surface.
  • front haptic surface 1303 is a best fit NURBS surface. In another embodiment, front haptic surface 1303 is a t-spline surface. In another embodiment, front haptic surface 1303 is a subdivision surface. Other surface models known in the art may be employed.
  • front blend surface 1304 is a best fit NURBS surface. In another embodiment, front blend surface 1304 is a t-spline surface. In another embodiment, front blend surface 1304 is a subdivision surface. Other surface models known in the art may be employed.
  • front optical surface 1305 is a best fit NURBS surface. In another embodiment, front optical surface 1305 is a t-spline surface. In another embodiment, front optical surface 1305 is a subdivision surface. Other surface models is known in the art may be employed.
  • Point cloud 1401 includes front lens point cloud 1402 and back lens point cloud 1403 . Front lens point cloud 1402 and back lens point cloud 1403 are separated by thickness 1406 . Thickness 1406 varies depending on the shape of back lens point cloud 1403 and the desired lens design. Point cloud 1401 further includes point density 1404 and separation distance 1405 . Point cloud 1401 is a collection of points. Each point is a three-dimensional coordinate to model a three-dimensional lens.
  • point density 1404 is approximately 0.1 mm Other densities may be employed.
  • separation distance 1405 is approximately 0.3 mm. Other distances may be employed.
  • thickness 1406 is approximately 0.25 mm at apex 1407 . In other embodiments, thickness 1406 varies according the shape of back tens point cloud 1403 and the desired lens design.
  • Lens image 1500 has front lens surface image 1501 , back lens surface image 1502 , and lens thickness 1503 .
  • Lens thickness 1503 varies depending on the shape of back lens surface image 1502 and the desired lens design.
  • lens thickness 1503 is approximately 0.25 mm at apex 1504 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Eyeglasses (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)

Abstract

A system and method for designing scleral lenses includes a computer, a camera and a lathe connected by a network. The camera captures a set of sagittal images at each of a set of sagittal places of an eye. A combined sagittal image is created from each set of sagittal images. A spline curve is defined for each combined sagittal image creating a set of spline curves. A set of back surface curves if created from the set of spline curves. A back lens surface is generated from the set of surface curves. A front lens surface is generated adjacent the back lens surface. A point cloud is generated from the back lens surface and the front lens surface. The point cloud is converted to a lens image. The lens is converted to a test file for use by the lathe to cut the scleral lens.

Description

    FIELD OF THE INVENTION
  • The present invention relates to optical systems. In particular, the present invention relates to systems methods for designing scleral lenses.
  • BACKGROUND OF THE INVENTION
  • Keratoconus is a degenerative disorder of the eye in which structural changes within the cornea cause thinning and loss of curvature. Keratoconus can cause substantial distortion of vision, including double vision (diplopia), streaking and hyper-sensitivity to light. Keratoconus is typically diagnosed during adolescence. Debilitating deterioration in vision can occur.
  • Refractive surgical procedures, such as Laser-Assisted Keratomileusis (“LASIK”), are often prescribed to correct common vision disorders. In general, the LASIK procedure is performed by making a thin flap of corneal tissue, folding the flap out of the way, altering the shape of the cornea by subtracting tissue using an excimer laser, then replacing the flap.
  • Despite the many advantages of LASIK, severe side effects can occur. For example, halos, starbursts, loss of low-light sensitivity and dryness are common side effects of the procedure. In other less common side effects, the flap may fail to adhere properly to the eye's surface or may cause microscopic wrinkles in the flap called corneal striae. Studies indicate that flap complications occur in from 0.3 to 5.7 percent of LASIK procedures, according to the April 2006 issue of American Journal of Ophthalmology. These flap complications can lead to an irregularly shaped eye surface and distorted vision.
  • Irregular astigmatism also may occur from LASIK correction that is not centered property on the eye or from irregular healing after the procedure.
  • In another side effect, epithelial ingrowth occurs when cells from the outer layer of the cornea, the epithelium, grow under the flap. In some cases, blurred vision or chromic discomfort can result.
  • In yet another side effect, diffuse lamellar keratitis (“DLK”), nicknamed “Sands of the Sahara,” results in which an inflammation under the LASIK flap occurs. But if the inflammation is uncontrolled, as in DLK, it can interfere with healing and cause vision loss.
  • The prior art has provided different methods to compensate for the side effects of LASIK surgery; however, none is completely satisfactory. For example, corneal implants, called “Intacs,” may be prescribed to hold the cornea in place.
  • Another remedy for LASIK side effects is gas permeable contact lenses, such as scleral lenses, which are prescribed as a cost effective and safe remedy. However, the prior art methods to design and construct sclera lenses, so far, have been ineffective to produce an adequate “fit” at an economical price, so that the remedy cannot be widely prescribed or used.
  • For example, U.S. Pat. No. 5,570,142 to Lieberman discloses a contact lens for asymmetric aspheric corneas with a peripheral portion to fit peripheral portion of the cornea as determined by scan of subject eye. The contact lens is not substantially greater in diameter than the cornea. The process for manufacturing the lens uses three-dimensional topographic data from points on the cornea. The data is used to shape at least the peripheral portion of the posterior surface of the lens. However, the process in Lieberman limited to corneal lenses only and not suitable for scleral lenses.
  • U.S. Pat. No. 5,452,031 to Ducharme discloses a method for manufacturing a contact lens. The contact lens is made through use of a computer implementing an approximation of the cornea. Piecewise polynomials approximating the corneal topology have equal first and second derivatives where they join. A polynomial representing the central optical portion of the lens and the piecewise polynomial adjacent to the central optical portion curve have an equal first derivative where they join. A contact lens is cut corresponding to the lens surface defined by the piecewise polynomials. However, the method is Ducharme requires an approximation of only the cornea thereby leading to an inaccurate representation of the scleral surface and an uncomfortable fit.
  • The prior art fails to disclose or suggest a system and method for designing scleral lenses that conform to the shape of the eye surface. Therefore, there is a need in the art for a system and method for designing scleral leases that accurately follows the shape of the eye surface such that the lens does not rotate when worn or move excessively, and can be worn comfortably and safely on the eye.
  • SUMMARY
  • In a preferred embodiment, a system and method for designing scleral lenses is disclosed. The system includes a computer, a camera, and a lathe connected by a network.
  • In a preferred embodiment, the camera captures a set of sagittal images at each of a set of sagittal planes of an eye and sends the set of sagittal images to the computer. A combined sagittal image is created at each sagittal plane from each set of sagittal images. A spline curve is defined for each combined sagittal image to create a set of spline curves. A set of back surface curves is created from the set of spline curves. A back optical surface is generated. A back haptic surface curve is created for each back surface curve creating a set of back haptic surface curves. A back haptic surface is generated from the set of back haptic surface curves. A blend surface is generated to connect the back haptic surface to the back optical surface to create the back lens surface. The process is repeated to create the front lens surface.
  • A font surface curve is created at each plane adjacent to the back surface to create a set of front surface curves. A front optical surface is generated. A front haptic surface curve is created for each front surface curve creating a set of front haptic surface curves. A front haptic surface is generated from the set of front haptic surface curves. A blend surface is generated to connect the front haptic surface to the front optical surface to create the front lens surface.
  • A “point cloud” is generated from the back lens surface and the front lens surface. The point cloud is converted to a lens image. The lens is converted to a text file. The text file is sent to the lathe and a scleral lens is cut by the lathe using the text file.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The disclosed embodiments will be described with reference to the accompanying drawings.
  • FIG. 1 is a schematic for a system for designing a scleral lens of a preferred embodiment.
  • FIG. 2 is a flowchart of a method for designing a scleral lens of a preferred embodiment.
  • FIG. 3A is a set of sagittal planes of an eye of a preferred embodiment.
  • FIG. 3B is a sagittal image captured on a sagittal plane of a preferred embodiment.
  • FIG. 3C is a sagittal image taken on a sagittal plane of a preferred embodiment.
  • FIG. 3D is a combined sagittal image of a preferred embodiment.
  • FIG. 4 is a spline curve on a corneal surface of a sagittal image of a preferred embodiment.
  • FIG. 5 is a near point carve on a sagittal image of a preferred embodiment.
  • FIG. 6 is a set of near point curves of a preferred embodiment.
  • FIG. 7A is a point cloud of an optical section of a preferred embodiment.
  • FIG. 7B is a set of back surface points of a preferred embodiment.
  • FIG. 7C is a point cloud of an optical section and an optical surface of a preferred embodiment.
  • FIG. 7D is an optical surface and a trim tool of a preferred embodiment.
  • FIG. 7E is an optical surface of a preferred embodiment.
  • FIG. 8 is a drawing of an optical back surface trimmed to a near point curve of a preferred embodiment.
  • FIG. 9 is a drawing of an optical back surface section and a haptic surface section of a preferred embodiment.
  • FIG. 10 is a drawing of a back surface of a lens of a preferred embodiment.
  • FIG. 11 is a drawing of a back surface of a lens, a spline of a front surface of a lens, and a front optical section of a preferred embodiment.
  • FIG. 12 is a drawing of a from surface and a back surface of a lens.
  • FIG. 13 is a drawing of a point cloud of a lens.
  • FIG. 14 is a drawing of a completed lens.
  • DETAILED DESCRIPTION
  • It will be appreciated by those skilled in the art that aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or a composition of matter, or any new and useful improvement thereof. Therefore, aspects of the present disclosure may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Further, aspects of the present disclosure may take the foil i of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereof.
  • Any combination of one or more computer readable media may be utilized. The computer readable media may be a computer readable signal medium or a computer readable storage medium. For example, a computer readable storage medium may be, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the computer readable storage medium would include, but are not limited to: a portable computer diskette, a hard disk, a random access memory (“RAM”), a read-only memory (“ROM”), an erasable programmable read-only memory (“EPROM” or Flash memory), an appropriate optical fiber with a repeater, a portable compact disc read-only memory (“CD-ROM”), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. Thus, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system apparatus, or device.
  • A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. The propagated data signal may take any of a variety of forms, including, but not limbed to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, or any suitable combination thereof.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++, C#, .NET, Objective C, Ruby, Python SQL, or other modern and commercially available programming languages.
  • Aspects of the present disclosure are described with reference to flowchart illustrations and/or block diagrams of methods, systems and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/of block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable instruction execution apparatus, create a mechanism for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computes to implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • Referring to FIG. 1, system 100 includes computer 101, camera 102, and lathe 103, each connected to network 104. Computer 101 includes processor 105 and memory 106 connected to processor 105. Lens design process 107 is saved in memory 106 and executed by processor 105.
  • In use, camera 102 captures a set of sagittal images of an eye. The set of sagittal images are sent to computer 101 through network 104. A lens is designed using lens design process 107 from the set of sagittal images, as will be further described below. The lens design is sent to lathe 103 through network 104. Lathe 103 cuts the lens according to the design.
  • In a preferred embodiment, camera 102 is a Visante® optical coherence tomography (“OCT”) unit available from Carl Zeiss Meditec, Inc. Other suitable imaging devices known in the art may be employed.
  • In a preferred embodiment, lathe 103 is a DAC ALM lens lather available from DAC International, Inc. Other suitable lens lathes known in the art may be employed.
  • Referring to FIG. 2, lens design process 200 will be described.
  • In a preferred embodiment, a back lens surface is first created that will rest on an eye. In this embodiment, a front lens surface is created second, based on the back lens surface, as will be further described below.
  • In step 201, a set of overlapping sagittal images is captured at each of a set of sagittal planes of an eye.
  • In step 202, each set of sagittal images are combined to create a combined sagittal image for each sagittal plane.
  • In a preferred embodiment, each set of sagittal images are digitally “stitched” together using a graphics editing software program to create the combined sagittal image, as will be further described below.
  • In another embodiment, portions of each set of sagittal images are cut and pasted together using a graphics editing software program to create the combined sagittal image. Other image editing techniques known in the art may be employed.
  • In step 203, a set of back surface curves are created. A spline curve is defined on each combined sagittal image by tracing the surface of the eye to create a set of spline curves, as will be further described below. A lens curve is also generated for each combined sagittal image by converting each spline curve to a near point carve at a predetermined distance from the eye surface, as will be further described below.
  • In step 204, an optical surface is generated.
  • In step 205, a haptic surface curve is created for each lens curve to create a set of haptic surface curves.
  • In step 206, a haptic surface is generated from the set of haptic surface curves.
  • In step 207, a blend surface is generated for the optical surface and the haptic surface.
  • In step 208, the haptic surface, the blend surface, and the optical surface are combined to form a lens surface.
  • In step 209, if the front surface has been created, then lens design process 200 proceeds to step 211. If the front surface has not been created, then lens design process 200 proceeds to step 210.
  • In step 210, a set of front surface curves are created. In this embodiment, a front lens curve is generated at each sagittal plane by creating a near point curve at a predetermined distance from the back lens surface, as will be further described below.
  • Steps 204, 205, 206, 207, and 208 are repeated to create the front lens surface from the set of front surface curves.
  • In step 211, the back lens surface and front lens surface are converted to a point cloud.
  • In step 212, the point cloud is concerted to a three-dimensional lens image.
  • In step 213, the three-dimensional lens image is converted to a text file. In step 214, the lens is cut by a lathe using the text file. In this step, the text file is a cutting pattern of the lens that guides the movement of the lathe.
  • Referring to FIG. 3A, step 201 will be further described. Eye 301 has sagittal planes 302, 303, 304, 305, 306, and 307, and reference point 344. Angle a separates each of sagittal planes 302, 303, 304, 305, 306, and 307.
  • In the preferred embodiment, six planes are employed. In other embodiments, other numbers of planes may be employed.
  • In a preferred embodiment, angle a is approximately 30°. In other embodiments, other angles may be used.
  • A set of sagittal images is captured at each of sagittal planes 302, 303, 304, 305, 306, and 307. At sagittal plane 302, sagittal image 308 is captured from point 309 to point 310 and sagittal image 311 is captured from point 312 to point 313. At sagittal plane 303, sagittal image 314 is captured from point 315 to point 316 and sagittal image 317 is captured from point 318 to point 319. At sagittal plane 304, sagittal image 320 is captured from point 321 to point 322 and sagittal image 323 is captured from point 324 to point 325. At sagittal plane 305, sagittal image 326 is captured from point 327 to point 328 and sagittal image 329 is captured from point 330 to point 331. At sagittal plane 306, sagittal image 332 is captured from point 333 to point 334 and sagittal image 335 is captured from point 336 to point 337. At sagittal plane 307, sagittal image 338 is captured from point 339 to point 340 and sagittal image 341 is captured from point 342 to point 343.
  • In a preferred embodiment, two overlapping sagittal images are captured at each sagittal plane. In this embodiment, the pupil of the subject eye is dilated. A low power infrared light beam is centered on the dilated pupil to create the reference point. The eye is moved laterally to move the infrared radiation beam off-center to capture the two overlapping sagittal images. Other techniques known in the art may be employed.
  • Referring to FIGS. 3B and 3C by way of example, sagittal image 308 spans from point 309 to point 310. Sagittal image 308 has reference point 344, eye surface 345, and pupillary center 349. Sagittal image 311 spans from point 312 to point 313. Sagittal image 311 has reference point 344, eye surface 346, and pupillary center 349.
  • Referring to FIG. 3D, step 202 will be further described. Combined sagittal image 347 includes sagittal image 308 and sagittal image 311. Sagittal image 308 has eye surface 345. Sagittal image 311 has eye surface 346. Combined sagittal image 345 has reference point 344, combined eye surface 348, and pupillary center 349. Eye surfaces 345 and 346 form combined eye surface 348.
  • In a preferred embodiment, each of eye surfaces 345 and 346 is a “surface profile” created by surface detection in the Visante® OCT unit when each of sagittal images 308 and 311 is captured.
  • In a preferred embodiment, sagittal image 308 and 311 are combined by overlapping eye surfaces 345 and 346 at pupillary center 349 to create a “best fit” combined eye surface. In this embodiment, sagittal images 308 and 311 are digitally “stitched” together using a graphics editing software.
  • In another embodiment sagittal images 308 and 311 are combined by overlapping eye surfaces 345 and 346 and aligning eye surfaces 345 and 346 at reference point 344 to create a “best fit” combined eye surface.
  • In a preferred embodiment, combined sagittal image 347 is created by digitally “stitching” sagittal images 308 and 311 using Adobe Photoshop software available from Adobe Systems, Inc. In this embodiment, sagittal images 308 and 311 are imported into Adobe Photoshop and combined using the “Photomerge” tool in Adobe Photoshop. Other suitable image editing software known in the art may be employed.
  • In another embodiment, portions of sagittal images 308 and 311 are “cut” and “pasted” together, using Adobe Photoshop to create combined sagittal image 347. Other image editing software and techniques known in the art may be employed.
  • Referring to FIG. 4, step 203 will be further described. Combined sagittal image 401 includes eye surface 403. Eye surface 403 has surface apex 405. Spline curve 402 traces eye surface 403 and intersects surface apex 405. Spline curve 402 has a plurality of control points 404 to manipulate the curvature of spline curve 402.
  • In a preferred embodiment, spline curve 402 and control points 404 are created by importing combined sagittal image 401 into CATIA Design software, available from Dassault Systemes Americas Corp., and drawing spline curve 402 and control points on eye surface 403. Other computer-aided design (“CAD”), computer-aided manufacturing (“CAM”) and/or computer-aided engineering (“CAE”) software known in the art may be employed.
  • In a preferred embodiment, each of control points 404 is digitally “attached” to spline curve 402 at anatomical parts of eye surface 403. The density and location of each control points 404 depend on the curvature of eye surface 403. Any of control points 404 may be repositioned in any direction by selecting and dragging any of control points 404 to alter spline curve 402.
  • In a preferred embodiment, approximately 30 control points are attached to spline curve 402. In other embodiments, other numbers of control points are employed.
  • In a preferred embodiment, haptic section 406 is defined along spline curve 402 by end point 407 and point 408. Optical section 409 is defined along spline curve 402 by point 410 and point 411. Haptic section 412 is defined along spline curve 402 by point 413 and point 414.
  • Referring to FIG. 5, surface curve 501 is distance 503 from eye surface 502 at apex 516. Surface curve 501 has a plurality of control points 504 to manipulate the curvature and location of surface curve 501.
  • In a preferred embodiment, distance 503 is in a range from approximately 0.35 nun to approximately 0.5 mm to accommodate for a “settling distance” of the lens on the eye. In this embodiment, the settling distance is in a range of approximately 0.2 mm to 0.25 mm. In other embodiments, other distances are employed.
  • In a preferred embodiment, each of control points 504 is digitally “attached” to surface curve 501 at anatomical parts of eye surface 502. The density and location of each control points 504 depend on the curvature of eye surface 502. Any of control points 504 may be repositioned in any direction by selecting and dragging any of control points 504 to reposition surface curve 501.
  • In a preferred embodiment, surface curve 501 is created by selecting and dragging control points 504 to reposition a portion of spline curve 402 in FIG. 4 by a predetermined distance from eye surface 502.
  • Surface carve 501 contacts eye surface 502 at contact points 505 and 506. Surface curve 501 has end points 507 and 508. End point 507 is distance 509 from centerline 510. Contact point 505 is distance 511 from centerline 510. Contact point 506 is distance 512 from centerline 510. End point 508 is distance 513 from centerline 510. Haptic section curve 514 of surface curve 501 is defined by end point 507 and contact point 505. Haptic section curve 515 of surface curve 501 is defined by end point 508 and contact point 506. Haptic section curves 514 and 515 contact eye surface 502.
  • In one embodiment, the distances of surface curve 501 are listed in Table 1 below. In other embodiments, other distances are employed according to the desired lens design.
  • TABLE 1
    Distance No. Length
    Distance
    503 0.4 mm
    Distance
    509 9.5 mm
    Distance
    513 9 5 mm
    Distance
    511 7.25 mm
    Distance
    512 7.25 mm
  • Referring to FIG. 6, set of surface curves 600 includes surface curves 601, 602, 603, 604, 605, and 606. Surface curves 601, 602, 603, 604, 605, and 606 are joined at apex 607. Each of surface curves 601, 602, 603, 604, 605, and 606 is separated from each other by angle a. Each of surface curves 601, 602, 603, 604, 605, and 606 extends along a sagittal plane as shown in FIG. 3A. Surface curve 601 extends along sagittal plane 305. Surface curve 602 extends along sagittal plane 304. Surface curve 603 expends along sagittal plane 303. Surface curve 604 extends along sagittal plane 302. Surface curve 605 extends along sagittal plane 307. Surface curve 606 extends along sagittal plane 306.
  • Step 204 will be further described with reference to FIGS. 7A, 7B, 7C, 7D, 8, and 9.
  • Referring to FIG. 7A, optical point cloud 700 is imparted from a spreadsheet of predetermined point values. Optical point cloud 700 is a set of three-dimensional coordinates. Optical point cloud 700 has set of back surface points 701 and set of front surface points 702, thickness 703, diameter 704, and height 705, each of which may vary according to the desired lens design. The predetermined point values are a set of polar coordinates generated from standard optical lens design methods known in the art that depend on the desired diameter, material, focal power, refractive index of the desired material, sagittal value of the curve of the desired material, and a minimum thickness of the desired material, and the general shape of the optical surfaces, i.e., spherical, aspherical, or toric. The curvature of set of front surface points 702 depends on the curvature of set of back surface points 701. For example, at a given diameter and a given set of back surface points the curve of the front optical surface will generally have a steeper curve as the focal power increases and a flatter curve as the focal power decreases.
  • In one embodiment, the predetermined point values of set of back surface points 701 are determined by the method disclosed in U.S. application Ser. No. 13/277,139, filed on Oct. 19, 2011 and is incorporated herein by reference.
  • In a preferred embodiment, set of back surface points 701 is imported first and set of front surface points 702 is imported second. In this embodiment, set of back surface points 701 is separated from optical point cloud 700 by determining a first set of surface coordinates that, positioned on an eye, are closest to the pupillary center. In this embodiment, set of front surface points 702 is separated from optical point cloud 700 by determining a second set of surface coordinates that, when positioned on an eye, are furthest from the papillary center.
  • Referring to FIG. 7B by way of example, set of back surface points 701 is imported and separated from optical point cloud 700. Set of back surface points 701 has diameter 706 and height 707.
  • Referring to FIG. 7C by way of example, surface 708 is generated from set of back surface points 701 by connecting the point values of set of back surface points 701.
  • In a preferred embodiment, surface 708 is a best fit non-uniform rational B-spline (“NURBS”) surface. In another embodiment, surface 708 is a t-spline surface. In another embodiment, surface 708 is a subdivision surface. Other surface models known in the art may be employed.
  • Referring to FIGS. 7D and 7E, corners of surface 708 are trimmed using trim tool 709 to create optical surface 710. Optical surface 710 has diameter 711 and height 712 that matches diameter 706 and height 707 of set of hack surface points 701, respectively.
  • Referring to FIG. 8, steps 204 and 205 will be further described. Optical surface 901 has surface edge 916. Optical surface 901 is overlaid on surface curve 902 and eye surface 903. Surface edge 916 of optical surface 901 is aligned with surface curve 902 through control points 920 and 921 by selecting and dragging optical surface 901 to reposition optical surface 901. Surface curve 902 is further aligned with surface edge 916 of optical surface 901 at control points 920 and 921 by selecting and dragging control points 920 and 921 so that surface curve 902 is substantially tangential to surface edge 916 at control points 920 and 921. Optical surface 901 is trimmed with trim tool 918 to create surface edge 917 and define surface edge 916 from point 920 to point 921. Surface edge 917 is adjacent eye surface 903 and does not contact eye surface 903.
  • Surface curve 902 is trimmed with trim tool 918 by deleting portions of surface cure 902 covered by trim tool 918 to create haptic surface curves 904 and 905. Haptic surface curve 904 is defined from point 922 to point 926. Haptic surface curve 915 is defined from point 927 to point 931. Haptic surface curves 904 and 915 contact eye surface 903.
  • The slope of haptic surface curve 904 may be altered using points 922, 923, 924, 925, and 926 by selecting and dragging any of points 922, 923, 924, 925, and 926 so reposition haptic surface curve 904. The slope of haptic surface curve 915 may be altered using points 927, 928, 929, 930, and 931 by selecting and dragging any of points 927, 928, 929, 930, and 931 to reposition haptic surface curve 915. Altering the slopes of haptic surface curves 904 and 915 provides a better fit of haptic surface curves 904 and 915 on eye surface 903.
  • Each of haptic surface curves 905, 906, 907, 908, 909, 910, 911, 912, 913, and 914 is created as previously described. Each of haptic surface curves 904, 905, 906, 907, 908, 909, 910, 911, 912, 913, 914, and 915 extend along a sagittal plane as shown in FIG. 3A. Each of haptic surface curves 904, 905, 906, 907, 908, 909, 910, 911, 912, 913, 914, and 915 is separated from each other by angle a. Haptic surface curves 904 and 915 extend along sagittal plane 302. Haptic surface carves 905 and 914 extend along sagittal plane 303. Haptic surface curves 907 and 912 extend along sagittal plane 304. Haptic surface curves 909 and 910 extend along sagittal plane 305. Haptic surface curves 911 and 908 extend along sagittal plane 306. Haptic surface curves 913 and 906 extend along sagittal plane 307.
  • Referring to FIG. 9 step 206 will be further described. Haptic surface 1002 is generated to connect haptic surface curves 1003, 1004, 1005, 1006, 1007, 1008, 1009, 1010, 1011, and 1012.
  • In a preferred embodiment, haptic surface 1002 is a best fit NURBS surface. In another embodiment, haptic surface 1002 is a t-spline surface. In another embodiment, haptic surface 1002 is a subdivision surface. Other surface models known in the art may be employed.
  • In a preferred embodiment, optical surface 1001 is a best fit NURBS surface. In another embodiment, optical surface 1001 is a t-spline surface. In another embodiment, optical surface 1001 is a subdivision surface. Other surface models known in the art may be employed.
  • Referring to FIG. 10, steps 207 and 208 will be described in further detail. Blend surface 1103 is a best fit surface generated to connect optical surface 1102 with haptic surface 1104. Lens surface 1101 includes optical surface 1102, blend surface 1103 attached to optical surface 1102, and haptic surface 1104 attached to blend surface 1103.
  • In a preferred embodiment, blend surface 1103 is a best fit NURBS surface. In another embodiment, blend surface 1103 is a t-spline surface. In another embodiment, blend surface 1103 is a subdivision surface. Other surface models known in the art may be employed.
  • Referring to FIG. 11, step 210 will be further described. Each front surface curve 1202 is generated adjacent to back lens surface 1201 and extends along a sagittal plane as shown in FIG. 3A. By way of example, front surface carve 1202 extends along sagittal plane 302. Front surface curve 1202 is constrained by back lens surface 1201. Front surface curve 1202 has haptic surface curves 1217 and 1218. Front surface curve 1202 is trimmed as previously described to create haptic surface curves 1217 and 1218. Haptic surface curve 1217 is defined from end point 1206 to point 1210. Haptic surface curve 1218 is defined from point 1212 to end point 1214.
  • Front optical surface 1220 is overlaid onto front surface curve 1202. Edge 1225 of optical surface 1220 is aligned with front surface curve 1202 through points 1221, 1222, 1223, 1224, and 1204 by selecting and dragging front optical surface 1220 to align edge 1225 of optical surface 1220 with front surface curve 1202 as previously described. Optical surface 1220 is trimmed as previously described to define optical surface 1220 from point 1222 to point 1223.
  • Point 1204 is distance 1205 from back lens surface apex 1203. End point 1206 is distance 1207 from center line 1208 and distance 1209 from back lens surface 1201. Point 1210 is distance 1211 from back lens 1201. Point 1212 is distance 1213 from back lens surface 1201. End point 1214 is distance 1215 from center line 1208 and distance 1216 from back lens surface 1201.
  • Distance 1205 depends on the desired lens design. In a preferred embodiment, distance 1205 is in a range from 0.15 mm to 0.4 mm.
  • In one embodiment, the distances of front surface carve 1202 are listed in Table 2 below. In other embodiments, other distances are employed according to the desired lens design.
  • TABLE 2
    Distance No. Length
    Distance
    1205 0.25 mm
    Distance
    1207 9.3 mm
    Distance
    1209 0.18 mm
    Distance
    1211 0.35 mm
    Distance
    1213 0.35 mm
    Distance
    1215 9.3 mm
    Distance
    1216 0.18 mm
  • Referring to FIG. 12, lens 1300 includes back lens surface 1301 and front lens surface 1302 adjacent to back lens surface 1302. Front lens surface 1302 includes front haptic surface 1303, front blend surface 1304 attached to haptic surface 1303, and front optical surface 1305 attached to front blend surface 1304. Each of front haptic surface 1303, front blend surface 1304, and front optical section 1305 is generated as previously described with respect to the back surface.
  • In a preferred embodiment, front haptic surface 1303 is a best fit NURBS surface. In another embodiment, front haptic surface 1303 is a t-spline surface. In another embodiment, front haptic surface 1303 is a subdivision surface. Other surface models known in the art may be employed.
  • In a preferred embodiment, front blend surface 1304 is a best fit NURBS surface. In another embodiment, front blend surface 1304 is a t-spline surface. In another embodiment, front blend surface 1304 is a subdivision surface. Other surface models known in the art may be employed.
  • In a preferred embodiment, front optical surface 1305 is a best fit NURBS surface. In another embodiment, front optical surface 1305 is a t-spline surface. In another embodiment, front optical surface 1305 is a subdivision surface. Other surface models is known in the art may be employed.
  • Referring to FIG. 13, step 211 will be further described. Point cloud 1401 includes front lens point cloud 1402 and back lens point cloud 1403. Front lens point cloud 1402 and back lens point cloud 1403 are separated by thickness 1406. Thickness 1406 varies depending on the shape of back lens point cloud 1403 and the desired lens design. Point cloud 1401 further includes point density 1404 and separation distance 1405. Point cloud 1401 is a collection of points. Each point is a three-dimensional coordinate to model a three-dimensional lens.
  • In a preferred embodiment, point density 1404 is approximately 0.1 mm Other densities may be employed.
  • In a preferred embodiment, separation distance 1405 is approximately 0.3 mm. Other distances may be employed.
  • In one embodiment, thickness 1406 is approximately 0.25 mm at apex 1407. In other embodiments, thickness 1406 varies according the shape of back tens point cloud 1403 and the desired lens design.
  • Referring to FIG. 14, step 212 will be described its further detail. Lens image 1500 has front lens surface image 1501, back lens surface image 1502, and lens thickness 1503. Lens thickness 1503 varies depending on the shape of back lens surface image 1502 and the desired lens design.
  • In one embodiment, lens thickness 1503 is approximately 0.25 mm at apex 1504.
  • It will be appreciated by those skilled in the art that modifications can be made to the embodiments disclosed and remain within the inventive concept. Therefore, this invention is not limited to the specific embodiments disclosed, but is intended to cover changes within the scope and spirit of the claims.

Claims (45)

1-22. (canceled)
23. A method of designing a scleral contact lens comprising:
capturing a sagittal image of the anterior surface of an eye;
extracting a surface curve from the sagittal image;
creating a back surface curve from the surface curve;
generating a back lens surface from the back surface curve;
creating a front surface curve from the back surface curve; and
generating a front lens surface from the front surface curve.
24. The method of claim 23 wherein said creating a back surface curve comprises:
creating a spline curve for the sagittal image; and
converting the spline curve to a back surface curve.
25. The method of claim 23 wherein the sagittal image is captured by a camera.
26. The method of claim 23, wherein the step of generating a back lens surface from the back surface curve further comprises the steps of:
generating a back optical surface;
creating a back haptic surface curve for the back surface curve;
generating a back haptic surface from the back haptic surface curve;
generating a back blend surface; and
combining the back optical surface, the back haptic surface, and the back blend surface to form the back lens surface.
27. The method of claim 26, wherein the step of generating a back optical surface further comprises the steps:
receiving a set of back surface points;
generating a back optical surface from the set of back surface points; and
trimming the back optical surface.
28. The method of claim 26 further comprising the step of changing a slope of the back haptic surface curve.
29. The method of claim 23 wherein the step of generating a front lens surface from the front surface curve further comprises the steps of:
generating a front optical surface;
creating a front haptic surface curve for the front surface curve;
generating a front haptic surface from the front haptic surface curve;
generating a front blend surface; and
combining the front optical surface, the front haptic surface, and the front blend surface to form the front lens surface.
30. The method of claim 29 wherein the step of generating a front optical surface, further comprises the steps of:
receiving a set of front surface points;
generating a front optical surface from the set of front surface points; and
trimming the front optical surface.
31. The method of claim 29, further comprising the step of changing the slope of the front haptic surface curve.
32. A method of designing a scleral contact lens comprising:
capturing a plurality of sagittal images of the anterior surface of an eye;
extracting a plurality of surface curves from the plurality of sagittal images;
creating a plurality of back surface curves from the plurality of surface curves;
generating a back lens surface from the plurality of back surface curves;
creating a plurality of front surface curves from the plurality of back surface curves; and
generating a front lens surface from the plurality of front surface curves.
33. The method of claim 32 wherein a computer is used to create a set of combined sagittal images from the plurality of sagittal images.
34. The method of claim 33 wherein said creating a plurality of back surface curves comprises:
creating a spline curve for each combined sagittal image of the set of combined sagittal images to create a set of spline curves; and
converting each spline curve to a back surface curve to create the plurality of back surface curves.
35. The method of claim 32 wherein the plurality of sagittal images are captured by a camera.
36. The method of claim 32, wherein the step of generating a back lens surface from the plurality of back surface curves further comprises the steps of:
generating a back optical surface;
creating a back haptic surface curve for each back surface curve of the plurality of back surface curves to create a plurality of back haptic surface curves;
generating a back haptic surface from the plurality of back haptic surface curves;
generating a back blend surface; and
combining the back optical surface, the back haptic surface, and the back blend surface to form the back lens surface.
37. The method of claim 36, wherein the step of generating a back optical surface further comprises the steps of:
receiving a set of back surface points;
generating a back optical surface from the set of back surface points; and
trimming the back optical surface.
38. The method of claim 36 further comprising the step of changing a slope of each back haptic surface curve of the plurality of back haptic surface curves.
39. The method of claim 32 wherein the step of generating a front lens surface from the plurality of front surface curves further comprises the steps of:
generating a front optical surface;
creating a front haptic surface curve for each front surface curve of the plurality of front surface curves to create a plurality of front haptic surface curves;
generating a front haptic surface from the plurality of front haptic surface curves;
generating a front blend surface; and
combining the front optical surface, the front haptic surface, and the front blend surface to form the front lens surface.
40. The method of claim 39 wherein the step of generating a front optical surface, further comprises the steps of:
receiving a set of front surface points;
generating a front optical surface from the set of front surface points; and
trimming the front optical surface.
41. The method of claim 39, further comprising the step of changing a slope of each from haptic surface curve of the plurality of front haptic surface curves.
42. A method of creating a scleral contact lens comprising:
receiving a front lens surface and a back lens surface at a computer;
the computer commanding a lathe to cut the scleral contact lens using the front lens surface and the back lens surface.
43. A system for designing a scleral contact lens comprising:
a device for capturing a sagittal image of the anterior surface of an eye;
a computer program for extracting a surface curve from the sagittal image;
a computer program for creating a back surface curve from the surface curve;
a computer program for generating a back lens surface from the back surface curve;
a computer program for creating a front surface curve from the back surface curve; and
a computer program for generating a front lens surface from the front surface curve.
44. The system of claim 43 wherein said computer program for creating a back surface curve comprises:
Software code for creating a spline curve for the sagittal image; and
software code for converting the spline curve to a back surface curve.
45. The system of claim 43 further comprising a camera to capture the sagittal image.
46. The system of claim 43 wherein the computer program for generating a back lens surface from the back surface curve further comprises:
software code for generating a back optical surface;
software code for creating a back haptic surface curve for the back surface curve;
software code for generating a back haptic surface from the back haptic surface curve;
software code generating a back blend surface; and
software code for combining the back optical surface, the back haptic surface, and the back blend surface to form the back lens surface.
47. The system of claim 46, wherein the software code for generating a back optical surface further comprises:
software code for receiving a set of back surface points;
software code for generating a back optical surface from the set of back surface points, and
software code for trimming the back optical surface.
48. The system of claim 46 further comprising software for changing a slope of the back haptic surface curve.
49. The system of claim 43 wherein the computer program for generating a front lens surface from the front surface curve further comprises:
software code for generating a front optical surface;
software code for creating a front haptic surface curve for the front surface curve;
software code for generating a front haptic surface from the front haptic surface curve;
software code for generating a front blend surface; and
software code for combining the front optical surface, the front haptic surface, and the front blend surface to form the front lens surface.
50. The system of claim 49 wherein the software code for generating a front optical surface further comprises:
software code for receiving a set of front surface points;
software code for generating a front optical surface from the set of front surface points; and
software code for trimming the front optical surface.
51. The method of claim 49, further comprising the step of changing the slope of the from haptic surface curve.
52. A system for designing a scleral contact lens comprising:
a device for capturing a plurality of sagittal images of the anterior surface of an eye;
a computer program for extracting a plurality of surface curves from the plurality of sagittal images;
a computer program for creating a plurality of back surface curves from the plurality of surface curves;
a computer program for generating a back lens surface from the plurality of back surface curves;
a computer program for creating a plurality front surface curves from the plurality of back surface curves; and
a computer program for generating a front lens surface from the plurality of front surface curves.
53. The system of claim 52 wherein a computer program is used to create a set of combined sagittal images from the plurality of sagittal images.
54. The system of claim 52 wherein the computer program for creating a set of back surface curves comprises:
software code for creating a spline curve for each combined sagittal image of the set of combined sagittal images to create a set of spline curves; and
software code for converting each spline curve to a back surface curve to create the set of back surface curves.
55. The system of claim 52 comprising a camera to capture the sagittal images.
56. The system of claim 52, wherein the computer program for generating a back lens surface from the plurality of back surface curves comprises:
software code for generating a back optical surface;
software code for creating a back haptic surface curve for each back surface curve of the plurality of back surface curves to create a plurality of back haptic surface curves;
software code for generating a back haptic surface from the plurality of back haptic surface curves;
software code for generating a back blend surface; and
software code for combining the back optical surface, the back haptic surface, and the back blend surface to form the back lens surface.
57. The system of claim 56 wherein the software code for generating a back optical surface comprises:
software code for receiving a set of back surface points;
software code for generating a back optical surface from the set of back surface points; and
software code for trimming the back optical surface.
58. The system of claim 56 further comprising a computer program for changing a slope of each back haptic surface curve of the plurality of back haptic surface curves.
59. The system of claim 52 wherein the computer program for generating a front lens surface from the plurality of front surface curves further comprising:
software code for generating a front optical surface;
software code for creating a front haptic surface curve for each front surface curve of the plurality of front surface curves to create a plurality of front haptic surface curves;
software code for generating a front haptic surface from the plurality of front haptic surface curves;
software code for generating a front blend surface; and
software code for combining the front optical surface, the front haptic surface, and the front blend surface to form the front lens surface.
60. The system of claim 59 wherein the software code for generating a front optical surface further comprises:
software code for receiving a set of front surface points;
software code for generating a front optical surface from the set of front surface points; and
software code for trimming the front optical surface.
61. The system of claim 59, further comprising a computer program for changing a slope of each front haptic surface curve of the plurality of from haptic surface curves.
62. A system for creating a scleral contact lens comprising:
a computer for receiving a front lens surface and a back lens surface;
a computer program for commanding a lathe to cut the scleral contact lens using the front lens surface and the back lens surface.
63. A method of designing a scleral contact lens comprising:
receiving a surface curve generated from a sagittal image of the anterior surface of an eye;
creating a back surface curve from the surface curve;
generating a back lens surface from the back surface curve;
creating a front surface curve from the back surface curve; and
generating a front lens surface from the front surface curve.
64. A method of designing a scleral contact lens comprising:
receiving a plurality of surface curves generated from a plurality of sagittal images of the anterior surface of an eye;
creating a plurality of back surface curves from the plurality of surface curves;
generating a back lens surface from the plurality of back surface curves;
creating a plurality of front surface curve from the plurality of back surface curves; and
generating a front lens surface from the plurality of front surface curves.
65. A system of designing a sclera contact lens comprising:
a computer program for receiving a surface curve generated from a sagittal image of the anterior surface of an eye;
a computer program for creating a back surface curve from the surface curve;
a computer program for generating a back lens surface from the back surface curve;
a computer program for creating a front surface curve from the back surface curve; and
a computer program for generating a front lens surface from the front surface curve.
66. A system for of designing a scleral contact lens comprising:
a computer program for receiving a plurality of surface curves generated from a plurality of sagittal images of the anterior surface of an eye;
a computer program for creating a plurality of back surface curves from the plurality of surface curves;
a computer program for generating a back lens surface from the plurality of back surface curves;
a computer program for creating a plurality of front surface curve from the plurality of back surface curves; and
a computer program for generating a front lens surface from the plurality of front surface curves.
US16/100,205 2013-09-20 2018-08-10 System and Method for Designing Scleral Lenses Abandoned US20180373220A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/100,205 US20180373220A1 (en) 2013-09-20 2018-08-10 System and Method for Designing Scleral Lenses

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14/032,920 US9421721B2 (en) 2013-09-20 2013-09-20 System and method for designing scleral lenses
US15/216,062 US10095213B2 (en) 2013-09-20 2016-07-21 System and method for designing scleral lenses
US16/100,205 US20180373220A1 (en) 2013-09-20 2018-08-10 System and Method for Designing Scleral Lenses

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/216,062 Continuation US10095213B2 (en) 2013-09-20 2016-07-21 System and method for designing scleral lenses

Publications (1)

Publication Number Publication Date
US20180373220A1 true US20180373220A1 (en) 2018-12-27

Family

ID=52691648

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/032,920 Active 2035-01-13 US9421721B2 (en) 2013-09-20 2013-09-20 System and method for designing scleral lenses
US15/216,062 Active 2034-02-05 US10095213B2 (en) 2013-09-20 2016-07-21 System and method for designing scleral lenses
US16/100,205 Abandoned US20180373220A1 (en) 2013-09-20 2018-08-10 System and Method for Designing Scleral Lenses

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/032,920 Active 2035-01-13 US9421721B2 (en) 2013-09-20 2013-09-20 System and method for designing scleral lenses
US15/216,062 Active 2034-02-05 US10095213B2 (en) 2013-09-20 2016-07-21 System and method for designing scleral lenses

Country Status (1)

Country Link
US (3) US9421721B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10054802B2 (en) 2011-10-17 2018-08-21 Eydeal Scanning, LLC Topology guided ocular lens design
US9421721B2 (en) 2013-09-20 2016-08-23 Gregory Gemoules System and method for designing scleral lenses
WO2017078820A1 (en) 2015-11-03 2017-05-11 Boston Foundation For Sight Chiral scleral lenses
US9885882B2 (en) 2016-02-03 2018-02-06 Diversified Ophthalmics, Inc. Progressive contact lens
JP2020525842A (en) * 2017-06-27 2020-08-27 アイディール スキャニング リミテッド ライアビリティ カンパニー Ophthalmic lens design based on topology

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB655901A (en) * 1948-07-24 1951-08-08 Leonard Stidwill Improvements in or relating to ophthalmic contact lenses
ES2046877T3 (en) 1990-04-24 1994-02-01 Ciba-Geigy Ag PROCEDURE FOR THE MANUFACTURE OF CONTACT LENSES.
US5872613A (en) * 1992-11-23 1999-02-16 Innotech, Inc. Method of manufacturing contact lenses
US5452031A (en) 1993-05-05 1995-09-19 Boston Eye Technology, Inc. Contact lens and a method for manufacturing contact lens
US5502518A (en) 1993-09-09 1996-03-26 Scient Optics Inc Asymmetric aspheric contact lens
US5804107A (en) 1994-06-10 1998-09-08 Johnson & Johnson Vision Products, Inc. Consolidated contact lens molding
US5861114A (en) 1994-06-10 1999-01-19 Johnson&Johnson Vision Products, Inc. Method of manufacturing complex optical designs in soft contact lenses
US5880809A (en) 1996-12-30 1999-03-09 Scientific Optics, Inc. Contact lens
WO2000008516A1 (en) 1998-08-06 2000-02-17 Lett John B W Multifocal aspheric lens
US7048759B2 (en) * 2000-02-24 2006-05-23 Advanced Medical Optics, Inc. Intraocular lenses
US7275827B2 (en) 2000-04-11 2007-10-02 Benedikt Jean Selective corneal aberrometry
US6595639B1 (en) 2000-11-10 2003-07-22 Ocular Sciences, Inc. Junctionless ophthalmic lenses and methods for making same
US7152975B2 (en) 2000-11-10 2006-12-26 Cooper Vision, Inc. Junctionless ophthalmic lenses and methods for making same
US20040237971A1 (en) 2003-06-02 2004-12-02 Hema Radhakrishnan Methods and apparatuses for controlling optical aberrations to alter modulation transfer functions
US20060055071A1 (en) 2004-08-18 2006-03-16 Stephen Kendig Using higher order mathematical functions to create asymmetric molding back pieces
US7320587B2 (en) 2005-08-09 2008-01-22 Cooper Vision, Inc. Contact lens molds and systems and methods for producing same
US7296890B2 (en) * 2005-10-25 2007-11-20 Truform Optics Contact lens with controlled shape
AR062067A1 (en) 2006-07-17 2008-10-15 Novartis Ag TORICAS CONTACT LENSES WITH CONTROLLED OPTICAL POWER PROFILE
US7625085B2 (en) 2007-08-07 2009-12-01 Novartis Ag Toric contact lens with improved posterior surface design
US7862176B2 (en) * 2007-11-24 2011-01-04 Truform Optics Method of fitting rigid gas-permeable contact lenses from high resolution imaging
US8696117B2 (en) * 2010-09-14 2014-04-15 Truform Optics Fitting a contact lens
US8668333B2 (en) * 2010-12-08 2014-03-11 Valdemar Portney Contra-aspheric toric ophthalmic lens
US9554889B2 (en) * 2012-05-07 2017-01-31 Boston Foundation For Sight Customized wavefront-guided methods, systems, and devices to correct higher-order aberrations
US9421721B2 (en) * 2013-09-20 2016-08-23 Gregory Gemoules System and method for designing scleral lenses

Also Published As

Publication number Publication date
US9421721B2 (en) 2016-08-23
US20150088297A1 (en) 2015-03-26
US20160327928A1 (en) 2016-11-10
US10095213B2 (en) 2018-10-09

Similar Documents

Publication Publication Date Title
US20180373220A1 (en) System and Method for Designing Scleral Lenses
US20230000672A1 (en) Device and method for producing control data for the surgical correction of defective eye vision
RU2579350C2 (en) Device and method used in laser eye surgery system
EP2804571B1 (en) Adjusting laser energy in accordace with optical density
JP4611749B2 (en) Excimer laser device and related control method for corneal ablation to reduce presbyopia
CN103501720A (en) System and method for measuring tilt in the crystalline lens using laser phaco fragmentation
TWI631926B (en) Centering technique for a cutting laser for refractive ophthalmic surgery
US10470929B2 (en) Method for eye surgery
MXPA04012162A (en) Method and system for improving vision.
US20120303007A1 (en) System and Method for Using Multiple Detectors
CN113692265B (en) Device and method for laser surgery of the eye, in particular for cornea implantation
US12048650B2 (en) System and method of determining incision depths in eyes
JP2024100787A (en) System and method for making implant for eye
JP2024050744A (en) Making implant for eye
CN103945804B (en) Laser-assisted epithelial removal
KR20180022641A (en) Laser method and system for handling lens condition
US20240335107A1 (en) Device and method for the amelioration of ectatic and irregular corneal disorders
TWI605806B (en) Device for defining a flap geometry
CN114522020B (en) Personalized refractive surgery device
CN110063710B (en) Method for determining scleral spur
US20160015262A1 (en) Method and system for improving vision of an eye with macular degeneration

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION