EP2350798A1 - Representation system - Google Patents

Representation system

Info

Publication number
EP2350798A1
EP2350798A1 EP09749183A EP09749183A EP2350798A1 EP 2350798 A1 EP2350798 A1 EP 2350798A1 EP 09749183 A EP09749183 A EP 09749183A EP 09749183 A EP09749183 A EP 09749183A EP 2350798 A1 EP2350798 A1 EP 2350798A1
Authority
EP
European Patent Office
Prior art keywords
representation
contact
data processor
control signals
touch input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP09749183A
Other languages
German (de)
French (fr)
Inventor
David Carberry
Loren Picco
Arturas Ulcinas
Mervyn Miles
James Grieve
Sriram Subramanian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carl Zeiss Microscopy GmbH
Original Assignee
University of Bristol
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0819931A external-priority patent/GB0819931D0/en
Priority claimed from GB0916176A external-priority patent/GB0916176D0/en
Application filed by University of Bristol filed Critical University of Bristol
Publication of EP2350798A1 publication Critical patent/EP2350798A1/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • This invention relates to a representation system, and more particularly to a system including a material representation apparatus, which can be controlled from an input device.
  • Various forms of material representation apparatus are known, including forms of material representation apparatus such as microscopes that simply provide an image of a sample when located in a working space, and also including forms of material representation apparatus that provide an image of a sample, but also have a material processing function.
  • control can take the form of variation of the field of view, or of the focussing of the imaging system, for example.
  • the control can take any suitable form, depending on the available processing functionality.
  • One known form of material representation apparatus that also provides a material processing function is an optical tweezers apparatus, such apparatus having evolved into a powerful micro-manipulation tool in the fields of physics, chemistry, biology and material science. Many such apparatus have additional optics to enable the control of numerous optical traps at once.
  • Such additions include the use of spatial light modulators for holographic optical tweezers, for example as in M. Reicherter, T. Haist, E. U. Wagemann, and H. J. Tiziani, Optical particle trapping with computer-generated holograms written on a liquid-crystal display," Opt. Lett.24(9), 608-610 (1999); or GPC based systems as in P. J. Rodrigo, L.
  • trap locations are controlled by either positioning a digital marker over a live video feed of the sample or by modifying simple coordinate values. While simple to develop and implement these "point-and-click" interfaces may not be sufficiently user- friendly for use by non-specialists and seldom allow for the control of more than one trap at a time.
  • a conventional AFM typically produces an image area of a few square micrometers in two to three minutes. Imaging times increase for larger scan areas, for increased image resolution, and for samples with larger height variations and/or lower stiffness.
  • This limitation arises from the fact that the AFM is a mechanical microscope based on a serial process.
  • a secondary effect of this limited imaging rate is that, unlike, for example, optical microscopy or electron microscopy (SEM or TEM), there is no low magnification mode in AFM. Instead, the sample must be mapped sequentially with a series of images, each limited to a maximum of 100 ⁇ m 2 or less.
  • High-speed AFM is a relatively new development that enables the collection of images in timescales of less than 1 second.
  • Components for high speed atomic force microscopy Ultramicroscopy, 106:881 , 2006, and T. Ando, N. Kodera, E. Takai, D. Maruyama, K. Saito, and A.
  • the high frame-rate means that the effects of any adjustments made to the system (the tip-sample interaction force, scan size, location etc.) are immediately presented to the operator.
  • the HSAFM gives the user the freedom to explore large areas with nanometre resolution, making the search for key features of interest a simple procedure.
  • the responsiveness of the HSAFM and the ability to freely scan large regions of the sample surface make it more comparable in operation to an optical microscope than a conventional AFM.
  • Multi-touch interfaces such as those demonstrated at the 18th Annual ACM Symposium on User Interface Software and Technology, "Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection” (ACM, Seattle, Washington, USA, 2005), provide the user with the ability to interact with a computer in multiple locations simultaneously. In this manner several digital objects can be controlled independently. Applications of this technology are in computer games, digital image manipulation and musical instruments.
  • a representation system comprising: a material representation apparatus, for generating a real-time representation of a working space; a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; and a data processor, connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
  • a data processor for use in a representation system, wherein the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the data processor is adapted to be connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
  • the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the method comprises receiving inputs representing the multiple points of contact, and generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
  • the new interface gives even an untrained user a high degree of control over a material representation apparatus.
  • Figure 1 is a block diagram, illustrating a representation system in accordance with an embodiment of the invention.
  • FIG. 2 is a block diagram, illustrating in more detail a holographic optical tweezers system in accordance with an embodiment of the invention.
  • Figure 3 illustrates in more detail a part of the touch screen in the system of Figures 1 and 2.
  • Figure 4 shows the touch screen in the system of Figure 2, in use.
  • Figure 5 illustrates a first use of the system of Figure 2.
  • Figure 6 illustrates a second use of the system of Figure 2.
  • Figure 7 illustrates a third use of the system of Figure 2.
  • Figure 8 is a block diagram, illustrating in more detail an atomic force microscope in accordance with an embodiment of the invention.
  • Figure 9 illustrates a first use of the system of Figure 8.
  • Figure 10 illustrates a second use of the system of Figure 8.
  • Figure 1 is a block schematic diagram, showing a representation system, including a material representation apparatus 10.
  • the material representation apparatus 10 includes an imaging device 12, which can in principle be any form of microscope using visible light or other forms of radiation, or generating a representation of a sample placed in a working space 16 by any other method.
  • the material representation apparatus 10 also includes a material processing device 14 for providing some form of material processing function on a sample when located in a working space, but, in other embodiments, the material representation apparatus 10 simply provides an image of a sample when located in the working space 16.
  • the imaging device 12 and the material processing device 14 may be elements of a single device.
  • the representation system also includes a multi-touch input device 20, having a display 22 on which is generated an image produced by the imaging device 12.
  • the multi- touch input device 20 also includes a touch screen 24, which may advantageously overlie the display 22, although it could in principle be provided separately therefrom.
  • the multi-touch input device 20 further includes a touch discrimination unit 26, for identifying locations on the touch screen 24 at which a user is touching the screen.
  • the representation system also includes a data processor 30, including a control interface 32, for receiving signals from the touch discrimination unit 26 and for generating suitable control signals for the imaging device 12 and/or the material processing device 14, as appropriate.
  • Figure 2 shows in slightly more detail a representation system in which the material representation apparatus is in the form of an optical tweezers 40.
  • the multi-touch input device 20 uses frustrated total internal reflection (FTIR), as described by Han, 18th Annual ACM Symposium on User Interface Software and Technology, "Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection” (ACM, Seattle, Washington, USA, 2005), in order to track the positions of each of the user's fingertips.
  • FTIR frustrated total internal reflection
  • ACM Seattle, Washington, USA, 2005
  • the console is physically made from a sheet 50 of PMMA (Poly(methyl methacrylate)), measuring 100 cm x 80 cm x 1 cm, with countersunk holes 52 for LEDs 54 drilled into the edge of the sheet at 2.5 cm intervals.
  • the 880 nm LEDs 54 (SFH486, Osram) are used to illuminate the console uniformly, effectively making it a waveguide with minimal losses arising from scattering.
  • An overlay of drafting paper 56 which acts as a diffuser for a data projector 58 (MP771 , BenQ), is coated with lightly textured silicone rubber 60 to facilitate the coupling between the user's fingers 62 and the waveguide 50.
  • the holographic optical tweezers device 40 are designed around a commercially available inverted microscope 72 (Axiovert 200, Zeiss), a 1.3 NA 100 ⁇ Plan-Neufluor objective 74 (Zeiss), and a motorised xyz translation stage 76 (MS-2000, ASI).
  • the optical traps are powered by a titanium-sapphire laser 78 (Coherent 899), producing 4 W at 800 nm, which is pumped by a solid-state laser (Verdi-V18, Coherent).
  • the wavefront is shaped by an optically-addressed spatial light modulator (SLM) 80 (X8267-14DB, Hamamatsu), controlled by a computer using either the CPU or a dedicated graphics card (GPU) 82 for calculating holograms 84, as shown in M. Reicherter, S. Zwick, T. Haist, C. Kohler, H. Tiziani, and W. Osten, "Fast digital hologram generation and adaptive force measurement in liquid-crystal-display based holographic tweezers," Appl. Opt.45(5), 888-896 (2006).
  • SLM optically-addressed spatial light modulator
  • GPU dedicated graphics card
  • a live video feed is provided by a camera 86, such as a CMOS camera (EC1280, Prosilica) or a high-speed camera capable of position tracking at rates of up to 150 kHz (microCam-640, Durham Smart Imaging), as described in G. M. Gibson, J. Leach, S. Keen, A. J. Wright, and M. J. Padgett, "Measuring the accuracy of particle position and force in optical tweezers using high-speed video microscopy," Opt. Express 16(19), 14,561-14,570 (2008).
  • the optical tweezers 40 is described in more detail in G. Gibson, D. M. Carberry, G. Whyte, J. Leach, J. Courtial, J. C. Jackson, D. Robert, M. Miles, and M. Padgett, "Holographic assembly workstation for optical manipulation," Journal of Optics A: Pure and Applied Optics 10(4), 044,009 (2008).
  • the multitouch software 26 and the interface software 32 operate on respective different computers 88, 90.
  • This modular design enables other interfaces (joystick, fiducial marker tracking, or future interfaces) to be swapped into the system and provide optical trap locations with a minimum configuration change.
  • a further benefit is that the processing speed of each individual system is increased.
  • the hologram calculation software is designed to run on either a dedicated graphics card 82 or on a third computer. Data is passed between computers over the local network using UDP datagrams.
  • Figure 2 shows how the hardware and software combine to form the entire system.
  • the multitouch software is designed to detect the positions of each user-generated event.
  • a typical image 92 captured by the multitouch CCD camera 64 is shown in Figure 2. It shows a user pressing each finger on one hand onto the console, and the light spots 94 show the positions of the user's fingertips.
  • the software subtracts the background signal, applies a threshold, converts the image to pure black and white, and calculates the centre of each spot.
  • the coordinates of every spot generated by the user's actions are passed to the interface software 32, in XY format, via UDP.
  • the multitouch software can be implemented using both the open source library Nuigroup, "TouchLib A Multitouch Development Kit,” (2008) available at the URL http://nuiqroup.com/touchlib/ and using LabView's blob detection and tracking algorithms (National Instruments).
  • the interface software 32 determines what each press or movement on the multitouch console will perform. To ensure that the user is aware of what each console event does, the interface software is projected onto the multitouch console to provide immediate feedback to the user.
  • Figure 4 shows the interface window, having two main sections, namely a manipulation area 122, which is a projection of a live image from the microscope with a graphics overlay; and a service area 124, which houses all of the control buttons and sliders.
  • Optical traps are represented by circles 126 in the manipulation area 122.
  • the system has been configured so that a temporary trap is formed when the user presses a finger onto the console in the manipulation area 122. These traps will operate only while pressure is applied to the console; by removing one's fingers the traps are destroyed.
  • the user wishes to maintain the currently selected optical traps without touching the console they simply need to press on the Group button 128 or Persistent button 130.
  • the user can then remove their hands and generate other optical traps, select other previously generated traps, or perform one of the other manipulations available in the service area 124 (such as changing the z-height or intensity of the traps by means of the sliders 132, 134 respectively). Release of a persistent or grouped trap location simply requires the pressing of the designated trap and a second pressing of the persistent button 130 or group button 128 as appropriate.
  • the image of the traps generated by the user is superimposed on the live, or real-time, display of the image from the microscope, which, in this case, shows a part of the physical structure 136.
  • Groups of traps can be manipulated as single entities and transformed using rotations, translations and scaling in 3D.
  • a group is represented by a colour unique to the optical traps, and its centre is represented by a coloured square.
  • An entire group can be translated by pressing on a square and moving it.
  • Scale and rotate operations are performed by selecting the group centre and a single trap marker. The exact function is determined by the change in the (r, ⁇ , ⁇ ) coordinate of the selected trap relative to the group centre.
  • Each trap within the group can still be manipulated individually by selecting only that one trap.
  • the display 22 displays the position of each optical trap, and the movement of the user's fingertips on the touch screen 24 is converted to desired movements of the positions.
  • the desired position of each optical trap is then passed to the hologram calculation processor 82.
  • Holograms are calculated using either a modified Gerchberg- Saxon algorithm or simple "Gratings and Lenses" algorithms, as known from prior art documents such as J. Liesener, M. Reicherter, T. Haist, and H. J. Tiziani, "Multifunctional optical tweezers using computer-generated holograms," Optics Communications 185 (1-3), 77-82 (2000); E. R. Dufresne, G. C. Spalding, M. T. Dearing, S. A. Sheets, and D.
  • Figure 5 shows how simple this two-sphere problem becomes with the multitouch console.
  • the user touches the console inside the manipulation area 122, generating optical traps at each of the user's finger locations 140, 142, trapping particles.
  • the updated particle positions 144, 146 reflect the new coordinates of the user's finger locations, as shown in Figure 5(b).
  • the optical traps are then destroyed when the user removes his fingers.
  • Figure 6 shows another feature of the multitouch console.
  • three microspheres are trapped by three traps at locations 150, 152, 154 as described with reference to Figure 5, and the traps have been made permanent and have been formed into a group 156 in Figure 6(b), whose position is represented by a square 158..
  • This group 156 is then transformed by translating in Figure 6(c), by moving a finger from a starting position of the square 158 to a desired new position, by rotating in Figure 6(d) and by scaling in Figure 6(e) using two fingers. These operations can be performed sequentially, or simultaneously. Any number of such groups can be manipulated at the same time.
  • Figure 7 illustrates the use of the system in trapping non-spherical objects.
  • Figure 7(a) shows traps 160, 162 being formed at each end of a 300 nm diameter, 12 ⁇ m long cadmium sulphide (CdS) rod 164, trapping it perpendicular to the optical axis.
  • CdS cadmium sulphide
  • Figure 7(b) shows the traps 160, 162 being moved by the user to manipulate the rod 164.
  • Figures 2 to 7 show a multitouch interface system for the interactive real-time control of a holographic optical tweezers system, allowing users to control optical traps and perform complex operations with just the touch of a finger.
  • Figure 8 shows the application of this principle to a high speed scanned probe microscope, in which a multitouch interface produces a highly intuitive and responsive control environment. This enables nanometre resolution to be maintained whilst scanning the sample over tens of microns, and allows arbitrary paths to be traversed.
  • Figure 8 shows a High Speed Atomic Force Microscope (HSAFM) 200, which is described in the document L. Picco, L. Bozec, A. Ulcinas, D. Engledew, M. Antognozzi, M. Horton, and M. J. Miles, "Breaking the speed limit with atomic force microscopy", Nanotechnology, 18:044030, 2006, and is therefore not described in further detail herein.
  • HAFM High Speed Atomic Force Microscope
  • the image generated by the AFM 200 is passed through the control interface 32 to the display portion of the multi-touch input device 20, which is as shown in Figure 2.
  • the multitouch console of the multi-touch input device 20 obtains the position of the operator's fingertips via frustrated total internal reflection, as described in more detail above.
  • control interface 32 where they are converted into control signals, and these control signals are passed to the AFM 200 to control the operation thereof.
  • pre-defined gestures utilising multiple fingers can be used to trigger a variety of control effects.
  • Figure 9 shows in more detail the form of the HSAFM control panel projected onto the multi-touch display 22.
  • the display is dominated in this example by two imaging panels.
  • On the left is the scan window 210, which displays the scan area currently being imaged, while on the right is the Max Scan Area 212, which takes the images and tiles them as the scan window is panned across the sample surface.
  • the current image tile 214 remains on top of the image.
  • the panning of the scan window across the sample surface is controlled by the movement of the user's finger across the screen.
  • buttons and sliders 216, 218, 220 etc, or other types of soft key that control the frame rate, the fast and slow scan frequency and amplitude, the signal filtering and other image controls. All of these controls can be altered by the operator in real-time, enabling them to control the frame rate, scan size and resolution of each frame precisely. All of the buttons and sliders may be responsive to the touch of a single finger on the screen 22.
  • Max Scan Area panel 212 presents the opportunity to use a wider range of gestures.
  • the display has two panels, it is also possible to display simultaneously more than two windows.
  • AFM it is possible to read out multiple data types from a single scan, such as a friction force image (using the lateral deflection of the probe), a height image (from the vertical deflection), and other images relating to other material properties of the sample (such as variations in the dielectric constant of the sample, the Young's Modulus, the chemical composition etc.).
  • the system can display multiple different representations of the same working space, allowing the user to identify their next course of action from one or more of these.
  • Figure 9 illustrates the image scan function.
  • Figure 9(a) by placing a single finger on the current image tile in the Max Scan area panel 212, the user is able to "pick up" the scan window.
  • Figures 9(b) and 9(c) show the situation as the user drags his finger across the screen, to pan the scan window across the image surface. Any trajectory can be followed, as the x and y axes are completely integrated. The speed of the pan motion is limited by the tiling rate (largely determined by the frame rate). Thus, the faster the user pans, the fewer the tiles laid down along the chosen trajectory.
  • Figures 9(b) and 9(c) the user has selected the current scan window with a single finger and translates this with respect to the absolute area, generating the arbitrary pattern shown.
  • Figure 10 shows a zoom function.
  • the user places two fingers within the Max Scan Area panel 212 and moves them towards or away from each other as shown in Figure 10(b) and Figure 10(c), the operator is able to zoom in and out of the max scan area, allowing large image areas to be built up while still being able to zoom in to see the high resolution information contained within the image.
  • the control interface identifies two simultaneous finger touches, changes in the separation of the fingers are used to control the zooming of the image.
  • a third gesture using three fingers allows the user to move the Max Scan Area window around when zoomed-in on a specific region.
  • the user can generate a tiled image of a large and arbitrarily shaped region of the sample surface.
  • This approach has a significant advantage compared to simply collecting a conventional AFM image of the same area because, as mentioned previously, a standard AFM is typically restricted to a maximum pixel resolution of 512 x 512 per image regardless of the scan size used. For example, a 20 ⁇ m 2 image taken using the Dimension 3100 operating in its standard mode will have a lateral pixel resolution of approximately 39 nm and take several minutes to collect.
  • an image of the same region generated by tiling a sequence of HSAFM scans collected as described above can not only be obtained in a shorter time, but also with a much greater lateral resolution.
  • HSAFM is capable of imaging large, fully hydrated chromosomes. This sample represents the typical challenge of biological samples, namely a very soft sample on a hard background, where, for the best results, the two surfaces require different imaging conditions.
  • the combination of HSAFM and multi-touch input described herein first allows the easy identification of the extent of the features of interest.
  • the scan window can then be moved over the feature of interest while at the same time the imaging parameters (scan size, tip-sample interaction force etc.) can be optimised.
  • the HSAFM permits rapid alterations to be made and assessed because of the fast response time due to the high frame rate. It is also worth noting that the arrangement of the Current Scan Window panel 210 and the Max Scan Area panel 212 aid the user by providing a detailed and high-resolution view of the sample, while helping to navigate the overall shape of the surface feature and maintain a sense of position relative to other landmarks on the substrate.
  • HSAFM primarily as an imaging tool.
  • AFM the interaction of the probe with the sample can be used for a wide range of non-imaging based techniques.
  • the localised tip-induced oxidation of hydrogen-passivated silicon wafers can be performed in parallel with HSAFM, providing the user with a unique level of feedback and control over the dimensions of the oxide layer thus produced.
  • the application of the electric signals used to oxidise the surface can be synchronised and controlled by the HSAFM and therefore it becomes possible to generate them periodically during high-speed scanning (such that, for example, they occur once per fast scan trace and re-trace).
  • the growth of the surface oxide can be monitored on a frame-by-frame basis.
  • the development of the integrated HSAFM and multi-touch input system has the potential to increase the versatility of this application even further, because the operator can control all of the oxidation parameters (and hence the dimensions and shape of the oxide layer produced) at the same time as the nanostructure that is being fabricated is scanned.
  • Other techniques such as the field-enhanced deposition of metallic nanostructures from a coated cantilever as described in P. Brogueira and L. V.
  • the multitouch interface has been integrated with the various devices, including the HSAFM, to enhance the operator's control of the devices.
  • the combination pushes the representation apparatus towards real-time visualisation and manipulation. This combination provides the user with an intuitive interface.

Abstract

A representation system comprises a material representation apparatus, for generating a real-time representation of a working space, which may be an imaging device or may also have material processing functionality. A multi-touch input device is connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and detects multiple points of contact. A data processor, connected to the multi-touch input device, and to the material representation apparatus, receives inputs representing the multiple points of contact, and generates control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact. In the case of an imaging device, the control signals can for example change the scan size and the zoom of the image. In the case of a device with material processing functions, the control signals can control the material processing.

Description

REPRESENTATION SYSTEM
This invention relates to a representation system, and more particularly to a system including a material representation apparatus, which can be controlled from an input device.
Various forms of material representation apparatus are known, including forms of material representation apparatus such as microscopes that simply provide an image of a sample when located in a working space, and also including forms of material representation apparatus that provide an image of a sample, but also have a material processing function.
In the case of a material representation apparatus that provides an image of a sample, the control can take the form of variation of the field of view, or of the focussing of the imaging system, for example. In the case of a material representation apparatus that also provides a material processing function, the control can take any suitable form, depending on the available processing functionality.
One known form of material representation apparatus that also provides a material processing function is an optical tweezers apparatus, such apparatus having evolved into a powerful micro-manipulation tool in the fields of physics, chemistry, biology and material science. Many such apparatus have additional optics to enable the control of numerous optical traps at once. Such additions include the use of spatial light modulators for holographic optical tweezers, for example as in M. Reicherter, T. Haist, E. U. Wagemann, and H. J. Tiziani, Optical particle trapping with computer-generated holograms written on a liquid-crystal display," Opt. Lett.24(9), 608-610 (1999); or GPC based systems as in P. J. Rodrigo, L. Gammelgaard, P. Bggild, I. Perch-Nielsen, and J. Glϋckstad, "Actuation of microfabricated tools using multiple GPC-based counterpropagating beam traps," Opt. Express13(18), 6899-6904 (2005); the use of acoustic optic deflectors as in K. C. Neuman and S. M. Block, Optical trapping," Review of Scientific Instruments75(9), 2787-2809 (2004); and the use of rapid scanning mirrors C. Mio, T. Gong, A. Terray, and D. W. M. Marr, "Design of a scanning laser optical trap for multiparticle manipulation," Review of Scientific Instruments 71 (5), 2196-2200 (2000). The widespread deployment of multiple-trap optical tweezers has led to experimentation with a variety of interfaces, as disclosed in G. Gibson, L. Barron, F. Beck, G. Whyte, and M. Padgett, Optically controlled grippers for manipulating micron- sized particles," New Journal of Physics 9(1 ), 14-14 (2007); G. Whyte, G. Gibson, J. Leach, M. Padgett, D. Robert, and M. Miles, "An optical trapped microhand for manipulating micron-sized objects," Opt. Express14(25), 12,497-12,502 (2006); and I.Y. Park, S.Y. Sung, J. H. Lee, and Y.G. Lee, "Manufacturing micro-scale structures by an optical tweezers system controlled by five finger tips," Journal of Micromechanics and Microengineering 17(10), N82-N89 (2007).
The most common method uses a PC and some form of control program to enable the real-time updating of trap locations. For example, in J. Leach, K. Wulff, G. Sinclair, P. Jordan, J. Courtial, L. Thomson, G. Gibson, K. Karunwi, J. Cooper, Z. J. Laczik, and M. Padgett, "Interactive approach to optical tweezers control," Appl. Opt.45 (5), 897-903 (2006), trap locations are controlled by either positioning a digital marker over a live video feed of the sample or by modifying simple coordinate values. While simple to develop and implement these "point-and-click" interfaces may not be sufficiently user- friendly for use by non-specialists and seldom allow for the control of more than one trap at a time.
Alternatives such as a joystick "gripper" as in G. Gibson, L. Barron, F. Beck, G. Whyte, and M. Padgett, "Optically controlled grippers for manipulating micron-sized particles," New Journal of Physics 9(1 ), 14-14 (2007) allow the user to perform a repetitive task such as the translation, rotation and scaling of a group of 4 optical traps. Similarly, an interface making use of fiducial markers attached to the user's fingertips, which was imaged and tracked by a typical web-camera, is disclosed in G. Whyte, G. Gibson, J. Leach, M. Padgett, D. Robert, and M. Miles, "An optical trapped microhand for manipulating micron-sized objects," Opt. Express 14(25), 12,497-12,502 (2006), allowing for more flexibility and limited control over the z-axis.
In a similar vein, animation gloves have also been used I.Y. Park, S.Y. Sung, J. H. Lee, and Y.G. Lee, "Manufacturing micro-scale structures by an optical tweezers system controlled by five finger tips," Journal of Micromechanics and Microengineering 17(10), N82-N89 (2007) to map optical traps to the user's fingertips. Unfortunately these systems are highly optimised for specific tasks and, for all their elegance, focus mainly on the manipulation of only a handful of particles. One known material representation apparatus that simply provides an image of a sample when located in a working space is an atomic force microscope (AFM) as described in G. Binnig, C. F. Quate, and Ch. Gerber. Atomic force microscope. Phys. Rev. Lett., 56(9):930-933, Mar 1986. Now commercialised and increasingly widespread, a conventional AFM typically produces an image area of a few square micrometers in two to three minutes. Imaging times increase for larger scan areas, for increased image resolution, and for samples with larger height variations and/or lower stiffness. This limitation arises from the fact that the AFM is a mechanical microscope based on a serial process. A secondary effect of this limited imaging rate is that, unlike, for example, optical microscopy or electron microscopy (SEM or TEM), there is no low magnification mode in AFM. Instead, the sample must be mapped sequentially with a series of images, each limited to a maximum of 100 μm2 or less. When dealing with functional biological structures, it quickly becomes entirely impractical to find the structures of interest while they are in the desired state.
High-speed AFM (HSAFM) is a relatively new development that enables the collection of images in timescales of less than 1 second. G. E. Fantner, G. Schitter, J. H. Kindt, T. Ivanov, K. Ivanova, R. Patel, N. Holten-Andersen, J. Adams, P. J. Thurner, I. W. Rangelow, and P. K. Hansma. "Components for high speed atomic force microscopy", Ultramicroscopy, 106:881 , 2006, and T. Ando, N. Kodera, E. Takai, D. Maruyama, K. Saito, and A. Toda "A high-speed atomic force microscope for studying biological macromolecules", PNAS, 98(22):12468, 2001 describe high-speed AFM achieved by reducing the dimensions of the cantilever (resulting in cantilevers with both high force- sensitivity and resonant frequency). By contrast, A. D. L. Humphris, M. J. Miles, and J. K. Hobbs, "A mechanical microscope: High-speed atomic force microscopy". Appl. Phys. Lett., 86(3):034106, 2005, and L. Picco, L. Bozec, A. Ulcinas, D. Engledew, M. Antognozzi, M. Horton, and M. J. Miles, "Breaking the speed limit with atomic force microscopy", Nanotechnology, 18:044030, 2006 describe a contact-mode technique capable of imaging at up to 1 ,300 frames per second. Going fast is a step towards one of the ultimate aims of AFM: the direct observation of a biological process in real-time.
The move to higher imaging rates presents three significant benefits. Firstly, conventional AFM is typically restricted to collecting images of 512 x 512 pixels regardless of the size of the scan area, and increasing the scan size sacrifices the lateral resolution of the image. Therefore, to maintain nanometre resolution over large (tens of microns) scan areas, it is necessary to image the area in smaller (one to two micron) sections which can then be stitched together. As described above, the imaging rate of a conventional AFM renders this impractical under most conditions. HSAFM dramatically reduces the time required to collect these smaller images and enables the creation of high-resolution, large-area scans in timescales similar to that of a single conventional AFM image. Secondly, the high frame-rate means that the effects of any adjustments made to the system (the tip-sample interaction force, scan size, location etc.) are immediately presented to the operator. Finally, the HSAFM gives the user the freedom to explore large areas with nanometre resolution, making the search for key features of interest a simple procedure. The responsiveness of the HSAFM and the ability to freely scan large regions of the sample surface make it more comparable in operation to an optical microscope than a conventional AFM.
Multi-touch interfaces, such as those demonstrated at the 18th Annual ACM Symposium on User Interface Software and Technology, "Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection" (ACM, Seattle, Washington, USA, 2005), provide the user with the ability to interact with a computer in multiple locations simultaneously. In this manner several digital objects can be controlled independently. Applications of this technology are in computer games, digital image manipulation and musical instruments.
According to the first aspect of the present invention there is provided a representation system, comprising: a material representation apparatus, for generating a real-time representation of a working space; a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; and a data processor, connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
Further features of the representation system are defined in dependent claims relating to the representation system. According to the second aspect of the present invention there is provided a data processor, for use in a representation system, wherein the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the data processor is adapted to be connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
Further features of the data processor are defined in dependent claims relating to the representation system.
According to the third aspect of the present invention there is provided a method of controlling a representation system, wherein the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the method comprises receiving inputs representing the multiple points of contact, and generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
Further features of the method are defined in dependent claims relating to the representation system.
According to the fourth aspect of the present invention there is provided a computer program product configured to implement the method of the third aspect. Thus, the new interface gives even an untrained user a high degree of control over a material representation apparatus.
For a better understanding of the present invention, and to show how it may be put into effect, reference will now be made, by way of example, to the accompanying drawings, in which:
Figure 1 is a block diagram, illustrating a representation system in accordance with an embodiment of the invention.
Figure 2 is a block diagram, illustrating in more detail a holographic optical tweezers system in accordance with an embodiment of the invention.
Figure 3 illustrates in more detail a part of the touch screen in the system of Figures 1 and 2.
Figure 4 shows the touch screen in the system of Figure 2, in use.
Figure 5 illustrates a first use of the system of Figure 2.
Figure 6 illustrates a second use of the system of Figure 2.
Figure 7 illustrates a third use of the system of Figure 2.
Figure 8 is a block diagram, illustrating in more detail an atomic force microscope in accordance with an embodiment of the invention.
Figure 9 illustrates a first use of the system of Figure 8.
Figure 10 illustrates a second use of the system of Figure 8.
Figure 1 is a block schematic diagram, showing a representation system, including a material representation apparatus 10. The material representation apparatus 10 includes an imaging device 12, which can in principle be any form of microscope using visible light or other forms of radiation, or generating a representation of a sample placed in a working space 16 by any other method. In this illustrated embodiment of the invention, the material representation apparatus 10 also includes a material processing device 14 for providing some form of material processing function on a sample when located in a working space, but, in other embodiments, the material representation apparatus 10 simply provides an image of a sample when located in the working space 16. It will be appreciated that the imaging device 12 and the material processing device 14 may be elements of a single device.
The representation system also includes a multi-touch input device 20, having a display 22 on which is generated an image produced by the imaging device 12. The multi- touch input device 20 also includes a touch screen 24, which may advantageously overlie the display 22, although it could in principle be provided separately therefrom. The multi-touch input device 20 further includes a touch discrimination unit 26, for identifying locations on the touch screen 24 at which a user is touching the screen.
The representation system also includes a data processor 30, including a control interface 32, for receiving signals from the touch discrimination unit 26 and for generating suitable control signals for the imaging device 12 and/or the material processing device 14, as appropriate.
Figure 2 shows in slightly more detail a representation system in which the material representation apparatus is in the form of an optical tweezers 40.
In this example, the multi-touch input device 20 uses frustrated total internal reflection (FTIR), as described by Han, 18th Annual ACM Symposium on User Interface Software and Technology, "Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection" (ACM, Seattle, Washington, USA, 2005), in order to track the positions of each of the user's fingertips. It will be appreciated that other types of touch screen can be used if preferred.
As shown in more detail in Figure 3, the console is physically made from a sheet 50 of PMMA (Poly(methyl methacrylate)), measuring 100 cm x 80 cm x 1 cm, with countersunk holes 52 for LEDs 54 drilled into the edge of the sheet at 2.5 cm intervals. The 880 nm LEDs 54 (SFH486, Osram) are used to illuminate the console uniformly, effectively making it a waveguide with minimal losses arising from scattering. An overlay of drafting paper 56, which acts as a diffuser for a data projector 58 (MP771 , BenQ), is coated with lightly textured silicone rubber 60 to facilitate the coupling between the user's fingers 62 and the waveguide 50.
When the user presses his/her hand onto the console infra-red light is scattered, as shown in Figure 3, before being collected by a 1.3MP CCD camera 64 (Dragonfly2, Point Grey Research). An 880 nm bandpass filter (not shown) can be added to the optical path immediately prior to the camera to reduce noise generated through ambient light. As a result, the camera detects a single "spot" of light at each location where a finger is pressed on the console.
The holographic optical tweezers device 40 are designed around a commercially available inverted microscope 72 (Axiovert 200, Zeiss), a 1.3 NA 100χ Plan-Neufluor objective 74 (Zeiss), and a motorised xyz translation stage 76 (MS-2000, ASI). The optical traps are powered by a titanium-sapphire laser 78 (Coherent 899), producing 4 W at 800 nm, which is pumped by a solid-state laser (Verdi-V18, Coherent). The wavefront is shaped by an optically-addressed spatial light modulator (SLM) 80 (X8267-14DB, Hamamatsu), controlled by a computer using either the CPU or a dedicated graphics card (GPU) 82 for calculating holograms 84, as shown in M. Reicherter, S. Zwick, T. Haist, C. Kohler, H. Tiziani, and W. Osten, "Fast digital hologram generation and adaptive force measurement in liquid-crystal-display based holographic tweezers," Appl. Opt.45(5), 888-896 (2006).
A live video feed is provided by a camera 86, such as a CMOS camera (EC1280, Prosilica) or a high-speed camera capable of position tracking at rates of up to 150 kHz (microCam-640, Durham Smart Imaging), as described in G. M. Gibson, J. Leach, S. Keen, A. J. Wright, and M. J. Padgett, "Measuring the accuracy of particle position and force in optical tweezers using high-speed video microscopy," Opt. Express 16(19), 14,561-14,570 (2008). The optical tweezers 40 is described in more detail in G. Gibson, D. M. Carberry, G. Whyte, J. Leach, J. Courtial, J. C. Jackson, D. Robert, M. Miles, and M. Padgett, "Holographic assembly workstation for optical manipulation," Journal of Optics A: Pure and Applied Optics 10(4), 044,009 (2008).
In this embodiment, the multitouch software 26 and the interface software 32 operate on respective different computers 88, 90. This modular design enables other interfaces (joystick, fiducial marker tracking, or future interfaces) to be swapped into the system and provide optical trap locations with a minimum configuration change. A further benefit is that the processing speed of each individual system is increased. The hologram calculation software is designed to run on either a dedicated graphics card 82 or on a third computer. Data is passed between computers over the local network using UDP datagrams.
Figure 2 shows how the hardware and software combine to form the entire system. The multitouch software is designed to detect the positions of each user-generated event. A typical image 92 captured by the multitouch CCD camera 64 is shown in Figure 2. It shows a user pressing each finger on one hand onto the console, and the light spots 94 show the positions of the user's fingertips. The software subtracts the background signal, applies a threshold, converts the image to pure black and white, and calculates the centre of each spot.
The coordinates of every spot generated by the user's actions are passed to the interface software 32, in XY format, via UDP. The multitouch software can be implemented using both the open source library Nuigroup, "TouchLib A Multitouch Development Kit," (2008) available at the URL http://nuiqroup.com/touchlib/ and using LabView's blob detection and tracking algorithms (National Instruments).
The interface software 32 determines what each press or movement on the multitouch console will perform. To ensure that the user is aware of what each console event does, the interface software is projected onto the multitouch console to provide immediate feedback to the user.
Figure 4 shows the interface window, having two main sections, namely a manipulation area 122, which is a projection of a live image from the microscope with a graphics overlay; and a service area 124, which houses all of the control buttons and sliders. Optical traps are represented by circles 126 in the manipulation area 122. The system has been configured so that a temporary trap is formed when the user presses a finger onto the console in the manipulation area 122. These traps will operate only while pressure is applied to the console; by removing one's fingers the traps are destroyed. When the user wishes to maintain the currently selected optical traps without touching the console they simply need to press on the Group button 128 or Persistent button 130. The user can then remove their hands and generate other optical traps, select other previously generated traps, or perform one of the other manipulations available in the service area 124 (such as changing the z-height or intensity of the traps by means of the sliders 132, 134 respectively). Release of a persistent or grouped trap location simply requires the pressing of the designated trap and a second pressing of the persistent button 130 or group button 128 as appropriate.
The image of the traps generated by the user is superimposed on the live, or real-time, display of the image from the microscope, which, in this case, shows a part of the physical structure 136.
Groups of traps can be manipulated as single entities and transformed using rotations, translations and scaling in 3D. A group is represented by a colour unique to the optical traps, and its centre is represented by a coloured square. An entire group can be translated by pressing on a square and moving it. Scale and rotate operations are performed by selecting the group centre and a single trap marker. The exact function is determined by the change in the (r, θ, φ) coordinate of the selected trap relative to the group centre. Each trap within the group can still be manipulated individually by selecting only that one trap.
Thus, the display 22 displays the position of each optical trap, and the movement of the user's fingertips on the touch screen 24 is converted to desired movements of the positions. The desired position of each optical trap is then passed to the hologram calculation processor 82. Holograms are calculated using either a modified Gerchberg- Saxon algorithm or simple "Gratings and Lenses" algorithms, as known from prior art documents such as J. Liesener, M. Reicherter, T. Haist, and H. J. Tiziani, "Multifunctional optical tweezers using computer-generated holograms," Optics Communications 185 (1-3), 77-82 (2000); E. R. Dufresne, G. C. Spalding, M. T. Dearing, S. A. Sheets, and D. G. Grier, "Computer-generated holographic optical tweezer arrays," Review of Scientific Instruments 72(3), 1810-1816 (2001 ); G. Sinclair, J. Leach, P. Jordan, G. Gibson, E. Yao, Z. Laczik, M. Padgett, and J. Courtial, "Interactive application in holographic optical tweezers of a multi-plane Gerchberg- Saxton algorithm for three-dimensional light shaping," Opt. Express 12(8), 1665-1670 (2004); and J. Leach, K. Wulff, G. Sinclair, P. Jordan, J. Courtial, L. Thomson, G. Gibson, K. Karunwi, J. Cooper, Z. J. Laczik, and M. Padgett, "Interactive approach to optical tweezers control," Appl. Opt. 45(5), 897-903 (2006). Thus, the system described herein allows simultaneous and independent interactive- control of multiple traps, for example in experiments where a rapid response to changing conditions is required, or in experiments where optical traps must be manipulated at the same time. For example, achieving independent translation of two trapped silica beads in two dimensions is possible.
Figure 5 shows how simple this two-sphere problem becomes with the multitouch console. In Figure 5(a), the user touches the console inside the manipulation area 122, generating optical traps at each of the user's finger locations 140, 142, trapping particles. When the user moves his finger across the console the updated particle positions 144, 146 reflect the new coordinates of the user's finger locations, as shown in Figure 5(b). The optical traps are then destroyed when the user removes his fingers.
Figure 6 shows another feature of the multitouch console. In Figure 6(a), three microspheres are trapped by three traps at locations 150, 152, 154 as described with reference to Figure 5, and the traps have been made permanent and have been formed into a group 156 in Figure 6(b), whose position is represented by a square 158.. This group 156 is then transformed by translating in Figure 6(c), by moving a finger from a starting position of the square 158 to a desired new position, by rotating in Figure 6(d) and by scaling in Figure 6(e) using two fingers. These operations can be performed sequentially, or simultaneously. Any number of such groups can be manipulated at the same time.
Figure 7 illustrates the use of the system in trapping non-spherical objects. Figure 7(a) shows traps 160, 162 being formed at each end of a 300 nm diameter, 12μm long cadmium sulphide (CdS) rod 164, trapping it perpendicular to the optical axis. The high refractive index of CdS (n=2.26 at 800 nm) makes optical trapping more challenging, and the morphology of the rod is such that the optical traps 160, 162 need to be applied virtually instantly at both ends of the rod to prevent it aligning itself with the beam's propagation direction. The rapid Brownian fluctuations of the rod are not overly troublesome in this case, because of the possibility for a user to create an optical trap in any position in the plane and make rapid adjustments immediately prior to trapping. Figure 7(b) shows the traps 160, 162 being moved by the user to manipulate the rod 164. Thus, Figures 2 to 7 show a multitouch interface system for the interactive real-time control of a holographic optical tweezers system, allowing users to control optical traps and perform complex operations with just the touch of a finger.
The same principle can be used in other systems where complex functions need to be modified simultaneously or controlled due to changing sample conditions, for example in other experimental or industrial systems in which multiple events occur at once.
Figure 8 shows the application of this principle to a high speed scanned probe microscope, in which a multitouch interface produces a highly intuitive and responsive control environment. This enables nanometre resolution to be maintained whilst scanning the sample over tens of microns, and allows arbitrary paths to be traversed.
In more detail, Figure 8 shows a High Speed Atomic Force Microscope (HSAFM) 200, which is described in the document L. Picco, L. Bozec, A. Ulcinas, D. Engledew, M. Antognozzi, M. Horton, and M. J. Miles, "Breaking the speed limit with atomic force microscopy", Nanotechnology, 18:044030, 2006, and is therefore not described in further detail herein. As described in that prior art document, a fast scan axis is provided by a high-speed flexure sample stage, driven by two piezoceramic actuators operating 180 degrees out of phase with one another. This is mounted on a Dimension 3100 AFM (Veeco, US), which is used to provide the slow scan axis, the pan axes (used to translate the high-speed imaging window across the sample surface) and the feedback loop, which operates on a frame by frame timescale and thus is only used to correct for sample slope. Software written in LabVIEW (National Instruments, US) can take direct control of the x and y piezo actuators of the Dimension 3100 and collect the deflection data for formation of an image of the imaging area 202. Standard MSNL cantilevers (Veeco, US) were used throughout.
The image generated by the AFM 200 is passed through the control interface 32 to the display portion of the multi-touch input device 20, which is as shown in Figure 2. The multitouch console of the multi-touch input device 20 obtains the position of the operator's fingertips via frustrated total internal reflection, as described in more detail above.
These positions detected by blob detection and tracking algorithms, are passed to the control interface 32 where they are converted into control signals, and these control signals are passed to the AFM 200 to control the operation thereof. As multiple blobs can be detected, pre-defined gestures utilising multiple fingers can be used to trigger a variety of control effects.
Figure 9 shows in more detail the form of the HSAFM control panel projected onto the multi-touch display 22. As seen in Figure 9, the display is dominated in this example by two imaging panels. On the left is the scan window 210, which displays the scan area currently being imaged, while on the right is the Max Scan Area 212, which takes the images and tiles them as the scan window is panned across the sample surface. The current image tile 214 remains on top of the image. The panning of the scan window across the sample surface is controlled by the movement of the user's finger across the screen.
In addition, the screen 22 presents various buttons and sliders, 216, 218, 220 etc, or other types of soft key that control the frame rate, the fast and slow scan frequency and amplitude, the signal filtering and other image controls. All of these controls can be altered by the operator in real-time, enabling them to control the frame rate, scan size and resolution of each frame precisely. All of the buttons and sliders may be responsive to the touch of a single finger on the screen 22.
In addition, the Max Scan Area panel 212 presents the opportunity to use a wider range of gestures.
Although in this example the display has two panels, it is also possible to display simultaneously more than two windows. For example, in the case of an AFM, it is possible to read out multiple data types from a single scan, such as a friction force image (using the lateral deflection of the probe), a height image (from the vertical deflection), and other images relating to other material properties of the sample (such as variations in the dielectric constant of the sample, the Young's Modulus, the chemical composition etc.). Thus, the system can display multiple different representations of the same working space, allowing the user to identify their next course of action from one or more of these.
Figure 9 illustrates the image scan function. As shown in Figure 9(a), by placing a single finger on the current image tile in the Max Scan area panel 212, the user is able to "pick up" the scan window. Figures 9(b) and 9(c) show the situation as the user drags his finger across the screen, to pan the scan window across the image surface. Any trajectory can be followed, as the x and y axes are completely integrated. The speed of the pan motion is limited by the tiling rate (largely determined by the frame rate). Thus, the faster the user pans, the fewer the tiles laid down along the chosen trajectory. In Figures 9(b) and 9(c), the user has selected the current scan window with a single finger and translates this with respect to the absolute area, generating the arbitrary pattern shown.
Figure 10 shows a zoom function. As shown in Figure 10(a), the user places two fingers within the Max Scan Area panel 212 and moves them towards or away from each other as shown in Figure 10(b) and Figure 10(c), the operator is able to zoom in and out of the max scan area, allowing large image areas to be built up while still being able to zoom in to see the high resolution information contained within the image. Thus, when the control interface identifies two simultaneous finger touches, changes in the separation of the fingers are used to control the zooming of the image. A third gesture using three fingers allows the user to move the Max Scan Area window around when zoomed-in on a specific region. By employing a combination of these gestures the user can generate a tiled image of a large and arbitrarily shaped region of the sample surface. This approach has a significant advantage compared to simply collecting a conventional AFM image of the same area because, as mentioned previously, a standard AFM is typically restricted to a maximum pixel resolution of 512 x 512 per image regardless of the scan size used. For example, a 20 μm2 image taken using the Dimension 3100 operating in its standard mode will have a lateral pixel resolution of approximately 39 nm and take several minutes to collect. In comparison, an image of the same region generated by tiling a sequence of HSAFM scans collected as described above, can not only be obtained in a shorter time, but also with a much greater lateral resolution. This is because a typical HSAFM frame will cover an area of 1 μm2, will take 100ms to collect and will have a lateral resolution that is comparable to a conventional image (the exact pixel count per frame will depend on the ratio of the fast scan frequency to the frame rate). Tiling these 1 μm2 frames together to create the 20 μm2 scan results in a final tiled image with a lateral resolution that is approximately 20 times greater than that of conventional AFM. Using a frame rate of 10 s~1 and a conservative xy overlap between each consecutive frame of 50% implies that the entire 20 μm2 area can be imaged by the operator in as little as 80 seconds and without compromising the nanometre lateral resolution available from the individual HSAFM frames. The benefits of greater pixel density are obvious when imaging DNA or other long, thin samples. In conventional AFM, such a sample cannot be found and mapped in large scan areas because the lateral resolution becomes too poor to identify the sample. This leaves the user to hop around the surface at random with a small scan window. The combination of HSAFM and multi-touch input described herein not only maintains a high lateral resolution, but the fast panning across the surface allows the user to follow the contours of the sample, avoiding the need for large scan images filled mostly with blank substrate. The mobility of the scan window of the HSAFM is also particularly useful for tracking the moving interface of a rapidly evolving or dynamic system, such as following phase changes in a polymer crystal during melting or crystallisation. Additional benefits are demonstrated when imaging chromosomes.
As described in L. M. Picco, P. G. Dunton, A. Ulcinas, D. J. Engledew, O. Hoshi, T. Ushiki, and M. J. Miles. High-speed AFM of human chromosomes in liquid. Nanotechnology, 19(38):384018, 2008, HSAFM is capable of imaging large, fully hydrated chromosomes. This sample represents the typical challenge of biological samples, namely a very soft sample on a hard background, where, for the best results, the two surfaces require different imaging conditions. The combination of HSAFM and multi-touch input described herein first allows the easy identification of the extent of the features of interest. The scan window can then be moved over the feature of interest while at the same time the imaging parameters (scan size, tip-sample interaction force etc.) can be optimised. The HSAFM permits rapid alterations to be made and assessed because of the fast response time due to the high frame rate. It is also worth noting that the arrangement of the Current Scan Window panel 210 and the Max Scan Area panel 212 aid the user by providing a detailed and high-resolution view of the sample, while helping to navigate the overall shape of the surface feature and maintain a sense of position relative to other landmarks on the substrate.
So far the applications and advantages of this development that have been discussed and demonstrated have used the HSAFM primarily as an imaging tool. However, one of the major strengths of AFM is that, as a mechanical microscope, the interaction of the probe with the sample can be used for a wide range of non-imaging based techniques. As described in J. A. Vicary and M. J. Miles, "Real-time nanofabrication with high-speed atomic force microscopy", Nanotechnology, 20(9):095302, 2009, the localised tip-induced oxidation of hydrogen-passivated silicon wafers can be performed in parallel with HSAFM, providing the user with a unique level of feedback and control over the dimensions of the oxide layer thus produced. The application of the electric signals used to oxidise the surface can be synchronised and controlled by the HSAFM and therefore it becomes possible to generate them periodically during high-speed scanning (such that, for example, they occur once per fast scan trace and re-trace). Thus, the growth of the surface oxide can be monitored on a frame-by-frame basis. The development of the integrated HSAFM and multi-touch input system has the potential to increase the versatility of this application even further, because the operator can control all of the oxidation parameters (and hence the dimensions and shape of the oxide layer produced) at the same time as the nanostructure that is being fabricated is scanned. Other techniques, such as the field-enhanced deposition of metallic nanostructures from a coated cantilever as described in P. Brogueira and L. V. MeIo, "Novel nanosized patterning technology based on electropulsed scanning probe microscopy", Materials Science and Engineering: C, 23:77, 2003, or localised heating of the substrate by a heated tip as described in H. J. Mamin and D. Rugar, "Thermomechanical writing with an atomic force microscope tip", Applied Physics Letters, 61 (8):1003-1005, 1992 could also benefit from the increased quality control and responsiveness that this combination of simultaneous nanostructure fabrication or modification with high-speed visualisation provides.
In conclusion, the multitouch interface has been integrated with the various devices, including the HSAFM, to enhance the operator's control of the devices. The combination pushes the representation apparatus towards real-time visualisation and manipulation. This combination provides the user with an intuitive interface.

Claims

1. A representation system, comprising: a material representation apparatus, for generating a real-time representation of a working space; a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; and a data processor, connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
2. A system as claimed in claim 1 , wherein the multi-touch input device is adapted to detect said points of contact on the display.
3. A system as claimed in claim 1 or 2, wherein a touch detection component of the multi-touch input device is superimposed, or appears to a user to be superimposed, on the display.
4. A system as claimed in any of claims 1 to 3, wherein the multi-touch input device is adapted to display soft keys, and to detect points of contact on said soft keys, and wherein the data processor is adapted to generate further control signals in response to detected points of contact on said soft keys.
5. A system as claimed in claim 4, wherein the soft keys comprise buttons or sliders.
6. A system as claimed in any preceding claim, wherein the material representation apparatus comprises a material processing device, and wherein the data processor is adapted to generate control signals for controlling a processing parameter of the material processing device.
7. A system as claimed in claim 6, wherein the material processing device comprises a sample handling device, and wherein the data processor is adapted to generate control signals for controlling a material movement parameter of the material processing device.
8. A system as claimed in claim 6 or 7, wherein, in response to movement of points of contact on the multi-touch input device, the data processor is adapted to generate control signals to cause multiple objects in a sample to be individually controlled and/or to be selectively grouped and for the group to undergo scaling, translation or rotation.
9. A system as claimed in claim 6, 7 or 8, wherein the material processing device comprises an optical trapping device, and wherein the data processor is adapted to generate control signals for controlling positions of optical traps in the optical trapping device.
10. A system as claimed in claim 9, when dependent directly or indirectly on claim 4, wherein in response to detected points of contact on said soft keys the data processor is adapted to generate further control signals to control one or more of: a height of a trap or a group of traps; a strength of trap or a group of traps; a duration of trap or a group of traps.
11. A system as claimed in claim 9 or 10, wherein, in one mode of operation, the data processor is adapted to generate control signals for creating a trap only when a region of the multi-touch input device is depressed.
12. A system as claimed in any of claims 9 to 1 1 , in which the positions of the traps are displayed on the display.
13. A system as claimed in any of claims 9 to 12, in which the traps are controlled by dynamically generated holograms.
14. A system as claimed in claim 6, 7 or 8, wherein the material processing device comprises a scanning force microscope, and wherein the data processor is adapted to generate control signals for controlling an interaction between a probe of the scanning force microscope and a sample, for the purposes of structure modification or generation via one or more of an electrochemical, optical, thermal, or indentation force parameter.
15. A system as claimed in any preceding claim, wherein the material representation apparatus comprises an imaging device, and wherein the data processor is adapted to generate control signals for controlling an imaging parameter of the imaging device.
16. A system as claimed in claim 15, wherein the data processor is adapted to generate control signals for the imaging device in response to changes in the received inputs representing movement of at least one point of contact, in order to control a position and/or size of a scan region of the imaging device.
17. A system as claimed in claim 15 or 16, wherein the imaging device comprises a scanning force microscope, and wherein the data processor is adapted to generate control signals for controlling a scanning parameter of the scanning force microscope.
18. A system as claimed in claim 17, wherein the data processor is adapted to: sequentially collect a plurality of data images generated based on respective positions of a point of contact on the multi-touch input device; and stitch together said plurality of data images so as to reflect their relative physical locations to form a composite digital image.
19. A system as claimed in claim 17 or 18, wherein the scanning force microscope comprises an atomic force microscope, a scanning tunnelling microscope, or a transverse dynamic force microscope etc.
20. A system as claimed in claim 15 or 16, wherein the imaging device comprises a radiation imaging microscope, and wherein the data processor is adapted to generate control signals for controlling a parameter of the radiation imaging microscope.
21. A system as claimed in any of claims 15 to 20, wherein the multi-touch input device comprises: a first display portion, configured to display a real-time representation of a part of the working space, and a second display portion, configured to display said real-time representation on a smaller scale at a location indicating a position of said part within the working space.
22. A system as claimed in claim 21 , wherein the data processor is adapted to receive an input representing one point of contact, and to generate control signals for moving said part within the working space in response to changes in the received input representing movement of the point of contact.
23. A system as claimed in claim 21 or 22, wherein the data processor is adapted to receive inputs representing two points of contact, and to generate control signals for altering a size of said part within the working space in response to changes in a separation of the two points of contact.
24. A data processor, for use in a representation system, wherein the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the data processor is adapted to be connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
25. A data processor adapted to interface a multi-touch input device with an apparatus where parameters of the apparatus or objects acted on by the apparatus can be represented on a display associated with or part of the multi-touch input device, wherein the data processor is adapted to allow multiple parameters or objects to be individually controlled and/or to be selectively grouped and for the group to undergo scaling, translation or rotation in response to movement of points of contact on the multi-touch input device, and wherein the data processor generates commands for the apparatus.
26. A method of controlling a representation system, wherein the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the method comprises receiving inputs representing the multiple points of contact, and generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
27. A method as claimed in claim 26, wherein the material representation apparatus comprises a material processing device, and wherein the method comprises generating control signals for controlling a processing parameter of the material processing device.
28. A method as claimed in claim 26, wherein the material representation apparatus comprises an imaging device, and wherein the method comprises generating control signals for controlling an imaging parameter of the imaging device.
29. A computer program product configured to implement the method of claim 26, 27 or 28.
EP09749183A 2008-10-30 2009-10-30 Representation system Ceased EP2350798A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GB0819931A GB0819931D0 (en) 2008-10-30 2008-10-30 Interface for optical tweezers
GB0916176A GB0916176D0 (en) 2009-09-15 2009-09-15 High-speed AFM user interface
PCT/GB2009/051463 WO2010049738A1 (en) 2008-10-30 2009-10-30 Representation system

Publications (1)

Publication Number Publication Date
EP2350798A1 true EP2350798A1 (en) 2011-08-03

Family

ID=41429470

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09749183A Ceased EP2350798A1 (en) 2008-10-30 2009-10-30 Representation system

Country Status (2)

Country Link
EP (1) EP2350798A1 (en)
WO (1) WO2010049738A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007143736A2 (en) * 2006-06-07 2007-12-13 Fei Company Compact scanning electron microscope

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080122799A1 (en) * 2001-02-22 2008-05-29 Pryor Timothy R Human interfaces for vehicles, homes, and other applications
US7724242B2 (en) * 2004-08-06 2010-05-25 Touchtable, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US8441467B2 (en) * 2006-08-03 2013-05-14 Perceptive Pixel Inc. Multi-touch sensing display through frustrated total internal reflection

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007143736A2 (en) * 2006-06-07 2007-12-13 Fei Company Compact scanning electron microscope

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LEACH J ET AL: "INTERACTIVE APPROACH TO OPTICAL TWEEZERS CONTROL", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 45, no. 5, 10 February 2006 (2006-02-10), pages 897 - 903, XP001239181, ISSN: 0003-6935, DOI: 10.1364/AO.45.000897 *
See also references of WO2010049738A1 *

Also Published As

Publication number Publication date
WO2010049738A1 (en) 2010-05-06

Similar Documents

Publication Publication Date Title
Taylor et al. The nanomanipulator: A virtual-reality interface for a scanning tunneling microscope
US11048333B2 (en) System and method for close-range movement tracking
Malik et al. Visual touchpad: a two-handed gestural input device
Ferreira et al. Virtual reality and haptics for nanorobotics
US9030498B2 (en) Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
Yoo et al. 3D user interface combining gaze and hand gestures for large-scale display
US20120120224A1 (en) Microscope having a touch screen
US20120327125A1 (en) System and method for close-range movement tracking
JP2013037675A5 (en)
US9097909B2 (en) Manipulation device for navigating virtual microscopy slides/digital images and methods related thereto
JP2011525283A (en) Gesture reference control system for vehicle interface
Grieve et al. Hands-on with optical tweezers: a multitouch interface for holographic optical trapping
Iwata et al. Development of nanomanipulator using a high-speed atomic force microscope coupled with a haptic device
US20150160260A1 (en) Touch-screen based scanning probe microscopy (spm)
Pacoret et al. Invited article: A review of haptic optical tweezers for an interactive microworld exploration
Carberry et al. Mapping real-time images of high-speed AFM using multitouch control
Schkolne et al. Immersive design of DNA molecules with a tangible interface
Taylor et al. Pearls found on the way to the ideal interface for scanned probe microscopes
Tomori et al. Holographic Raman tweezers controlled by multi-modal natural user interface
WO2010049738A1 (en) Representation system
Bae et al. Tangible NURBS-curve manipulation techniques using graspable handles on a large display
Iwata et al. Nanometer-scale manipulation and ultrasonic cutting using an atomic force microscope controlled by a haptic device as a human interface
EP2648004A1 (en) Touch-screen based scanning probe microscopy (SPM)
Gerena et al. 3D force-feedback optical tweezers for experimental biology
Obushi et al. MagniFinger: Fingertip-mounted microscope for augmenting human perception

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110523

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: CARL ZEISS MICROSCOPY GMBH

17Q First examination report despatched

Effective date: 20150713

APBK Appeal reference recorded

Free format text: ORIGINAL CODE: EPIDOSNREFNE

APBN Date of receipt of notice of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA2E

APBR Date of receipt of statement of grounds of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA3E

APAF Appeal reference modified

Free format text: ORIGINAL CODE: EPIDOSCREFNE

APBT Appeal procedure closed

Free format text: ORIGINAL CODE: EPIDOSNNOA9E

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20201123