EP2350798A1 - Representation system - Google Patents
Representation systemInfo
- Publication number
- EP2350798A1 EP2350798A1 EP09749183A EP09749183A EP2350798A1 EP 2350798 A1 EP2350798 A1 EP 2350798A1 EP 09749183 A EP09749183 A EP 09749183A EP 09749183 A EP09749183 A EP 09749183A EP 2350798 A1 EP2350798 A1 EP 2350798A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- representation
- contact
- data processor
- control signals
- touch input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- This invention relates to a representation system, and more particularly to a system including a material representation apparatus, which can be controlled from an input device.
- Various forms of material representation apparatus are known, including forms of material representation apparatus such as microscopes that simply provide an image of a sample when located in a working space, and also including forms of material representation apparatus that provide an image of a sample, but also have a material processing function.
- control can take the form of variation of the field of view, or of the focussing of the imaging system, for example.
- the control can take any suitable form, depending on the available processing functionality.
- One known form of material representation apparatus that also provides a material processing function is an optical tweezers apparatus, such apparatus having evolved into a powerful micro-manipulation tool in the fields of physics, chemistry, biology and material science. Many such apparatus have additional optics to enable the control of numerous optical traps at once.
- Such additions include the use of spatial light modulators for holographic optical tweezers, for example as in M. Reicherter, T. Haist, E. U. Wagemann, and H. J. Tiziani, Optical particle trapping with computer-generated holograms written on a liquid-crystal display," Opt. Lett.24(9), 608-610 (1999); or GPC based systems as in P. J. Rodrigo, L.
- trap locations are controlled by either positioning a digital marker over a live video feed of the sample or by modifying simple coordinate values. While simple to develop and implement these "point-and-click" interfaces may not be sufficiently user- friendly for use by non-specialists and seldom allow for the control of more than one trap at a time.
- a conventional AFM typically produces an image area of a few square micrometers in two to three minutes. Imaging times increase for larger scan areas, for increased image resolution, and for samples with larger height variations and/or lower stiffness.
- This limitation arises from the fact that the AFM is a mechanical microscope based on a serial process.
- a secondary effect of this limited imaging rate is that, unlike, for example, optical microscopy or electron microscopy (SEM or TEM), there is no low magnification mode in AFM. Instead, the sample must be mapped sequentially with a series of images, each limited to a maximum of 100 ⁇ m 2 or less.
- High-speed AFM is a relatively new development that enables the collection of images in timescales of less than 1 second.
- Components for high speed atomic force microscopy Ultramicroscopy, 106:881 , 2006, and T. Ando, N. Kodera, E. Takai, D. Maruyama, K. Saito, and A.
- the high frame-rate means that the effects of any adjustments made to the system (the tip-sample interaction force, scan size, location etc.) are immediately presented to the operator.
- the HSAFM gives the user the freedom to explore large areas with nanometre resolution, making the search for key features of interest a simple procedure.
- the responsiveness of the HSAFM and the ability to freely scan large regions of the sample surface make it more comparable in operation to an optical microscope than a conventional AFM.
- Multi-touch interfaces such as those demonstrated at the 18th Annual ACM Symposium on User Interface Software and Technology, "Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection” (ACM, Seattle, Washington, USA, 2005), provide the user with the ability to interact with a computer in multiple locations simultaneously. In this manner several digital objects can be controlled independently. Applications of this technology are in computer games, digital image manipulation and musical instruments.
- a representation system comprising: a material representation apparatus, for generating a real-time representation of a working space; a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; and a data processor, connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
- a data processor for use in a representation system, wherein the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the data processor is adapted to be connected to the multi-touch input device, and to the material representation apparatus, for receiving inputs representing the multiple points of contact, and for generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
- the representation system comprises: a material representation apparatus, for generating a real-time representation of a working space; and a multi-touch input device, connected to the material representation apparatus, for displaying the real-time representation of the working space on a display, and for detecting multiple points of contact; wherein the method comprises receiving inputs representing the multiple points of contact, and generating control signals for the material representation apparatus in response to changes in the received inputs representing movement of the multiple points of contact.
- the new interface gives even an untrained user a high degree of control over a material representation apparatus.
- Figure 1 is a block diagram, illustrating a representation system in accordance with an embodiment of the invention.
- FIG. 2 is a block diagram, illustrating in more detail a holographic optical tweezers system in accordance with an embodiment of the invention.
- Figure 3 illustrates in more detail a part of the touch screen in the system of Figures 1 and 2.
- Figure 4 shows the touch screen in the system of Figure 2, in use.
- Figure 5 illustrates a first use of the system of Figure 2.
- Figure 6 illustrates a second use of the system of Figure 2.
- Figure 7 illustrates a third use of the system of Figure 2.
- Figure 8 is a block diagram, illustrating in more detail an atomic force microscope in accordance with an embodiment of the invention.
- Figure 9 illustrates a first use of the system of Figure 8.
- Figure 10 illustrates a second use of the system of Figure 8.
- Figure 1 is a block schematic diagram, showing a representation system, including a material representation apparatus 10.
- the material representation apparatus 10 includes an imaging device 12, which can in principle be any form of microscope using visible light or other forms of radiation, or generating a representation of a sample placed in a working space 16 by any other method.
- the material representation apparatus 10 also includes a material processing device 14 for providing some form of material processing function on a sample when located in a working space, but, in other embodiments, the material representation apparatus 10 simply provides an image of a sample when located in the working space 16.
- the imaging device 12 and the material processing device 14 may be elements of a single device.
- the representation system also includes a multi-touch input device 20, having a display 22 on which is generated an image produced by the imaging device 12.
- the multi- touch input device 20 also includes a touch screen 24, which may advantageously overlie the display 22, although it could in principle be provided separately therefrom.
- the multi-touch input device 20 further includes a touch discrimination unit 26, for identifying locations on the touch screen 24 at which a user is touching the screen.
- the representation system also includes a data processor 30, including a control interface 32, for receiving signals from the touch discrimination unit 26 and for generating suitable control signals for the imaging device 12 and/or the material processing device 14, as appropriate.
- Figure 2 shows in slightly more detail a representation system in which the material representation apparatus is in the form of an optical tweezers 40.
- the multi-touch input device 20 uses frustrated total internal reflection (FTIR), as described by Han, 18th Annual ACM Symposium on User Interface Software and Technology, "Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection” (ACM, Seattle, Washington, USA, 2005), in order to track the positions of each of the user's fingertips.
- FTIR frustrated total internal reflection
- ACM Seattle, Washington, USA, 2005
- the console is physically made from a sheet 50 of PMMA (Poly(methyl methacrylate)), measuring 100 cm x 80 cm x 1 cm, with countersunk holes 52 for LEDs 54 drilled into the edge of the sheet at 2.5 cm intervals.
- the 880 nm LEDs 54 (SFH486, Osram) are used to illuminate the console uniformly, effectively making it a waveguide with minimal losses arising from scattering.
- An overlay of drafting paper 56 which acts as a diffuser for a data projector 58 (MP771 , BenQ), is coated with lightly textured silicone rubber 60 to facilitate the coupling between the user's fingers 62 and the waveguide 50.
- the holographic optical tweezers device 40 are designed around a commercially available inverted microscope 72 (Axiovert 200, Zeiss), a 1.3 NA 100 ⁇ Plan-Neufluor objective 74 (Zeiss), and a motorised xyz translation stage 76 (MS-2000, ASI).
- the optical traps are powered by a titanium-sapphire laser 78 (Coherent 899), producing 4 W at 800 nm, which is pumped by a solid-state laser (Verdi-V18, Coherent).
- the wavefront is shaped by an optically-addressed spatial light modulator (SLM) 80 (X8267-14DB, Hamamatsu), controlled by a computer using either the CPU or a dedicated graphics card (GPU) 82 for calculating holograms 84, as shown in M. Reicherter, S. Zwick, T. Haist, C. Kohler, H. Tiziani, and W. Osten, "Fast digital hologram generation and adaptive force measurement in liquid-crystal-display based holographic tweezers," Appl. Opt.45(5), 888-896 (2006).
- SLM optically-addressed spatial light modulator
- GPU dedicated graphics card
- a live video feed is provided by a camera 86, such as a CMOS camera (EC1280, Prosilica) or a high-speed camera capable of position tracking at rates of up to 150 kHz (microCam-640, Durham Smart Imaging), as described in G. M. Gibson, J. Leach, S. Keen, A. J. Wright, and M. J. Padgett, "Measuring the accuracy of particle position and force in optical tweezers using high-speed video microscopy," Opt. Express 16(19), 14,561-14,570 (2008).
- the optical tweezers 40 is described in more detail in G. Gibson, D. M. Carberry, G. Whyte, J. Leach, J. Courtial, J. C. Jackson, D. Robert, M. Miles, and M. Padgett, "Holographic assembly workstation for optical manipulation," Journal of Optics A: Pure and Applied Optics 10(4), 044,009 (2008).
- the multitouch software 26 and the interface software 32 operate on respective different computers 88, 90.
- This modular design enables other interfaces (joystick, fiducial marker tracking, or future interfaces) to be swapped into the system and provide optical trap locations with a minimum configuration change.
- a further benefit is that the processing speed of each individual system is increased.
- the hologram calculation software is designed to run on either a dedicated graphics card 82 or on a third computer. Data is passed between computers over the local network using UDP datagrams.
- Figure 2 shows how the hardware and software combine to form the entire system.
- the multitouch software is designed to detect the positions of each user-generated event.
- a typical image 92 captured by the multitouch CCD camera 64 is shown in Figure 2. It shows a user pressing each finger on one hand onto the console, and the light spots 94 show the positions of the user's fingertips.
- the software subtracts the background signal, applies a threshold, converts the image to pure black and white, and calculates the centre of each spot.
- the coordinates of every spot generated by the user's actions are passed to the interface software 32, in XY format, via UDP.
- the multitouch software can be implemented using both the open source library Nuigroup, "TouchLib A Multitouch Development Kit,” (2008) available at the URL http://nuiqroup.com/touchlib/ and using LabView's blob detection and tracking algorithms (National Instruments).
- the interface software 32 determines what each press or movement on the multitouch console will perform. To ensure that the user is aware of what each console event does, the interface software is projected onto the multitouch console to provide immediate feedback to the user.
- Figure 4 shows the interface window, having two main sections, namely a manipulation area 122, which is a projection of a live image from the microscope with a graphics overlay; and a service area 124, which houses all of the control buttons and sliders.
- Optical traps are represented by circles 126 in the manipulation area 122.
- the system has been configured so that a temporary trap is formed when the user presses a finger onto the console in the manipulation area 122. These traps will operate only while pressure is applied to the console; by removing one's fingers the traps are destroyed.
- the user wishes to maintain the currently selected optical traps without touching the console they simply need to press on the Group button 128 or Persistent button 130.
- the user can then remove their hands and generate other optical traps, select other previously generated traps, or perform one of the other manipulations available in the service area 124 (such as changing the z-height or intensity of the traps by means of the sliders 132, 134 respectively). Release of a persistent or grouped trap location simply requires the pressing of the designated trap and a second pressing of the persistent button 130 or group button 128 as appropriate.
- the image of the traps generated by the user is superimposed on the live, or real-time, display of the image from the microscope, which, in this case, shows a part of the physical structure 136.
- Groups of traps can be manipulated as single entities and transformed using rotations, translations and scaling in 3D.
- a group is represented by a colour unique to the optical traps, and its centre is represented by a coloured square.
- An entire group can be translated by pressing on a square and moving it.
- Scale and rotate operations are performed by selecting the group centre and a single trap marker. The exact function is determined by the change in the (r, ⁇ , ⁇ ) coordinate of the selected trap relative to the group centre.
- Each trap within the group can still be manipulated individually by selecting only that one trap.
- the display 22 displays the position of each optical trap, and the movement of the user's fingertips on the touch screen 24 is converted to desired movements of the positions.
- the desired position of each optical trap is then passed to the hologram calculation processor 82.
- Holograms are calculated using either a modified Gerchberg- Saxon algorithm or simple "Gratings and Lenses" algorithms, as known from prior art documents such as J. Liesener, M. Reicherter, T. Haist, and H. J. Tiziani, "Multifunctional optical tweezers using computer-generated holograms," Optics Communications 185 (1-3), 77-82 (2000); E. R. Dufresne, G. C. Spalding, M. T. Dearing, S. A. Sheets, and D.
- Figure 5 shows how simple this two-sphere problem becomes with the multitouch console.
- the user touches the console inside the manipulation area 122, generating optical traps at each of the user's finger locations 140, 142, trapping particles.
- the updated particle positions 144, 146 reflect the new coordinates of the user's finger locations, as shown in Figure 5(b).
- the optical traps are then destroyed when the user removes his fingers.
- Figure 6 shows another feature of the multitouch console.
- three microspheres are trapped by three traps at locations 150, 152, 154 as described with reference to Figure 5, and the traps have been made permanent and have been formed into a group 156 in Figure 6(b), whose position is represented by a square 158..
- This group 156 is then transformed by translating in Figure 6(c), by moving a finger from a starting position of the square 158 to a desired new position, by rotating in Figure 6(d) and by scaling in Figure 6(e) using two fingers. These operations can be performed sequentially, or simultaneously. Any number of such groups can be manipulated at the same time.
- Figure 7 illustrates the use of the system in trapping non-spherical objects.
- Figure 7(a) shows traps 160, 162 being formed at each end of a 300 nm diameter, 12 ⁇ m long cadmium sulphide (CdS) rod 164, trapping it perpendicular to the optical axis.
- CdS cadmium sulphide
- Figure 7(b) shows the traps 160, 162 being moved by the user to manipulate the rod 164.
- Figures 2 to 7 show a multitouch interface system for the interactive real-time control of a holographic optical tweezers system, allowing users to control optical traps and perform complex operations with just the touch of a finger.
- Figure 8 shows the application of this principle to a high speed scanned probe microscope, in which a multitouch interface produces a highly intuitive and responsive control environment. This enables nanometre resolution to be maintained whilst scanning the sample over tens of microns, and allows arbitrary paths to be traversed.
- Figure 8 shows a High Speed Atomic Force Microscope (HSAFM) 200, which is described in the document L. Picco, L. Bozec, A. Ulcinas, D. Engledew, M. Antognozzi, M. Horton, and M. J. Miles, "Breaking the speed limit with atomic force microscopy", Nanotechnology, 18:044030, 2006, and is therefore not described in further detail herein.
- HAFM High Speed Atomic Force Microscope
- the image generated by the AFM 200 is passed through the control interface 32 to the display portion of the multi-touch input device 20, which is as shown in Figure 2.
- the multitouch console of the multi-touch input device 20 obtains the position of the operator's fingertips via frustrated total internal reflection, as described in more detail above.
- control interface 32 where they are converted into control signals, and these control signals are passed to the AFM 200 to control the operation thereof.
- pre-defined gestures utilising multiple fingers can be used to trigger a variety of control effects.
- Figure 9 shows in more detail the form of the HSAFM control panel projected onto the multi-touch display 22.
- the display is dominated in this example by two imaging panels.
- On the left is the scan window 210, which displays the scan area currently being imaged, while on the right is the Max Scan Area 212, which takes the images and tiles them as the scan window is panned across the sample surface.
- the current image tile 214 remains on top of the image.
- the panning of the scan window across the sample surface is controlled by the movement of the user's finger across the screen.
- buttons and sliders 216, 218, 220 etc, or other types of soft key that control the frame rate, the fast and slow scan frequency and amplitude, the signal filtering and other image controls. All of these controls can be altered by the operator in real-time, enabling them to control the frame rate, scan size and resolution of each frame precisely. All of the buttons and sliders may be responsive to the touch of a single finger on the screen 22.
- Max Scan Area panel 212 presents the opportunity to use a wider range of gestures.
- the display has two panels, it is also possible to display simultaneously more than two windows.
- AFM it is possible to read out multiple data types from a single scan, such as a friction force image (using the lateral deflection of the probe), a height image (from the vertical deflection), and other images relating to other material properties of the sample (such as variations in the dielectric constant of the sample, the Young's Modulus, the chemical composition etc.).
- the system can display multiple different representations of the same working space, allowing the user to identify their next course of action from one or more of these.
- Figure 9 illustrates the image scan function.
- Figure 9(a) by placing a single finger on the current image tile in the Max Scan area panel 212, the user is able to "pick up" the scan window.
- Figures 9(b) and 9(c) show the situation as the user drags his finger across the screen, to pan the scan window across the image surface. Any trajectory can be followed, as the x and y axes are completely integrated. The speed of the pan motion is limited by the tiling rate (largely determined by the frame rate). Thus, the faster the user pans, the fewer the tiles laid down along the chosen trajectory.
- Figures 9(b) and 9(c) the user has selected the current scan window with a single finger and translates this with respect to the absolute area, generating the arbitrary pattern shown.
- Figure 10 shows a zoom function.
- the user places two fingers within the Max Scan Area panel 212 and moves them towards or away from each other as shown in Figure 10(b) and Figure 10(c), the operator is able to zoom in and out of the max scan area, allowing large image areas to be built up while still being able to zoom in to see the high resolution information contained within the image.
- the control interface identifies two simultaneous finger touches, changes in the separation of the fingers are used to control the zooming of the image.
- a third gesture using three fingers allows the user to move the Max Scan Area window around when zoomed-in on a specific region.
- the user can generate a tiled image of a large and arbitrarily shaped region of the sample surface.
- This approach has a significant advantage compared to simply collecting a conventional AFM image of the same area because, as mentioned previously, a standard AFM is typically restricted to a maximum pixel resolution of 512 x 512 per image regardless of the scan size used. For example, a 20 ⁇ m 2 image taken using the Dimension 3100 operating in its standard mode will have a lateral pixel resolution of approximately 39 nm and take several minutes to collect.
- an image of the same region generated by tiling a sequence of HSAFM scans collected as described above can not only be obtained in a shorter time, but also with a much greater lateral resolution.
- HSAFM is capable of imaging large, fully hydrated chromosomes. This sample represents the typical challenge of biological samples, namely a very soft sample on a hard background, where, for the best results, the two surfaces require different imaging conditions.
- the combination of HSAFM and multi-touch input described herein first allows the easy identification of the extent of the features of interest.
- the scan window can then be moved over the feature of interest while at the same time the imaging parameters (scan size, tip-sample interaction force etc.) can be optimised.
- the HSAFM permits rapid alterations to be made and assessed because of the fast response time due to the high frame rate. It is also worth noting that the arrangement of the Current Scan Window panel 210 and the Max Scan Area panel 212 aid the user by providing a detailed and high-resolution view of the sample, while helping to navigate the overall shape of the surface feature and maintain a sense of position relative to other landmarks on the substrate.
- HSAFM primarily as an imaging tool.
- AFM the interaction of the probe with the sample can be used for a wide range of non-imaging based techniques.
- the localised tip-induced oxidation of hydrogen-passivated silicon wafers can be performed in parallel with HSAFM, providing the user with a unique level of feedback and control over the dimensions of the oxide layer thus produced.
- the application of the electric signals used to oxidise the surface can be synchronised and controlled by the HSAFM and therefore it becomes possible to generate them periodically during high-speed scanning (such that, for example, they occur once per fast scan trace and re-trace).
- the growth of the surface oxide can be monitored on a frame-by-frame basis.
- the development of the integrated HSAFM and multi-touch input system has the potential to increase the versatility of this application even further, because the operator can control all of the oxidation parameters (and hence the dimensions and shape of the oxide layer produced) at the same time as the nanostructure that is being fabricated is scanned.
- Other techniques such as the field-enhanced deposition of metallic nanostructures from a coated cantilever as described in P. Brogueira and L. V.
- the multitouch interface has been integrated with the various devices, including the HSAFM, to enhance the operator's control of the devices.
- the combination pushes the representation apparatus towards real-time visualisation and manipulation. This combination provides the user with an intuitive interface.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Microscoopes, Condenser (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0819931A GB0819931D0 (en) | 2008-10-30 | 2008-10-30 | Interface for optical tweezers |
GB0916176A GB0916176D0 (en) | 2009-09-15 | 2009-09-15 | High-speed AFM user interface |
PCT/GB2009/051463 WO2010049738A1 (en) | 2008-10-30 | 2009-10-30 | Representation system |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2350798A1 true EP2350798A1 (en) | 2011-08-03 |
Family
ID=41429470
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP09749183A Ceased EP2350798A1 (en) | 2008-10-30 | 2009-10-30 | Representation system |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP2350798A1 (en) |
WO (1) | WO2010049738A1 (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007143736A2 (en) * | 2006-06-07 | 2007-12-13 | Fei Company | Compact scanning electron microscope |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080122799A1 (en) * | 2001-02-22 | 2008-05-29 | Pryor Timothy R | Human interfaces for vehicles, homes, and other applications |
US7724242B2 (en) * | 2004-08-06 | 2010-05-25 | Touchtable, Inc. | Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter |
US8441467B2 (en) * | 2006-08-03 | 2013-05-14 | Perceptive Pixel Inc. | Multi-touch sensing display through frustrated total internal reflection |
-
2009
- 2009-10-30 EP EP09749183A patent/EP2350798A1/en not_active Ceased
- 2009-10-30 WO PCT/GB2009/051463 patent/WO2010049738A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007143736A2 (en) * | 2006-06-07 | 2007-12-13 | Fei Company | Compact scanning electron microscope |
Non-Patent Citations (2)
Title |
---|
LEACH J ET AL: "INTERACTIVE APPROACH TO OPTICAL TWEEZERS CONTROL", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 45, no. 5, 10 February 2006 (2006-02-10), pages 897 - 903, XP001239181, ISSN: 0003-6935, DOI: 10.1364/AO.45.000897 * |
See also references of WO2010049738A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2010049738A1 (en) | 2010-05-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11048333B2 (en) | System and method for close-range movement tracking | |
Taylor et al. | The nanomanipulator: A virtual-reality interface for a scanning tunneling microscope | |
JP6050935B2 (en) | MICROSCOPE WITH TOUCH SCREEN, CONTROL DEVICE FOR CONTROLLING MICROSCOPE, AND METHOD FOR OPERATING MICROSCOPE | |
Malik et al. | Visual touchpad: a two-handed gestural input device | |
JP6116064B2 (en) | Gesture reference control system for vehicle interface | |
Ferreira et al. | Virtual reality and haptics for nanorobotics | |
US9910498B2 (en) | System and method for close-range movement tracking | |
Yoo et al. | 3D user interface combining gaze and hand gestures for large-scale display | |
JP2013037675A5 (en) | ||
US9097909B2 (en) | Manipulation device for navigating virtual microscopy slides/digital images and methods related thereto | |
Grieve et al. | Hands-on with optical tweezers: a multitouch interface for holographic optical trapping | |
Iwata et al. | Development of nanomanipulator using a high-speed atomic force microscope coupled with a haptic device | |
Carberry et al. | Mapping real-time images of high-speed AFM using multitouch control | |
WO2010049738A1 (en) | Representation system | |
Bae et al. | Tangible NURBS-curve manipulation techniques using graspable handles on a large display | |
Sharma et al. | Virtual reality and haptics in nano-and bionanotechnology | |
Gerena et al. | 3D force-feedback optical tweezers for experimental biology | |
EP2834651A1 (en) | Touch-screen based scanning probe microscopy (spm) | |
Obushi et al. | MagniFinger: Fingertip-mounted microscope for augmenting human perception | |
Taylor | Haptics for scientific visualization | |
Obushi et al. | MagniFinger: magnified perception by a fingertip probe microscope | |
CN116682488B (en) | Visual interaction method and device for relaxation of biological molecular structure | |
Ishisaki et al. | Nanomanipulator based on a high-speed atomic force microscopy | |
Iwata et al. | Development of a nano manipulator based on an atomic force microscope coupled with a haptic device: a novel manipulation tool for scanning electron microscopy | |
TW201322752A (en) | Bi-directional synchronous operation displacement device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20110523 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: CARL ZEISS MICROSCOPY GMBH |
|
17Q | First examination report despatched |
Effective date: 20150713 |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20201123 |