US20070300307A1 - Security Using Physical Objects - Google Patents
Security Using Physical Objects Download PDFInfo
- Publication number
- US20070300307A1 US20070300307A1 US11/426,101 US42610106A US2007300307A1 US 20070300307 A1 US20070300307 A1 US 20070300307A1 US 42610106 A US42610106 A US 42610106A US 2007300307 A1 US2007300307 A1 US 2007300307A1
- Authority
- US
- United States
- Prior art keywords
- objects
- security pattern
- security
- display
- pattern
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/36—User authentication by graphic or iconic representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
Definitions
- Accessing secured content may require the placement of physical objects against a display surface that can detect the objects.
- the user can request to define any security pattern they wish, and the system may respond by asking the user to place the desired object or objects in the desired pattern on the display.
- the system may then detect the various visual attributes of the placed object(s), such as their outline shape, position on the display, relative positioning with respect to other objects, rotation orientation, interior design pattern, etc., and may ask the user to select which ones will be used in the security pattern.
- the display surface may be a table top configuration, which may be used as a desk, so it may contain other objects not necessarily intended by the user to be security pattern objects.
- the system when defining a security pattern, may give the user the option of de-selecting certain objects so that they are ignored in the pattern being created.
- the system may display an attribute menu adjacent to each detected object on the display.
- the menu can list the various detected attributes, and may give the user the option to check/un-check the individual attributes to allow the user to customize the level of security desired. For example, a user might simply wish to use the object's outline shape for one security pattern, and may wish to use the outline shapes and rotational orientations of a group of objects for another security pattern.
- the system may also allow the user to define a margin of error for the application of the security pattern. Accordingly, if the security pattern requires that an object having a particular shape be placed at a particular location on the display, the system may be configured to accept placements of the object in slightly different locations (e.g., 5% to the left or right, 10% above, etc.).
- FIG. 1 illustrates an example computing environment in which features described herein may be implemented.
- FIG. 2 illustrates an example optical detection system that may be used as a display to implement features described herein.
- FIGS. 3 and 4 illustrate example table embodiments of the display shown in FIG. 2 .
- FIGS. 5A and 5B illustrate example displays having one or more objects placed on top or against them as a security pattern.
- FIG. 6 illustrates an example process employing various features described herein.
- FIG. 7 illustrates an example attribute menu that may be displayed for proposed security pattern objects.
- FIGS. 8 and 9 illustrate example interior patterns that may be used as part of a security pattern.
- FIG. 1 illustrates an example of a suitable computing system environment 100 on which the features herein may be implemented.
- the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the features described herein. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100 .
- program modules include routines, programs, objects, components, data structures, etc. that can perform particular tasks or implement particular abstract data types.
- program modules may be located in both local and remote computer storage media including memory storage devices.
- the exemplary system 100 for implementing features described herein includes a general purpose-computing device in the form of a computer 110 including a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including the system memory to the processing unit 120 .
- Computer 110 may include a variety of computer readable media.
- computer readable media may include computer storage media and communication media.
- the system memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132 .
- ROM read only memory
- RAM random access memory
- a basic input/output system 133 (BIOS) containing the basic routines that help to transfer information between elements within computer 110 , such as during start-up, may be stored in ROM 131 .
- BIOS basic input/output system 133
- RAM 132 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
- FIG. 1 illustrates operating system 134 , application programs 135 , other program modules 136 , and program data 137 .
- the computer 110 may also include other removable/nonremovable, volatile/nonvolatile computer storage media.
- FIG. 1 illustrates a hard disk drive 141 that reads from or writes to nonremovable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152 , and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD ROM or other optical media.
- removable/nonremovable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
- the hard disk drive 141 may be connected to the system bus 121 through a non-removable memory interface such as interface 140
- magnetic disk drive 151 and optical disk drive 155 may be connected to the system bus 121 by a removable memory interface, such as interface 150 .
- the drives and their associated computer storage media discussed above and illustrated in FIG. 1 may provide storage of computer readable instructions, data structures, program modules and other data for the computer 110 .
- hard disk drive 141 is illustrated as storing operating system 144 , application programs 145 , other program modules 146 , and program data 147 .
- operating system 144 application programs 145 , other program modules 146 , and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
- a user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161 , commonly referred to as a mouse, trackball or touch pad.
- Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
- These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
- a monitor 191 or other type of display device may also be connected to the system bus 121 via an interface, such as a video interface 190 .
- the video interface 190 may be bidirectional, and may receive video input from sensors associated with the monitor 191 .
- the monitor 191 may be touch and/or proximity sensitive, such that contacts to a monitor surface may be used as input data.
- the input sensors for affecting this could be a capacitive touch sensitive device, an array of resistive contact sensors, an optical sensor or camera, or any other desired sensor to make the monitor 191 touch and/or proximity sensitive.
- the monitor itself may be optically sensitive, such as the example shown in FIG. 2 and described below.
- a touch, optical and/or proximity sensitive input system may be separate from monitor 191 , and may include a planar surface such as a table top 192 and any applicable sensing systems to make the planar surface touch sensitive, such as camera 193 .
- computers may also include other peripheral output devices such as speakers 197 and printer 196 , which may be connected through an output peripheral interface 195 .
- the computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180 .
- the remote computer 180 may be a personal computer, and typically includes many or all of the elements described above relative to the computer 110 , although only a memory storage device 181 has been illustrated in FIG. 1 .
- the logical connections depicted in FIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173 , but may also include other networks.
- LAN local area network
- WAN wide area network
- the computer 110 When used in a LAN networking environment, the computer 110 may be connected to the LAN 171 through a network interface or adapter 170 .
- the computer 110 When used in a WAN networking environment, the computer 110 may include a modem 172 or other means for establishing communications over the WAN 173 , such as the Internet.
- the modem 172 which may be internal or external, may be connected to the system bus 121 via the user input interface 160 , or other appropriate mechanism.
- program modules depicted relative to the computer 110 may be stored in the remote memory storage device.
- FIG. 1 illustrates remote application programs 185 as residing on memory device 181 .
- the computing device shown in FIG. 1 may be incorporated into a system having table display device 200 , as shown in FIG. 2 .
- the display device 200 may include a display surface 201 , which may be a planar surface such as the table top 192 . As described hereinafter, the display surface 201 may also help to serve as a user interface.
- the display device 200 may display a computer-generated image on its display surface 201 , which allows the device 200 to be used as a display monitor (such as monitor 191 ) for computing processes, displaying graphical user interfaces, displaying television or other visual images, video games, and the like.
- the display may be projection-based, and may use a digital light processing (e.g., DLP by Texas Instruments, Inc., Plano, Tex.) technique, or it may be based on other display technologies, such as liquid crystal display (LCD) technology.
- a projector 202 may be used to project light onto the underside of the display surface 201 . It may do so directly, or may do so using one or more mirrors. As shown in FIG.
- the projector 202 in this example projects light for a desired image onto a first reflective surface 203 a , which may in turn reflect light onto a second reflective surface 203 b , which may ultimately reflect that light onto the underside of the display surface 201 , causing the surface 201 to emit light corresponding to the desired display.
- the device 200 may also be used as an input-receiving device.
- the device 200 may include one or more light emitting devices 204 , such as IR light emitting diodes (LEDs), mounted in the device's interior.
- the light from devices 204 may be projected upwards through the display surface 201 , and may reflect off of various objects that are above the display surface 201 .
- one or more objects 205 may be placed in physical contact with the display surface 201 .
- One or more other objects 206 may be placed near the display surface 201 , but not in physical contact (e.g., closely hovering).
- the light emitted from the emitting device(s) 204 may reflect off of these objects, and may be detected by a camera 207 , which may be an IR camera if IR light is used.
- the signals from the camera 207 may then be forwarded to a computing device (e.g., the computer 110 shown in FIG. 1 ) for processing, which, based on various configurations for various applications, may identify the object and its orientation (e.g. touching or hovering, tilted, partially touching, etc.) based on its shape and the amount/type of light reflected.
- the objects may include a reflective pattern, such as a bar code, on their lower surface.
- the display surface 201 may include a translucent layer that diffuses emitted light. Based on the amount of light reflected back to the camera 207 through this layer, the associated processing system may determine whether an object is touching the surface 201 , and if the object is not touching, a distance between the object and the surface 201 . Accordingly, various physical objects (e.g., fingers, elbows, hands, stylus pens, blocks, etc.) may be used as physical control members, providing input to the device 200 (or to an associated computing device).
- various physical objects e.g., fingers, elbows, hands, stylus pens, blocks, etc.
- the device 200 shown in FIG. 2 is illustrated as using light projection- and sensing techniques for the display of data and the reception of input, but other techniques may be used as well.
- stylus-sensitive displays are currently available for use with Tablet-based laptop computers, and such displays may be used as device 200 .
- stylus- and other touch-sensitive displays are available with many personal data assistants (PDAs), and those types of displays may also be used as device 200 .
- PDAs personal data assistants
- the device 200 is also shown in a substantially horizontal orientation, with the display surface 201 acting as a tabletop. Other orientations may also be used.
- the device 200 may be oriented to project a display onto any desired surface, such as a vertical wall. Reflective IR light may also be received from any such oriented surface.
- FIG. 3 illustrates an illustrative configuration of an implementation of the system shown in FIG. 2 , in which device 301 is used as a tabletop display device.
- FIG. 4 illustrates an overhead view of such a table, around which a number of users 401 may be seated or standing. Each user 401 may wish to interact with the display on the surface of table 301 , for example to place and/or touch an object, or to play a party video game.
- the display area of table 301 shown in this example is circular, it may be any desired shape, such as oval, rectangular, square, hexagon, octagon, etc.
- FIG. 5A illustrates an example of a security pattern that may be used in the place of, or together with, alphanumeric passwords to restrict access to a location or content on a computing system.
- the display 501 may be in a tabletop or horizontal configuration, and may have several objects resting on top (other configurations may also be used, with objects in contact with the display surface).
- FIG. 5A shows the placement of a star-shaped object 502 on the display 501 .
- the object may be a personal effect of the user, such as a keychain ornament, that the user can easily remember to place on the tabletop display whenever he/she needs to access some particular piece of secured content (e.g., parental control blocking predetermined program types, such as adult content, from programs viewed on a television or the display 501 itself).
- the display 501 may detect the outline shape of the object 502 , and may require that it sees the object 502 somewhere on the display 501 before granting access to the restricted content.
- the user may configure the display 501 (or the secured content) to require more than just the placement of the object 502 having the appropriate outline.
- the user may configure the system to require that the object be placed at a predefined location on the display 501 surface, using for example X,Y pixel coordinates.
- the user may also configure the system to require that the object 502 be placed at a particular angle 503 of rotation orientation from a normal orientation.
- FIG. 5B illustrates an example in which four example objects are placed on the display 501 as a security pattern.
- the security pattern involves one or more dice 504 , a trophy having an oval base 505 , and a stapler 506 all placed on the display 501 before being satisfied and granting access to the secured content.
- these objects may also be required to be placed at the proper locations, orientations, and/or at the proper angles, before access will be granted to the secured content.
- FIG. 6 illustrates a process by which a security pattern may be defined and used.
- the system e.g., display 501 and/or its underlying computing system
- the request may be generated in many ways. It may be generated in response to a user's request, for example, to set a parental control limit on a television program source, or to change the security pattern for a previously-secure application or piece of content.
- the request may be generated automatically, such as when a security-enabled application program is first installed on the display 501 's computing system, or when a periodic update is scheduled.
- the request may also identify the target that is to be locked by the security pattern.
- the secured content can be any aspect of the system, such as application software, configuration settings, data files, Internet sites, television programs, radio channels, etc.
- Step 602 may occur prior to, or simultaneously with, the request in step 601 .
- the system proceeds to step 603 and scans to determine what objects are seen or detected on the display.
- the system may trace the outlines of the objects detected on the display, and may measure and record placement and orientation data identifying various aspects (e.g., appearance, outline, location, rotation, etc. described below) of how the objects were arranged.
- the system may display an attribute menu 701 (as illustrated in FIG. 7 ) for each detected object. The present discussion will digress briefly to discuss the attribute menu in greater detail.
- a separate attribute menu 701 may be displayed adjacent to each object detected on the display surface in step 602 .
- the attribute menu 701 may display a number of object characteristics that have been detected by the system and may be selected for inclusion in the security pattern being generated, as well as the option to have certain detected objects ignored or excluded from the security pattern being defined. For example, the user may be given an option to de-select a particular object, to indicate that a detected object is not intended to be part of the security pattern. This may be useful, for example, if the user has other miscellaneous objects on the display surface and does not wish to clear off the entire surface to generate the pattern. This may also be useful if inadvertent objects, such as the user's elbow, were placed on the display and detected in step 602 .
- the user may have an option 703 to indicate that the object's outline shape is an attribute to be used in the security pattern. This may be useful, for example, if the user has a particular object (e.g., the star-shaped keychain ornament 502 ) that he/she intends to treat as the “key” to the secured content. If this option is selected (e.g., by checking a selection box), then when the security pattern is in use, it will require the presence, somewhere on the display, of the object's outline before granting access to the secured content.
- a particular object e.g., the star-shaped keychain ornament 502
- the user may request that the object's interior pattern also be a required attribute of the security pattern.
- the display may detect visual patterns on the bottom surface of objects resting on the display, and may incorporate those patterns in the security pattern when this option is selected. So, for example, if object 504 is a die, then the pattern shown in FIG. 8 may be the detected interior pattern. If this interior pattern is included in the security pattern, then the secured content will only be unlocked if the object having this pattern (in the FIG. 8 example, the number ‘5’ on the die) is placed on the display surface.
- the interior pattern may be any printed pattern or surface feature.
- FIG. 9 shows an example of a bar code that may be affixed to (or carved in) the bottom of stapler 506 and detected. The pattern may be printed using visible ink, or any other form of ink (e.g., invisible to humans) that can be detected by the display system's camera 207 .
- the user may request that the current position of the object be used as a required attribute of the security pattern.
- the location may be expressed in terms of X,Y display (e.g., pixel) coordinates, or any other desired coordinate system. This requirement will require the placement of an object at the specified location, but will not necessarily require the same object used in defining the security pattern.
- the security pattern only includes object position 705 criteria, without other object-specific criteria such as shape outline 703 and shape interior pattern 704 , the placement of any object at the specified location may serve to satisfy the security pattern. This may be useful, for example, if the user does not want to have to remember to bring specific objects to the display to unlock secured content. Instead, the user might simply want to remember that unlocking that content requires placing, for example, any object in the upper-left corner of the display, or placing multiple objects in the shape of a square at some location on the display.
- the options may also include a granularity option.
- the granularity option may specify a margin for error permitted in unlocking the secured content. So, for example, if the object's position 705 is a required part of the security pattern, the user may specify a 5% margin of error that will also satisfy the requirement. In such a case, if the user attempts to unlock content locked by a security pattern that requires placing an object at location 100, 100, but places the object 5% off of the required location (e.g., at coordinates 95, 100; or 95, 95), the system may still treat that as satisfying the requirement.
- Some programs may automatically include a degree of granularity in some or all of the security pattern's requirements, to allow users some flexibility in use.
- Some programs may place limitations on the ranges of granularity permitted, such as a guaranteed minimum margin for error to prevent users from requiring too high a degree of precision, or a granularity ceiling to avoid giving so much margin for error that the attribute loses value as a security measure.
- Another option 706 may specify the rotational orientation angle made by the object.
- Some objects may have a normal, vertical axis, based on their shape, and the rotational angle may indicate how much the object has been rotated away from its normal axis.
- the star keychain 502 may be defined as upright with one point pointing straight up, and its arrangement in FIGS. 5A and 5B is considered to be rotated by an angle 503 - ⁇ .
- the normal axis may be predetermined as part of shape recognition software used in the system, or the user may define the object's normal axis in a configuration process (e.g., by placing the object on the display and identifying the normal axis by touching two points to define the axis, or by touching a point outside the object through which the normal axis passes if the center of the object is otherwise specified).
- the rotational offset is also subject to a granularity option to allow for angular imperfections when the pattern is in use.
- These are example visual attributes that may be used, and other aspects of an object may be required as well. For example, some implementations may require specific colors, reflectivity/light scattering, patterns of movement or placement in multiple locations and/or orientations over a period of time, radio-frequency identifiers, etc. to be present in an object to satisfy a security pattern requirement.
- step 604 the user selects the desired attribute(s) that will be requirements in the security pattern being defined (e.g., by checking selection boxes in the various attribute menus 701 ), and in step 605 , the user may define granularities for the various attributes.
- the process may then move to step 606 to determine if a request has been made to unlock content that is secured by a security pattern.
- the request may be made in any way desired. For example, if the security pattern is registered as restricting a particular video program (e.g., adult video content), the request may be made when a user operates a remote control (e.g., a handheld remote, or using a remote control graphical user interface on display 501 ) to select the restricted content.
- the request may also occur automatically, for example, if a user had previously scheduled a program to be recorded before the program was locked by another user, but the program had become locked by the time it was scheduled to air.
- step 607 the user attempting to access the content is prompted to present the required security pattern.
- the system may continuously scan the display surface and determine whether the required pattern is present, or alternatively, it may simply ask the user to indicate (e.g., by pressing a “Ready” button displayed with the prompt) when he/she is finished arranging objects to present as the security pattern.
- one or more of the required objects may already have been on the display surface, and might only need to be moved to the correct location after the request in order to satisfy the security pattern.
- the user may already have objects like coffee mugs, staplers, telephones, computers, pen holders, etc. lying about on the display surface. Some of these may have been designated security objects, and the user might configure a security pattern to simply require moving one of these objects to its proper location after the request to unlock the secured content.
- the system may scan the display surface to determine whether the required security pattern is present. During this process, the system may simply ignore those objects that do not satisfy any part of the security pattern, looking just for the required attributes. This may allow the user to avoid having to clear off the entire surface of the display 501 in order to present the pattern.
- the security pattern may be configured to require just that—that only the objects specified by the pattern are placed on the display 501 during the pattern verification process.
- applications may define a subset area of the display 501 that must be cleared of extraneous objects and only contain the security pattern objects (i.e., all objects detected in the subset area during verification are compared to the security pattern, and objects that do not satisfy an attribute of the security pattern cause the verification to fail).
- the system may grant the requested access in step 609 . If the security pattern is not present, the system may deny access in step 610 . The security pattern process may then return to step 601 to begin anew. Note that the security pattern process may be used in conjunction with traditional alphanumeric passwords, if desired, to provide a higher degree of security.
- the security pattern may be stored as a data structure in any of the computer-readable media and storage devices described above.
- the security pattern may be stored as a file having the contents shown below:
- the ⁇ Security Pattern Name> parameter may be an alphanumeric name given by the user when defining the security pattern.
- the ⁇ Security Pattern Target> may be an identifier of the television program, television channel, Internet site, software application, data file, or other content whose access will require that the user correctly provide the security pattern.
- the security pattern is not limited to locking just one piece of content, and may instead be used to lock a variety of different pieces of content (e.g., multiple files, applications, programs, etc.).
- the ⁇ Number of Objects> parameter may identify a number of objects that will need to be presented to satisfy the security pattern.
- the security pattern file may identify one or more attributes associated with the object. So, for example, Object 1 might only require that a particular item be placed somewhere on the display, and may list a single ⁇ Shape Outline ID> attribute.
- the ⁇ Shape Outline ID> attribute may identify (e.g., a file pointer, file name, etc.) a source from which the outline of the required shape may be obtained.
- This may be an image file, such as a *.BMP, *.JPG, *.PDF, etc., and may be generated using a computer assisted drawing (CAD) program, such as MICROSOFT VISIOTM (product of Microsoft Corporation, Redmond Wash.).
- CAD computer assisted drawing
- Object 2 might also include a ⁇ Shape Position> attribute.
- This attribute may identify one or more pixel coordinates (e.g., 100, 200) on the display that are required to be occupied by the given object, as well as any granularity tolerances (e.g., 5%, 10% , etc.) specified by the user.
- the position attribute may refer to relative positions among the objects, as opposed to fixed coordinates on the display.
- the security pattern may indicate that three objects form the vertices of an equilateral triangle, with no requirement of how large or small the triangle must be. So long as the three objects maintain the correct relative position with one another, they may satisfy such an attribute.
- Other objects may include a ⁇ Shape Interior Pattern> attribute that may identify the source of a pattern image (akin to the image file identified above) to be matched for the object, and a ⁇ Shape Rotation> attribute that may identify a required angle of rotation and associated granularity values.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- Passwords have become ubiquitous in daily life. Today, it is not uncommon for a person to have to remember dozens of unique words, codes, numbers and phrases to gain access to bank automated teller machines (ATMs), subscription Internet sites, work computers, e-mail programs, cell phone accounts, cable television pay-per-view (and other television parental control features), and a plethora of other secure locations. Many times, these passwords are randomly-generated sequences of letters and numbers that may enhance security, but may also be difficult to remember. Couple that with the equally random account numbers that often go with these services, and it is easy to see why many have resorted to using the same password at different locations, or writing passwords down on a handy piece of paper by the computer. Obviously, such efforts compromise security, undermining the purpose for their existence in the first place.
- As technology marches onward, new genres of products offer opportunities for doing things differently. One such technology offers video displays that can see, or optically detect, objects that are placed against or near the display surface. The description below offers password security features that may take advantage of the capabilities of these kinds of displays.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features, essential features, or required advantages of the claimed subject matter, or in limiting the scope of the appended claims.
- Methods are described herein that allow users to define security levels in a computing system that uses the shape and/or layout arrangement of one or more physical objects. Accessing secured content may require the placement of physical objects against a display surface that can detect the objects. The user can request to define any security pattern they wish, and the system may respond by asking the user to place the desired object or objects in the desired pattern on the display. The system may then detect the various visual attributes of the placed object(s), such as their outline shape, position on the display, relative positioning with respect to other objects, rotation orientation, interior design pattern, etc., and may ask the user to select which ones will be used in the security pattern.
- The display surface may be a table top configuration, which may be used as a desk, so it may contain other objects not necessarily intended by the user to be security pattern objects. The system, when defining a security pattern, may give the user the option of de-selecting certain objects so that they are ignored in the pattern being created.
- When configuring the security pattern, the system may display an attribute menu adjacent to each detected object on the display. The menu can list the various detected attributes, and may give the user the option to check/un-check the individual attributes to allow the user to customize the level of security desired. For example, a user might simply wish to use the object's outline shape for one security pattern, and may wish to use the outline shapes and rotational orientations of a group of objects for another security pattern.
- For the attributes that are to be used, the system may also allow the user to define a margin of error for the application of the security pattern. Accordingly, if the security pattern requires that an object having a particular shape be placed at a particular location on the display, the system may be configured to accept placements of the object in slightly different locations (e.g., 5% to the left or right, 10% above, etc.).
- Users may find it easier to remember a physical object-based security pattern, and having a physical object associated with the security pattern may allow users to easily recognize how secure (or unsecure) their content is, both of which may give the user greater confidence in the system. A user might not readily understand the difference in security between a five-letter alphanumeric password and a ten-letter one, but the user may easily understand the relative difference between requiring the placement and/or arrangement of five objects versus ten. Additionally, since a physical object cannot be duplicated as easily as an alphanumeric password, a user who temporarily loses the object (e.g., to a child who sneaks the object from a mother's purse) can rest assured that when the object is returned, the security is restored (without having to create a new security pattern). These advantages and others may be realized using some or all of the features described herein.
- These and other features will be described in greater detail below.
-
FIG. 1 illustrates an example computing environment in which features described herein may be implemented. -
FIG. 2 illustrates an example optical detection system that may be used as a display to implement features described herein. -
FIGS. 3 and 4 illustrate example table embodiments of the display shown inFIG. 2 . -
FIGS. 5A and 5B illustrate example displays having one or more objects placed on top or against them as a security pattern. -
FIG. 6 illustrates an example process employing various features described herein. -
FIG. 7 illustrates an example attribute menu that may be displayed for proposed security pattern objects. -
FIGS. 8 and 9 illustrate example interior patterns that may be used as part of a security pattern. -
FIG. 1 illustrates an example of a suitablecomputing system environment 100 on which the features herein may be implemented. Thecomputing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the features described herein. Neither should thecomputing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in theexemplary operating environment 100. - The features herein are described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that can perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the features may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. The features may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
- With reference to
FIG. 1 , theexemplary system 100 for implementing features described herein includes a general purpose-computing device in the form of acomputer 110 including aprocessing unit 120, asystem memory 130, and asystem bus 121 that couples various system components including the system memory to theprocessing unit 120. -
Computer 110 may include a variety of computer readable media. By way of example, and not limitation, computer readable media may include computer storage media and communication media. Thesystem memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements withincomputer 110, such as during start-up, may be stored inROM 131.RAM 132 may contain data and/or program modules that are immediately accessible to and/or presently being operated on byprocessing unit 120. By way of example, and not limitation,FIG. 1 illustratesoperating system 134,application programs 135,other program modules 136, andprogram data 137. - The
computer 110 may also include other removable/nonremovable, volatile/nonvolatile computer storage media. By way of example only,FIG. 1 illustrates ahard disk drive 141 that reads from or writes to nonremovable, nonvolatile magnetic media, amagnetic disk drive 151 that reads from or writes to a removable, nonvolatilemagnetic disk 152, and anoptical disk drive 155 that reads from or writes to a removable, nonvolatileoptical disk 156 such as a CD ROM or other optical media. Other removable/nonremovable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. Thehard disk drive 141 may be connected to thesystem bus 121 through a non-removable memory interface such asinterface 140, andmagnetic disk drive 151 andoptical disk drive 155 may be connected to thesystem bus 121 by a removable memory interface, such asinterface 150. - The drives and their associated computer storage media discussed above and illustrated in
FIG. 1 may provide storage of computer readable instructions, data structures, program modules and other data for thecomputer 110. InFIG. 1 , for example,hard disk drive 141 is illustrated as storingoperating system 144,application programs 145,other program modules 146, andprogram data 147. Note that these components can either be the same as or different fromoperating system 134,application programs 135,other program modules 136, andprogram data 137.Operating system 144,application programs 145,other program modules 146, andprogram data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into thecomputer 110 through input devices such as akeyboard 162 and pointingdevice 161, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to theprocessing unit 120 through auser input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). Amonitor 191 or other type of display device may also be connected to thesystem bus 121 via an interface, such as avideo interface 190. Thevideo interface 190 may be bidirectional, and may receive video input from sensors associated with themonitor 191. For example, themonitor 191 may be touch and/or proximity sensitive, such that contacts to a monitor surface may be used as input data. The input sensors for affecting this could be a capacitive touch sensitive device, an array of resistive contact sensors, an optical sensor or camera, or any other desired sensor to make themonitor 191 touch and/or proximity sensitive. The monitor itself may be optically sensitive, such as the example shown inFIG. 2 and described below. In an alternative arrangement, or in addition, a touch, optical and/or proximity sensitive input system may be separate frommonitor 191, and may include a planar surface such as atable top 192 and any applicable sensing systems to make the planar surface touch sensitive, such ascamera 193. In addition to the monitor, computers may also include other peripheral output devices such asspeakers 197 andprinter 196, which may be connected through an outputperipheral interface 195. - The
computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as aremote computer 180. Theremote computer 180 may be a personal computer, and typically includes many or all of the elements described above relative to thecomputer 110, although only amemory storage device 181 has been illustrated inFIG. 1 . The logical connections depicted inFIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks. - When used in a LAN networking environment, the
computer 110 may be connected to theLAN 171 through a network interface oradapter 170. When used in a WAN networking environment, thecomputer 110 may include amodem 172 or other means for establishing communications over theWAN 173, such as the Internet. Themodem 172, which may be internal or external, may be connected to thesystem bus 121 via theuser input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to thecomputer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,FIG. 1 illustratesremote application programs 185 as residing onmemory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. Many of the features described herein may be implemented using computer-executable instructions stored on one or more computer-readable media, such as the media described above, for execution on the one or more units that make up processingunit 120. - The computing device shown in
FIG. 1 may be incorporated into a system havingtable display device 200, as shown inFIG. 2 . Thedisplay device 200 may include adisplay surface 201, which may be a planar surface such as thetable top 192. As described hereinafter, thedisplay surface 201 may also help to serve as a user interface. - The
display device 200 may display a computer-generated image on itsdisplay surface 201, which allows thedevice 200 to be used as a display monitor (such as monitor 191) for computing processes, displaying graphical user interfaces, displaying television or other visual images, video games, and the like. The display may be projection-based, and may use a digital light processing (e.g., DLP by Texas Instruments, Inc., Plano, Tex.) technique, or it may be based on other display technologies, such as liquid crystal display (LCD) technology. Where a projection-style display is used, aprojector 202 may be used to project light onto the underside of thedisplay surface 201. It may do so directly, or may do so using one or more mirrors. As shown inFIG. 2 , theprojector 202 in this example projects light for a desired image onto a firstreflective surface 203 a, which may in turn reflect light onto a secondreflective surface 203 b, which may ultimately reflect that light onto the underside of thedisplay surface 201, causing thesurface 201 to emit light corresponding to the desired display. - In addition to being used as an output display for displaying images, the
device 200 may also be used as an input-receiving device. As illustrated inFIG. 2 , thedevice 200 may include one or more light emittingdevices 204, such as IR light emitting diodes (LEDs), mounted in the device's interior. The light fromdevices 204 may be projected upwards through thedisplay surface 201, and may reflect off of various objects that are above thedisplay surface 201. For example, one ormore objects 205 may be placed in physical contact with thedisplay surface 201. One or moreother objects 206 may be placed near thedisplay surface 201, but not in physical contact (e.g., closely hovering). The light emitted from the emitting device(s) 204 may reflect off of these objects, and may be detected by acamera 207, which may be an IR camera if IR light is used. The signals from thecamera 207 may then be forwarded to a computing device (e.g., thecomputer 110 shown inFIG. 1 ) for processing, which, based on various configurations for various applications, may identify the object and its orientation (e.g. touching or hovering, tilted, partially touching, etc.) based on its shape and the amount/type of light reflected. To assist in identifying theobjects contact 205 from hoveringobjects 206, thedisplay surface 201 may include a translucent layer that diffuses emitted light. Based on the amount of light reflected back to thecamera 207 through this layer, the associated processing system may determine whether an object is touching thesurface 201, and if the object is not touching, a distance between the object and thesurface 201. Accordingly, various physical objects (e.g., fingers, elbows, hands, stylus pens, blocks, etc.) may be used as physical control members, providing input to the device 200 (or to an associated computing device). - The
device 200 shown inFIG. 2 is illustrated as using light projection- and sensing techniques for the display of data and the reception of input, but other techniques may be used as well. For example, stylus-sensitive displays are currently available for use with Tablet-based laptop computers, and such displays may be used asdevice 200. Additionally, stylus- and other touch-sensitive displays are available with many personal data assistants (PDAs), and those types of displays may also be used asdevice 200. - The
device 200 is also shown in a substantially horizontal orientation, with thedisplay surface 201 acting as a tabletop. Other orientations may also be used. For example, thedevice 200 may be oriented to project a display onto any desired surface, such as a vertical wall. Reflective IR light may also be received from any such oriented surface. -
FIG. 3 illustrates an illustrative configuration of an implementation of the system shown inFIG. 2 , in whichdevice 301 is used as a tabletop display device.FIG. 4 illustrates an overhead view of such a table, around which a number ofusers 401 may be seated or standing. Eachuser 401 may wish to interact with the display on the surface of table 301, for example to place and/or touch an object, or to play a party video game. Although the display area of table 301 shown in this example is circular, it may be any desired shape, such as oval, rectangular, square, hexagon, octagon, etc. -
FIG. 5A illustrates an example of a security pattern that may be used in the place of, or together with, alphanumeric passwords to restrict access to a location or content on a computing system. Thedisplay 501 may be in a tabletop or horizontal configuration, and may have several objects resting on top (other configurations may also be used, with objects in contact with the display surface).FIG. 5A shows the placement of a star-shapedobject 502 on thedisplay 501. The object may be a personal effect of the user, such as a keychain ornament, that the user can easily remember to place on the tabletop display whenever he/she needs to access some particular piece of secured content (e.g., parental control blocking predetermined program types, such as adult content, from programs viewed on a television or thedisplay 501 itself). Thedisplay 501 may detect the outline shape of theobject 502, and may require that it sees theobject 502 somewhere on thedisplay 501 before granting access to the restricted content. For added security, the user may configure the display 501 (or the secured content) to require more than just the placement of theobject 502 having the appropriate outline. For example, the user may configure the system to require that the object be placed at a predefined location on thedisplay 501 surface, using for example X,Y pixel coordinates. The user may also configure the system to require that theobject 502 be placed at aparticular angle 503 of rotation orientation from a normal orientation. These options are described further below. - The user may also configure the system to require the placement of more than one object.
FIG. 5B illustrates an example in which four example objects are placed on thedisplay 501 as a security pattern. In theFIG. 5B example, the security pattern involves one ormore dice 504, a trophy having anoval base 505, and astapler 506 all placed on thedisplay 501 before being satisfied and granting access to the secured content. As with thekeychain 502, these objects may also be required to be placed at the proper locations, orientations, and/or at the proper angles, before access will be granted to the secured content. -
FIG. 6 illustrates a process by which a security pattern may be defined and used. Instep 601, the system (e.g.,display 501 and/or its underlying computing system) may check to see if a request has been made to register a new security pattern to restrict access to some content (e.g., a television program, a software application, a data file, etc.). The request may be generated in many ways. It may be generated in response to a user's request, for example, to set a parental control limit on a television program source, or to change the security pattern for a previously-secure application or piece of content. Alternatively, the request may be generated automatically, such as when a security-enabled application program is first installed on thedisplay 501's computing system, or when a periodic update is scheduled. The request may also identify the target that is to be locked by the security pattern. The secured content can be any aspect of the system, such as application software, configuration settings, data files, Internet sites, television programs, radio channels, etc. - If a request has been received, the process may move to step 602 and await the user's placement of objects on the
display 501. The system may prompt the user with a pop-up display or other message requesting that the user place the proposed security pattern objects on the display. Step 602 may occur prior to, or simultaneously with, the request instep 601. - When the proposed security pattern objects are placed on the display, the system proceeds to step 603 and scans to determine what objects are seen or detected on the display. During this process, the system may trace the outlines of the objects detected on the display, and may measure and record placement and orientation data identifying various aspects (e.g., appearance, outline, location, rotation, etc. described below) of how the objects were arranged. The system may display an attribute menu 701 (as illustrated in
FIG. 7 ) for each detected object. The present discussion will digress briefly to discuss the attribute menu in greater detail. - A
separate attribute menu 701 may be displayed adjacent to each object detected on the display surface instep 602. Theattribute menu 701 may display a number of object characteristics that have been detected by the system and may be selected for inclusion in the security pattern being generated, as well as the option to have certain detected objects ignored or excluded from the security pattern being defined. For example, the user may be given an option to de-select a particular object, to indicate that a detected object is not intended to be part of the security pattern. This may be useful, for example, if the user has other miscellaneous objects on the display surface and does not wish to clear off the entire surface to generate the pattern. This may also be useful if inadvertent objects, such as the user's elbow, were placed on the display and detected instep 602. - For objects that the user wishes to include in the security pattern, the user may have an
option 703 to indicate that the object's outline shape is an attribute to be used in the security pattern. This may be useful, for example, if the user has a particular object (e.g., the star-shaped keychain ornament 502) that he/she intends to treat as the “key” to the secured content. If this option is selected (e.g., by checking a selection box), then when the security pattern is in use, it will require the presence, somewhere on the display, of the object's outline before granting access to the secured content. - As another
option 704, the user may request that the object's interior pattern also be a required attribute of the security pattern. The display may detect visual patterns on the bottom surface of objects resting on the display, and may incorporate those patterns in the security pattern when this option is selected. So, for example, ifobject 504 is a die, then the pattern shown inFIG. 8 may be the detected interior pattern. If this interior pattern is included in the security pattern, then the secured content will only be unlocked if the object having this pattern (in theFIG. 8 example, the number ‘5’ on the die) is placed on the display surface. The interior pattern may be any printed pattern or surface feature.FIG. 9 shows an example of a bar code that may be affixed to (or carved in) the bottom ofstapler 506 and detected. The pattern may be printed using visible ink, or any other form of ink (e.g., invisible to humans) that can be detected by the display system'scamera 207. - As another
option 705, the user may request that the current position of the object be used as a required attribute of the security pattern. The location may be expressed in terms of X,Y display (e.g., pixel) coordinates, or any other desired coordinate system. This requirement will require the placement of an object at the specified location, but will not necessarily require the same object used in defining the security pattern. For example, if the security pattern only includesobject position 705 criteria, without other object-specific criteria such asshape outline 703 and shapeinterior pattern 704, the placement of any object at the specified location may serve to satisfy the security pattern. This may be useful, for example, if the user does not want to have to remember to bring specific objects to the display to unlock secured content. Instead, the user might simply want to remember that unlocking that content requires placing, for example, any object in the upper-left corner of the display, or placing multiple objects in the shape of a square at some location on the display. - The options may also include a granularity option. The granularity option may specify a margin for error permitted in unlocking the secured content. So, for example, if the object's
position 705 is a required part of the security pattern, the user may specify a 5% margin of error that will also satisfy the requirement. In such a case, if the user attempts to unlock content locked by a security pattern that requires placing an object atlocation object 5% off of the required location (e.g., atcoordinates 95, 100; or 95, 95), the system may still treat that as satisfying the requirement. Some programs may automatically include a degree of granularity in some or all of the security pattern's requirements, to allow users some flexibility in use. Some programs may place limitations on the ranges of granularity permitted, such as a guaranteed minimum margin for error to prevent users from requiring too high a degree of precision, or a granularity ceiling to avoid giving so much margin for error that the attribute loses value as a security measure. - Another
option 706 may specify the rotational orientation angle made by the object. Some objects may have a normal, vertical axis, based on their shape, and the rotational angle may indicate how much the object has been rotated away from its normal axis. For example, thestar keychain 502 may be defined as upright with one point pointing straight up, and its arrangement inFIGS. 5A and 5B is considered to be rotated by an angle 503-θ. The normal axis may be predetermined as part of shape recognition software used in the system, or the user may define the object's normal axis in a configuration process (e.g., by placing the object on the display and identifying the normal axis by touching two points to define the axis, or by touching a point outside the object through which the normal axis passes if the center of the object is otherwise specified). The rotational offset is also subject to a granularity option to allow for angular imperfections when the pattern is in use. These are example visual attributes that may be used, and other aspects of an object may be required as well. For example, some implementations may require specific colors, reflectivity/light scattering, patterns of movement or placement in multiple locations and/or orientations over a period of time, radio-frequency identifiers, etc. to be present in an object to satisfy a security pattern requirement. - Returning now to the process of
FIG. 6 , instep 604, the user selects the desired attribute(s) that will be requirements in the security pattern being defined (e.g., by checking selection boxes in the various attribute menus 701), and instep 605, the user may define granularities for the various attributes. - The process may then move to step 606 to determine if a request has been made to unlock content that is secured by a security pattern. The request may be made in any way desired. For example, if the security pattern is registered as restricting a particular video program (e.g., adult video content), the request may be made when a user operates a remote control (e.g., a handheld remote, or using a remote control graphical user interface on display 501) to select the restricted content. The request may also occur automatically, for example, if a user had previously scheduled a program to be recorded before the program was locked by another user, but the program had become locked by the time it was scheduled to air.
- If no request is made, the process may return to step 601 to begin anew. If a request is made to access locked content, the process may proceed to step 607, where the user attempting to access the content is prompted to present the required security pattern. The system may continuously scan the display surface and determine whether the required pattern is present, or alternatively, it may simply ask the user to indicate (e.g., by pressing a “Ready” button displayed with the prompt) when he/she is finished arranging objects to present as the security pattern. In some instances, one or more of the required objects may already have been on the display surface, and might only need to be moved to the correct location after the request in order to satisfy the security pattern. For example, if the user is using the
display 501 as a desk, he/she may already have objects like coffee mugs, staplers, telephones, computers, pen holders, etc. lying about on the display surface. Some of these may have been designated security objects, and the user might configure a security pattern to simply require moving one of these objects to its proper location after the request to unlock the secured content. - Once the objects are in place, in
step 608, the system may scan the display surface to determine whether the required security pattern is present. During this process, the system may simply ignore those objects that do not satisfy any part of the security pattern, looking just for the required attributes. This may allow the user to avoid having to clear off the entire surface of thedisplay 501 in order to present the pattern. Alternatively, the security pattern may be configured to require just that—that only the objects specified by the pattern are placed on thedisplay 501 during the pattern verification process. In some situations, applications may define a subset area of thedisplay 501 that must be cleared of extraneous objects and only contain the security pattern objects (i.e., all objects detected in the subset area during verification are compared to the security pattern, and objects that do not satisfy an attribute of the security pattern cause the verification to fail). - If the security pattern is present, the system may grant the requested access in
step 609. If the security pattern is not present, the system may deny access instep 610. The security pattern process may then return to step 601 to begin anew. Note that the security pattern process may be used in conjunction with traditional alphanumeric passwords, if desired, to provide a higher degree of security. - The security pattern may be stored as a data structure in any of the computer-readable media and storage devices described above. For example, the security pattern may be stored as a file having the contents shown below:
-
<Security Pattern Name> <Security Pattern Target> <Number of Objects> Object 1: <Shape Outline ID> Object 2: <Shape Outline ID> <Shape Position> Object 3: <Shape Interior Pattern> <Shape Position> <Shape Rotation> - The <Security Pattern Name> parameter may be an alphanumeric name given by the user when defining the security pattern. The <Security Pattern Target> may be an identifier of the television program, television channel, Internet site, software application, data file, or other content whose access will require that the user correctly provide the security pattern. The security pattern is not limited to locking just one piece of content, and may instead be used to lock a variety of different pieces of content (e.g., multiple files, applications, programs, etc.).
- The <Number of Objects> parameter may identify a number of objects that will need to be presented to satisfy the security pattern. For each object, the security pattern file may identify one or more attributes associated with the object. So, for example,
Object 1 might only require that a particular item be placed somewhere on the display, and may list a single <Shape Outline ID> attribute. The <Shape Outline ID> attribute may identify (e.g., a file pointer, file name, etc.) a source from which the outline of the required shape may be obtained. This may be an image file, such as a *.BMP, *.JPG, *.PDF, etc., and may be generated using a computer assisted drawing (CAD) program, such as MICROSOFT VISIO™ (product of Microsoft Corporation, Redmond Wash.). - Other parameters may be stored as well. For example, Object 2 might also include a <Shape Position> attribute. This attribute may identify one or more pixel coordinates (e.g., 100, 200) on the display that are required to be occupied by the given object, as well as any granularity tolerances (e.g., 5%, 10% , etc.) specified by the user. If multiple objects are included in the security pattern, the position attribute may refer to relative positions among the objects, as opposed to fixed coordinates on the display. For example, the security pattern may indicate that three objects form the vertices of an equilateral triangle, with no requirement of how large or small the triangle must be. So long as the three objects maintain the correct relative position with one another, they may satisfy such an attribute.
- Other objects, such as Object 3, may include a <Shape Interior Pattern> attribute that may identify the source of a pattern image (akin to the image file identified above) to be matched for the object, and a <Shape Rotation> attribute that may identify a required angle of rotation and associated granularity values.
- Using one or more of the features and approaches described above, users' experiences with various orientations can be improved. Although the description above provides illustrative examples and sequences of actions, it should be understood that the various examples and sequences may be rearranged, divided, combined and subcombined as desired. For example, steps and features described may be omitted, or additional steps and features may be added. Accordingly, although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/426,101 US8001613B2 (en) | 2006-06-23 | 2006-06-23 | Security using physical objects |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/426,101 US8001613B2 (en) | 2006-06-23 | 2006-06-23 | Security using physical objects |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070300307A1 true US20070300307A1 (en) | 2007-12-27 |
US8001613B2 US8001613B2 (en) | 2011-08-16 |
Family
ID=38874947
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/426,101 Expired - Fee Related US8001613B2 (en) | 2006-06-23 | 2006-06-23 | Security using physical objects |
Country Status (1)
Country | Link |
---|---|
US (1) | US8001613B2 (en) |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070188518A1 (en) * | 2006-02-10 | 2007-08-16 | Microsoft Corporation | Variable orientation input mode |
US20070284429A1 (en) * | 2006-06-13 | 2007-12-13 | Microsoft Corporation | Computer component recognition and setup |
US20070300182A1 (en) * | 2006-06-22 | 2007-12-27 | Microsoft Corporation | Interface orientation using shadows |
US20080291283A1 (en) * | 2006-10-16 | 2008-11-27 | Canon Kabushiki Kaisha | Image processing apparatus and control method thereof |
US20110283338A1 (en) * | 2010-05-14 | 2011-11-17 | Microsoft Corporation | Sensor-based authentication to a computer network-based service |
WO2011159921A1 (en) * | 2010-06-16 | 2011-12-22 | Delphian Systems, LLC | Wireless device enabled locking system |
US8139059B2 (en) | 2006-03-31 | 2012-03-20 | Microsoft Corporation | Object illumination in a virtual environment |
EP2569727A2 (en) * | 2010-05-14 | 2013-03-20 | Microsoft Corporation | Overlay human interactive proof system and techniques |
US20140210703A1 (en) * | 2013-01-31 | 2014-07-31 | Samsung Electronics Co. Ltd. | Method of unlocking and subsequent application launch in portable electronic device via orientation sensing |
US8930834B2 (en) | 2006-03-20 | 2015-01-06 | Microsoft Corporation | Variable orientation user interface |
US9141779B2 (en) | 2011-05-19 | 2015-09-22 | Microsoft Technology Licensing, Llc | Usable security of online password management with sensor-based authentication |
EP2472373A4 (en) * | 2009-08-27 | 2016-08-17 | Sony Corp | Information processing device, information processing method, and program |
US20170123622A1 (en) * | 2015-10-28 | 2017-05-04 | Microsoft Technology Licensing, Llc | Computing device having user-input accessory |
WO2018057121A1 (en) * | 2016-09-20 | 2018-03-29 | Sony Interactive Entertainment Inc. | Input method for modeling physical objects in vr/digital |
RU180346U1 (en) * | 2016-12-09 | 2018-06-08 | Чуань-Энь ЛИ | PRIVACY PROTECTION FILTER FOR DISPLAYS |
CN109670291A (en) * | 2017-10-17 | 2019-04-23 | 腾讯科技(深圳)有限公司 | A kind of implementation method of identifying code, device and storage medium |
US10529156B2 (en) | 2013-05-20 | 2020-01-07 | Delphian Systems, LLC | Access control via selective direct and indirect wireless communications |
WO2021216712A1 (en) * | 2020-04-21 | 2021-10-28 | Roblox Corporation | Systems and methods for accessible computer-user interactions |
US20220171845A1 (en) * | 2020-11-30 | 2022-06-02 | Rovi Guides, Inc. | Enhancing intelligence in parental control |
US11354958B2 (en) | 2010-06-16 | 2022-06-07 | Delphian Systems, LLC | Wireless device enabled locking system having different modalities |
US11397956B1 (en) | 2020-10-26 | 2022-07-26 | Wells Fargo Bank, N.A. | Two way screen mirroring using a smart table |
US11429957B1 (en) | 2020-10-26 | 2022-08-30 | Wells Fargo Bank, N.A. | Smart table assisted financial health |
US11457730B1 (en) | 2020-10-26 | 2022-10-04 | Wells Fargo Bank, N.A. | Tactile input device for a touch screen |
US11543931B2 (en) * | 2021-01-27 | 2023-01-03 | Ford Global Technologies, Llc | Systems and methods for interacting with a tabletop model using a mobile device |
US11572733B1 (en) | 2020-10-26 | 2023-02-07 | Wells Fargo Bank, N.A. | Smart table with built-in lockers |
US11727483B1 (en) | 2020-10-26 | 2023-08-15 | Wells Fargo Bank, N.A. | Smart table assisted financial health |
US11740853B1 (en) | 2020-10-26 | 2023-08-29 | Wells Fargo Bank, N.A. | Smart table system utilizing extended reality |
US11741517B1 (en) | 2020-10-26 | 2023-08-29 | Wells Fargo Bank, N.A. | Smart table system for document management |
US20230350515A1 (en) * | 2021-07-13 | 2023-11-02 | Novatek Microelectronics Corp. | Transmission system |
GB2625153A (en) * | 2022-12-09 | 2024-06-12 | Andrews & Wykeham Ltd | Security credential |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8590020B1 (en) * | 2007-01-19 | 2013-11-19 | Veronika Orlovskaya | Authentication system and method using arrangements of objects |
WO2009122331A2 (en) * | 2008-04-01 | 2009-10-08 | Koninklijke Philips Electronics N.V. | Pointing device for use on an interactive surface |
US20110095992A1 (en) * | 2009-10-26 | 2011-04-28 | Aten International Co., Ltd. | Tools with multiple contact points for use on touch panel |
CN102822770B (en) * | 2010-03-26 | 2016-08-17 | 惠普发展公司,有限责任合伙企业 | Associated with |
CN102279910A (en) * | 2010-06-11 | 2011-12-14 | 鸿富锦精密工业(深圳)有限公司 | Device with encryption and decryption functions and encrypting and decrypting methods of device |
WO2013163285A1 (en) | 2012-04-25 | 2013-10-31 | Southeast Solutions, Inc. | Fraud resistant passcode entry system |
JP6398248B2 (en) * | 2014-01-21 | 2018-10-03 | セイコーエプソン株式会社 | Position detection system and method for controlling position detection system |
CN106133738A (en) * | 2014-04-02 | 2016-11-16 | 索尼公司 | Information processing system and computer program |
KR101980977B1 (en) | 2017-11-23 | 2019-05-21 | 성균관대학교산학협력단 | Method for User based Application Grouping under Multi-User Environment and Table Top Display Apparatus for Performing the Same |
Citations (98)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US645364A (en) * | 1897-04-20 | 1900-03-13 | John C Riebe | Mechanical movement. |
US4817176A (en) * | 1986-02-14 | 1989-03-28 | William F. McWhortor | Method and apparatus for pattern recognition |
US5230063A (en) * | 1989-03-15 | 1993-07-20 | Sun Microsystems, Inc. | Method and apparatus for selecting button function and retaining selected optics on a display |
US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
US5345549A (en) * | 1992-10-30 | 1994-09-06 | International Business Machines Corporation | Multimedia based security systems |
US5423554A (en) * | 1993-09-24 | 1995-06-13 | Metamedia Ventures, Inc. | Virtual reality game method and apparatus |
US5434964A (en) * | 1990-01-25 | 1995-07-18 | Radius Inc. | Movement and redimensioning of computer display windows |
US5463725A (en) * | 1992-12-31 | 1995-10-31 | International Business Machines Corp. | Data processing system graphical user interface which emulates printed material |
US5665951A (en) * | 1996-02-08 | 1997-09-09 | Newman; Gary H. | Customer indicia storage and utilization system |
US5714698A (en) * | 1994-02-03 | 1998-02-03 | Canon Kabushiki Kaisha | Gesture input method and apparatus |
US5804803A (en) * | 1996-04-02 | 1998-09-08 | International Business Machines Corporation | Mechanism for retrieving information using data encoded on an object |
US5818450A (en) * | 1996-03-07 | 1998-10-06 | Toshiba Kikai Kabushiki Kaisha | Method of displaying data setting menu on touch input display provided with touch-sensitive panel and apparatus for carrying out the same method |
US5883626A (en) * | 1997-03-31 | 1999-03-16 | International Business Machines Corporation | Docking and floating menu/tool bar |
US5910653A (en) * | 1997-04-09 | 1999-06-08 | Telxon Corporation | Shelf tag with ambient light detector |
US5943164A (en) * | 1994-11-14 | 1999-08-24 | Texas Instruments Incorporated | Curved 3-D object description from single aerial images using shadows |
US6181343B1 (en) * | 1997-12-23 | 2001-01-30 | Philips Electronics North America Corp. | System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs |
US6240207B1 (en) * | 1993-08-11 | 2001-05-29 | Sony Corporation | Handwriting input display apparatus having improved speed in changing display of entered handwriting |
US6247128B1 (en) * | 1997-07-22 | 2001-06-12 | Compaq Computer Corporation | Computer manufacturing with smart configuration methods |
US20010012001A1 (en) * | 1997-07-07 | 2001-08-09 | Junichi Rekimoto | Information input apparatus |
US20020109737A1 (en) * | 2001-02-15 | 2002-08-15 | Denny Jaeger | Arrow logic system for creating and operating control systems |
US6445364B2 (en) * | 1995-11-28 | 2002-09-03 | Vega Vista, Inc. | Portable game display and method for controlling same |
US6448964B1 (en) * | 1999-03-15 | 2002-09-10 | Computer Associates Think, Inc. | Graphic object manipulating tool |
US6452593B1 (en) * | 1999-02-19 | 2002-09-17 | International Business Machines Corporation | Method and system for rendering a virtual three-dimensional graphical display |
US20020151337A1 (en) * | 2001-03-29 | 2002-10-17 | Konami Corporation | Video game device, video game method, video game program, and video game system |
US20020154214A1 (en) * | 2000-11-02 | 2002-10-24 | Laurent Scallie | Virtual reality game system using pseudo 3D display driver |
US6512507B1 (en) * | 1998-03-31 | 2003-01-28 | Seiko Epson Corporation | Pointing position detection device, presentation system, and method, and computer-readable medium |
US20030025676A1 (en) * | 2001-08-02 | 2003-02-06 | Koninklijke Philips Electronics N.V. | Sensor-based menu for a touch screen panel |
US20030063132A1 (en) * | 2001-08-16 | 2003-04-03 | Frank Sauer | User interface for augmented and virtual reality systems |
US6545663B1 (en) * | 1999-04-19 | 2003-04-08 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method and input device for controlling the position of an object to be graphically displayed in virtual reality |
US6556951B1 (en) * | 1997-11-26 | 2003-04-29 | The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services | System and method for intelligent quality control of a process |
US6568596B1 (en) * | 2000-10-02 | 2003-05-27 | Symbol Technologies, Inc. | XML-based barcode scanner |
US6577330B1 (en) * | 1997-08-12 | 2003-06-10 | Matsushita Electric Industrial Co., Ltd. | Window display device with a three-dimensional orientation of windows |
US20030119576A1 (en) * | 2001-12-20 | 2003-06-26 | Mcclintic Monica A. | Gaming devices and methods incorporating interactive physical skill bonus games and virtual reality games in a shared bonus event |
US6593945B1 (en) * | 1999-05-21 | 2003-07-15 | Xsides Corporation | Parallel graphical user interface |
US6624833B1 (en) * | 2000-04-17 | 2003-09-23 | Lucent Technologies Inc. | Gesture-based input interface system with shadow detection |
US6623365B1 (en) * | 1998-05-12 | 2003-09-23 | Volkswagen Ag | Transmission element for the transmission of power and/or torques, oscillation damper and method for oscillation damping |
US6630943B1 (en) * | 1999-09-21 | 2003-10-07 | Xsides Corporation | Method and system for controlling a complementary user interface on a display surface |
US6672961B1 (en) * | 2000-03-16 | 2004-01-06 | Sony Computer Entertainment America Inc. | Computer system and method of displaying images |
US20040005920A1 (en) * | 2002-02-05 | 2004-01-08 | Mindplay Llc | Method, apparatus, and article for reading identifying information from, for example, stacks of chips |
US6686931B1 (en) * | 1997-06-13 | 2004-02-03 | Motorola, Inc. | Graphical password methodology for a microprocessor device accepting non-alphanumeric user input |
US20040032409A1 (en) * | 2002-08-14 | 2004-02-19 | Martin Girard | Generating image data |
US20040046784A1 (en) * | 2000-08-29 | 2004-03-11 | Chia Shen | Multi-user collaborative graphical user interfaces |
US20040051733A1 (en) * | 2000-12-28 | 2004-03-18 | David Katzir | Method and system for parental internet control |
US6720860B1 (en) * | 2000-06-30 | 2004-04-13 | International Business Machines Corporation | Password protection using spatial and temporal variation in a high-resolution touch sensitive display |
US6735625B1 (en) * | 1998-05-29 | 2004-05-11 | Cisco Technology, Inc. | System and method for automatically determining whether a product is compatible with a physical device in a network |
US20040090432A1 (en) * | 2002-11-01 | 2004-05-13 | Fujitsu Limited, | Touch panel device and contact position detection method |
US6745234B1 (en) * | 1998-09-11 | 2004-06-01 | Digital:Convergence Corporation | Method and apparatus for accessing a remote location by scanning an optical code |
US20040119746A1 (en) * | 2002-12-23 | 2004-06-24 | Authenture, Inc. | System and method for user authentication interface |
US20040127272A1 (en) * | 2001-04-23 | 2004-07-01 | Chan-Jong Park | System and method for virtual game |
US20040141008A1 (en) * | 2001-03-07 | 2004-07-22 | Alexander Jarczyk | Positioning of areas displayed on a user interface |
US6768419B2 (en) * | 1998-08-14 | 2004-07-27 | 3M Innovative Properties Company | Applications for radio frequency identification systems |
US6767287B1 (en) * | 2000-03-16 | 2004-07-27 | Sony Computer Entertainment America Inc. | Computer system and method for implementing a virtual reality environment for a multi-player game |
US6792452B1 (en) * | 1998-09-11 | 2004-09-14 | L.V. Partners, L.P. | Method for configuring a piece of equipment with the use of an associated machine resolvable code |
US6791530B2 (en) * | 2000-08-29 | 2004-09-14 | Mitsubishi Electric Research Laboratories, Inc. | Circular graphical user interfaces |
US6795452B2 (en) * | 2002-05-31 | 2004-09-21 | Sandbridge Technologies, Inc. | Method of tracking time intervals for a communication signal |
US20040212617A1 (en) * | 2003-01-08 | 2004-10-28 | George Fitzmaurice | User interface having a placement and layout suitable for pen-based computers |
US6822635B2 (en) * | 2000-01-19 | 2004-11-23 | Immersion Corporation | Haptic interface for laptop computers and other portable devices |
US6847856B1 (en) * | 2003-08-29 | 2005-01-25 | Lucent Technologies Inc. | Method for determining juxtaposition of physical components with use of RFID tags |
US20050017709A1 (en) * | 2003-07-25 | 2005-01-27 | Honeywell International Inc. | Magnetoresistive turbocharger compressor wheel speed sensor |
US20050054392A1 (en) * | 2003-09-04 | 2005-03-10 | Too Yew Teng | Portable digital device orientation |
US20050069186A1 (en) * | 2003-09-30 | 2005-03-31 | Konica Minolta Meical & Graphic, Inc. | Medical image processing apparatus |
US20050110781A1 (en) * | 2003-11-25 | 2005-05-26 | Geaghan Bernard O. | Light emitting stylus and user input device using same |
US20050122308A1 (en) * | 2002-05-28 | 2005-06-09 | Matthew Bell | Self-contained interactive video display system |
US6910076B2 (en) * | 1999-06-23 | 2005-06-21 | Intel Corporation | Network-based detection and display of product replacement information |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
US20050146508A1 (en) * | 2004-01-06 | 2005-07-07 | International Business Machines Corporation | System and method for improved user input on personal computing devices |
US20050153128A1 (en) * | 2000-06-30 | 2005-07-14 | Selinfreund Richard H. | Product packaging including digital data |
US20050166264A1 (en) * | 2002-01-08 | 2005-07-28 | Kazuhiro Yamada | Content delivery method and content delivery system |
US20050162402A1 (en) * | 2004-01-27 | 2005-07-28 | Watanachote Susornpol J. | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback |
US20050177054A1 (en) * | 2004-02-10 | 2005-08-11 | Dingrong Yi | Device and process for manipulating real and virtual objects in three-dimensional space |
US20050183035A1 (en) * | 2003-11-20 | 2005-08-18 | Ringel Meredith J. | Conflict resolution for graphic multi-user interface |
US20050193120A1 (en) * | 2000-03-16 | 2005-09-01 | Sony Computer Entertainment America Inc. | Data transmission protocol and visual display for a networked computer system |
US20050200291A1 (en) * | 2004-02-24 | 2005-09-15 | Naugler W. E.Jr. | Method and device for reading display pixel emission and ambient luminance levels |
US6950534B2 (en) * | 1998-08-10 | 2005-09-27 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
US20060015501A1 (en) * | 2004-07-19 | 2006-01-19 | International Business Machines Corporation | System, method and program product to determine a time interval at which to check conditions to permit access to a file |
US6990660B2 (en) * | 2000-09-22 | 2006-01-24 | Patchlink Corporation | Non-invasive automatic offsite patch fingerprinting and updating system and method |
US20060017709A1 (en) * | 2004-07-22 | 2006-01-26 | Pioneer Corporation | Touch panel apparatus, method of detecting touch area, and computer product |
US20060026535A1 (en) * | 2004-07-30 | 2006-02-02 | Apple Computer Inc. | Mode-based graphical user interfaces for touch sensitive input devices |
US20060075250A1 (en) * | 2004-09-24 | 2006-04-06 | Chung-Wen Liao | Touch panel lock and unlock function and hand-held device |
US20060077211A1 (en) * | 2004-09-29 | 2006-04-13 | Mengyao Zhou | Embedded device with image rotation |
US7036090B1 (en) * | 2001-09-24 | 2006-04-25 | Digeo, Inc. | Concentric polygonal menus for a graphical user interface |
US20060090078A1 (en) * | 2004-10-21 | 2006-04-27 | Blythe Michael M | Initiation of an application |
US20060119541A1 (en) * | 2004-12-02 | 2006-06-08 | Blythe Michael M | Display system |
US20060156249A1 (en) * | 2005-01-12 | 2006-07-13 | Blythe Michael M | Rotate a user interface |
US20060161871A1 (en) * | 2004-07-30 | 2006-07-20 | Apple Computer, Inc. | Proximity detector in handheld device |
US7085590B2 (en) * | 2003-12-31 | 2006-08-01 | Sony Ericsson Mobile Communications Ab | Mobile terminal with ergonomic imaging functions |
US7098891B1 (en) * | 1992-09-18 | 2006-08-29 | Pryor Timothy R | Method for providing human input to a computer |
US7104891B2 (en) * | 2002-05-16 | 2006-09-12 | Nintendo Co., Ltd. | Game machine and game program for displaying a first object casting a shadow formed by light from a light source on a second object on a virtual game space |
US20070063981A1 (en) * | 2005-09-16 | 2007-03-22 | Galyean Tinsley A Iii | System and method for providing an interactive interface |
US20070188518A1 (en) * | 2006-02-10 | 2007-08-16 | Microsoft Corporation | Variable orientation input mode |
US7259747B2 (en) * | 2001-06-05 | 2007-08-21 | Reactrix Systems, Inc. | Interactive video display system |
US20070220444A1 (en) * | 2006-03-20 | 2007-09-20 | Microsoft Corporation | Variable orientation user interface |
US20070236485A1 (en) * | 2006-03-31 | 2007-10-11 | Microsoft Corporation | Object Illumination in a Virtual Environment |
US7327375B2 (en) * | 2003-05-13 | 2008-02-05 | Sega Corporation | Control program for display apparatus |
US20080040692A1 (en) * | 2006-06-29 | 2008-02-14 | Microsoft Corporation | Gesture input |
US7397464B1 (en) * | 2004-04-30 | 2008-07-08 | Microsoft Corporation | Associating application states with a physical object |
US20080192005A1 (en) * | 2004-10-20 | 2008-08-14 | Jocelyn Elgoyhen | Automated Gesture Recognition |
US7483015B2 (en) * | 2004-02-17 | 2009-01-27 | Aruze Corp. | Image display system |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6667741B1 (en) | 1997-12-24 | 2003-12-23 | Kabushiki Kaisha Sega Enterprises | Image generating device and image generating method |
US6159100A (en) | 1998-04-23 | 2000-12-12 | Smith; Michael D. | Virtual reality game |
US7469381B2 (en) | 2007-01-07 | 2008-12-23 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US6333735B1 (en) | 1999-03-16 | 2001-12-25 | International Business Machines Corporation | Method and apparatus for mouse positioning device based on infrared light sources and detectors |
US6662365B1 (en) | 1999-08-17 | 2003-12-09 | Gateway, Inc. | Unified parental locks |
CA2402680A1 (en) | 2000-03-15 | 2001-09-20 | Richard F. Rudolph | Controlled remote product internet access and distribution |
US20020180811A1 (en) * | 2001-05-31 | 2002-12-05 | Chu Sing Yun | Systems, methods, and articles of manufacture for providing a user interface with selection and scrolling |
JP2002358149A (en) | 2001-06-01 | 2002-12-13 | Sony Corp | User inputting device |
JP2003122492A (en) | 2001-10-10 | 2003-04-25 | Wacom Co Ltd | Input system, program, and recording medium |
JP4052884B2 (en) | 2002-06-24 | 2008-02-27 | 富士通株式会社 | Touch panel device |
US20050253872A1 (en) | 2003-10-09 | 2005-11-17 | Goss Michael E | Method and system for culling view dependent visual data streams for a virtual environment |
US7134756B2 (en) | 2004-05-04 | 2006-11-14 | Microsoft Corporation | Selectable projector and imaging modes of display table |
US7467380B2 (en) * | 2004-05-05 | 2008-12-16 | Microsoft Corporation | Invoking applications with virtual objects on an interactive display |
WO2005122557A1 (en) | 2004-06-04 | 2005-12-22 | Thomson Licensing | Method and apparatus for controlling an apparatus having a parental control function |
US7788606B2 (en) * | 2004-06-14 | 2010-08-31 | Sas Institute Inc. | Computer-implemented system and method for defining graphics primitives |
US7787706B2 (en) | 2004-06-14 | 2010-08-31 | Microsoft Corporation | Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface |
US7168813B2 (en) | 2004-06-17 | 2007-01-30 | Microsoft Corporation | Mediacube |
US7499027B2 (en) | 2005-04-29 | 2009-03-03 | Microsoft Corporation | Using a light pointer for input on an interactive display surface |
US8487910B2 (en) | 2005-05-02 | 2013-07-16 | Smart Technologies Ulc | Large scale touch system and methods for interacting with same |
US20070284429A1 (en) | 2006-06-13 | 2007-12-13 | Microsoft Corporation | Computer component recognition and setup |
US7552402B2 (en) | 2006-06-22 | 2009-06-23 | Microsoft Corporation | Interface orientation using shadows |
-
2006
- 2006-06-23 US US11/426,101 patent/US8001613B2/en not_active Expired - Fee Related
Patent Citations (99)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US645364A (en) * | 1897-04-20 | 1900-03-13 | John C Riebe | Mechanical movement. |
US4817176A (en) * | 1986-02-14 | 1989-03-28 | William F. McWhortor | Method and apparatus for pattern recognition |
US5230063A (en) * | 1989-03-15 | 1993-07-20 | Sun Microsystems, Inc. | Method and apparatus for selecting button function and retaining selected optics on a display |
US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
US5434964A (en) * | 1990-01-25 | 1995-07-18 | Radius Inc. | Movement and redimensioning of computer display windows |
US7098891B1 (en) * | 1992-09-18 | 2006-08-29 | Pryor Timothy R | Method for providing human input to a computer |
US5345549A (en) * | 1992-10-30 | 1994-09-06 | International Business Machines Corporation | Multimedia based security systems |
US5463725A (en) * | 1992-12-31 | 1995-10-31 | International Business Machines Corp. | Data processing system graphical user interface which emulates printed material |
US6240207B1 (en) * | 1993-08-11 | 2001-05-29 | Sony Corporation | Handwriting input display apparatus having improved speed in changing display of entered handwriting |
US5423554A (en) * | 1993-09-24 | 1995-06-13 | Metamedia Ventures, Inc. | Virtual reality game method and apparatus |
US5714698A (en) * | 1994-02-03 | 1998-02-03 | Canon Kabushiki Kaisha | Gesture input method and apparatus |
US5943164A (en) * | 1994-11-14 | 1999-08-24 | Texas Instruments Incorporated | Curved 3-D object description from single aerial images using shadows |
US6445364B2 (en) * | 1995-11-28 | 2002-09-03 | Vega Vista, Inc. | Portable game display and method for controlling same |
US5665951A (en) * | 1996-02-08 | 1997-09-09 | Newman; Gary H. | Customer indicia storage and utilization system |
US5818450A (en) * | 1996-03-07 | 1998-10-06 | Toshiba Kikai Kabushiki Kaisha | Method of displaying data setting menu on touch input display provided with touch-sensitive panel and apparatus for carrying out the same method |
US5804803A (en) * | 1996-04-02 | 1998-09-08 | International Business Machines Corporation | Mechanism for retrieving information using data encoded on an object |
US5883626A (en) * | 1997-03-31 | 1999-03-16 | International Business Machines Corporation | Docking and floating menu/tool bar |
US5910653A (en) * | 1997-04-09 | 1999-06-08 | Telxon Corporation | Shelf tag with ambient light detector |
US6686931B1 (en) * | 1997-06-13 | 2004-02-03 | Motorola, Inc. | Graphical password methodology for a microprocessor device accepting non-alphanumeric user input |
US20010012001A1 (en) * | 1997-07-07 | 2001-08-09 | Junichi Rekimoto | Information input apparatus |
US6414672B2 (en) * | 1997-07-07 | 2002-07-02 | Sony Corporation | Information input apparatus |
US6247128B1 (en) * | 1997-07-22 | 2001-06-12 | Compaq Computer Corporation | Computer manufacturing with smart configuration methods |
US6577330B1 (en) * | 1997-08-12 | 2003-06-10 | Matsushita Electric Industrial Co., Ltd. | Window display device with a three-dimensional orientation of windows |
US6556951B1 (en) * | 1997-11-26 | 2003-04-29 | The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services | System and method for intelligent quality control of a process |
US6181343B1 (en) * | 1997-12-23 | 2001-01-30 | Philips Electronics North America Corp. | System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs |
US6512507B1 (en) * | 1998-03-31 | 2003-01-28 | Seiko Epson Corporation | Pointing position detection device, presentation system, and method, and computer-readable medium |
US6623365B1 (en) * | 1998-05-12 | 2003-09-23 | Volkswagen Ag | Transmission element for the transmission of power and/or torques, oscillation damper and method for oscillation damping |
US6735625B1 (en) * | 1998-05-29 | 2004-05-11 | Cisco Technology, Inc. | System and method for automatically determining whether a product is compatible with a physical device in a network |
US6950534B2 (en) * | 1998-08-10 | 2005-09-27 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
US6768419B2 (en) * | 1998-08-14 | 2004-07-27 | 3M Innovative Properties Company | Applications for radio frequency identification systems |
US6745234B1 (en) * | 1998-09-11 | 2004-06-01 | Digital:Convergence Corporation | Method and apparatus for accessing a remote location by scanning an optical code |
US6792452B1 (en) * | 1998-09-11 | 2004-09-14 | L.V. Partners, L.P. | Method for configuring a piece of equipment with the use of an associated machine resolvable code |
US6452593B1 (en) * | 1999-02-19 | 2002-09-17 | International Business Machines Corporation | Method and system for rendering a virtual three-dimensional graphical display |
US6448964B1 (en) * | 1999-03-15 | 2002-09-10 | Computer Associates Think, Inc. | Graphic object manipulating tool |
US6545663B1 (en) * | 1999-04-19 | 2003-04-08 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method and input device for controlling the position of an object to be graphically displayed in virtual reality |
US6593945B1 (en) * | 1999-05-21 | 2003-07-15 | Xsides Corporation | Parallel graphical user interface |
US6910076B2 (en) * | 1999-06-23 | 2005-06-21 | Intel Corporation | Network-based detection and display of product replacement information |
US6630943B1 (en) * | 1999-09-21 | 2003-10-07 | Xsides Corporation | Method and system for controlling a complementary user interface on a display surface |
US6822635B2 (en) * | 2000-01-19 | 2004-11-23 | Immersion Corporation | Haptic interface for laptop computers and other portable devices |
US6672961B1 (en) * | 2000-03-16 | 2004-01-06 | Sony Computer Entertainment America Inc. | Computer system and method of displaying images |
US20050193120A1 (en) * | 2000-03-16 | 2005-09-01 | Sony Computer Entertainment America Inc. | Data transmission protocol and visual display for a networked computer system |
US6767287B1 (en) * | 2000-03-16 | 2004-07-27 | Sony Computer Entertainment America Inc. | Computer system and method for implementing a virtual reality environment for a multi-player game |
US6624833B1 (en) * | 2000-04-17 | 2003-09-23 | Lucent Technologies Inc. | Gesture-based input interface system with shadow detection |
US20050153128A1 (en) * | 2000-06-30 | 2005-07-14 | Selinfreund Richard H. | Product packaging including digital data |
US6720860B1 (en) * | 2000-06-30 | 2004-04-13 | International Business Machines Corporation | Password protection using spatial and temporal variation in a high-resolution touch sensitive display |
US6791530B2 (en) * | 2000-08-29 | 2004-09-14 | Mitsubishi Electric Research Laboratories, Inc. | Circular graphical user interfaces |
US20040046784A1 (en) * | 2000-08-29 | 2004-03-11 | Chia Shen | Multi-user collaborative graphical user interfaces |
US6990660B2 (en) * | 2000-09-22 | 2006-01-24 | Patchlink Corporation | Non-invasive automatic offsite patch fingerprinting and updating system and method |
US6568596B1 (en) * | 2000-10-02 | 2003-05-27 | Symbol Technologies, Inc. | XML-based barcode scanner |
US20020154214A1 (en) * | 2000-11-02 | 2002-10-24 | Laurent Scallie | Virtual reality game system using pseudo 3D display driver |
US20040051733A1 (en) * | 2000-12-28 | 2004-03-18 | David Katzir | Method and system for parental internet control |
US20020109737A1 (en) * | 2001-02-15 | 2002-08-15 | Denny Jaeger | Arrow logic system for creating and operating control systems |
US20040141008A1 (en) * | 2001-03-07 | 2004-07-22 | Alexander Jarczyk | Positioning of areas displayed on a user interface |
US20020151337A1 (en) * | 2001-03-29 | 2002-10-17 | Konami Corporation | Video game device, video game method, video game program, and video game system |
US20040127272A1 (en) * | 2001-04-23 | 2004-07-01 | Chan-Jong Park | System and method for virtual game |
US7259747B2 (en) * | 2001-06-05 | 2007-08-21 | Reactrix Systems, Inc. | Interactive video display system |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
US20030025676A1 (en) * | 2001-08-02 | 2003-02-06 | Koninklijke Philips Electronics N.V. | Sensor-based menu for a touch screen panel |
US20030063132A1 (en) * | 2001-08-16 | 2003-04-03 | Frank Sauer | User interface for augmented and virtual reality systems |
US7036090B1 (en) * | 2001-09-24 | 2006-04-25 | Digeo, Inc. | Concentric polygonal menus for a graphical user interface |
US20030119576A1 (en) * | 2001-12-20 | 2003-06-26 | Mcclintic Monica A. | Gaming devices and methods incorporating interactive physical skill bonus games and virtual reality games in a shared bonus event |
US20050166264A1 (en) * | 2002-01-08 | 2005-07-28 | Kazuhiro Yamada | Content delivery method and content delivery system |
US20040005920A1 (en) * | 2002-02-05 | 2004-01-08 | Mindplay Llc | Method, apparatus, and article for reading identifying information from, for example, stacks of chips |
US7104891B2 (en) * | 2002-05-16 | 2006-09-12 | Nintendo Co., Ltd. | Game machine and game program for displaying a first object casting a shadow formed by light from a light source on a second object on a virtual game space |
US20050122308A1 (en) * | 2002-05-28 | 2005-06-09 | Matthew Bell | Self-contained interactive video display system |
US6795452B2 (en) * | 2002-05-31 | 2004-09-21 | Sandbridge Technologies, Inc. | Method of tracking time intervals for a communication signal |
US20040032409A1 (en) * | 2002-08-14 | 2004-02-19 | Martin Girard | Generating image data |
US20040090432A1 (en) * | 2002-11-01 | 2004-05-13 | Fujitsu Limited, | Touch panel device and contact position detection method |
US20040119746A1 (en) * | 2002-12-23 | 2004-06-24 | Authenture, Inc. | System and method for user authentication interface |
US20040212617A1 (en) * | 2003-01-08 | 2004-10-28 | George Fitzmaurice | User interface having a placement and layout suitable for pen-based computers |
US7327375B2 (en) * | 2003-05-13 | 2008-02-05 | Sega Corporation | Control program for display apparatus |
US20050017709A1 (en) * | 2003-07-25 | 2005-01-27 | Honeywell International Inc. | Magnetoresistive turbocharger compressor wheel speed sensor |
US6847856B1 (en) * | 2003-08-29 | 2005-01-25 | Lucent Technologies Inc. | Method for determining juxtaposition of physical components with use of RFID tags |
US20050054392A1 (en) * | 2003-09-04 | 2005-03-10 | Too Yew Teng | Portable digital device orientation |
US20050069186A1 (en) * | 2003-09-30 | 2005-03-31 | Konica Minolta Meical & Graphic, Inc. | Medical image processing apparatus |
US20050183035A1 (en) * | 2003-11-20 | 2005-08-18 | Ringel Meredith J. | Conflict resolution for graphic multi-user interface |
US20050110781A1 (en) * | 2003-11-25 | 2005-05-26 | Geaghan Bernard O. | Light emitting stylus and user input device using same |
US7085590B2 (en) * | 2003-12-31 | 2006-08-01 | Sony Ericsson Mobile Communications Ab | Mobile terminal with ergonomic imaging functions |
US20050146508A1 (en) * | 2004-01-06 | 2005-07-07 | International Business Machines Corporation | System and method for improved user input on personal computing devices |
US20050162402A1 (en) * | 2004-01-27 | 2005-07-28 | Watanachote Susornpol J. | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback |
US20050177054A1 (en) * | 2004-02-10 | 2005-08-11 | Dingrong Yi | Device and process for manipulating real and virtual objects in three-dimensional space |
US7483015B2 (en) * | 2004-02-17 | 2009-01-27 | Aruze Corp. | Image display system |
US20050200291A1 (en) * | 2004-02-24 | 2005-09-15 | Naugler W. E.Jr. | Method and device for reading display pixel emission and ambient luminance levels |
US7397464B1 (en) * | 2004-04-30 | 2008-07-08 | Microsoft Corporation | Associating application states with a physical object |
US20060015501A1 (en) * | 2004-07-19 | 2006-01-19 | International Business Machines Corporation | System, method and program product to determine a time interval at which to check conditions to permit access to a file |
US20060017709A1 (en) * | 2004-07-22 | 2006-01-26 | Pioneer Corporation | Touch panel apparatus, method of detecting touch area, and computer product |
US20060161871A1 (en) * | 2004-07-30 | 2006-07-20 | Apple Computer, Inc. | Proximity detector in handheld device |
US20060026535A1 (en) * | 2004-07-30 | 2006-02-02 | Apple Computer Inc. | Mode-based graphical user interfaces for touch sensitive input devices |
US20060075250A1 (en) * | 2004-09-24 | 2006-04-06 | Chung-Wen Liao | Touch panel lock and unlock function and hand-held device |
US20060077211A1 (en) * | 2004-09-29 | 2006-04-13 | Mengyao Zhou | Embedded device with image rotation |
US20080192005A1 (en) * | 2004-10-20 | 2008-08-14 | Jocelyn Elgoyhen | Automated Gesture Recognition |
US20060090078A1 (en) * | 2004-10-21 | 2006-04-27 | Blythe Michael M | Initiation of an application |
US20060119541A1 (en) * | 2004-12-02 | 2006-06-08 | Blythe Michael M | Display system |
US20060156249A1 (en) * | 2005-01-12 | 2006-07-13 | Blythe Michael M | Rotate a user interface |
US20070063981A1 (en) * | 2005-09-16 | 2007-03-22 | Galyean Tinsley A Iii | System and method for providing an interactive interface |
US20070188518A1 (en) * | 2006-02-10 | 2007-08-16 | Microsoft Corporation | Variable orientation input mode |
US20070220444A1 (en) * | 2006-03-20 | 2007-09-20 | Microsoft Corporation | Variable orientation user interface |
US20070236485A1 (en) * | 2006-03-31 | 2007-10-11 | Microsoft Corporation | Object Illumination in a Virtual Environment |
US20080040692A1 (en) * | 2006-06-29 | 2008-02-14 | Microsoft Corporation | Gesture input |
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7612786B2 (en) | 2006-02-10 | 2009-11-03 | Microsoft Corporation | Variable orientation input mode |
US20070188518A1 (en) * | 2006-02-10 | 2007-08-16 | Microsoft Corporation | Variable orientation input mode |
US8930834B2 (en) | 2006-03-20 | 2015-01-06 | Microsoft Corporation | Variable orientation user interface |
US8139059B2 (en) | 2006-03-31 | 2012-03-20 | Microsoft Corporation | Object illumination in a virtual environment |
US20070284429A1 (en) * | 2006-06-13 | 2007-12-13 | Microsoft Corporation | Computer component recognition and setup |
US20070300182A1 (en) * | 2006-06-22 | 2007-12-27 | Microsoft Corporation | Interface orientation using shadows |
US7552402B2 (en) | 2006-06-22 | 2009-06-23 | Microsoft Corporation | Interface orientation using shadows |
US20080291283A1 (en) * | 2006-10-16 | 2008-11-27 | Canon Kabushiki Kaisha | Image processing apparatus and control method thereof |
US10318076B2 (en) | 2006-10-16 | 2019-06-11 | Canon Kabushiki Kaisha | Image displaying apparatus with changed menu based on detection of mobile information terminal placed thereon |
EP2472373A4 (en) * | 2009-08-27 | 2016-08-17 | Sony Corp | Information processing device, information processing method, and program |
US20110283338A1 (en) * | 2010-05-14 | 2011-11-17 | Microsoft Corporation | Sensor-based authentication to a computer network-based service |
US8935767B2 (en) | 2010-05-14 | 2015-01-13 | Microsoft Corporation | Overlay human interactive proof system and techniques |
US8621583B2 (en) * | 2010-05-14 | 2013-12-31 | Microsoft Corporation | Sensor-based authentication to a computer network-based service |
EP2569727A4 (en) * | 2010-05-14 | 2013-12-25 | Microsoft Corp | Overlay human interactive proof system and techniques |
EP2569727A2 (en) * | 2010-05-14 | 2013-03-20 | Microsoft Corporation | Overlay human interactive proof system and techniques |
US11443577B2 (en) | 2010-06-16 | 2022-09-13 | Delphian Systems, LLC | Wireless device enabled locking system |
US9077716B2 (en) | 2010-06-16 | 2015-07-07 | Delphian Systems, LLC | Wireless device enabled locking system |
CN103026682A (en) * | 2010-06-16 | 2013-04-03 | 德尔斐系统有限公司 | Wireless device enabled locking system |
US9691201B2 (en) | 2010-06-16 | 2017-06-27 | Delphian Systems, LLC | Wireless device enabled locking system |
WO2011159921A1 (en) * | 2010-06-16 | 2011-12-22 | Delphian Systems, LLC | Wireless device enabled locking system |
US11354958B2 (en) | 2010-06-16 | 2022-06-07 | Delphian Systems, LLC | Wireless device enabled locking system having different modalities |
US9141779B2 (en) | 2011-05-19 | 2015-09-22 | Microsoft Technology Licensing, Llc | Usable security of online password management with sensor-based authentication |
US9858402B2 (en) | 2011-05-19 | 2018-01-02 | Microsoft Technology Licensing, Llc | Usable security of online password management with sensor-based authentication |
US20140210703A1 (en) * | 2013-01-31 | 2014-07-31 | Samsung Electronics Co. Ltd. | Method of unlocking and subsequent application launch in portable electronic device via orientation sensing |
US11100736B2 (en) | 2013-05-20 | 2021-08-24 | Delphian Systems, LLC | Access control via selective direct and indirect wireless communications |
US10529156B2 (en) | 2013-05-20 | 2020-01-07 | Delphian Systems, LLC | Access control via selective direct and indirect wireless communications |
US20170123622A1 (en) * | 2015-10-28 | 2017-05-04 | Microsoft Technology Licensing, Llc | Computing device having user-input accessory |
WO2018057121A1 (en) * | 2016-09-20 | 2018-03-29 | Sony Interactive Entertainment Inc. | Input method for modeling physical objects in vr/digital |
RU180346U1 (en) * | 2016-12-09 | 2018-06-08 | Чуань-Энь ЛИ | PRIVACY PROTECTION FILTER FOR DISPLAYS |
US11341227B2 (en) * | 2017-10-17 | 2022-05-24 | Tencent Technology (Shenzhen) Company Limited | Verification code generation method and apparatus, computer device, and storage medium |
CN109670291A (en) * | 2017-10-17 | 2019-04-23 | 腾讯科技(深圳)有限公司 | A kind of implementation method of identifying code, device and storage medium |
WO2021216712A1 (en) * | 2020-04-21 | 2021-10-28 | Roblox Corporation | Systems and methods for accessible computer-user interactions |
US11740853B1 (en) | 2020-10-26 | 2023-08-29 | Wells Fargo Bank, N.A. | Smart table system utilizing extended reality |
US11429957B1 (en) | 2020-10-26 | 2022-08-30 | Wells Fargo Bank, N.A. | Smart table assisted financial health |
US11397956B1 (en) | 2020-10-26 | 2022-07-26 | Wells Fargo Bank, N.A. | Two way screen mirroring using a smart table |
US11457730B1 (en) | 2020-10-26 | 2022-10-04 | Wells Fargo Bank, N.A. | Tactile input device for a touch screen |
US11572733B1 (en) | 2020-10-26 | 2023-02-07 | Wells Fargo Bank, N.A. | Smart table with built-in lockers |
US11687951B1 (en) | 2020-10-26 | 2023-06-27 | Wells Fargo Bank, N.A. | Two way screen mirroring using a smart table |
US11727483B1 (en) | 2020-10-26 | 2023-08-15 | Wells Fargo Bank, N.A. | Smart table assisted financial health |
US11741517B1 (en) | 2020-10-26 | 2023-08-29 | Wells Fargo Bank, N.A. | Smart table system for document management |
US11969084B1 (en) | 2020-10-26 | 2024-04-30 | Wells Fargo Bank, N.A. | Tactile input device for a touch screen |
US20220171845A1 (en) * | 2020-11-30 | 2022-06-02 | Rovi Guides, Inc. | Enhancing intelligence in parental control |
US11543931B2 (en) * | 2021-01-27 | 2023-01-03 | Ford Global Technologies, Llc | Systems and methods for interacting with a tabletop model using a mobile device |
US20230350515A1 (en) * | 2021-07-13 | 2023-11-02 | Novatek Microelectronics Corp. | Transmission system |
GB2625153A (en) * | 2022-12-09 | 2024-06-12 | Andrews & Wykeham Ltd | Security credential |
Also Published As
Publication number | Publication date |
---|---|
US8001613B2 (en) | 2011-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8001613B2 (en) | Security using physical objects | |
US9171142B2 (en) | Arrangements for identifying users in a multi-touch surface environment | |
US11989394B2 (en) | Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs | |
US7552402B2 (en) | Interface orientation using shadows | |
US8930834B2 (en) | Variable orientation user interface | |
US7574739B2 (en) | Password authenticating apparatus, method, and program | |
RU2589397C2 (en) | Authentication graphic gestures | |
US9736137B2 (en) | System and method for managing multiuser tools | |
US7791597B2 (en) | Uniquely identifiable inking instruments | |
KR100686272B1 (en) | Display method for table type information terminal | |
US8390568B2 (en) | Display system | |
US20070188445A1 (en) | Uniquely identifiable inking instruments | |
US20040010722A1 (en) | Computer system and method of controlling booting of the same | |
US20050240871A1 (en) | Identification of object on interactive display surface by identifying coded pattern | |
US20060206717A1 (en) | Image or pictographic based computer login systems and methods | |
JP2007128486A (en) | Method and system for conducting transaction using recognized text | |
JPH11149454A (en) | Authenticating device, user authenticating method, card for authenticating user and recording medium | |
US11435866B2 (en) | Time-based device interfaces | |
US9557914B2 (en) | Electronic device, unlocking method, and non-transitory storage medium | |
US20070140533A1 (en) | Input device with a fingerprint recognizing mechanism | |
US20090109030A1 (en) | Using a physical object and its position on a surface to control an enablement state of a surface based computing device | |
US20170199994A1 (en) | Imaging devices and methods for authenticating a user | |
US20200382664A1 (en) | Recording medium storing control program and electronic device | |
CN107437015B (en) | System and method for orientation sensing of objects on electronic devices | |
US9424416B1 (en) | Accessing applications from secured states |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:., DUNCAN;REEL/FRAME:017855/0564 Effective date: 20060622 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034542/0001 Effective date: 20141014 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20190816 |