WO1993007561A1 - Apparatus and method for projection upon a three-dimensional object - Google Patents

Apparatus and method for projection upon a three-dimensional object Download PDF

Info

Publication number
WO1993007561A1
WO1993007561A1 PCT/US1992/008626 US9208626W WO9307561A1 WO 1993007561 A1 WO1993007561 A1 WO 1993007561A1 US 9208626 W US9208626 W US 9208626W WO 9307561 A1 WO9307561 A1 WO 9307561A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
data
guest
simplified
color pattern
Prior art date
Application number
PCT/US1992/008626
Other languages
French (fr)
Inventor
Marshall M. Monroe
Original Assignee
The Walt Disney Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Walt Disney Company filed Critical The Walt Disney Company
Priority to EP92922049A priority Critical patent/EP0615637A1/en
Priority to JP5507209A priority patent/JPH07504515A/en
Publication of WO1993007561A1 publication Critical patent/WO1993007561A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • G03B21/60Projection screens characterised by the nature of the surface
    • G03B21/606Projection screens characterised by the nature of the surface for relief projection
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/005Projectors using an electronic spatial light modulator but not peculiar thereto
    • G03B21/006Projectors using an electronic spatial light modulator but not peculiar thereto using LCD's
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • the present invention relates to projection devices and, more particularly, to an apparatus and method for projection upon a three-dimensional object.
  • Holograms for example, are very limited in color pallet and exhibit uncontrollable color shift with varying viewing angle.
  • the techniques they employ simply are not practical for reproducing a threedimensional image from a two-dimensional depiction, because the two-dimensional image has to be initially captured and subsequently processed to include a depth component so that a three-dimensional image can be reconstituted.
  • interactive image modification is provided through a video shopping device that superimposes computer generated apparel upon an image of a human figure which is captured by a video camera.
  • the human figure adorns an orientation identifying feature that can be recognized by image control circuitry which maps the appropriate orientation of the computer stored apparel image onto the figure and which then displays the composite two-dimensional image upon a viewing screen.
  • This method has drawbacks in that it requires a human figure to wear at least one orientation identifying feature, and it does not provide for the projection of vivid and realistic three-dimensional images.
  • a method of projection which can interactively recreate three-dimensional images from two-dimensional depictions without the need for a video camera, advance processing or the adornment of orientation identifying features would therefore be desirable.
  • the present invention provides an apparatus and method for projecting images upon a three-dimensional object so as to impart a vivid and realistic appearance upon that object.
  • the apparatus employs graphics processing equipment and a projection means for projecting an image upon the object in a manner which allows for user interaction with the projected image.
  • Specific methods in accordance with the invention allow for an image to be created corresponding to the surface contour of the object, as well as the definition of regions within that contour which may be independently processed for projection upon the object.
  • a user or guest may create and edit a complete artwork data file which contains all of the perspective, registration and optical keystoning corrections necessary for projecting a vivid and realistic image, and which accurately conveys depth when projected upon the object.
  • the projected image also can be modified in real time, providing animation, interactivity, transformation and even translation.
  • the apparatus includes a projector and an addressable light filter means which is adapted to filter and color the light projected onto the object.
  • a user interface means receives graphics data for the creation of a properly aligned projection contour and regions within the contour.
  • the graphics data is then fed to a graphics processing device, such as a computer coupled to the addressable light filter means, to generate and control projection of the desired image utilizing multi-dimensional bit-mapping.
  • the user interface means consists of a user interface and one or more simplified guest interfaces.
  • a simplified guest interface which may be in the form of a Polhemus device, joystick, gimballed stylus, mouse, etc., may be added so as to permit a guest of the user to input graphics data that is used to manipulate color patterns projected onto regions of the object without user supervision.
  • the data received from the simplified guest interface corresponds to a particular position on the object and allows for selection of an active color pattern and for selective painting of that color pattern upon a region corresponding to the position on the object.
  • Contour and other graphics data is typically generated at the user interface, which may include a stylus and digitizing pad or a mouse, by tracing an image or the like on the object.
  • the computer processes the graphics data and generates an output representing an image which corresponds to the surface contour of the object, as traced by the user.
  • This output controls the light filter means and commands it to filter the projected light such that the image is projected onto the object in various colors, with the appearance of shading, surface textures and other characteristics as desired.
  • the light filter means includes two or more optically superpositioned liquid crystal panels that are individually composited with a color filter. These displays are controlled in response to the output from the computer to filter the light from the projector to thereby produce color components and other projection features of the image.
  • a display monitor also may be provided for connection to the processing device to permit two-dimensional display of the bit map data file on the monitor.
  • the liquid crystal filter may be driven from the same graphics signal that is supplied to control the display monitor, and therefore consists of three such displays that subtract light for creating a color projected image.
  • the processing device comprises a computer having graphics software, with the display monitor being coupled to the computer.
  • the graphics software is designed to store the processed graphics data in a memory, and to permit graphics data and image projection patterns corresponding to locations on the object to be created, generated, modified, maintained, erased or retrieved for subsequent projection.
  • a method of projecting an image onto a three-dimensional image includes the steps of entering the graphics data into the graphics input device and then processing that data to generate an output representing an image which corresponds to the surface contour of the object.
  • the light filter is then controlled in response to the output to filter light from the projector such that the image is projected onto the object with a desired appearance.
  • the output may be stored in a buffer and then processed by the user to interactively modify the image.
  • the output representing the image may be stored in a memory for subsequent recall and projection on the object.
  • a plurality of outputs may be stored to form a sequence of different but related images for sequential projection upon the object so as to make the object appear to be in motion.
  • FIG. 1 is a perspective view of an apparatus embodying the novel features of the present invention, showing a three-dimensional object and a simplified guest interface according to one preferred embodiment of the current invention;
  • FIG. 2 is another perspective view of the apparatus, showing a colorless three-dimensional object, projector, computer and interface for controlling projection upon the object;
  • FIG. 3 is another perspective view, similar to FIG. 2, showing projection of an image upon the threedimensional object as selected by the simplified guest interface;
  • FIG. 4 is a plan view of the object and a projection device used for projecting the image onto the object and illustrates the depth of field of the object;
  • FIG. 5 is a block diagram depicting the functional interaction between the computer, the simplified guest interface, a liquid crystal filter projector, and a switch/lamp assembly;
  • FIG. 6 is another block diagram showing the functional interaction between a polhemus device, the computer and an overhead-type liquid crystal filter projector;
  • FIG. 7 is a logic block diagram of the software necessary to direct the computer of the preferred embodiment to control projection of an image according to the current invention.
  • an amusement apparatus for use in projecting images onto a three-dimensional object 12, for example, having the form of animated characters on a stage.
  • An amusement guest 14 positioned in front of the object 12 directs a wand 18 at regions 16 of the object and utilizes a button 20, located on the wand, for selectively coloring or drawing upon a region determined from the orientation of the wand.
  • FIGS. 2-3 show the apparatus 10 in more detail.
  • the three-dimensional object 12 is in the form of a carousel horse supported on a raised platform 26 or the like.
  • the three-dimensional object is a Bas- ⁇ Relief with attention paid to avoiding "undercuts" where the projected light can't reach. While the object illustrated in the preferred embodiment is an animated character, it will be appreciated that various other types and forms of three-dimensional objects having diverse surface shapes and configurations may be used in the context of the present invention.
  • the surface 28 of the object 12 is monochromatic neutral-colored projection surface. In this way, the images projected upon the object will not be affected by unwanted colors on the object itself.
  • the apparatus 10 includes a projection means that is aligned so as to project light upon the object 12.
  • the projection means which may include any device suitable for projecting light, or a liquid crystal display projector or light valve projector, is illustrated in the preferred embodiment as comprising an overhead type projector 29 with a large format liquid crystal light filter.
  • data processing means are responsive to inputs received by a user interface means for processing graphics data to generate an image 24 and for controlling projection of image upon the object 12. More particularly, the data processing means includes a computer graphics system 30.
  • the user interface means may have one, two or even more interfaces for interaction with the computer graphics system 30.
  • a single user interface 32 consisting of a keyboard 34 and gimballed stylus 36 with a drawing surface 37 of a digitizing pad 38, and one or more simplified guest interfaces, generally referred to by the reference numeral 40.
  • the simplified guest interface 40 may be any device that allows a guest to position a projected cursor or otherwise indicate position on the object 12, decide to paint a region or draw upon the object and select a current color pattern for projection onto the object.
  • a user enters graphics data at the user interface 32 to define "regions" corresponding to positions 16 on the object 12.
  • the user can trace the object while viewing both the projection of the traced lines on the object 12 and the formation of contours as depicted on a computer monitor 42.
  • the user preferably creates by these contours at least one computer region corresponding to the object 12 and may further subdivide each region into multiple regions and store to memory the created image.
  • the system is then set up for interaction with an amusement guest via the simplified guest interface 40.
  • the user interface 32 of the preferred embodiment utilizes a keyboard 34, stylus 36 and digitizing pad 38
  • a digitizing pad and stylus alone, or a mouse, or a keyboard alone, or a track ball, as well as any other device that is effective to input data to the computer graphics system 30.
  • the guest 14 may use the simplified guest interface 40 to projectively color the object 12 in accordance with data provided by the guest 14.
  • the simplified guest interface 40 includes the wand 18, a "mouseclick" button 20, and a set of twelve color pattern buttons 46. Four of these buttons will select a pattern, e.g., polka dot, stripes, stars and crescent moons, etc., and eight buttons are employed to present a color selection.
  • the wand 18 may be presented as shown in FIG. 1, having an enlarged paintbrush 48 having a bristle-end 50 that is to be pointed at a position on the object 12. By pushing the mouseclick 20, which as shown in FIG.
  • the simplified guest interface 40 causes the computer graphics system 30 to projectively paint a portion of the object 12 corresponding to the wand's orientation with the current color pattern.
  • the bristle-end 50 may carry lights for illuminating the bristle-end with the current color pattern, in a manner to simulate paint on the bristle-end of the brush.
  • the simplified guest interface 40 may present the guest with an option to projectively paint brush strokes or draw upon the object 12, in addition to flooding portions of the object as defined by the software delineated "Regions.”
  • the wand 18 may house a so-called Polhemus device 92.
  • the polhemus device utilizes low- frequency magnetic field technology to determine the position and orientation of a sensor 94 in relation to a source unit 96 and presents information representative of six degrees of freedom in ASCII or binary format.
  • a unit sold under the name 3SPACE Tm ISOTRAK Tm has been found to be acceptable. There are, however, other units that may also be conveniently used in accordance with the invention.
  • FIG. 7 includes a block diagram that illustrates the logic steps that the software incorporating the modifications needs to accomplish. A more detailed statement of the software is also located at the end of this detailed description.
  • the guest 14 manipulates the wand 18 to provide data to a computer 52 in a manner to projectively draw upon the object 12 or paint portions 16 of the object 12.
  • a computer 52 By processing the graphics data fed to the computer 52, all of the perspective, registration and optical keystoning corrections necessary for exact alignment of a re-projected image onto the object 12 are made.
  • the processing of the graphics data may be divided into two segments. In the first, input from the user (not shown) to the user interface 32 is used to create an artwork data file. The user traces closed contours corresponding to portions 16 of the object 12 which are bit-mapped and used to define "regions" via the software.
  • contours are aligned in projection with the object, because the user has observed the projected contours corresponding to the portions 16 of the object 12 contemporaneously with their having been traced upon the object.
  • Inputs from the user interface 32 are also received to indicate color, shading, texturing and perspective which are selectively assigned to regions.
  • This artwork file is then ready for interaction by guests, and may be stored to memory of the computer 52 if it is desired to recall the original artwork data file or impose a default image projection.
  • the artwork data file also can be enhanced on the computer 52 by using the keyboard 40 to select appropriate commands so that when the images are projected onto the object, as shown in FIGS. 3-4, the object 24 appears to be a real character having appropriate coloring, shading and desired textured appearances.
  • the second segment of data processing includes processing of data from the simplified guest interface 40 to allow guest manipulation of the image.
  • the guest 14 enters data via the simplified guest interface 40 that accesses the artwork data file.
  • the simplified guest interface 40 may be any interface that permits the guest to interact with the artwork data file and preferably allows only limited access to that file, i.e., the guest 14 preferably does not have the choice to completely erase all information corresponding to the object 12 including definition of regions.
  • the guest 14 may employ the twelve buttons 46 to select a current color pattern and then utilize the mouseclick 44 to assign that color pattern to one of the pre-defined regions that is identified by the orientation of the wand 18 such that the bristle end 50 of the paint brush 48 points to the portion 16 of the object that corresponds to that region.
  • the user will first trace the boundaries of the tail 58 by watching a projection pen tip of the stylus 36 create a contour on the object 12 itself.
  • the VGA signal emulated by the computer 52 to the monitor 42 is readily used as the projection control signal for the image.
  • a guest 14 may subsequently utilize the wand 18, mouseclick 20 and set of buttons 46 to select in real time an appropriate color, shading or the appearance of a particular texture to be projected within the traced area.
  • a single button may replace the set of twelve buttons 46 which, when depressed, causes the computer 52 to select a new current color pattern.
  • color pattern selection could also be made by directing the wand 18 at a target, such as a paint bucket 51 of FIG. 1, and pressing the mouseclick 20.
  • entry of the graphics data via either of the user interface 32 or the simplified guest interface 40 generates signals which are transferred to a computer 52.
  • the computer may be chosen to be a personal computer having one or more disk drives 54 for receiving graphics software.
  • the keyboard 34 and the visual display monitor 42 are coupled to the central processing unit of the computer.
  • the computer processes the graphics data entered by the user on the digitizing pad 38 and generates a VGA format RBG output corresponding to the image drawn or traced upon the pad. This output is then emulated to both the monitor 42 and the projector 29 for projection onto the object 12.
  • the computer comprises a Macintosh personal computer manufactured by Apple Computer, Inc. of Cupertino, California. In the case of computers other than the Macintosh, a VGA board may be necessary for emulation of signals for driving the projector 29.
  • projector 29 which is preferably an overhead type projector, is aligned to face the three-dimensional object 12 and includes a 1,000 watt light source 86 for projecting light upon the object.
  • An addressable light filter means 62 is mounted on the projector 29 in between an objective lens 64 and a condensing lens 66. These two lenses are conventional in most overhead-type projectors to enable focusing and alignment of the light for projection purposes. In the present invention, however, the objective lens and condensing lens are specially configured to provide a relatively large depth of field as shown by the reference designation D in FIG. 4. This is accomplished by using a wide angle objective lens 64.
  • the addressable light filter means 62 which is taken to include any projection device that electronically enables projection of an image by filtering light, is coupled to the computer 52 and is adapted to selectively filter the light projected onto the object 12 in response to output control signals generated by the computer.
  • the light filter means 62 comprises a plurality of optically superpositioned liquid crystal panels 68. Each of these panels is composited with a color filter and is comprised of a plurality of addressable pixels (not shown) which are individually controlled in response to the computer output to generate color elements for the composite image.
  • the liquid crystal panels 68 in the preferred embodiment are designed to accept VGA format signals from the computer 52. Accordingly, an RGB transcoder 70 is connected between the computer 52 and the liquid crystal panels 68 to convert the high resolution output graphics signals, generated by the Macintosh computer, into a VGA format.
  • the liquid crystal filter 68 is comprised of three superpositioned filters, yellow, cyan and magenta, and accordingly has 3 liquid crystal panels that are individually composited with these light filters. These secondary colors are chosen to subtract light from the projected light to project the image upon object.
  • commercially available liquid crystal filters include a circuit, designated by the reference numeral 72 in FIGS. 2-3, that appropriately converts an RGB signal from the computer into control signals for each of the three panels.
  • FIG. 5 shows a functional block diagram of an embodiment of the system that utilizes an IBM compatible personal computer 74.
  • the wand 18 is in the form of a joystick 78 with the familiar paintbrush 48 mounted at the end of the joystick.
  • the mouseclick 20 is also located on one of the paintbrush and the joystick.
  • the wand 18 is coupled to the personal computer which is fitted with I/O boards 82 and 84 for communication, with the LCD overhead display 29 and a twelve button set 46 for selection of colors and patterns, respectively.
  • FIG. 6 shows a partial connection block diagram of the preferred embodiment, including the wand 18 that includes the paintbrush 48, a pivotal mounting 90, and the polhemus device 92.
  • the polhemus device 92 consists of a sensor 94, a source 96 and a polhemus controller 98 which emulates signals for receipt of the controller. Selection of a region identified by the orientation of the paintbrush 48 is accomplished by pushing the mouseclick 20, which is coupled to a mouse device 100.
  • the mouse device 100 serially emulates a digital word to the computer 52 that indicates to the computer 52 and to the custom software that the guest desires to modify the current region.
  • the computer 52 then edits the bit map data file stored in memory and emulates signals to the monitor and liquid crystal filter 68 of the overhead projector 29 to project the image upon the object 12.
  • the projector 29 has a one-thousand watt light source 86 for projecting the images onto the object 12.
  • liquid crystal panels 68 having a ten-inch-by-ten-inch cross-section mounted over the condensing lens 66 are best suited for in-focus projection over approximately a twenty-inch range of depth D with respect to the three-dimensional object 12.
  • the lens system of the projector 29 may be modified as desired to achieve a different in-focus range of depth D over the object 12. Whatever range of depth D is selected, however, care must be taken to ensure that the surface 28 of the object 12 to be projected upon does not have a contour that varies from front to back by more than the desired range of depth.
  • a very high quality image can be generated, aligned on the object and enhanced by the computer graphics system 30 so that the user or other viewers will perceive the object 12 as having full, realistic three-dimensional features.
  • Sophisticated graphics software is used to select and generate various colors, shading effects, the appearance of texture, animated transformation, and other commands for processing, manipulating, editing and storing the image to be projected on the object 12.
  • PIXEL PAINT Tm PHOTO SHOP Tm OR MACROMIND DIRECTOR Tm
  • suitable software programs offering the added feature of animation as well as bit-map painting
  • DELUXE PAINT II Tm or ANIMATOR Tm are suitable (the latter being an animation program as the name implies).
  • FIG. 7 shows a block diagram of the software used to control projection and the sophisticated graphics software and communication with peripherals, such as the simplified guest interface 40, to boards 82 and 84 and Polhemus device 92. Appendices A-F are a more detailed statement of this software.
  • a printer 102 is connected to the computer 52.
  • the printer is adapted to produce hard color copies 104 of the viewed image for the amusement of the guest 14 or other viewers.
  • This feature has special usefulness when the apparatus 10 is used as an amusement device in an amusement park or the like, since the guests will be able to take a sample of their design home with them.
  • the printer 102 also has advantages to artists and others who may use the apparatus 10 in a commercial environment or other contexts. Engineers can use the device for analyzing and comparing different optical systems.
  • a user will pick up the stylus 36 and apply it to the drawing surface 37 of the digitizing pad 38. Once the virtual pen tip of the stylus contacts the drawing surface, the user will see this pen tip as a point on the visual monitor 42 and on the three-dimensional object 12.
  • the user may, for example, trace the contours, forms and outline of the projection object 12 by watching the pen tip of the stylus 36 move around on the object.
  • the traced image which is displayed on the monitor 42 and object 12, defines software regions that carry data signifying desired colors, shading, or the appearance of texture. For example, the user may trace forms corresponding to items of clothing to be displayed upon the object, each such item having a corresponding region.
  • a two-dimensional bit-mapped digital artwork data file will have been created and may be stored in the computer 52.
  • the artwork file thus contains graphics data which is processed by the computer 52 in conjunction with the graphics software to generate an output representing the images traced by the user.
  • the system is then ready for interactive use with a guest via a simplified guest interface.
  • the simplified guest interface has been chosen to include a polhemus device 92, but any interface sufficient to designate or change position is sufficient.
  • a mouse or a joystick may be used in equivalent fashion .
  • the computer 52 then processes this graphics information to generate the desired image for projection onto, for example, the character's tail 58, as shown in FIG. 4. Because the user and the guest 14 may observe the results of moving the wand 18 by looking up at the projection object 12 and observing a moving cursor, contour or virtual pen tip, the projected image registers exactly with the object's appearance or shape. In this way, the object will have high-quality three-dimensional color images and characteristics of three-dimensional form.
  • user created software translates inputs from the simplified guest interface 40 for acceptance by the sophisticated graphics software, and performs incidental tasks, such as periodic blinking of the character's eyes (not shown).
  • incidental tasks such as periodic blinking of the character's eyes (not shown).
  • Appendices A-F the software of the preferred embodiment for interaction with the sophisticated graphics software is attached hereto as Appendices A-F.
  • the user has the option of selecting commands using the keyboard 36 which are displayed in a border on the computer monitor 42.
  • These commands allow for presentation to the guest of selection of various colors, shading effects, the appearance of textures, and various other commands for manipulating, storing and editing the displayed image.
  • modification of the graphics software may be necessary if the simplified guest interface is to be presented a limited subset of available colors, or with only a decision to select a new color pattern rather than select from a range of color patterns available.
  • the switch I/O board 84 may be configured with physical switches (not shown) to allow changes in the emulated protocol such that the graphics software may recognize signals emulated by the I/O board 84 as representing alternate sets of colors and patterns.
  • the output representing the image may be stored in a memory of the computer 52 such that it may be recalled and subsequently projected onto the object 12 as desired.
  • a sequence of different but related images may be collated.
  • the object may appear to be in motion or to display some other characteristic of movement, even though the object 12 itself is stationary.
  • FIG. 7 contains a software block 108 that automatically projects eyes on the object that appear to be blinking.
  • An alternative, but as yet untested application of the present invention, would be to have a physical mechanical movement of the object. Video information could then be synched to the physical movement and a file of projected animation designed. By projecting the two-dimensional image in exact synchronization with the repeated movement of the projection object, a three-dimensional object having a wide range of physical movement would result.
  • the present invention has far ranging applications, including the fields of video shopping and cosmetic surgery, as well as the amusement field.
  • an image may be projected upon a shopper (not shown) to create an impression that the shopper is wearing desired apparel.
  • the shopper's outline may be traced and a specific field within that outline traced, perhaps corresponding to a shirt on the shopper's torso.
  • the graphics information generated by this tracing may thereafter be processed and the light filter may be controlled such that green polka dots, for example, appear on the shopper's torso to make it appear that the shopper is wearing a green polka dotted shirt.
  • images may be projected upon a model of a car or house (also not shown) so as to impart apparent colors, styles or customizations to the car or house.
  • images may be projected upon a model of a car or house (also not shown) so as to impart apparent colors, styles or customizations to the car or house.
  • features desired by cosmetic surgery may be projected onto the subject. The quality of projection is so good that the subject may be made to appear to have a different shaped nose, a beard, eyebrows, or nearly any other physical characteristic.
  • One specific contemplated use of this invention is as a guest operated amusement device, such as a three- dimensional coloring book for artistic applications.
  • the device could be used as a display or as an attraction at an amusement park, whereby users can designate color features for display on an object and see the results of their work as they do it, in three-dimensional rather than two-dimensional form.
  • the above-mentioned applications of this invention are listed as examples only, to illustrate the broad applications and the usefulness of this invention, and they are not intended to limit this invention in any way.
  • Appendix A is a software listing of a "C" language program that calls routines to set-up menus, windows, initialize the system, move the cursor, reset the polhemus and define the paint regions.
  • Appendix B defines menus for Macintosh and
  • Appendix C is a routine for implementing polhemus driven cursor movements.
  • Appendix D is a routine for initializing serial communications and pausing the polhemus output.
  • Appendix E is a routine for managing window/cursor sealing, region definitions, and random color/pattern files, cursor tracking and character eye blinking.
  • Appendix F is a resource file for dialog boxes and display.
  • int windowCode FindWindow (theEvent->where, &theWindow), switch (windowCode)
  • DragWindow (PolhemusWindow, theEvent->where, &dragRect);
  • PlusCursorHdl GetCursor(plusCursor);
  • NewTime TickCount () ;
  • InsertMenu(appleMenu GetMenu(AppleMenuID), 0);
  • InsertMenu(rsMenu GetMenu(RS232MenuID), 0);
  • Boolean DA kind ⁇ 0;
  • DialogPtr genDlgPtr DialogPtr genDlgPtr
  • genDlgPtr GetNewDialog(AboutDialogID, NIL, Pointerl) ,
  • NIL ModalDialog
  • genDlgPtr GetNewDialog(DIPsettingDialogID, NIL, Pointer1), ModalDialog(NIL, &dummylnt); DisposDialog (genDlgPtr) ;
  • Enableltem (polMenu, Increment) ;
  • genDlgPtr GetNewDialog(IncrementDialogID, NIL, Pointer1); GetDItem(genDlgPtr, 2, &itemType, sitemHd1, &box);
  • ToPolhemus (“I",1) ; /* send set increment command with value */ ToPolhemus (text2,text[0]);
  • xmax screenBits.bounds.right - 1;
  • errorCode SerHShake (modemin, sflags);
  • errorCode SerHShake (modemOut, sflags);
  • errorCode SerSetBuf (modemin, InBuffer, 1000);
  • errorCode FSWrite (modemOut, slength, msg);
  • errorCode FSRead(modemin, SoneCount, stempString[0]);
  • yaw 10 * yaw + (int) c - (int) '0';
  • pitch 10 * pitch + (int) c - (int) '0';
  • PutMouse sp, inLocal
  • Rect windowBounds ⁇ 20, 0, 480, 640 ⁇ ;
  • Rect queenBounds ⁇ 100, 100, 200, 200 ⁇ ;
  • Pattern squares ⁇ 0 ⁇ F0,0 ⁇ F0,0 ⁇ F0,0 ⁇ F0,0 ⁇ F0,0 ⁇ 0F,0 ⁇ 0F,0 ⁇ 0F,0 ⁇ 0F,0 ⁇ 0F ⁇ ;
  • Pattern hearts ⁇ 0 ⁇ 00, 0 ⁇ 6C, 0 ⁇ 92, 0 ⁇ 82, 0 ⁇ 82, 0 ⁇ 44, 0 ⁇ 28, 0 ⁇ 10 ⁇ ; #define NumberOfRegions 10
  • DragRect screenBits.bounds;
  • PolhemusWindow NewWindow (OL, &windowBounds, " ⁇ pImageworks 3D Paint Demo”, true, SetPort(PolhemusWindow);
  • console TENew(&d,&v) ;

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A projection apparatus and method for realistic projection with applications to amusement, optical engineering, video shopping and cosmetics. Graphics data is entered into a user interface (32, 42, 52) and is processed to generate an output (24) representing an image to be projected onto a three-dimensional object (12). This output controls a light filter (68), such as a plurality of optically superposed color composite liquid crystal panels, to selectively filter projected light so that an image having a desired appearance is projected upon the object (12). The projected image may be interactively modified, stored in memory and projected as part of an image sequence to create apparent motion in the object.

Description

APPARATUS AND METHOD FOR PROJECTION
UPON A THREE-DIMENSIONAL OBJECT
A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. BACKGROUND OF THE INVENTION
The present invention relates to projection devices and, more particularly, to an apparatus and method for projection upon a three-dimensional object.
The projection of an image onto a three-dimensional object having various contours and shapes is not an easy task. It is generally known that the correction of all optical distortion problems inherent in flat, two-dimensional image projection is especially difficult when working with three-dimensional projection surfaces. These problems include proper image registration on the object, proper keystoning, corrections to ensure appropriate perspective appearances and focusing of the image within a specified range of depth.
Through the years, various attempts have been made to project images onto three-dimensional objects. For example, from the days of early artists it has been known to transfer three-dimensional images to two-dimensional images by using a pane of glass and tracing the three-dimensional image by eye onto the glass. In general, these three-dimensional images could not be accurately reconstituted from the glass by projection, because the human eye cannot reproduce the optical distortions induced by the condensing and objective lens systems used in most projectors. This is due in large part to the fact that the human eye generally cannot perceive depth or perspective in projected images.
In recent times, artists have nevertheless used projection in an effort to recreate three dimensional images. Attempts have been made, for example, to use two distinct but overlapping image projections of polarized light to create an illusion of depth. In these circumstances, special three-dimensional viewing glasses are worn for viewing the polarized image projections. These viewing glasses filter the polarized light to present one of the image projections to each eye. The discrepancies between these image projections create the impression of depth in the image.
Other approaches, such as holography, present a three-dimensional image through the interference patterns of two distinct projections of coherent light. In these applications, the phase difference between the light projections is varied such that some points in three-dimensional space appear brighter than others because of the superposition of the crests of the light waves.
The foregoing approaches have inherent limitations, however. Holograms, for example, are very limited in color pallet and exhibit uncontrollable color shift with varying viewing angle. The techniques they employ simply are not practical for reproducing a threedimensional image from a two-dimensional depiction, because the two-dimensional image has to be initially captured and subsequently processed to include a depth component so that a three-dimensional image can be reconstituted.
Attempts have also been made to recreate three dimensional images by projecting a two-dimensional image upon a stationary three-dimensional object or a molded screen. However, these images are very difficult to edit and they cannot be modified in real time. The need for registration and alignment between the projected two-dimensional image and the three-dimensional projection surface affects the utility of these methods. Additionally, the requirement that these three-dimensional images be recorded in advance generally necessitates film preparation, which further contributes to the registration and keystoning difficulties, not to mention focusing problems associated within the desired depth of field.
In yet another system, interactive image modification is provided through a video shopping device that superimposes computer generated apparel upon an image of a human figure which is captured by a video camera. The human figure adorns an orientation identifying feature that can be recognized by image control circuitry which maps the appropriate orientation of the computer stored apparel image onto the figure and which then displays the composite two-dimensional image upon a viewing screen. This method, however, has drawbacks in that it requires a human figure to wear at least one orientation identifying feature, and it does not provide for the projection of vivid and realistic three-dimensional images. A method of projection which can interactively recreate three-dimensional images from two-dimensional depictions without the need for a video camera, advance processing or the adornment of orientation identifying features would therefore be desirable.
Accordingly, there has existed a definite need for an apparatus and method of projection which can recreate three-dimensional images from two-dimensional depictions without advance processing, and which solves the distortion problems inherent in flat image projection. Additionally, there has existed a need for such an apparatus and method that would allow for interactive image modi fication, and would therefore have applications in a wide range of fields, including by way of example, a guest- interactive amusement attraction, optical engineering, video shopping and cosmetic surgery. The present invention satisfies these needs and provides further related advantages.
SUMMARY OF THE INVENTION
The present invention provides an apparatus and method for projecting images upon a three-dimensional object so as to impart a vivid and realistic appearance upon that object. The apparatus employs graphics processing equipment and a projection means for projecting an image upon the object in a manner which allows for user interaction with the projected image. Specific methods in accordance with the invention allow for an image to be created corresponding to the surface contour of the object, as well as the definition of regions within that contour which may be independently processed for projection upon the object. In this way, a user or guest may create and edit a complete artwork data file which contains all of the perspective, registration and optical keystoning corrections necessary for projecting a vivid and realistic image, and which accurately conveys depth when projected upon the object. The projected image also can be modified in real time, providing animation, interactivity, transformation and even translation.
More particularly, the apparatus includes a projector and an addressable light filter means which is adapted to filter and color the light projected onto the object. A user interface means receives graphics data for the creation of a properly aligned projection contour and regions within the contour. The graphics data is then fed to a graphics processing device, such as a computer coupled to the addressable light filter means, to generate and control projection of the desired image utilizing multi-dimensional bit-mapping.
In one form of the invention, the user interface means consists of a user interface and one or more simplified guest interfaces. A simplified guest interface, which may be in the form of a Polhemus device, joystick, gimballed stylus, mouse, etc., may be added so as to permit a guest of the user to input graphics data that is used to manipulate color patterns projected onto regions of the object without user supervision. The data received from the simplified guest interface corresponds to a particular position on the object and allows for selection of an active color pattern and for selective painting of that color pattern upon a region corresponding to the position on the object. Contour and other graphics data is typically generated at the user interface, which may include a stylus and digitizing pad or a mouse, by tracing an image or the like on the object.
The computer processes the graphics data and generates an output representing an image which corresponds to the surface contour of the object, as traced by the user. This output controls the light filter means and commands it to filter the projected light such that the image is projected onto the object in various colors, with the appearance of shading, surface textures and other characteristics as desired.
In one aspect of the invention, the light filter means includes two or more optically superpositioned liquid crystal panels that are individually composited with a color filter. These displays are controlled in response to the output from the computer to filter the light from the projector to thereby produce color components and other projection features of the image. In addition, a display monitor also may be provided for connection to the processing device to permit two-dimensional display of the bit map data file on the monitor. Typically, the liquid crystal filter may be driven from the same graphics signal that is supplied to control the display monitor, and therefore consists of three such displays that subtract light for creating a color projected image.
In another aspect of the invention, the processing device comprises a computer having graphics software, with the display monitor being coupled to the computer. The graphics software is designed to store the processed graphics data in a memory, and to permit graphics data and image projection patterns corresponding to locations on the object to be created, generated, modified, maintained, erased or retrieved for subsequent projection.
In another form of the invention, a method of projecting an image onto a three-dimensional image is provided. The method includes the steps of entering the graphics data into the graphics input device and then processing that data to generate an output representing an image which corresponds to the surface contour of the object. The light filter is then controlled in response to the output to filter light from the projector such that the image is projected onto the object with a desired appearance. In various aspects of this method, the output may be stored in a buffer and then processed by the user to interactively modify the image. In addition, the output representing the image may be stored in a memory for subsequent recall and projection on the object. In this regard, a plurality of outputs may be stored to form a sequence of different but related images for sequential projection upon the object so as to make the object appear to be in motion. Other features and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the principles of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings illustrate the invention. In such drawings:
FIG. 1 is a perspective view of an apparatus embodying the novel features of the present invention, showing a three-dimensional object and a simplified guest interface according to one preferred embodiment of the current invention;
FIG. 2 is another perspective view of the apparatus, showing a colorless three-dimensional object, projector, computer and interface for controlling projection upon the object;
FIG. 3 is another perspective view, similar to FIG. 2, showing projection of an image upon the threedimensional object as selected by the simplified guest interface;
FIG. 4 is a plan view of the object and a projection device used for projecting the image onto the object and illustrates the depth of field of the object; FIG. 5 is a block diagram depicting the functional interaction between the computer, the simplified guest interface, a liquid crystal filter projector, and a switch/lamp assembly;
FIG. 6 is another block diagram showing the functional interaction between a polhemus device, the computer and an overhead-type liquid crystal filter projector; and
FIG. 7 is a logic block diagram of the software necessary to direct the computer of the preferred embodiment to control projection of an image according to the current invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
As shown in FIG. 1, the preferred embodiment of the present invention is embodied in an amusement apparatus, generally referred to by the reference numeral 10, for use in projecting images onto a three-dimensional object 12, for example, having the form of animated characters on a stage. An amusement guest 14 positioned in front of the object 12 directs a wand 18 at regions 16 of the object and utilizes a button 20, located on the wand, for selectively coloring or drawing upon a region determined from the orientation of the wand.
FIGS. 2-3 show the apparatus 10 in more detail. As shown in FIG. 2, the three-dimensional object 12 is in the form of a carousel horse supported on a raised platform 26 or the like. The three-dimensional object is a Bas-^Relief with attention paid to avoiding "undercuts" where the projected light can't reach. While the object illustrated in the preferred embodiment is an animated character, it will be appreciated that various other types and forms of three-dimensional objects having diverse surface shapes and configurations may be used in the context of the present invention. In the preferred embodiment, the surface 28 of the object 12 is monochromatic neutral-colored projection surface. In this way, the images projected upon the object will not be affected by unwanted colors on the object itself.
In accordance with the invention, the apparatus 10 includes a projection means that is aligned so as to project light upon the object 12. The projection means, which may include any device suitable for projecting light, or a liquid crystal display projector or light valve projector, is illustrated in the preferred embodiment as comprising an overhead type projector 29 with a large format liquid crystal light filter. As described in more detail below, data processing means are responsive to inputs received by a user interface means for processing graphics data to generate an image 24 and for controlling projection of image upon the object 12. More particularly, the data processing means includes a computer graphics system 30. The user interface means may have one, two or even more interfaces for interaction with the computer graphics system 30. In the preferred embodiment, there is typically a single user interface 32, consisting of a keyboard 34 and gimballed stylus 36 with a drawing surface 37 of a digitizing pad 38, and one or more simplified guest interfaces, generally referred to by the reference numeral 40. The simplified guest interface 40 may be any device that allows a guest to position a projected cursor or otherwise indicate position on the object 12, decide to paint a region or draw upon the object and select a current color pattern for projection onto the object.
In using the preferred embodiment of the apparatus and method of the invention, a user enters graphics data at the user interface 32 to define "regions" corresponding to positions 16 on the object 12. By inputting the data via the keyboard 34 and stylus 36 and digitizing pad 38, the user can trace the object while viewing both the projection of the traced lines on the object 12 and the formation of contours as depicted on a computer monitor 42. The user preferably creates by these contours at least one computer region corresponding to the object 12 and may further subdivide each region into multiple regions and store to memory the created image.
The system is then set up for interaction with an amusement guest via the simplified guest interface 40.
Although the user interface 32 of the preferred embodiment utilizes a keyboard 34, stylus 36 and digitizing pad 38, there are many devices that may be equivalently employed as a user interface, for example, a digitizing pad and stylus alone, or a mouse, or a keyboard alone, or a track ball, as well as any other device that is effective to input data to the computer graphics system 30.
The guest 14 may use the simplified guest interface 40 to projectively color the object 12 in accordance with data provided by the guest 14. In the preferred embodiment, the simplified guest interface 40 includes the wand 18, a "mouseclick" button 20, and a set of twelve color pattern buttons 46. Four of these buttons will select a pattern, e.g., polka dot, stripes, stars and crescent moons, etc., and eight buttons are employed to present a color selection. The wand 18 may be presented as shown in FIG. 1, having an enlarged paintbrush 48 having a bristle-end 50 that is to be pointed at a position on the object 12. By pushing the mouseclick 20, which as shown in FIG. 6 is a button on the rearward end of the paintbrush 48, the simplified guest interface 40 causes the computer graphics system 30 to projectively paint a portion of the object 12 corresponding to the wand's orientation with the current color pattern. As an optional feature to the embodiment shown in FIG. 1, the bristle-end 50 may carry lights for illuminating the bristle-end with the current color pattern, in a manner to simulate paint on the bristle-end of the brush.
As a further refinement of the preferred embodiment, the simplified guest interface 40 may present the guest with an option to projectively paint brush strokes or draw upon the object 12, in addition to flooding portions of the object as defined by the software delineated "Regions."
There are many devices that may be incorporated into the simplified guest interface 40 to emulate signals to the computer graphics system 30 for identification of the paintbrush's 48 orientation with respect to the object 12. As shown in FIG. 6, the wand 18 may house a so-called Polhemus device 92. The polhemus device utilizes low- frequency magnetic field technology to determine the position and orientation of a sensor 94 in relation to a source unit 96 and presents information representative of six degrees of freedom in ASCII or binary format. In the preferred embodiment, a unit sold under the name 3SPACETm ISOTRAK Tm has been found to be acceptable. There are, however, other units that may also be conveniently used in accordance with the invention. Utilization of a polhemus device may require modifications to the computer's graphics software such that the software accepts the protocol of the polhemus' data output. These modifications are well within the skill of anyone familiar with computer programming. For convenience, FIG. 7 includes a block diagram that illustrates the logic steps that the software incorporating the modifications needs to accomplish. A more detailed statement of the software is also located at the end of this detailed description.
In use, the guest 14 manipulates the wand 18 to provide data to a computer 52 in a manner to projectively draw upon the object 12 or paint portions 16 of the object 12. By processing the graphics data fed to the computer 52, all of the perspective, registration and optical keystoning corrections necessary for exact alignment of a re-projected image onto the object 12 are made. The processing of the graphics data may be divided into two segments. In the first, input from the user (not shown) to the user interface 32 is used to create an artwork data file. The user traces closed contours corresponding to portions 16 of the object 12 which are bit-mapped and used to define "regions" via the software. These contours are aligned in projection with the object, because the user has observed the projected contours corresponding to the portions 16 of the object 12 contemporaneously with their having been traced upon the object. Inputs from the user interface 32 are also received to indicate color, shading, texturing and perspective which are selectively assigned to regions. This artwork file is then ready for interaction by guests, and may be stored to memory of the computer 52 if it is desired to recall the original artwork data file or impose a default image projection. The artwork data file also can be enhanced on the computer 52 by using the keyboard 40 to select appropriate commands so that when the images are projected onto the object, as shown in FIGS. 3-4, the object 24 appears to be a real character having appropriate coloring, shading and desired textured appearances.
The second segment of data processing includes processing of data from the simplified guest interface 40 to allow guest manipulation of the image. The guest 14 enters data via the simplified guest interface 40 that accesses the artwork data file. As the name implies, the simplified guest interface 40 may be any interface that permits the guest to interact with the artwork data file and preferably allows only limited access to that file, i.e., the guest 14 preferably does not have the choice to completely erase all information corresponding to the object 12 including definition of regions. In the preferred embodiment, the guest 14 may employ the twelve buttons 46 to select a current color pattern and then utilize the mouseclick 44 to assign that color pattern to one of the pre-defined regions that is identified by the orientation of the wand 18 such that the bristle end 50 of the paint brush 48 points to the portion 16 of the object that corresponds to that region. By way of example, if a user desires to facilitate projection of an image on the character's tail 58 shown in FIGS. 2-4, the user will first trace the boundaries of the tail 58 by watching a projection pen tip of the stylus 36 create a contour on the object 12 itself. As graphics software has formatted for display on the computer monitor 42 an image created by the bit-mapping of the data received from the user interface 30 corresponding to the trace, the VGA signal emulated by the computer 52 to the monitor 42 is readily used as the projection control signal for the image. Once the outline of the tail 58 has been traced, a guest 14 may subsequently utilize the wand 18, mouseclick 20 and set of buttons 46 to select in real time an appropriate color, shading or the appearance of a particular texture to be projected within the traced area.
As an alternative option, a single button may replace the set of twelve buttons 46 which, when depressed, causes the computer 52 to select a new current color pattern. Or, color pattern selection could also be made by directing the wand 18 at a target, such as a paint bucket 51 of FIG. 1, and pressing the mouseclick 20.
In the preferred form of the invention's embodiments, entry of the graphics data via either of the user interface 32 or the simplified guest interface 40 generates signals which are transferred to a computer 52. The computer may be chosen to be a personal computer having one or more disk drives 54 for receiving graphics software. The keyboard 34 and the visual display monitor 42 are coupled to the central processing unit of the computer. The computer processes the graphics data entered by the user on the digitizing pad 38 and generates a VGA format RBG output corresponding to the image drawn or traced upon the pad. This output is then emulated to both the monitor 42 and the projector 29 for projection onto the object 12. In the preferred embodiment, the computer comprises a Macintosh personal computer manufactured by Apple Computer, Inc. of Cupertino, California. In the case of computers other than the Macintosh, a VGA board may be necessary for emulation of signals for driving the projector 29.
As shown in FIGS. 2 and 3, projector 29, which is preferably an overhead type projector, is aligned to face the three-dimensional object 12 and includes a 1,000 watt light source 86 for projecting light upon the object. An addressable light filter means 62 is mounted on the projector 29 in between an objective lens 64 and a condensing lens 66. These two lenses are conventional in most overhead-type projectors to enable focusing and alignment of the light for projection purposes. In the present invention, however, the objective lens and condensing lens are specially configured to provide a relatively large depth of field as shown by the reference designation D in FIG. 4. This is accomplished by using a wide angle objective lens 64. The addressable light filter means 62, which is taken to include any projection device that electronically enables projection of an image by filtering light, is coupled to the computer 52 and is adapted to selectively filter the light projected onto the object 12 in response to output control signals generated by the computer.
In the preferred embodiment, the light filter means 62 comprises a plurality of optically superpositioned liquid crystal panels 68. Each of these panels is composited with a color filter and is comprised of a plurality of addressable pixels (not shown) which are individually controlled in response to the computer output to generate color elements for the composite image. As mentioned, the liquid crystal panels 68 in the preferred embodiment are designed to accept VGA format signals from the computer 52. Accordingly, an RGB transcoder 70 is connected between the computer 52 and the liquid crystal panels 68 to convert the high resolution output graphics signals, generated by the Macintosh computer, into a VGA format.
The liquid crystal filter 68 is comprised of three superpositioned filters, yellow, cyan and magenta, and accordingly has 3 liquid crystal panels that are individually composited with these light filters. These secondary colors are chosen to subtract light from the projected light to project the image upon object. In addition, commercially available liquid crystal filters include a circuit, designated by the reference numeral 72 in FIGS. 2-3, that appropriately converts an RGB signal from the computer into control signals for each of the three panels.
FIG. 5 shows a functional block diagram of an embodiment of the system that utilizes an IBM compatible personal computer 74. The wand 18 is in the form of a joystick 78 with the familiar paintbrush 48 mounted at the end of the joystick. The mouseclick 20 is also located on one of the paintbrush and the joystick. The wand 18 is coupled to the personal computer which is fitted with I/O boards 82 and 84 for communication, with the LCD overhead display 29 and a twelve button set 46 for selection of colors and patterns, respectively.
FIG. 6 shows a partial connection block diagram of the preferred embodiment, including the wand 18 that includes the paintbrush 48, a pivotal mounting 90, and the polhemus device 92. The polhemus device 92 consists of a sensor 94, a source 96 and a polhemus controller 98 which emulates signals for receipt of the controller. Selection of a region identified by the orientation of the paintbrush 48 is accomplished by pushing the mouseclick 20, which is coupled to a mouse device 100. The mouse device 100 serially emulates a digital word to the computer 52 that indicates to the computer 52 and to the custom software that the guest desires to modify the current region. The computer 52 then edits the bit map data file stored in memory and emulates signals to the monitor and liquid crystal filter 68 of the overhead projector 29 to project the image upon the object 12.
In the preferred embodiment, the projector 29 has a one-thousand watt light source 86 for projecting the images onto the object 12. When using this arrangement, it has been found that liquid crystal panels 68 having a ten-inch-by-ten-inch cross-section mounted over the condensing lens 66 are best suited for in-focus projection over approximately a twenty-inch range of depth D with respect to the three-dimensional object 12. Of course, the lens system of the projector 29 may be modified as desired to achieve a different in-focus range of depth D over the object 12. Whatever range of depth D is selected, however, care must be taken to ensure that the surface 28 of the object 12 to be projected upon does not have a contour that varies from front to back by more than the desired range of depth.
By selecting the projection optics and optimizing the range of depth in which a focused image will be projected upon the object 12, a very high quality image can be generated, aligned on the object and enhanced by the computer graphics system 30 so that the user or other viewers will perceive the object 12 as having full, realistic three-dimensional features.
Sophisticated graphics software is used to select and generate various colors, shading effects, the appearance of texture, animated transformation, and other commands for processing, manipulating, editing and storing the image to be projected on the object 12. If a Macintosh brand personal computer is used, then commercial software programs sold under the names PIXEL PAINT,Tm PHOTO SHOPTm OR MACROMIND DIRECTORTm are suitable software programs (the latter offering the added feature of animation as well as bit-map painting) for processing of graphics data. When an IBM format personal computer is used, software programs sold under the names DELUXE PAINT IITm or ANIMATORTm are suitable (the latter being an animation program as the name implies).
Also, as mentioned, additional software may be necessary in order that these programs can receive data from non-standard inputs. The Polhemus device 92 of the preferred embodiment is such a non-standard input. FIG. 7 shows a block diagram of the software used to control projection and the sophisticated graphics software and communication with peripherals, such as the simplified guest interface 40, to boards 82 and 84 and Polhemus device 92. Appendices A-F are a more detailed statement of this software.
In another aspect of the invention, a printer 102 is connected to the computer 52. The printer is adapted to produce hard color copies 104 of the viewed image for the amusement of the guest 14 or other viewers. This feature has special usefulness when the apparatus 10 is used as an amusement device in an amusement park or the like, since the guests will be able to take a sample of their design home with them. The printer 102 also has advantages to artists and others who may use the apparatus 10 in a commercial environment or other contexts. Engineers can use the device for analyzing and comparing different optical systems.
In operation, after all of the equipment described above is connected to the power source and turned on, a user will pick up the stylus 36 and apply it to the drawing surface 37 of the digitizing pad 38. Once the virtual pen tip of the stylus contacts the drawing surface, the user will see this pen tip as a point on the visual monitor 42 and on the three-dimensional object 12. Once an appropriate starting point has been selected, the user may, for example, trace the contours, forms and outline of the projection object 12 by watching the pen tip of the stylus 36 move around on the object. The traced image, which is displayed on the monitor 42 and object 12, defines software regions that carry data signifying desired colors, shading, or the appearance of texture. For example, the user may trace forms corresponding to items of clothing to be displayed upon the object, each such item having a corresponding region.
When the user has finished tracing on the digitizing pad, a two-dimensional bit-mapped digital artwork data file will have been created and may be stored in the computer 52. The artwork file thus contains graphics data which is processed by the computer 52 in conjunction with the graphics software to generate an output representing the images traced by the user.
The system is then ready for interactive use with a guest via a simplified guest interface. In the preferred embodiment, the simplified guest interface has been chosen to include a polhemus device 92, but any interface sufficient to designate or change position is sufficient. For example, a mouse or a joystick may be used in equivalent fashion . The computer 52 then processes this graphics information to generate the desired image for projection onto, for example, the character's tail 58, as shown in FIG. 4. Because the user and the guest 14 may observe the results of moving the wand 18 by looking up at the projection object 12 and observing a moving cursor, contour or virtual pen tip, the projected image registers exactly with the object's appearance or shape. In this way, the object will have high-quality three-dimensional color images and characteristics of three-dimensional form.
As shown in FIG. 7, user created software translates inputs from the simplified guest interface 40 for acceptance by the sophisticated graphics software, and performs incidental tasks, such as periodic blinking of the character's eyes (not shown). As an example, the software of the preferred embodiment for interaction with the sophisticated graphics software is attached hereto as Appendices A-F.
By using sophisticated graphics software of the type identified above, the user has the option of selecting commands using the keyboard 36 which are displayed in a border on the computer monitor 42. These commands allow for presentation to the guest of selection of various colors, shading effects, the appearance of textures, and various other commands for manipulating, storing and editing the displayed image. In this regard also, modification of the graphics software may be necessary if the simplified guest interface is to be presented a limited subset of available colors, or with only a decision to select a new color pattern rather than select from a range of color patterns available. Alternatively, the switch I/O board 84 may be configured with physical switches (not shown) to allow changes in the emulated protocol such that the graphics software may recognize signals emulated by the I/O board 84 as representing alternate sets of colors and patterns. In other aspects of the invention, the output representing the image may be stored in a memory of the computer 52 such that it may be recalled and subsequently projected onto the object 12 as desired. By storing a plurality of these outputs, a sequence of different but related images may be collated. When these stored images are sequentially projected upon the object 12, the object may appear to be in motion or to display some other characteristic of movement, even though the object 12 itself is stationary. For example, FIG. 7 contains a software block 108 that automatically projects eyes on the object that appear to be blinking. An alternative, but as yet untested application of the present invention, would be to have a physical mechanical movement of the object. Video information could then be synched to the physical movement and a file of projected animation designed. By projecting the two-dimensional image in exact synchronization with the repeated movement of the projection object, a three-dimensional object having a wide range of physical movement would result.
The present invention has far ranging applications, including the fields of video shopping and cosmetic surgery, as well as the amusement field. As an example of the invention's application to video shopping, an image may be projected upon a shopper (not shown) to create an impression that the shopper is wearing desired apparel. In accordance with the invention, the shopper's outline may be traced and a specific field within that outline traced, perhaps corresponding to a shirt on the shopper's torso. The graphics information generated by this tracing may thereafter be processed and the light filter may be controlled such that green polka dots, for example, appear on the shopper's torso to make it appear that the shopper is wearing a green polka dotted shirt. Similarly, images may be projected upon a model of a car or house (also not shown) so as to impart apparent colors, styles or customizations to the car or house. As an example of the invention's cosmetic applications, features desired by cosmetic surgery may be projected onto the subject. The quality of projection is so good that the subject may be made to appear to have a different shaped nose, a beard, eyebrows, or nearly any other physical characteristic.
One specific contemplated use of this invention is as a guest operated amusement device, such as a three- dimensional coloring book for artistic applications. The device could be used as a display or as an attraction at an amusement park, whereby users can designate color features for display on an object and see the results of their work as they do it, in three-dimensional rather than two-dimensional form. It should be understood, however, that the above-mentioned applications of this invention are listed as examples only, to illustrate the broad applications and the usefulness of this invention, and they are not intended to limit this invention in any way.
While a particular form of the invention has been illustrated and described, it will be apparent that various modifications can be made without departing from the spirit and scope of the invention. Accordingly, it is not intended that the invention be limited, except as by the appended claims. APPENDICES
Appendix A is a software listing of a "C" language program that calls routines to set-up menus, windows, initialize the system, move the cursor, reset the polhemus and define the paint regions. Appendix B defines menus for Macintosh and
Polhemus set-up parameters and also handles menu actions. Appendix C is a routine for implementing polhemus driven cursor movements.
Appendix D is a routine for initializing serial communications and pausing the polhemus output. Appendix E is a routine for managing window/cursor sealing, region definitions, and random color/pattern files, cursor tracking and character eye blinking.
Appendix F is a resource file for dialog boxes and display.
/*
* Brush.c
* A simple paint tool using a Polhemus sensor as a brush
*
*/
extern WindowPtr PolhemusWindow;
extern Rect dragRect;
extern int GraphicMode;
Str255 unitsString = "Npinches";
Str255 IncrementString = "\p00.00";
CursHandle PlusCursorHdl;
/ * * * *
* InitMacintosh()
*
* Initialize all the managers & memory
*
InitMacintosh()
{
MaxftpplZone();
MoreMasters() ;
InitGraf(&thePort);
InitFonts();
FlushEvents(everyEvent, 0);
InitWindows();
InitMenus();
TEInit();
InitDialogs (0L) ;
InitCursor();
}
/* end InitMacintosh */
* HandleMouseDown (theEvent)
*
* Take care of mouseDown events.
*
HandleMouseDown(theEvent)
EventRecord *theEvent;
{
WindowPtr theWindow;
int windowCode = FindWindow (theEvent->where, &theWindow), switch (windowCode)
{ case inSysWindow:
SystemClick (theEvent, theWindow);
break;
case inMenuBar:
AdjustMenus () ;
HandleMenu (MenuSelect (theEvent->where));
break;
case inDrag:
if (theWindow = PolhemusWindow)
DragWindow (PolhemusWindow, theEvent->where, &dragRect);
break;
case inContent:
if (theWindow= PolhemusWindow)
{
if (theWindow != FrontWindow ())
SelectWindow(PolhemusWindow) ;
else
if (GraphicMode)
{
Point p;
p = theEvent->where;
GlobalToLocal (&p) ;
PaintFillAt (p) ;
}
if ( !GraphicMode)
{
/* arrange redraw of text */
InvalRect (&PolhemusWindow->portRect) ;
}
}
break;
case inGoAway:
if (theWindow= PolhemusWindow && TrackGoAway (PolhemusWindow, theEvent->where)
{
HideWindow (PolhemusWindow);
QuitRoutines (noErr);
}
break;
}
}
/* end HandleMouseDown */ / * * * *
* HandleEvent()
*
* The main event dispatcher. This routine should be called
* repeatedly (it handles only one event).
*
* * * * */
HandleEvent () {
int ok;
EventRecord theEvent;
HiliteMenu(O);
SystemTask () ; /* Handle desk accessories */
ok = GetNextEvent (everyEvent, fitheEvent);
if (ok)
switch (theEvent.what)
{
case mouseDown:
HandleMouseDown(fitheEvent) ;
break;
case keyDown:
case autoKey:
if ((theEvent.modifiers & cmdKey) 1= 0)
{
AdjustMenus () ;
HandleMenu (MenuKey ( (char) (theEvent. message δ charCodeMask) ) ) ;
}
break;
case updateEvt:
Beginϋpdate(PolhemusWindow) ;
if (GraphicMode)
{
BackColor(blackColor) ;
}
else
{
BackColor(whiteColor) ;
}
DrawPolhemusWindow( ( (WindowPeek) PolhemusWindow) ->hilited) ; if (GraphicMode)
{
DrawRegionOutlines () ;
}
Endϋpdate(PolhemusWindow) ;
break;
case activateEvt:
InvalRect(&PolhemusWindow->portRect) ;
break;
}
SystemTask () ; /* Handle desk accessories */
}
/* end HandleEvent */ / * * * * *
* Quit-Routines() *
* Clean up any stray processes, then quit. *
* * * * */
QuitRoutines (ErrorCode)
int ErrorCode;
{
CloseSerial ();
ClosePolhemusWindow ();
if (ErrorCode != noErr) SysError (ErrorCode) ExitToShell();
} / * * * * *
* main()
*
* This is where everything happens
*
* * * * */ main ()
{
InitMacintosh();
SetUpMenus();
SetUpWindow();
InitializeSerialPort();
PlusCursorHdl = GetCursor(plusCursor);
ResetPolhemus ();
MakeRegions ();
for (;;)
{
HandleEvent () ;
CheckForlnput () ;
}
}
/* end main */
/ * * * * *
* PolhemusMenus.c
*
* Routines for Polhemus demo menus.
*
* * * * */
#include <SerialDvr.h>
extern void ToPolhemus(); extern void SetRate(int); extern Str255 unitsString; extern Str255 IncrementString; extern int baudRate;
extern WindowPtr PolhemusWindow;
/* constants */
#define Nil, 0L
#define Pointerl -1L
#define emptyString "\p"
#define inGlobal 1
#define inLocal 0
/* dialog/alert IDs */
#define AboutDialogID 128
#define IncrementDialogID 129
#define IncrementAlertID 130
#define DIBsettingDialoglD 131
/* menu IDs */
#define AppleMenuID 1000
#define FileMenuID 1001
#define EditMenuID 1002
#define RS232MenuID 1003
#define PolhemusMenuID 1004
#define CommandMenuID 1005
/* apple menu items */
#define Aboutltem 1
/* file menu items */
#define Newltem 1
#define Openltem 2
#define Saveltem 3
#define SaveAsItem 4
#define Closelterα 5
#define Quitltem 7
/* edit menu items * /
#define Undolterα 1
#define Cutltem 3
#define Copyltem 4
#define Pasteltem 5
#define Clearltem 6
/* rs232 menu items */ #define Baudl9200Item 1
#define Baud9600Item 2
#define Baud4800Item 3
#define Baud2400ltem 4
#define Baudl200Item 5
#define Baud300Item 6
#define DIPsetting 8
/* polhemus menu items */
#define Continuous 1
#define Send 2
#define Increment 3
#define Boresight 5
#define Averaging 7
#define ASCII 9
#define Binary 10
#define Inches 11
#define Centimeters 12
#define AngleOutOnly 14
#define Reset 16
/* command menu items */
#define PrintData 1
#define AdjustCursor 2
#define PaintingMode 3
static MenuHandle appleMenu, fileMenu, editMenu, rsMenu, polMenu, comMenu;
Point mousePosition;
int AdjustCursorFlag = false;
int PrintDataFlag = true;
int GraphicMode = false;
void
ResetPolhemus ()
{
long OldTime, NewTime;
int TimeCount = 5;
/* Continuous default is off */
Checkltem(polMenu,Continuous, false);
Enableltem (polMenu, Send);
Disableltem(polMenu, Increment);
/* Boresight default is off */
Checkltem(polMenu,Boresight, false);
/* Averaging default is off */
Checkltem(polMenu,Averaging, false);
/* default ASCII mode */
Checkltem(polMenu,ASCII,true);
Checkltem(polMenu,Binary, false);
Enableltem (polMenu, Inches);
Enableltem (polMenu, Centimeters);
/* default inches mode */
Checkltem(polMenu, Inches,true);
Checkltem (polMenu,Centimeters, false);
stccpy (unitsString, &"\pinches\0", 13);
) /* default output records */ Checkltem(polMenu,AngleOutOnly, false),
Checkltem(comMenu,AdjustCursor,false),
DisableItem(comMenu,AdjustCursor);
/* output the.reset command */
ToPolhemus("\031",1);
print ("Polhemus is being reset");
OldTime = TickCount();
while (TimeCount > 0)
{
NewTime = TickCount () ;
if ((NewTime - OldTime) >= 60)
{
OldTime = NewTime;
print (".");
TimeCount╌;
}
}
print("ready.\015");
}
/*
* SetUpMenus()
*
*/
SetϋpMenus()
{
InsertMenu(appleMenu = GetMenu(AppleMenuID), 0);
InsertMenu(fileMenu = GetMenu(FileMenuID), 0);
InsertMenu(editMenu = GetMenu(EditMenuID), 0);
InsertMenu(rsMenu = GetMenu(RS232MenuID), 0);
InsertMenu(polMenu = GetMenu(PolhemusMenuID), 0);
InsertMenu(comMenu = GetMenu(CommandMenuID), 0);
DrawMenuBar() ;
AddResMenu(appleMenu, 'DRVR');
}
/* end SetϋpMenus */ / * * * *
* AdjustMenus()
*
* Enable or disable the items in the File & Edit menus if a DA window
* comes up or goes away. The application doesn't do anything with
* the File & Edit menus, except Quit.
*
* * * * /
AdjustMenus ()
{
/* determine if DA is running */
register WindowPeek wp = (WindowPeek) FrontWindow();
short kind = wp ? wp->windowKind : 0;
Boolean DA = kind < 0;
enable (editMenu, Undoltem, DA) ; enable (editMenu, Cutltem, DA);
enable (editMenu, Copyltem, DA);
enable (editMenu, Pasteltem, DA);
enable (editMenu, Clearltem, DA);
enable (fileMenu, Newltem, DA);
enable (fileMenu, Openltem, DA);
enable (fileMenu, Saveltem, DA);
enable (fileMenu, SaveAsItem, DA);
enable (fileMenu, Closeltem, DA);
} static
enable (menu, item, ok)
Handle menu;
{
if (ok)
Enableltem (menu, item);
else
Disableltem (menu, item)
} / * * * * *
* HandleMenu(mSelect)
*
* Handle the menu selection. mSelect is what MenuSelect() and
* MenuKey() return: the high word is the menu ID, the low word
* is the menu item
*
* * * * * /
UncheckBauds()
/* clear multiple selection of baud rates */
{
Checkltem(rsMenu,Baudl9200Item, false);
Checkltem(rsMenu,Baud9600Item, false);
Checkltem(rsMenu, Baud4800Item, false);
Checkltem(rsMenu,Baud2400Item, false);
Checkltem(rsMenu,Baudl200Item, false);
Checkltem(rsMenu,Baud300Item, false);
}
stccpy (s1, s2, n)
register char *s1, *s2;
register int n;
{
register char *s0 = s1;
if (n <= 0) return (0);
while (n╌ && (*s1++ = *s2++));
if (n && *(sl-D) *s1 = '\0';
return (s1-s0-1);
} HandleMenu (mSelect)
long mSelect;
{
int menuID = HiWord(mSelect) ;
int menultem = LoWord(mSelect)
Str255 name;
GrafPtr savePort;
WindowPeek frontWindow;
DialogPtr genDlgPtr;
int dummyInt;
int isChecked;
int itemType;
Handle itemHdl;
Str255 text,text2,text3;
char ctext[10];
Rect box;
int i;
int dp;
int invalid;
switch (menuID)
{
case AppleMenuID:
if (menultem = Aboutltem)
{
genDlgPtr = GetNewDialog(AboutDialogID, NIL, Pointerl) ,
ModalDialog (NIL, sdummylnt) ;
DisposDialog (genDlgPtr) ;
}
else
{
GetPort(&savePort);
Getltem(appleMenu, menultem, name);
OpenDeskAcc (name);
SetPort (savePort) ;
}
break;
case FileMenuID:
switch (menultem)
{
case Quitltem:
QuitRoutines (noErr);
break;
}
break;
case EditMenuID:
break;
case RS232MenuID:
switch (menultem)
{
case Baudl9200Item: UncheckBauds ();
Checkltem(rsMenu, Baud19200Item, true);
SetRate (baud19200);
break;
case Baud9600ltem:
UncheckBauds();
Checkltem(rsMenu, Baud9600Item, true);
SetRate (baud9600);
break;
case Baud4800ltem:
UncheckBauds ();
Checkltem(rsMenu, Baud4800Item, true);
SetRate (baud4800);
break;
case Baud2400Item:
UncheckBauds ();
Checkltem(rsMenu, Baud2400Item, true);
SetRate (baud2400);
break;
case Baudl200Item:
UncheckBauds ();
Checkltem(rsMenu, Baudl200ltem, true);
SetRate (baudl200);
break;
case Baud300Item:
UncheckBauds ();
Checkltem(rsMenu, Baud300Item, true);
SetRate (baud300);
break;
case DIPsetting:
switch (baudRate)
{
case baudl9200:
stccpy (text, "\p\23\327\327\23\23\23\23\23", 256);
break;
case baud9600:
stccpy (text, "\p\327\23\327\23\23\23\23\23", 256);
break;
case baud4800:
stccpy (text, "\p\23\23\327\23\23\23\23\23", 256);
break;
case baud2400:
stccpy (text, "\p\327\327\23\23\23\23\23\23", 256);
break;
case baudl200:
stccpy (text, "\p\23\327\23\23\23\23\23\23", 256);
break;
case baud300:
stccpy(text, "\p\327\23\23\23\23\23\23\23", 256);
break;
}
stccpy(text2, "\p\23", 256);
stccpy (text3, "\p\327", 256);
ParamText (text,text2,text3,emptyString);
genDlgPtr = GetNewDialog(DIPsettingDialogID, NIL, Pointer1), ModalDialog(NIL, &dummylnt); DisposDialog (genDlgPtr) ;
break;
}
break;
case PolhemusMenuID:
switch (menultem)
{
case Continuous:
GetltemMark(polMenu,Continuous, sisChecked) ;
if (isChecked = noMark)
{
Checkltem (polMenu, Continuous, true) ;
Disableltem (polMenu, Send) ;
Enableltem (polMenu, Increment) ;
print ("Continuous updates: ON\015") ;
ToPolhemus ("C", 1) ;
}
else
{
Checkltem(polMenu,Continuous, false);
Enableltem(polMenu,Send);
Disableltem(polMenu,Increment);
print("Continuous updates: OFF\015");
ToBolhemus ("c",1) ;
}
break;
case Send:
ToPolhemus ("P",1) ;
break;
case Increment:
ResetAlrtStage();
ParamText(unitsString,emptyString,emptyString,emptyString); genDlgPtr = GetNewDialog(IncrementDialogID, NIL, Pointer1); GetDItem(genDlgPtr, 2, &itemType, sitemHd1, &box);
SetlText(itemHdl, IncrementString);
/* get value s insure that it is valid number before exiting */ do
{
ModalDialog(NIL, &dummylnt) ;
GetDItem(genDlgPtr, 2, &itemType, sitemHdl, sbox);
GetlText(itemHdl, stext);
dp = -1;
invalid = false;
for (i =1; i < text[0]; i++)
{
if (dp >= 0)
{
if (dp > 2)
invalid = true;
else if (text[i] < '0')
invalid = true;
else if (text[i] > '9')
invalid = true;
dp++;
} else
{
if (text[i] = '.')
dp++;
else if (text[i] < '0')
invalid = true;
else if (text[i] > '9')
invalid = true;
else if (i > 2)
invalid = true;
}
text2 [i-1] = text [i] ;
}
if (invalid != 0) NoteAlert (IncrementAlertID, NIL);
}
while (invalid != 0);
stccpy (IncrementString,text, 256);
ToPolhemus ("I",1) ; /* send set increment command with value */ ToPolhemus (text2,text[0]);
ToPolhemus ("\015", 1);
DisposDialog(genDlgPtr);
break;
case Boresight:
GetltemMark (polMenu, Boresight, SisChecked);
if (isChecked = noMark)
{
Checkltem(polMenu,Boresight,true);
ToPolhemus ("Bl\015", 3);
}
else
{
Checkltem(polMenu,Boresight, false);
ToPolhemus ("bl\015", 3);
}
break;
case Averaging:
GetltemMark(polMenu,Averaging, SisChecked) ;
if (isChecked = noMark)
{
Checkltem(polMenu,Averaging,true);
ToPolhemus ("K",1) ;
}
else
{
Checkltem (polMenu,Averaging, false);
ToPolhemus ("m",1);
}
break;
case ASCII:
Checkltem(polMenu,ASCII, true);
Checkltem(polMenu,Binary, false);
Enableltem(polMenu, Inches);
Enableltem(polMenu,Centimeters);
ToPolhemus ("F",1);
break;
case Binary: Checkltem(polMenu,Binary,true);
Checkltem(polMenu,ASCII, false);
Disableltem(polMenu,Inches);
Disableltem(polMenu,Centimeters);
ToPolhemus ("f",1);
break;
case Inches:
Checkltem(polMenu,Inches,true);
Checkltem(polMenu,Centimeters,false);
stccpy(unitsString, s"\pinches\0",13);
ToPolhemus("U",1);
break;
case Centimeters:
Checkltem(polMenu,Centimeters,true);
Checkltem(polMenu,Inches, false);
ToPolhemus ("u",1) ;
stccpy(unitsString, s"\pcentimeters\0", 13); break;
case AngleOutOnly:
GetltemMark(polMenu,AngleOutOnly, SisChecked); if (isChecked = noMark)
{
Checkltem(polMenu,AngleOutOnly,true);
Enableltem(comMenu,AdjustCursor);
ToPolhemus ("04,1\015", 5) ;
}
else
{
Checkltem(polMenu,AngleOutOnly,false); Checkltem(comMenu,AdjustCursor,false); Disableltem(comMenu,AdjustCursor);
ToPolhemus("k",1);
}
break;
case Reset:
ResetPolhemus () ;
break;
}
break;
case CommandMenuID:
switch (menultem)
{
case PrintData:
GetltemMark(comMenu,PrintData, &isChecked); if (isChecked = noMark)
{
Checkltem(comMenu,PrintData,true);
PrintDataFlag = true;
}
else
{
Checkltem(comMenu,PrintData, false);
PrintDataFlag = false;
}
break; case AdjustCursor:
GetltemMark (comMenu,AdjustCursor, sisChecked); if (isChecked = noMark)
{
Checkltem(comMenu,AdjustCursor,true);
AdjustCursorFlag = true;
}
else
{
Checkltem(comMenu,AdjustCursor, false);
AdjustCursorFlag = false;
}
break;
case PaintingMode:
GetltemMark (comMenu,PaintingMode, SisChecked); if (isChecked = noMark)
{
Checkltem(comMenu,PaintingMode,true);
GraphicMode = true;
InvalRect (SPolhemusWindow->portRect);
}
else
{
Checkltem(comMenu,PaintingMode, false); GraphicMode = false;
InvalRect (SPolhemusWindow->portRect);
}
break;
}
break;
}
}
/* end HandleMenu */
/ * * * * *
* BrushMouse.c
*
* Routines for mouse activities, including cursor control and Polhemus->mouse event.
*
* * * * * /
#define inGlobal 1
#define inLocal 0 per Mike Clark, though not used here...
global variables: Point: mTemp, rawMouse
put point in
move byte: cursorCouple to cursorNew
samples have tablet driver.a ... routine for reference
should have system equates in MPW
if raw mouse were not updated, jcursor will attempt to scale
* /
void
PutMouse(curpos, flag)
/* move the mouse coordinates to curpos. */
/* if flag 1= 0, curpos is in Global coords. */
/* if flag = 0, curpos is in current window coords.
Point *curpos;
int flag;
{
extern BitMap screenBits;
int xmax, ymax, xmin, ymin;
Point pos;
int *ptr;
char *bptr;
/* set local variables */
pos.v = curpos->v;
pos.h = curpos->h;
xmax = screenBits.bounds.right - 1;
xmin = 0;
ymax = screenBits.bounds.bottom - 3;
ymin = 0;
/* convert to global coords if needed */
if (I flag) LocalToGlobal(spos);
/* clamp to screen bounds */
if (pos.v > ymax) pos.v = ymax;
if (pos.v < ymin) pos.v = ymin;
if (pos.h > xmax) pos.h = xmax;
if (pos.h. < xmin) pos.h = xmin;
/* prepare pointers to mouse data */
ptr = (int *) 0×828;
bptr = (char *) 0×8ce; /* write data to mouse record and trigger update */
*ptr++ = pos.v;
*ptr++ = pos.h
*ptr++ = pos.v
*ptr++ = pos.h
*ptr++ = pos.v
*ptr = pos.h
*bptr = 0×ff;
return;
ExtractYaw()
/* return an integer 100x the yaw angle */
ExtractPitch ()
/* return an integer 100x the pitch angle */
/ * * * * *
* BrushSerial . c
*
*
*
* * * * * /
#include <SerialDvr.h>
#define modemin -6
#define modemOut -7
#define NIL 0L
#define LF *\012'
#define inGlobal 1
#define inLocal 0
/* positions in the Polhemus angle-only output stream */
#define pitchStart 10
#define yawStart 3
extern WindowPtr PolhemusWindow;
extern Rect queenBounds;
extern Print();
extern void ButMouse();
extern int AdjustCursorFlag;
extern int PrintDataFlag;
extern int GraphicMode;
OSErr errorCode;
int baudRate;
char* InBuffer;
int minusFlag;
long yaw, pitch;
long maxYaw = 0L, minYaw = 0L;
long maxPitch = 0L, minPitch = OL; void
IOFailure(errorCode)
{
SysError (errorCode);
} void
SetRate (rateCode)
int rateCode;
{
errorCode = SerReset(modemin, rateCode + stop10 + noParity + data8); if (errorCode != noErr) IOFailure(errorCode);
errorCode = SerReset(modemOut, rateCode + stop10 + noParity + data8), if (errorCode != noErr) IOFailure (errorCode);
baudRate = rateCode;
} void
InitializeSerialPort ()
{
static SerShk flags = {false, false, '\23', '\21', hwOverrunErr + framingErr, 0, fal
/* Open modem port and set handshaking parameters */
errorCode = RAMSDOpen(sPortA) ; /* open RAM serial drivers (in s out) for modem pc if (errorCode != noErr) IOFailure (errorCode) ;
/* default to no hardware or software flow control */
errorCode = SerHShake (modemin, sflags);
if (errorCode != noErr) IOFailure (errorCode) ;
errorCode = SerHShake (modemOut, sflags);
if (errorCode != noErr) IOFailure (errorCode);
SetRate (baud9600);
InBuffer = NewPtr (1000);
if (InBuffer = NIL) IOFailure (memFullErr);
errorCode = SerSetBuf (modemin, InBuffer, 1000);
if (errorCode != noErr) IOFailure (errorCode);
}
CloseSerial ()
{
RAMSDClose (sPortA);
DisposPtr (InBuffer);
} void
ToPolhemus (msg, len)
char* msg;
char len;
{
long length;
length = len;
errorCode = FSWrite (modemOut, slength, msg);
if (errorCode != noErr) IOFailure (errorCode);
}
void
CheckForlnput ()
{
char tempString[2]; /* a C string used as an input buffer */
static Str255 inputString = 0; /* a P string used to accumulate output record */
SerStaRec serSta;
long count;
long oneCount;
static int inStringLength = 0;
int commandTerm;
int i;
char c;
Rect window;
Point p;
errorCode = SerStatus (modemin, sserSta); if (errorCode != noErr) IOFailure (errorCode);
if (serSta.cumErrs 1= noErr)
{
if (serSta.cumErrs && swOverrunErr)
print("ERR: Data In, Software Overrun\015");
else if (serSta.cumErrs && hwOverrunErr)
print ("ERR: Data In, Hardware Overrun\015");
else if (serSta.cumErrs && framingErr)
print("ERR: Data In, Framing Error\015");
}
errorCode = SerGetBuf(modemin, Scount);
if (errorCode 1= noErr) IOFailure(errorCode);
if (count != 0)
{
/* get serial data up to linefeed*/
commandTerm = false;
while ((errorCode != eofErr) && (commandTerm = false))
{
oneCount =1;
errorCode = FSRead(modemin, SoneCount, stempString[0]);
if ((errorCode ϊ= noErr) && (errorCode != eofErr) ) IOFailure (errorCode), if (errorCode = noErr)
{
inputString[inStringLength++] = tempString[0];
if (tempString[0] == LF) commandTerm = true;
}
}
if (commandTerm) /* parse output record */
{
/* string from Polhemus received */
inputString[inStringLength] = '\0';
if (PrintDataFlag) print(inputString);
/* translate into pitch and yaw values */
inStringLength = yawStart;
yaw = 0;
minusFlag = false;
for (i = 0; i < 7; i++)
{
switch (c = inputString[inStringLength++] )
{
case '-':
minusFlag = true;
break;
case '.':
case ' ':
break;
case '0':
case '1':
case '2':
case '3':
case '4':
case '5':
case '6':
case '7':
case '8': case '9':
yaw = 10 * yaw + (int) c - (int) '0';
break;
}
}
if (minusFlag) yaw = -yaw;
pitch = 0;
minusFlag = false;
for (i = 0; i < 7; i++)
{
switch (c = inputString[inStringLength++])
{
case '-'
{
minusFlag = true;
break;
}
case '.':
case ' ':
break;
case '0':
case '1':
case '2':
case '3':
case '4':
case '5':
case '6':
case '7':
case '8':
case '9':
{
pitch = 10 * pitch + (int) c - (int) '0';
break;
}
}
}
if (minusFlag) pitch = -pitch;
/* note limits */
if (yaw < minYaw) minYaw = yaw;
if (yaw > maxYaw) maxYaw = yaw;
if (pitch < minPitch) minPitch = pitch;
if (pitch > maxPitch) maxPitch = pitch;
if (AdjustCursorFlag)
{
window = queenBounds; /*+++was+++ (*PolhemusWindow) .portRect;*/
p.h = window.right - (yaw - minYaw) * (window.right - window. left) /
(maxYaw - minYaw) ;
p.v = (pitch - minPitch) * (window.bottom - window.top) /
(maxPitch - minPitch) + window.top;
PutMouse (sp, inLocal) ;
if (GraphicMode) DrawEyes();
} /* reset input string */ inStringLength = 0;
}
}
}
/ * * * * *
* BrushWindow. c
*
* The window routines for the Brush demo
* * * * * /
extern CursHandle PlusCursorHdl;
WindowPtr PolhemusWindow;
Rect dragRect;
Rect windowBounds = { 20, 0, 480, 640 };
Rect queenBounds = { 100, 100, 200, 200 };
int width = 5;
TEHandle console;
int linesInWindow;
extern int GraphicMode;
Pattern squares = {0×F0,0×F0,0×F0,0×F0,0×0F,0×0F,0×0F,0×0F};
Pattern hearts = {0×00, 0×6C, 0×92, 0×82, 0×82, 0×44, 0×28, 0×10}; #define NumberOfRegions 10
RgnHandle RegionNumber [NumberOfRegions];
RgnHandle tempRgn;
void
print (text)
/* add text to window, scroll as needed */
char* text;
{
long length = 0;
while (text[length] != '\0') length++;
if ( (**console) .nLines >= linesInWindow)
{
(**console).selEnd = (**console) .lineStarts
[ (**console) .nLines - linesInWindow + 1] (**console).selStart = 0;
TEDelete (console);
}
(**console) .selEnd = (**console) .teLength;
(**console) .selStart = (**console) .teLength;
TEInsert (text, length, console);
}
ClosePolhemusWindow ()
/* ready window for text stream...*/
{
TEDispose (console);
}
SetUpWindow()
/* Create the Polhemus Window, and open it. */
{
Rect d,v;
dragRect = screenBits.bounds; PolhemusWindow = NewWindow (OL, &windowBounds, "\pImageworks 3D Paint Demo", true, SetPort(PolhemusWindow);
TextFont (monaco);
TextSize(9);
d.top = v.top = (*PolhemusWindow) .portRect.top;
d.left = v.left = (*PolhemusWindow) .portRect.left;
d.bottom = v.bottom = (*PolhemusWindow) .portRect.bottom;
d.right = v.right = (*PolhemusWindow) .portRect.right;
d.top += 4;
d.left += 4;
d.bottom -= 4;
d.right -= 4;
console = TENew(&d,&v) ;
linesInWindow = (d.bottom - d.top)/ (**console) .lineHeight;
(**console) .crOnly = -1;
print("EyePhone/Polhemus Demo Active...\015");
}
DrawPolhemusWindow(active)
/* Draws the Polhemus window dressings. */
short active;
{
Rect myRect;
int color = true;
int i;
SetPort(PolhemusWindow);
EraseRect(ε(*PolhemusWindow) .portRect);
if (1GraphicMode)
{
SetCursor(Sarrow) ;
PenPat (black) ;
BackColor (whiteColor) ;
ForeColor (blackColor) ;
TEUpdate (S (*PolhemusWindow) .portRect, console) ;
}
if (GraphicMode)
{
/* +++ redraw the region outlines (and buttons, if any) */
SetCursor (*PlusCursorHdl) ;
PenPat (black) ;
BackColor(whiteColor);
ForeColor(blackColor);
for (i=0; i< NumberOfRegions; i++)
{
FrameRgn(RegionNumber[i]);
}
}
}
/* definition order is important, as higher numbers are subtracted from lower */ #define skirt 0
#define trim 1
#define apron 4
#define blouse 2
#define arms 3
#define head 5 #define hat 6
#define mouth 7
#define brow 8
#define eyes 9
MakeRegions ()
/* create random regions in window
{
int i=0,j;
Rect r;
/*skirt 0*/
RegionNumber [skirt] = NewRgn (),
OpenRgn();
MoveTo (0×006E, 0×01A7);
LineTo (0×007D, 0×0167);
LineTo(0×0095, 0×0131);
LineTo (0×00A3, 0×010F);
LineTo (0×00B6, 0×0104)
LineTo(0×00CF, 0×0106);
LineTo(0×00F7, 0×0109);
LineTo (0×0119, 0×0105);
LineTo(0×012B, 0×0103);
LineTo (0×0147, 0×0113);
LineTo (0×0165, 0×0146);
LineTo(0×0179, 0×0188);
LineTo (0×017D, 0×01B0);
LineTo(0×006E, 0×01A7);
CloseRgn(RegionNumber[skirt]);
/*apron 1*/
RegionNumber[apron] = NewRgn();
OpenRgn();
MoveTo (0×OOAC, 0×01B2);
LineTo(0×OOBA, 0×0197);
LineTo (0×00C8, 0×0167);
LineTo (0×00D9, 0×0149);
LineTo (0×00E8, 0×0131);
LineTo (0×00F7, 0×0122);
LineTo(0×010A, 0×0137);
LineTo (0×011D, 0×0154);
LineTo (0×0129, 0×016F);
LineTo (0×0135, 0×018E);
LineTo (0×013E, 0×01B0);
LineTo (0×0142,0×01BC);
LineTo (0×00AC, 0×01B2);
CloseRgn (RegionNumber[apron]);
Λarms 2*/
RegionNumber [arms] = NewRgn();
OpenRgn() ;
MoveTo(0×00D5,0×00F5);
LineTo (0×00CE, 0×00F0);
LineTo (0×00C8, 0×00F0);
LineTo (0×00C0, 0×00F2);
LineTo (0×00B9, 0×00F6); LineTo(0×00B7, 0×00FF);
LineTo(0×00BB,0×0105);
LineTo(0×00C0, 0×0109);
LineTo(0×00C9,0×0109);
LineTo(0×00CF, 0×0104);
LineTo(0×00D5, 0×00F5);
MoveTo(0×011B, 0×00F3);
LineTo(0×0122, 0×00F1);
LineTo(0×012B, 0×00F0);
LineTo(0×0134, 0×00F4);
LineTo(0×0139,0×00F8);
LineTo(0×013A, 0×00FF);
LineTo(0×0138,0×0102);
LineTo(0×0133, 0×0107);
LineTo(0×0129, 0×0109);
LineTo(0×0122, 0×0107);
LineTo (0×011D,0×0105);
LineTo(0×011A,0×00FA);
LineTo(0×011B, 0×00F3);
CloseRgn(RegionNumber[arms]);
/*blouse 3*/
RegionNumber[blouse] = NewRgn(),
OpenRgn() ;
MoveTo(0×00DC, 0×00BD);
LineTo(0×00E0,0×00C3);
LineTo(0×00E7,0×00CD);
LineTo(0×00F6,0×00D3);
LineTo(0×0100, 0×00D3);
LineTo(0×0109,0×00D1);
LineTo(0×010E, 0×00CE);
LineTo(0×0114,0×00C9);
LineTo(0×011A,0×00C5);
LineTo(0×011D,0×00C0);
LineTo(0×011E,0×00BF);
LineTo(0×0123,0×00BE);
LineTo(0×012F,0×00C5);
LineTo(0×0135, 0×00D0);
LineTo(0×013A, 0×00DF);
LineTo(0×013B,0×00E8);
LineTo(0×0140, 0×00EF);
LineTo(0×013D,0×00F8);
LineTo(0×0137,0×0106);
LineTo (0×0125, 0×0108);
LineTo(0×0118,0×010E);
LineTo (0×0106, 0×0111);
LineTo(0×00F4,0×0111);
LineTo(0×00DD,0×0110);
LineTo(0×00D1, 0×010C);
LineTo(0×00CB,0×0109);
LineTo(0×00CO,0×0107);
LineTo(0×00B1, 0×00FF);
LineTo(0×00B4,0×00E0);
LineTo(0×00BC,0×00CB);
LineTo(0×00C4, 0×00C1); LineTo (0×OOCB,0×00C1);
LineTo (0×00DC, 0×00BD);
CloseRgn (RegionNumber [blouse]);
/*trim on skirt 4*/
RegionNumber [trim] = NewRgn ();
OpenRgn();
MoveTo (0×0074, 0×019F);
LineTo (0×0086, 0×01A4);
LineTo (0×0098,0×01A4);
LineTo (0×00A1, 0×019E);
LineTo (0×00A6, 0×0199);
LineTo (0×00B1, 0×017B);
LineTo (0×00BF, 0×015B);
LineTo (0×00D1, 0×013A);
LineTo (0×00E5, 0×0124);
LineTo (0×00F1,0×0118);
LineTo (0×00F9, 0×0122);
LineTo (0×0102, 0×0119);
LineTo (0×0111, 0×012C);
LineTo (0×011F, 0×013D);
LineTo (0×012D, 0×0154);
LineTo (0×013A, 0×0171);
LineTo (0×0145, 0×0187);
LineTo (0×014B, 0×0199);
LineTo (0×0150, 0×01A0);
LineTo (0×0156, 0×01A3);
LineTo (0×015F, 0×01A3);
LineTo (0×0169, 0×01A3);
LineTo (0×016F,0×01A1);
LineTo (0×0180, 0×01BF);
LineTo (0×0116, 0×01B5);
LineTo(0×00B4,0×01B5);
LineTo (0×0089, 0×01B8);
LineTo (0×0067, 0×01B8);
LineTo (0×0067, 0×01B8);
LineTo (0×0074, 0×019F);
CloseRgn (RegionNumber[trim]);
/*head 5*/
RegionNumber [head] = NewRgn (),
OpenRgn ();
MoveTo (0×00BD, 0×00BB);
LineTo (0×00FE, 0×00DB);
LineTo (0×0126, 0×00BD);
LineTo (0×0128, 0×00AD);
LineTo (0×010B, 0×0075);
LineTo (0×0101, 0×006A);
LineTo (0×00EA, 0×006C);
LineTo (0×00BD, 0×00BB);
CloseRgn (RegionNumber [head]);
/*hat 6*/
RegionNumber [hat] = NewRgn(); OpenRgn();
MoveTo (0×00DF, 0×007D); LineTo(0×00F1,0×0071);
LineTo(0×00F7, 0×006F);
LineTo(0×0107,0×0070);
LineTo(0×00FA, 0×0055);
LineTo(0×00E6,0×0064);
LineTo(0×00DF,0×007D);
CloseRgn(RegionNumber[hat]);
/*mouth 7*/
RegionNumber[mouth] = NewRgn(); OpenRgn();
MoveTo(0×00E8,0×00A9);
LineTo(0×00F2, 0×00AA);
LineTo(0×00F8,0×00A8);
LineTo(0×00FF,0×00A8);
LineTo (0×0104, 0×00A8);
LineTo(0×0l08, 0×00AA);
LineTo(0×010C,0×00A9);
LineTo(0×0112, 0×00A5);
LineTo(0×0108,0×00B2);
LineTo(0×0102,0×00B5);
LineTo(0×00FE,0×00B6);
LineTo(0×00F6,0×00B3);
LineTo(0×00F0, 0×00AE);
LineTo(0×00E8, 0×00A9);
CloseRgn (RegionNumber[mouth]);
/*brow 8*/
RegionNumber[brow] = NewRgn();
OpenRgn();
MoveTo(0×00F9,0×0083);
LineTo(0×00F7,0×007F);
LineTo(0×00F3, 0×007F);
LineTo(0×00EE,0×0081);
LineTo(0×00EC,0×0087);
LineTo (0×00E9,0×008D);
LineTo(0×00EC,0×008F);
LineTo(0×00EE,0×008A);
LineTo(0×00Fl, 0×0065);
LineTo(0×00F4, 0×0084);
LineTo(0×00F8, 0×0064);
LineTo(0×00FA, 0×0087);
LineTo(0×00FC,0×0087);
LineTo(0×00FC,0×0087);
LineTo(0×00F9,0×0083);
MoveTo (0×00FA, 0×0084);
LineTo(0×00FE, 0×007E);
LineTo(0×0109,0×0083);
LineTo(0×0l0B, 0×0087);
LineTo(0×0l0C,0×008C);
LineTo(0×0109, 0×008C);
LineTo(0×0106, 0×0088);
LineTo(0×0104, 0×0085);
LineTo(0×0101, 0×0083);
LineTo(0×00FF,0×0086); LineTo (0×00FC, 0×0086);
LineTo (0×00FA, 0×0084);
CloseRgn(RegionNumber [brow]);
/*eyes 9*/
RegionNumber[eyes] = NewRgn (),
OpenRgn() ;
fifdef FALSE
/* original pupil definitions */
MoveTo (0×00F5, 0×0096); /* $8A-$96 in height */
LineTo (0×00F2, 0×0095); /* $F1-$FA in width */
LineTo (0×00F1, 0×0091);
LineTo (0×00F1, 0×008E);
LineTo (0×00F4,0×008B);
LineTo (0×00F7, 0×008A);
LineTo (0×00FA, 0×008E);
LineTo (0×00FA, 0×0091);
LineTo(0×00FA, 0×0094);
LineTo(0×00F5, 0×0096);
MoveTo (0×0104, 0×008E); /* $89-$95 in height */
LineTo(0×0104, 0×0091); /* $FB-$104 in width */
LineTo(0×0104, 0×0093);
LineTo (0×0104, 0×0095);
LineTo(0×00FF, 0×0095);
LineTo (0×00FD, 0×0095);
LineTo (0×00FB, 0×0091);
LineTo (0×00FB, 0×008F);
LineTo (0×00FB, 0×008D);
LineTo (0×00FC, 0×008B);
LineTo(0×00FD, 0×008A);
LineTo(0×00FF, 0×0069);
LineTo(0×0101, 0×008A);
LineTo(0×0102, 0×008B);
LineTo(0×0104, 0×008C);
LineTo(0×0104, 0×008E);
#endif
#define leftEye (0×0F1 + 0×0FA)/2
#define rightEye (0×0FB + 0×104)/2
#define eyeHeight (0×08A + 0×096)/2
#define eyeRadius 3
SetRect(&r, rightEye - eyeRadius, eyeHeight - eyeRadius,
rightEye + 2 * eyeRadius - 1, eyeHeight + 2 * eyeRadius - 1);
FrameOval(&r);
SetRect(&r, leftEye - eyeRadius, eyeHeight - eyeRadius,
leftEye + 2 * eyeRadius - 1, eyeHeight + 2 * eyeRadius - 1)
FrameOval(&r);
CloseRgn(RegionNumber[eyes]);
/* do region subtractions to purify zones of influence */
for (i=0; i < NumberOfRegions-1; i++) {
for (j=i+1; j < NumberOfRegions; j++)
{
DiffRgn(RegionNumber[i] ,RegionNumber[j] ,RegionNumber[i]);
}
}
/* calculate queen boundary rectangle for Polhemus scaling */
queenBounds = (**(RegionNumber[0])) .rgnBBox;
for (i=1; i < NumberOfRegions; i++)
{
UnionRect(SqueenBounds, s ( (**(RegionNumber[i] ) ) .rgnBBox) , SqueenBounds) ;
}
InsetRgn(RegionNumber[eyes] ,-1,-2) ;
}
DrawRegionOutlines()
{
int i;
for (i = 0; i < NumberOfRegions; i++)
{
/* clear to white */
PenPat(black);
ForeColor(white);
PaintRgn(RegionNumber[i] );
/* outline in black */
if (i != head)
{
PenPat(black);
BackColor(whiteColor);
ForeColor(blackColor);
FrameRgn(RegionNumber[i]);
}
}
}
DrawEyes()
{
static int blink = 0;
Point p;
int height;
int slope;
Point left,right;
int temp;
left.h = leftEye;
left.v = eyeHeight;
right.h = rightEye;
right.v = eyeHeight;
GetMouse(εp) ;
temp = (p.h = left.h) ? 100 : (5 * (p.v-left.v)) / (p.h-left.h);
if (temp < 0) temp = -temp; if (temp = 0)
{
/* slope is less than 0.2*/
if (p.h < left.h)
left.h -= eyeRadius;
else
left.h += eyeRadius;
}
else if (temp <= 5)
{
/* slope is between 0.2 and 1.0 */
if (p.h < left.h)
left.h -= eyeRadius;
else
left.h += eyeRadius;
if (p.v < left.v)
left.v -= eyeRadius / 2;
else
left.v += eyeRadius / 2;
}
else if (temp <= 25)
{
/* slope is between 1.0 and 5.0 */
if (p.h < left.h)
left.h -= eyeRadius / 2;
else
left.h += eyeRadius / 2;
if (p.v < left.v)
left.v -= eyeRadius;
else
left.v += eyeRadius;
}
else
{
/* slope is grater than 5.0*/
if (p.v < left.v)
left.v -= eyeRadius;
else
left.v += eyeRadius;
}
temp = (p.h = right.h) ? 100 : (5 * (p.v-right.v)) / (p.h-right.h); if (temp < 0) temp = -temp;
if (temp == 0)
{
/* slope is less than 0.2*/
if (p.h < right. h)
right. h -= eyeRadius;
else
right.h += eyeRadius;
}
else if (temp <= 5)
{
/* slope is between 0.2 and 1.0 */
if (p.h < right.h) right.h -= eyeRadius;
else
right.h += eyeRadius;
if (p.v < right.v)
right.v -= eyeRadius / 2;
else
right.v += eyeRadius / 2;
}
else if (temp <= 25)
{
/* slope is between 1.0 and 5.0 */
if (p.h < right.h)
right.h -= eyeRadius / 2;
else
right.h += eyeRadius / 2;
if (p.v < right.v)
right.v -= eyeRadius;
else
right.v += eyeRadius;
}
else
{
/* slope is grater than 5.0V
if (p.v < right.v)
right.v -= eyeRadius;
else
right.v += eyeRadius;
}
PenPat(black);
ForeColor(whiteColor);
PaintRgn(RegionNumber[eyes]);
blink++;
blink %= 200;
if (blink > 10)
{
ForeColor(blueColor);
PenSize(3,3);
MoveTo(left.h,left.v);
LineTo (left.h, left.v);
/* note that only a single height is used */
MoveTo (right.h,left.v);
LineTo(right.h, left.v);
PenSize(1,1);
}
}
PaintFillAt(p)
/* searches for region containing point p, then fills with current color & pattern Point p;
{
int i=0;
EventRecord dummyEvent;
while (i < NumberOfRegions)
{ if ((i != eyes) && (i != head) && (PtInRgn (p,RegionNumber[i])))
{
SysBeep(10); /* sploot! sound */
if ( (i != brow) && (i != mouth) && (i != eyes))
{
RandomPattern();
}
else
{
PenPat (gray);
}
RandomColor ();
PaintRgn (RegionNumber[i]);
PenPat (black);
BackColor (whiteColor);
ForeColor (blackColor);
if ( (i != brow) && (i != eyes) && (i != mouth))
{
FrameRgn (RegionNumber[i]);
}
i = NumberOfRegions;
while ( ! SoundDone ()); /* wait for beep to finish */
while (GetNextEvent (mouseDown+mouseUp,sdummyEvent)); /* purge unneeded mice */
}
i++;
}
}
RandomPattern()
/* selects a random rated pattern */
{
static int i = 0;
i = (i + 1) % 4;
switch (i)
{
case 0:
PenPat (black) ;
break;
case 1:
PenPat (squares) ;
break;
case 2:
PenPat (hearts) ;
break;
case 3:
PenPat (gray) ;
break;
}
}
RandomColor ()
/* selects random foreground and background colors */
{
static int i = 0; static int j = 0;
i = (i + 1) % 8;
j = (j + 1) % 7;
switch (i)
{
case 0:
ForeColor(blackColor); break;
case 1:
ForeColor(whiteColor); break;
case 2:
ForeColor (redColor); break;
case 3:
ForeColor (greenColor) ; break;
case 4:
ForeColor(blueColor) ; break;
case 5:
ForeColor (cyanColor); break;
case 6:
ForeColor (magentaColor) ; break;
case 7:
ForeColor(yellowColor); break;
}
switch { (i + j + 1) % 8)
{
case 0:
BackColor (blackColor) ; break;
case 1:
BackColor (whiteColor) ; break;
case 2:
BackColor (redColor); break;
case 3:
BackColor(greenColor); break;
case 4:
BackColor (blueColor) ; break;
case 5:
BackColor (cyanColor) ; break;
case 6:
BackColor (magentaColor) break;
case 7 :
BackColor (yellowColor) ; break; }}
*
* Imageworks 3D Paint Demo: Resource File
* ©1991 Walt Disney Imagineering
* Research ε Development
* Bill Redmann
*
*
* Output File Spec
*
Brush.π.rsrc
????????
*
* Menus
*
type MENU
,1000
\14
About Brush.
(-
,1001
File
(New
(Open
(Save
(Save As
(Close
(-
Quit/Q
,1002
Edit
(Undo/Z
(-
(Cut/X
(Copy/C
(Paste/V
(Clear
,1003
RS232
19200
I 9600
4800
2400
1200
300
(-
DIP Setting.
,1004
Polhemus Continuous Updates/C
Jend Update/S
(Increment...
(-
Eoresight/B
(-
Averaging/A
(-
! ASCII Output
Binary Output
! Inch Units
Centimeter Units
(-
Angle Output Only
(- Reset/R
,1005
Commands
! Print Polhemus Data
(Adjust Cursor
Graphics Mode/G *
* About Dialog
*
type DLOG
,128
About Srush Demo
107 128 235 384
visible NoGoAway
2 ; ;plainDBox 9
123
type DITL
,129
1
Picltem Enabled
0 0 128 256
128
Type PICT = GNRL
,128
.H
08D9 0000 0000 0080 0100 1101
A000 82A1 0064 000A 5350 4E54
03E8 0001 0000 0100 0A00 0000
0002 D002 4009 AA55 AA55 AA55
AA55 2200 01FF 9DFF FFA1 00C0
0492 2520 50322D76 3135202D
2043 6F70 7972 6967 68742031 4265 6163 6820 536F 6674 7761 72652C20 496E 632E 0D757365
72646963742F 6D64 206B 6E6F 776E 7B63 75727265 6E74 6469 6374206D 6420 65717D7B 6661 6C73 657D 6966656C 73657B62 757D 69662063 75727265 6E74 646963742F50 325F 6420 6B6E 6F77 6E20 6E6F 747B 2F50325F 627B 50325F640D62 6567 696E 7D62 696E 6420 6465 662F 5032 5F64 20323720 6469 63742064 6566207573657264 6963742F 6D64 206B 6E6F 776E 7B637572
7265 6E74 6469 6374 206D 6420 6571 7D7B 6661 6C73 657D 6966 656C 73652050 325F 6220 6475 7020 6475 700D 2F6D 6B20 6578 6368 2064 6566 7B6D 642F 7061 7420 6B6E 6F77 6E20 6D64 2F73 6720 6B6E 6F77 6E20 6D64 2F67 7220 6B6E 6F77 6E20 616E 6420 616E 647D 7B66 616C 7365 7D69 6665 6C73 652F 706B 2065 7663 6820 6465667B 6D640D2F 7365 7454784D 6F64 6520 6B6E 6F77 6E7D 7B66 616C 7365 7D69 6665 6C73 652F 736B 2065 7663 6620 6465 662F 627B 6269 6E64 2064 65667D62 696E 6420 6465 662F 73617B6D 6174 7269 7820 6375 7272 656E 746D 6174 7269 7820 50325F74 700D 636F 6E63 6174 20616C6F 6164 2070 6F707D62 2F73 627B 6D61 7472 69782063 75727265 6E74 6D6174726978 20657863 6820 636F 6E63 6174 2050 325F 7470206D 61747269 7820696E 766572746D617472 69782063 6F6E 63617420 616C 6F61 640D 706F 707D 622F 7365 7B6D 6174 72697620 6173746F 72652073 6574 6D6174726978 7D622F62 627B 6773 6176 6520 50325F74 7020 636F 6E63 6174 206E 657770617468 206D 6F76 6574 6F7D 622F 6263 7B637572 7665746F 7D622F62 6C0D 7B6C 696E 6574 6F7D 622F 6278 7B63 6C6F 736570617468 7D622F62 707B 6773 6176 6520 656F 6669 6C6C 2067 72657374 6F72 657D 622F 62667B73 6361 6C65 2031 2073 6574 6C69 6E65 7769 6474 6620 7374726F 6B65 7D622F62 650D 7B67 7265 7374 6F72 657D 622F 707B 2F67 6620 6661 6C73 6520 6465 667D 622F 677B 2F67 6620 7472 7565 2064 6566 7D62 2067 2070 6B7B 2F5F 7061 742F 7061 7420 6C6F 6164 2064 6566 2F5F 6772 2F67 7220 6C6F 6164
2064 6566 7D7B 2F5F 6772 0D7B 3634 2E30 2064 6976 2073 6574 6772 6179 7D62 7D69 6665 6C73 6520 736B 7B2F 5F73 544D 2F73 6574 5478 4D6F 6465 206C 6F61 6420 6465 667D 6966 2F67787B 2F74 6720 6578 6368 2064 6566 7D62 2030 2067 782F 78367B61 7620 3638 2067 740D 7B66 616C 7365 7D69 667D 6220 656E 6420 5032 5F62 2070 6B20 656E 647B 2F70 6174 7B50 325F 6220 6766 7B65 6E64 2070 6F70 2073 6720 6176 2036 3820 6774 7B70 6F70 7D69 667D 7B2F 5F70 6174 206C 6F61 6420 656E 6420 6578 6563 7D0D 6966 656C 7365 7D62 696E 6420 6465 667D 7B2F 7061 747B 5032 5F62 2070 6F70 205F 6772
2065 6E64 7D62 696E 6420 6465 667D 6966 656C 7365 2050 325F 6220 736B 2065 6E64 7B2F 7365 7454 784D 6F64 657B 5032 5F62 2F5F 7354 4D20 6C6F 6164 0D65 6E64 2065 7865 6320 50325F62 2074 672F 5F67 7220 6C6F 6164 2065 6E64 2065 7865 637D 6269 6E64 2064 6566 7D7B 2F73 6574 5478 4D6F 6465 7B70 6F70 2050 325F 6220 7467 2F5F 6772206C 6F61 6420 656E 6420 6578 6563 7D62 696E 640D 6465 667D 6966 656C 7365 7D69 660D 0700 0000 0022 0001 0001 FFFF A100 C000 6430 2030 2031 2069 6E64 6578 206E 6567 2031 2069 6E64 6578 206E 6567 206D 6174 7269 7820 7472 616E 736C 6174 6520 3320 3120 726F 6C6C 0D63 7572 7265 6E74 706F 696E 7420 3220 636F 7079 206D 6174 7269 7820 7472 616E 736C 6174 6520 3620 3120 726F 6C6C 0D22 007F 00FF 0101 A100 C000 8432 3536 2031 3238 2063 7572 7265 6E74 706F 696E 7420 3120 696E 6465 7820 3620 696E 6465 7820 7375 6220 3420 696E 6465 7820 3920 696E 6465 7820 7375 6220 6469 760D 3120 696E 6465 7820 3620 696E 6465 78207375 6220 3420 696E 6465
7820 3920 696E 64657820 7375
6220 6469 760D 6D61 74726978
20736361 6C65 20313120 3120
726F 6C6C 0DA1 00C0 006F 5B20
3920 3120726F 6C6C 2063 6C65
6172746F 6D61 726B 0D33 2032
20726F6C 6C20 6D6174726978
2063 6F6E 6361 746D 61747269
780D 6578 6368206D 61747269
7820 636F 6E63 6174 6D617472
6978 0D2F 5032 5F747020 6578
63682064 65662050325F 6220
6D6B 2065 6E64 7B62 6E7D 6966
0D07 000100012200 01FF 9DFF
FFA10064 00OA 5350 4E540BB8
000B 0000 A000 8CA10064000A
5350 4E54 0BB8 0004 00030900
0000 0000 0000 00310000 0000
0080 0100 09FF FFFF FFFF FFFF
FF38 A100 6400 0A53504E 540B
B800 0100 00A1 0064 001A 5350
4E54 0C260007 0008 0021 OOFF
0005 00020000 0000 0000 0000
A100 6400 0A53 504E 540C 9400
0100 01A1 00C0 001D 50325F62
2030 2067 7820 78362065 6E64
20312073 6574 5478 4D6F 6465 0DA1 0096000C 0500 0000 0200 00000000 0000 0100 0A00 0000 0000 1C02 402C 0009 0003 0647 656E 6576 6103 0003 0401 0D00 0C10 0080 0100 0080 01002B0E 181F 4272 7573 683A 20612050 6F6C 6865 6D757320 7061696E 7469 6E672064 656D 6FA00097 A100 6400 0A53 504E 540B B800 010000A10064 001A 5350 4E54 0C260019 005A 0030 009F 0005 00020000 0000 0000 0000 A100 6400 0A53 504E 540C 94000100 01A100C0 001D 50325F622030 20677820 78362065 6E642031 2073 6574 5478 4D6F 64650DA1
0096 000C 0500 0000 0200 0000 0000 0000 0100 0A00 0000 0000 2B024004 000D 000A 2B52100B 76657273 696F 6E20 312E 30A0
0097 A100 6400 0A53 504E 540B B800 0100 00A10064 001A 5350 4E54 0C26003A 0025006B 00D8 0005 00020000 0000 00000000 AlOO 6400 0A53 504E 540C 9400 0100 01A100C1 001D 50325F62
203020677820 78362065 6E64
20312073 6574 5478 4D6F 6465 0DA1 0096 000C 0600 0000 0200
0000 0000 0000 0100 0A00 0000
0000 6602 4028 0049 002B 1FA9
3139 3931 2057 616C 7420 4469
736E 6579 2049 6D61 6769 6E65
6572 696E 670D 2B18 0D17 5265
7365 6172 6368 2026 2044 6576
656C 6F70 6D65 6E74 0D2B 020D
1442 696C 6C20 5265 646D 616E
6E20 312F 3234 2F39 31A0 0097
A100 6400 0A53 504E 540B B600
0C00 00A0 008D A100 6400 0653
504E 5403 E9A0 0083 FF
* End of picture data *
* Increment Dialog
*
type DLOG
,129 (4)
Set Update Increment
107 128 235 384
Visible NoGoAway
1 ; ; dBoxProc
0
129
type DITL
,129
3
Button Enabled
99186 118 236
Ok
EditText Disabled
79 73 99143
0
StatText Disabled
10 20 64 236
Enter minimum change required for automatic update (in ^0) *
* Increment Alert
*
type ALRT
,130 (4)
139108 203 404
130
5511
type DITL
,130
2
Button Enabled
20 20640256
Ok
StatText Disabled
57259186
Acceptable numeric format is 12.34
*
* DIPsettings Dialog
*
type DLOG
,131 (4)
DIP Switch Settings
107128 205 404
Visible NoGcAway
1 ; ;dBoxProc
0
131
type DITL
,131
2
Button Enabled
69 20688256
Ok
StatText Disabled
10 20 64 266
Set DIP switches on EyePhone control box like this: ^0, where ^1 is off and ^2 is on *
* lcon
*
type ICNt = GNRL
,128
.H
*icon data
3000 0000 0000 0000
0000 0000 0000 0000
0000 0000 0000 0000
3700 0000 0700 0000
07400000 0240 cooo
0F30 0000121C 0000
2218 00000214 0000
0F02 000010A10FFE
2090 8FFF 0008 4FFF
0004281F 00020BDF
0001 4B5F 0000 CA5F
0001 CBDF 0000 080F 0000 0801 0000 0801
0000 0801 0000 0841
0000 0F81 0000 0801
0000 0801 0000 0FFF
*icon mask
0000 0000 0000 0000
0000 0000 0000 0000
0000 0000 0F80 0000
0F80 0000 0FE0 0000
0FE0 0000 1FE0 0000
3FFE 0000 7FFE 0000
773E 0000 7FBF 0000
3FFF 9FFF 7FFF DFFF
73FF FFFF 71FF FFFF
001F FFFF 000F FFFF
0007 FFFF 0003 FFFF
0003 FFFF 0003 FFFF
0000 IFFF 0000 IFFF
0000 IFFF 0000 IFFF
0000 IFFF 0000 IFFF
0000 IFFF 0000 IFFF
*
* Bundle Data
*
type FREF
,128
APPL 0
type IW3p = STR
,0
Imageworks 3D Paint Demo, ©1991 Bill Redmann, Walt Disney Imagineering R&D type BNDL
,128
IW3p 0
ICN#
0128
FREF
0 128

Claims

I claim:
1. An image display apparatus, comprising: a three-dimensional object;
projection means aligned so as to project light upon the three-dimensional object;
interface means for receiving data from a user, said data representative of a portion of an image to be projected upon the three-dimensional object; and,
data processing means for processing, in response to data received by the interface means, data corresponding to the image to be projected upon the three- dimensional object and for controlling the projection means so as to project the image.
2. The apparatus of claim 1 wherein the projection means comprises a projector and includes addressable light filter means for selectively filtering the light of the projector so that an image is projected upon the three-dimensional object.
3. The apparatus of claim 2 wherein the projector has a condensing lens and an objective lens and wherein the addressable light filter means includes a plurality of optically superposed separately addressable liquid crystal panels which themselves are individually optically superposed with a color filter, said addressable light filter means positioned between the objective and condensing lens of the projector.
4. The apparatus of claim 3 wherein the separately addressable liquid crystal filters are three in number and are individually optically superposed with yellow, magenta and cyan filters.
5. The apparatus of claim 2,
wherein the interface means comprises user interface means for receiving data to create regions corresponding to said image, and
simplified guest interface means for selecting among regions projected upon the three-dimensional object by the image display apparatus and for selectively changing the graphical content of a selected region; and
wherein the data processing means is responsive to data from the user interface means to create said regions and to control the addressable light filter means to cause the projection of said image upon the three-dimensional object, and is also responsive to the simplified guest interface means to select one of said regions, to selectively paint that region with graphical data and to project corresponding graphical information within the region projected upon the three-dimensional object.
6. The apparatus of claim 5 wherein the data processing means includes a color pattern generator that is responsive to the simplified guest interface means to selectively generate a color pattern that may be painted in the region.
7. The apparatus of claim 5 wherein the simplified guest interface is a polhemus device having an output that is representative of a position on the three dimensional object and a decision to paint a region corresponding to said position with graphical data.
8. An image display apparatus comprising: a three-dimensional object;
a projector having a lens system and being aligned so as to project light upon the three-dimensional object, said projector including addressable light filter means for selectively filtering the light of the projector so that an image is projected upon the three-dimensional object;
user interface means for receiving data from a user to create at least one region corresponding to said image;
simplified guest interface means for selecting among regions corresponding to the three- dimensional object and for selectively changing the graphical content of a selected region;
a color pattern generator that is responsive to the simplified guest interface means to selectively generate a current color pattern that may be painted in the selected region; and,
data processing means responsive to the user interface means for creating regions for filling a region with the color pattern in response to the an impetus to draw or fill a region from the simplified guest interface means and in response to the color pattern generator, and for controlling the addressable light filter means to cause the projection of the image upon the three-dimensional object.
9. The apparatus of claim 8 wherein the simplified guest interface means includes a guest data input indicates of a guest's decision either to change the current color pattern or a decision to paint a particular region of the three-dimensional object with the current color pattern, and wherein the addressable light filter means may be driven by the computer's monitor output signal.
10. The apparatus of claim 8 wherein the simplified guest interface means includes a polhemus device having an output that corresponds to a selected position on the three-dimensional object.
11. The apparatus of claim 8 wherein the data processing means includes a memory for storing a plurality of images for sequential projection upon the three-dimensional object.
12. The apparatus of claim 8 wherein the simplified guest interface means includes a joystick having an output that corresponds to a selected position on the three-dimensional object.
13. The apparatus of claim 8 wherein the simplified guest interface includes a stylus and a digitizing pad having an output that corresponds to a selected position on the three-dimensional object.
14. The apparatus of claim 8 wherein the simplified guest interface includes a mouse having an output that corresponds to a selected position on the three-dimensional object.
15. A method of projecting an image onto a three-dimensional object, comprising the steps of:
(a) entering data into a user interface to create at least one region that corresponds to a portion of the object;
(b) entering data into the user interface to select a current color pattern corresponding to a region;
(c) processing the data to generate an output representing an image corresponding to the object;
(d) projecting the image onto the object such that selected color patterns for each region are projected upon each portion of the object corresponding to said regions; and (e) controlling the projection in response to the output such that the image is projected on the object as desired.
16. The method of claim 15, further comprising the step of storing the output representing the image in a buffer and then processing that stored output to interactively modify the image.
17. The method of claim 15, further comprising the step of (f) storing the output representing the image in a memory such that it may be recalled and subsequently projected onto the object as desired.
18. The method of claim 17, further comprising the steps of
(g) storing the output representing the image in a buffer and then processing that stored output to interactively modify the image storing a plurality of outputs to form a sequence of different but related images for sequential projection upon the object, and
(h) performing steps (f) and (g) a plurality of times to form a sequence in memory of related images for sequential projection upon the object.
19. The method of claim 15, wherein:
the step of projecting the image onto the object includes the step of projecting light through a light filter, and
the step of controlling the projection in response to the output includes the step of controlling the light filter to subtract projected light so as to project the image on the object as desired.
20. The method of claim 19, wherein:
the step of projecting light through the light filter includes the step of projecting light through optically superposed liquid crystal panels to project the image onto the object, and
the step of controlling the light filter includes the step of controlling two or more optically superposed liquid crystal panels in response to the output to thereby reproduce corresponding color components of the image.
21. The method of claim 15, wherein the step of processing the graphics data to generate an output representing the image corresponding the object includes the step of displaying the image upon a monitor for viewing by the user.
22. The method of claim 15,
wherein the step of entering data into a user interface to select color pattern information includes the steps of entering, via a simplified guest interface, data to select a current color pattern for projection upon selective portions of the object, and of entering, via the simplified guest interface, data to select a particular area of the object,
wherein the step of processing the data to generate an output includes the step of identifying a region corresponding to the area of the object selected and painting the region with the current color pattern, and
wherein the step of controlling the projection in response to the output includes the step of projecting the color pattern upon the portion of the object corresponding to said region.
23. The method of claim 15,
wherein the step of entering data into a user interface to select a current color pattern includes the steps of entering, via a simplified guest interface, data to indicate a decision to projectively draw upon the object and of entering, via the simplified guest interface, data corresponding to a point on the object, wherein the step of processing the data to generate an output includes the step of identifying a pixel within a region corresponding to the point on the object and painting that pixel with a selected current color pattern, and
wherein the step of controlling the projection in response to the output includes the step of projecting the selected color pattern upon an area of the object corresponding to said pixel.
24. A method of projecting light onto a threedimensional object comprising the steps of:
(a) entering, via a user interface, graphics data defining a region corresponding to a portion the three-dimensional object;
(b) retrieving data from a simplified guest interface indicative of a point on the object, said simplified guest interface being manipulable by a guest to identify various points on the object; (c) retrieving data entered by the guest at the simplified guest interface indicative of a decision by the guest to select a current color pattern to be projected upon the object and processing said data indicative of a decision to select a current color pattern;
(d) retrieving data entered by the guest at the simplified guest interface indicative of a decision to project the current color pattern upon an area on the object and processing said data indicative of a decision to project the current color pattern upon an area of the object and said data indicative of a point on the object to select a region corresponding to said point and to paint that region with the current color pattern;
(e) processing the data received to format an image containing the region to be projected upon the object; and
(f) projecting the image upon the object.
25. The method of claim 24 wherein the step of projecting the image upon the object includes the steps of projecting light through individual addressable liquid crystal filters that are individually optically superposed with color filters and with each other, of separating out color components of the image to be projected corresponding to the number of individual liquid crystal filters and of controlling the liquid crystal filters such that the image is projected upon the object.
26. The method of claim 24, further comprising the step of retrieving data entered by the guest at the simplified guest interface indicative of a decision to projectively draw upon the object and processing said data indicative of a point on the object to select a pixel corresponding to said point so as to paint that pixel with a selected color pattern, said selected color pattern being predefined as one of the current color pattern or a user defined color pattern.
27. The method of claim 26, further comprising the steps of:
(a) performing a plurality of times the step of retrieving data entered by the guest at the simplified guest interface indicative of a decision to projectively draw upon the object and processing said data indicative of a point on the object to select a pixel corresponding to said point so as to paint that pixel with a selected color pattern, so as to generate a set of pixels, (b) determining whether said set of pixels forms one of a closed contour or in combination with a region's boundaries a closed contour, and
(c) defining said closed contour as a region.
PCT/US1992/008626 1991-10-11 1992-10-09 Apparatus and method for projection upon a three-dimensional object WO1993007561A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP92922049A EP0615637A1 (en) 1991-10-11 1992-10-09 Apparatus and method for projection upon a three-dimensional object
JP5507209A JPH07504515A (en) 1991-10-11 1992-10-09 Apparatus and method for projecting onto three-dimensional objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US07/776,075 US5325473A (en) 1991-10-11 1991-10-11 Apparatus and method for projection upon a three-dimensional object
US776,075 1991-10-11

Publications (1)

Publication Number Publication Date
WO1993007561A1 true WO1993007561A1 (en) 1993-04-15

Family

ID=25106387

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1992/008626 WO1993007561A1 (en) 1991-10-11 1992-10-09 Apparatus and method for projection upon a three-dimensional object

Country Status (4)

Country Link
US (1) US5325473A (en)
EP (1) EP0615637A1 (en)
JP (1) JPH07504515A (en)
WO (1) WO1993007561A1 (en)

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4337161C1 (en) * 1993-10-30 1995-03-02 Deutsche Aerospace Manually guided (hand-guided) entry device for a computer
US5687305A (en) * 1994-03-25 1997-11-11 General Electric Company Projection of images of computer models in three dimensional space
JPH07294866A (en) * 1994-04-27 1995-11-10 Mitsubishi Electric Corp Projector device
US5513991A (en) * 1994-12-02 1996-05-07 Vamp, Inc. Method of simulating personal individual art instruction
EP0812447B1 (en) * 1995-03-02 2004-05-26 Parametric Technology Corporation Computer graphics system for creating and enhancing texture maps
US6421165B2 (en) * 1996-02-07 2002-07-16 Light & Sound Design Ltd. Programmable light beam shape altering device using programmable micromirrors
US6288828B1 (en) * 1997-09-10 2001-09-11 Light And Sound Design Ltd. Programmable light beam shape altering device using programmable micromirrors
US6167562A (en) * 1996-05-08 2000-12-26 Kaneko Co., Ltd. Apparatus for creating an animation program and method for creating the same
US5999194A (en) * 1996-11-14 1999-12-07 Brunelle; Theodore M. Texture controlled and color synthesized animation process
US6268865B1 (en) * 1998-01-13 2001-07-31 Disney Enterprises, Inc. Method and apparatus for three-dimensional painting
US6175610B1 (en) * 1998-02-11 2001-01-16 Siemens Aktiengesellschaft Medical technical system controlled by vision-detected operator activity
US6238217B1 (en) * 1999-05-17 2001-05-29 Cec Entertainment, Inc. Video coloring book
US8035612B2 (en) * 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US8300042B2 (en) * 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
US7259747B2 (en) * 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US7348963B2 (en) * 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US7576727B2 (en) * 2002-12-13 2009-08-18 Matthew Bell Interactive directed light/sound system
WO2004055776A1 (en) 2002-12-13 2004-07-01 Reactrix Systems Interactive directed light/sound system
WO2005041578A2 (en) 2003-10-24 2005-05-06 Reactrix Systems, Inc. Method and system for managing an interactive video display system
US7536032B2 (en) 2003-10-24 2009-05-19 Reactrix Systems, Inc. Method and system for processing captured image information in an interactive video display system
US20050281480A1 (en) * 2004-06-18 2005-12-22 Baldwin Robert A Computer apparatus and method for generating a paint-by-number picture from a digital image
US20060084039A1 (en) * 2004-10-19 2006-04-20 Massachusetts Institute Of Technology Drawing tool for capturing and rendering colors, surface images and movement
DE202005001702U1 (en) * 2005-02-02 2006-06-14 Sata Farbspritztechnik Gmbh & Co.Kg Virtual painting system and paint spray gun
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
DE502007000825D1 (en) 2006-12-05 2009-07-16 Sata Gmbh & Co Kg Ventilation for the gravity cup of a paint spray gun
AU2008299883B2 (en) 2007-09-14 2012-03-15 Facebook, Inc. Processing of gesture-based user interactions
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US9327301B2 (en) 2008-03-12 2016-05-03 Jeffrey D. Fox Disposable spray gun cartridge
US8595218B2 (en) 2008-06-12 2013-11-26 Intellectual Ventures Holding 67 Llc Interactive display management systems and methods
US8231225B2 (en) * 2008-08-08 2012-07-31 Disney Enterprises, Inc. High dynamic range scenographic image projection
DE202008014389U1 (en) * 2008-10-29 2010-04-08 Sata Gmbh & Co. Kg Gravity cup for a paint spray gun
DE102009032399A1 (en) 2009-07-08 2011-01-13 Sata Gmbh & Co. Kg Spray Gun
DE202010007355U1 (en) 2010-05-28 2011-10-20 Sata Gmbh & Co. Kg Nozzle head for a spraying device
EP2646166B1 (en) 2010-12-02 2018-11-07 SATA GmbH & Co. KG Spray gun and accessories
CN107537707B (en) 2011-06-30 2021-09-03 萨塔有限两合公司 Spray gun, spray medium guide unit, cover, base body and related method
US20140111629A1 (en) * 2012-10-20 2014-04-24 Margaret Morris System for dynamic projection of media
US10380921B2 (en) * 2013-03-15 2019-08-13 University Of Central Florida Research Foundation, Inc. Physical-virtual patient bed system
US9679500B2 (en) 2013-03-15 2017-06-13 University Of Central Florida Research Foundation, Inc. Physical-virtual patient bed system
CA155474S (en) 2013-09-27 2015-08-27 Sata Gmbh & Co Kg Spray gun
DE202013105779U1 (en) 2013-12-18 2015-03-19 Sata Gmbh & Co. Kg Air nozzle termination for a paint spray gun
JP6459194B2 (en) 2014-03-20 2019-01-30 セイコーエプソン株式会社 Projector and projected image control method
CN110996080B (en) * 2014-04-22 2021-10-08 日本电信电话株式会社 Video presentation device, video presentation method, and recording medium
CA159961S (en) 2014-07-31 2015-07-17 Sata Gmbh & Co Kg Spray gun
USD758537S1 (en) 2014-07-31 2016-06-07 Sata Gmbh & Co. Kg Paint spray gun rear portion
CN105289870B (en) 2014-07-31 2019-09-24 萨塔有限两合公司 Manufacturing method, spray gun, gun body and the lid of spray gun
USD768820S1 (en) 2014-09-03 2016-10-11 Sata Gmbh & Co. Kg Paint spray gun with pattern
DE102015006484A1 (en) 2015-05-22 2016-11-24 Sata Gmbh & Co. Kg Nozzle arrangement for a spray gun, in particular paint spray gun and spray gun, in particular paint spray gun
DE102015016474A1 (en) 2015-12-21 2017-06-22 Sata Gmbh & Co. Kg Air cap and nozzle assembly for a spray gun and spray gun
FR3048786B1 (en) * 2016-03-10 2018-04-06 Smartpixels DYNAMIC ADJUSTMENT OF THE SHARPNESS OF AT LEAST ONE IMAGE PROJECTED ON AN OBJECT
CN205966208U (en) 2016-08-19 2017-02-22 萨塔有限两合公司 Hood subassembly and spray gun
CN205995666U (en) 2016-08-19 2017-03-08 萨塔有限两合公司 Spray gun and its trigger
JP6556680B2 (en) * 2016-09-23 2019-08-07 日本電信電話株式会社 VIDEO GENERATION DEVICE, VIDEO GENERATION METHOD, AND PROGRAM
EP3462411A1 (en) 2017-09-27 2019-04-03 Arkite NV Configuration tool and method for a quality control system
JP6545415B1 (en) * 2018-07-02 2019-07-17 三菱電機株式会社 Editing device, editing method, editing program, and editing system
DE102018118737A1 (en) 2018-08-01 2020-02-06 Sata Gmbh & Co. Kg Nozzle for a spray gun, nozzle set for a spray gun, spray guns and method for producing a nozzle for a spray gun
EP3829778A2 (en) 2018-08-01 2021-06-09 SATA GmbH & Co. KG Set of nozzles for a spray gun, spray gun system, method for embodying a nozzle module, method for seelcting a nozzle module from a set of nozzles for a paint job, selection system and computer program product
DE102018118738A1 (en) 2018-08-01 2020-02-06 Sata Gmbh & Co. Kg Base body for a spray gun, spray guns, spray gun set, method for producing a base body for a spray gun and method for converting a spray gun
US11772276B2 (en) 2020-01-02 2023-10-03 Universal City Studios Llc Systems and methods for optical performance captured animated figure with real-time reactive projected media
US12008917B2 (en) 2020-02-10 2024-06-11 University Of Central Florida Research Foundation, Inc. Physical-virtual patient system
US11207606B2 (en) 2020-03-02 2021-12-28 Universal City Studios Llc Systems and methods for reactive projection-mapped show robot
DE102020123769A1 (en) 2020-09-11 2022-03-17 Sata Gmbh & Co. Kg Sealing element for sealing a transition between a base body of a spray gun and an add-on part of a spray gun, add-on part, in particular paint nozzle arrangement, for a spray gun and spray gun, in particular paint spray gun

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1653180A (en) * 1924-12-15 1927-12-20 Parisienne De Confection Soc Lay figure
US3610745A (en) * 1969-08-01 1971-10-05 James Mark Wilson Visual effects combining motion pictures and three dimensional objects
US4076398A (en) * 1973-10-10 1978-02-28 Ortho Pharmaceutical Corporation Visual communications system
US4200867A (en) * 1978-04-03 1980-04-29 Hill Elmer D System and method for painting images by synthetic color signal generation and control
US5115305A (en) * 1990-07-05 1992-05-19 Baur Thomas G Electrically addressable liquid crystal projection system with high efficiency and light output

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3420598A (en) * 1966-05-06 1969-01-07 Daniel Goss Screen animator
US4243315A (en) * 1978-08-21 1981-01-06 Wolf Clifford R Device for selectively distorting reflected images and the method of performing same
US4514818A (en) * 1980-12-04 1985-04-30 Quantel Limited Video image creation system which simulates drafting tool
US4539585A (en) * 1981-07-10 1985-09-03 Spackova Daniela S Previewer

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1653180A (en) * 1924-12-15 1927-12-20 Parisienne De Confection Soc Lay figure
US3610745A (en) * 1969-08-01 1971-10-05 James Mark Wilson Visual effects combining motion pictures and three dimensional objects
US4076398A (en) * 1973-10-10 1978-02-28 Ortho Pharmaceutical Corporation Visual communications system
US4200867A (en) * 1978-04-03 1980-04-29 Hill Elmer D System and method for painting images by synthetic color signal generation and control
US5115305A (en) * 1990-07-05 1992-05-19 Baur Thomas G Electrically addressable liquid crystal projection system with high efficiency and light output

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Microsoft Paintbrush User's Guide, 1986, see pages 34-35. *
Microsoft Windows Paint User's Guide, Version 2.0, 1987, see pages 12-21. *

Also Published As

Publication number Publication date
JPH07504515A (en) 1995-05-18
EP0615637A1 (en) 1994-09-21
US5325473A (en) 1994-06-28

Similar Documents

Publication Publication Date Title
WO1993007561A1 (en) Apparatus and method for projection upon a three-dimensional object
US7433760B2 (en) Camera and animation controller, systems and methods
EP0875042B1 (en) Computer-assisted animation construction system and method and user interface
Agrawala et al. Artistic multiprojection rendering
Derakhshani Introducing Autodesk Maya 2012
US20190347865A1 (en) Three-dimensional drawing inside virtual reality environment
US4602286A (en) Video processing for composite images
US5960099A (en) System and method for creating a digitized likeness of persons
Spencer ZBrush character creation: advanced digital sculpting
Staples Representation in virtual space: visual convention in the graphical user interface
US7277571B2 (en) Effective image processing, apparatus and method in virtual three-dimensional space
JPH06507743A (en) Image synthesis and processing
CN111324334B (en) Design method for developing virtual reality experience system based on narrative oil painting works
Maraffi Maya character creation: modeling and animation controls
CN103777915A (en) Immersed type interaction system
Balcisoy et al. Interaction between real and virtual humans in augmented reality
Land Computer art: Color-stereo displays
Derakhshani Introducing Maya 2011
Booth et al. Computers: Computers animate films and video: Computers permit spectacular visual effects and take the drudgery out of film cartooning
Pearson The computer: liberator or jailer of the creative spirit
Lammers et al. Maya 4.5 Fundamentals
WO2020261341A1 (en) Graphic game program
WO2020261454A1 (en) Graphic game program
Lewis et al. Maya 5 fundamentals
CN116524155A (en) MR fish tank interaction system and method supporting manual creation

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1992922049

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1992922049

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1992922049

Country of ref document: EP