US20070013718A1 - Image processor, image processing method, recording medium, computer program and semiconductor device - Google Patents

Image processor, image processing method, recording medium, computer program and semiconductor device Download PDF

Info

Publication number
US20070013718A1
US20070013718A1 US11522775 US52277506A US2007013718A1 US 20070013718 A1 US20070013718 A1 US 20070013718A1 US 11522775 US11522775 US 11522775 US 52277506 A US52277506 A US 52277506A US 2007013718 A1 US2007013718 A1 US 2007013718A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
image
images
target
time
series
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11522775
Inventor
Akio Ohba
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Structure of client; Structure of client peripherals using Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. Global Positioning System [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/4403User interfaces for controlling a television receiver or set top box [STB] through a remote control device, e.g. graphical user interfaces [GUI]; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed circuit television systems, i.e. systems in which the signal is not broadcast
    • H04N7/183Closed circuit television systems, i.e. systems in which the signal is not broadcast for receiving images from a single remote source
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/4403User interfaces for controlling a television receiver or set top box [STB] through a remote control device, e.g. graphical user interfaces [GUI]; Remote control devices therefor
    • H04N2005/4405Hardware details of remote control devices
    • H04N2005/4428Non-standard components, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone, battery charging device

Abstract

An image processor is provided which utilizes images taken by an imaging device, e.g., a digital camera, as an input interface to enter commands, etc. A memory is operable to store in real time a series of images of a location captured by the imaging device over time. A detector detects a target captured in the series of images and a quantitative value of a movement component of the target by detecting differences between features of the captured series of images between points in time. An image generator is operable to generate an object image, the object image following a movement of the detected target and including an image representing a trace of the movement. An image generator generates a combined image from the object image and an image from the captured series of images. An output is operable to output in real time a signal representing the combined image at the location captured in the series of images.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a division of U.S. patent application Ser. No. 10/872,917 filed Jun. 21, 2004, which is a continuation of U.S. patent application Ser. No. 09/971,962 filed Oct. 5, 2001, the disclosures of both of these applications being hereby incorporated by reference herein. U.S. patent application Ser. No. 09/971,962 is based upon and claims the benefit of priority from the prior Japanese Patent Applications Nos. 2000-307574 filed Oct. 6, 2000, and 2001-295098 filed Sep. 26, 2001, the entire contents of these Japanese patent applications which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • The present invention relates to an image processing technology for using an image taken by an image pickup apparatus such as a video camera as an interface for inputting commands, etc.
  • A keyboard, mouse, controller, etc. are input devices often used for a computer, video game machine, etc. The operator inputs desired commands by operating these input devices to render a computer, etc. to execute processing according to the commands entered. Then, the operator sees images and listens to sound, etc. obtained as the processing results from a display device and speaker.
  • The operator enters commands by operating many buttons provided on the input device while watching a cursor shown on the display device.
  • Such operations greatly depend on operating experiences of the operator. For example, for a person who never touched the keyboard before, entering desired commands using the keyboard is quite troublesome and time-consuming, and prone to input errors due to mistyping from the keyboard. For this reason, there is a demand for a man-machine interface that will provide the operator with an easy way to operate.
  • On the other hand, with the progress of multimedia technologies, people in general households can now readily enjoy capturing images using a video camera into a computer, etc., editing and displaying the images on a display device. Such technologies are also used for personal authentication by analyzing images of a physical body such as a face, extracting characteristic parts thereof to identify individuals.
  • Conventionally, these images are used as information to be processed by a computer such as editing or analysis. However, images taken have not been used so far for a purpose such as entering commands to a computer, for example.
  • SUMMARY OF THE INVENTION
  • It is an object of the present invention to provide an image processing technology to use images taken by an image pickup apparatus, etc. as an input interface to enter commands, etc.
  • According to an aspect of the invention, an image processor is provided which includes a memory operable to store in real time a series of images of a location captured by an imaging device over time, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time. The image processor further includes a detector operable to detect the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time. An image generator is included in the image processor, and is operable to generate an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement, the image generator being further operable to generate a combined image from the object image and an image from the captured series of images. The image processor further includes an output operable to output in real time a signal representing the combined image at the location captured in the series of images.
  • According to one or more preferred aspects of the invention, the image processor is operable to produce a combined image in which the combined image includes a mirrored image of the target.
  • According to one or more preferred aspects of the invention, the output of the image processor is operable to output the signal representing the combined image to a predetermined display device.
  • According to one or more preferred aspects of the invention, the target is a first target, the captured series of images at least partially includes a plurality of targets including the first target, each of the plurality of targets being subject to move within the captured series of images from one point in the time to another point in the time, and the detector is operable to detect the quantitative value of the movement component of each of the plurality of targets and to detect a particular one of the plurality of targets based on the detected quantitative values of the movement components of the plurality of targets such that the object image follows the movement of the particular target, and the image representing the trace of the movement represents the trace of the movement of the particular target.
  • In accordance with one or more preferred aspects of the invention, the object image is associated with predetermined processing and the image generator is further operable to perform the predetermined processing when the detected quantitative value of the movement component satisfies a predetermined condition.
  • According to one or more preferred aspects of the invention, the detected quantitative value of the movement component includes a rate of movement of the target.
  • According to one or more preferred aspects of the invention the detected quantitative value of the movement component includes a cumulative amount of movement of the target.
  • In accordance with another aspect of the invention, an image processing method is provided which includes the following:
      • storing in real time a series of images of a location captured by an imaging device overtime, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time;
      • detecting the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time;
      • generating an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement;
      • generating a combined image from the object image and an image from the captured series of images; and
      • outputting in real time a signal representing the combined image at the location captured in the series of images.
  • In accordance with another aspect of the invention, a computer-readable recording medium is provided which has instructions recorded thereon, the instructions being executable by a computer or image processing system to perform a method, wherein the method includes the following:
      • storing in real time a series of images of a location captured by an imaging device overtime, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time;
      • detecting the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time;
      • generating an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement;
      • generating a combined image from the object image and an image from the captured series of images; and
      • outputting in real time a signal representing the combined image at the location captured in the series of images.
  • In accordance with another aspect of the invention, a system is provided which is operable to process an image. The system includes one or more semiconductor devices, and the one or more semiconductor devices include:
      • a memory operable to store in real time a series of images of a location captured by an imaging device over time, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time,
      • a detector operable to detect the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time,
      • an image generator operable to generate an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement, the image generator being further operable to generate a combined image from the object image and an image from the captured series of images, and
      • an output operable to output in real time a signal representing the combined image at the location captured in the series of images.
  • According to a further preferred aspect of the invention, an image processor includes a memory operable to store an image from a series of real time images of a location captured by an imaging device over time, the stored image at least partially including a target that is subject to movement from one point in the time to another point in the time. The image processor further includes a detector operable to detect the target and a movement component thereof by detecting features of the captured images at different points in the time, and includes an image generator operable to generate an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement. Such image generator is further operable to generate a combined image from the object image and the stored image and to output in real time a signal representing the combined image, to permit the combined image to be displayed to the location imaged by the imaging device.
  • In accordance with an aspect of the invention, an image processor is operable to generate object images according to the movements of targets included in the mirrored moving image. That is, the movement, color, shape of the object image to be displayed on the display device and if there is a plurality of object images, which object image should be displayed, etc. are determined by the movement of the target. For example, if the target is the operator, the object is determined according to the movement of the operator. Thus, the mirrored moving image is available as a kind of input interface.
  • It is also possible to comprise means for making preparations for executing required processing based on the generated object image according to the movement component of the target.
  • It is also possible to further comprise means for comparing a combined image obtained by combining the object image generated by the image generating means and the mirrored moving image at the actual time point, with a template image which is the image of part of the target included in the immediately preceding mirrored moving image, detecting the part of the combined image whose image feature is most resembling the template image and making preparations for executing required processing based on this object image when the image of the part of the detected combined image includes the object image.
  • By associating the object image with predetermined processing and further comprising means for executing the processing linked to the object image when the movement component of the target detected by the detecting means satisfies predetermined conditions, it is possible to execute processing using the movement of the target as an input.
  • It is also possible to construct the image processor so that the mirrored moving image includes a plurality of targets, construct the detecting means to detect the movement components of the plurality of targets and detect one target based on the respective movement components of the detected plurality of targets, construct the image generating means to change the object image according to the movement component of the one target detected by the detecting means.
  • According to still other aspects of the invention, a recording medium is provided having instructions recorded thereon for performing a method such as described above, and a semiconductor device is provided which has functions to perform such method.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These objects and other objects and advantages of the present invention will become more apparent upon reading of the following detailed description and the accompanying drawings in which:
  • FIG. 1 is an overall configuration diagram of an image processing system applying the present invention;
  • FIG. 2 is a configuration diagram of an image processor according to an embodiment of the present invention;
  • FIG. 3 is a functional block diagram of the image processor according to the embodiment of the present invention;
  • FIG. 4 is a flow chart showing a processing procedure of Embodiment 1;
  • FIG. 5 is a flow chart showing a processing procedure of Embodiment 1;
  • FIG. 6 illustrates a combined image according to Embodiment 1;
  • FIG. 7 illustrates a menu image;
  • FIG. 8 is a flow chart showing a processing procedure of Embodiment 2;
  • FIG. 9 illustrates a combined image according to Embodiment 2;
  • FIG. 10 is a view illustrating a drawing using a recursive texture;
  • FIG. 11 is a flow chart showing a processing procedure of Embodiment 3;
  • FIG. 12 is a flow chart showing a processing procedure of Embodiment 3; and
  • FIG. 13 illustrates a combined image according to Embodiment 3.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • An embodiment of the present invention will be specifically described with reference to the drawings accompanying herewith.
  • FIG. 1 is a configuration example of an image processing system applying the present invention.
  • This image processing system takes pictures of the operator who sits in front of a display device 3 using an analog or digital video camera 1. In this way, the image processing system captures moving images into an image processor 2 consecutively in a time series to generate mirrored moving images. Of these mirrored moving images, the image processing system combines object images expressing objects such as a menu and cursor at positions where remarked objects such as the eyes and hands of the operator (hereinafter the remarked objects are referred to as “targets”) to generate a combined image (this, too, becomes a moving image) and displays this combined image on the display device 3 in real time.
  • A mirrored moving image can be generated by subjecting the moving image captured from the video camera 1 to mirroring (right/left inversion of image) by the image processor 2, but it is also possible to place a mirror in front of the video camera 1 and take pictures of a moving image on the mirror surface reflecting the operator by the video camera 1 to form a mirrored moving image. In any case, a combined image whose display mode changes in real time according to the movement of the target is displayed on the display device 3.
  • The image processor 2 is implemented by a computer that forms the required functions using a computer program.
  • The computer according to this embodiment whose hardware configuration is shown by way of example in FIG. 2 has two buses; a main bus B1 and sub bus B2 to which a plurality of semiconductor devices each having specific functions is connected. These buses B1 and B2 are mutually connected or disconnected via a bus interface INT.
  • The main bus B1 is connected to a main CPU 10 which is a main semiconductor device, a main memory 11 made up of a RAM, a main DMAC (Direct Memory Access Controller) 12, an MPEG (Moving Picture Experts Group) decoder (MDEC) 13 and a graphic processing unit (hereinafter referred to as “GPU”) 14 incorporating a frame memory 15 which serves as a drawing memory. The GPU 14 is connected with a CRTC (CRT controller) 16 for generating a video signal so as to display the data drawn in the frame memory 15 on the display device 3.
  • The CPU 10 loads a start program from the ROM 23 on the sub bus B2 at the startup via the bus interface INT, executes the start program and operates an operating system. The CPU 10 also controls the media drive 27, reads an application program or data from the medium 28 mounted in this media drive 27 and stores this in the main memory 11. The CPU 10 further applies geometry processing (coordinate value calculation processing) to various data read from the medium 28, for example, three-dimensional object data (coordinate values of vertices (typical points) of a polygon, etc.) made up of a plurality of basic graphics (polygons) and generates a display list containing geometry-processed polygon definition information (specifications of shape of the polygon used, its drawing position, type, color or texture, etc. of components of the polygon).
  • The GPU 14 is a semiconductor device having the functions of storing drawing context (drawing data including polygon components), carrying out rendering processing (drawing processing) by reading necessary drawing context according to the display list notified from the main CPU 10 and drawing polygons in the frame memory 15. The frame memory 15 can also be used as a texture memory. Thus, a pixel image in the frame memory can be pasted as texture to a polygon to be drawn.
  • The main DMAC 12 is a semiconductor device that carries out DMA transfer control over the circuits connected to the main bus B1 and also carries out DMA transfer control over the circuits connected to the sub bus B2 according to the condition of the bus interface INT. The MDEC 13 is a semiconductor device that operates in parallel with the CPU 10 and has the function of expanding data compressed in MPEG (Moving Picture Experts Group) or JPEG (Joint Photographic Experts Group) systems, etc.
  • The sub bus B2 is connected to a sub CPU 20 made up of a microprocessor, etc., a sub memory 21 made up of a RAM, a sub DMAC 22, a ROM 23 that records a control program such as operating system, a sound processing semiconductor device (SPU: Sound Processing Unit) 24 that reads sound data stored in the sound memory 25 and outputs as audio output, a communication control section (ATM) 26 that transmits/receives information to/from an external apparatus via a network (not shown), a media drive 27 for setting a medium 28 such as CD-ROM and DVD-ROM and an input device 31.
  • The sub CPU 20 carries out various operations according to the control program stored in the ROM 23. The sub DMAC 22 is a semiconductor device that carries out control such as a DMA transfer over the circuits connected to the sub bus B2 only when the bus interface INT separates the main bus B1 from sub bus B2. The input unit 31 is provided with a connection terminal 32 through which an input signal from an operating device 35 is input, a connection terminal 33 through which an image signal from a video camera 1 is input and a connection terminal 34 through which a sound signal from the video camera 1 is input.
  • This Specification will only explain about images and omit explanations of sound for convenience.
  • In the computer constructed as shown above, and with reference to FIG. 3, the main CPU 10, sub CPU 20 and GPU 14 read and execute a predetermined computer program from the recording medium such as the ROM 23 and medium 28, and thereby form a functional block necessary for operating as the image processor 2, that is, an image input device 101, an image inverter 102, an object data storage device 103, an object data input device 104, an object controller 105, a superimposing image generator 106, a difference value detector 107 and a display controller 108.
  • In the relationship with the hardware shown in FIG. 1, the image input device 101 is formed by the input device 31 and the sub CPU 20 that controls the operation thereof, the image inverter 102, the object data input device 104, the object controller 105 and the difference value detector 107 are formed by the CPU 10 and the superimposing image generator 106 are formed by the GPU 104, and the display controller 108 is formed by the GPU 14 and CRTC 16 cooperating with each other. The object data storage device 103 is formed in a memory area accessible to the main CPU 10, for example, the main memory 11.
  • The image input device 101 incorporates images taken by the video camera 1 via the connection terminal 33 of the input device 31. In the case where the image entered is a digital image, the image input device 101 incorporates the image as is. In the case where the image taken and entered is an analog image, the image input device 101 incorporates the image after converting it from analog to digital.
  • The image inverter 102 subjects the image incorporated by the image input device 101 to mirroring, that is, right/left inversion to form a mirrored moving image.
  • The object data storage device 103 stores object data to express objects such as a menu (including a submenu), matchstick, and cursor together with identification data thereof.
  • The object data input device 104 incorporates necessary object data from the object data storage device 103 and sends the object data to the object controller 105. The object data to be incorporated is instructed by the object controller 105.
  • The object controller 105 generates an object image based on the object data incorporated from the object data input section 104 according to the instruction content. Especially, the object controller 105 determines the object display condition based on a difference value sent from the difference value detector 107 and generates an object image to realize the display condition. The difference value will be described later.
  • The superimposing image generator 106 draws a combined image obtained by superimposing the mirrored moving image output from the image inverter 102 on the object image generated by the object controller 105 in the frame memory 15.
  • By the way, in addition to generating a combined image by superimposing the object image, it is also possible to display the object image on the mirrored moving image using publicly known imposing processing.
  • The difference value detector 107 compares the image features of the mirrored moving image of the combined image generated by the superimposing image generator 106 frame by frame and derives the difference value of the image features between the mirrored moving images of the preceding and following frames. Furthermore, the difference value detector 107 generates a difference image between the mirrored moving images of the preceding and following frames as required.
  • The difference value in the image features is a value quantitatively expressing a variation per frame of the movement component of the target included in the mirrored moving image. For example, the difference value indicates a distance that the target has moved in the mirrored moving image or an area between the area after the movement and the area before the movement.
  • When a plurality of targets is included within one mirrored moving image, a difference value in the image features expresses a variation in the movement of each target, and therefore it is possible to quantitatively calculate the variation in the movement of each target by calculating this difference value.
  • The difference image is an image expressing a variation in the movement per frame of each target included in the mirrored moving image at every point in time. For example, when a target moves between two mirrored moving images, the difference image is an image made up of the image of the target before the movement and the image of the target after the movement.
  • In order to derive the difference value and difference image, the difference value detector 107 stores a certain mirrored moving image as a “reference image” relative to mirrored moving images of other frames in the main memory 11. The mirrored moving image to be stored may be a full one-frame worth mirrored moving image or may be a mirrored moving image that is only part of the target because all that is required is to make it possible to derive a difference value in the image features.
  • In the following explanations, whenever a distinction should be made between an image of part of a target and an image of the rest of the target, such an image is called “template image”.
  • The difference value detected by the difference value detector 107 is sent to the object controller 105 and used to control movements of object images.
  • The display controller 108 converts the combined image generated by superimposing image generator 106 to a video signal and outputs the video signal to the display device 3. The display device 3 displays the combined image (moving image) on a screen using this video signal.
  • <Image Processing Method>
  • An embodiment of the image processing method carried out using the above-described image processing system will now be explained.
  • Embodiment 1
  • On the display device 3, as shown in FIG. 6, suppose the image processor 2 displays a combined image consisting of the mirrored moving image of the operator taken by the video camera 1 and subjected to mirroring with a menu image as an example of an object image superimposed.
  • As a target, it is possible to select various objects such as the eyes, mouth, hands, etc. of the operator. Here, a case will be described where the operator's hand is the target and instructions are entered to the menu image by detecting the amount of movement of the hand in the area in which the menu image is displayed.
  • The menu image has a hierarchic structure as shown in FIG. 7. When the operator selects “menu” at the top layer, a pull-down image highlighting one of “select1”, “select2” or “select3” at the lower layer is displayed and when one item is selected from the pull-down image, the process determining image (for example, “process 21”, “process 22”, “process 23”, “process 24”) of the menu at the lower layer of the selected pull-down image are displayed.
  • The process determining image is stored in the object data storage device 103 linked to the program to render the main CPU 10 to execute the determined process (event) and when a certain process determining image is selected, the program linked thereto starts to execute the corresponding process (event).
  • FIG. 4 and FIG. 5 show the procedure for processing by the image processor 2 to enable such an operation.
  • First, with reference to FIG. 4, the difference value detector 107 updates the mirrored moving image to that of the next frame and when the combined image generated by the superimposing image generator 106 is thereby updated (step S101), image features of the mirrored moving image included in the preceding and following combined images to be updated are compared and the difference value is calculated (step S102). The difference value calculated here is a value expressing one movement of the operator's hand in the area in which the menu image is displayed. The difference values calculated are recorded in the main memory 11 and cumulatively added for a certain period of time (step S103). The reason that difference values are cumulatively added is that the operator's will about the operation instruction is detected by the image processor 2 based on a plurality of movements of the operator's hand. If the operator's will about the operation instruction can be checked according to the amount of one time movement of the hand, cumulative addition need not always be performed.
  • The difference value detector 107 sends the difference value (cumulative sum) to the object controller 105.
  • The object controller 105 determines the color of the menu image according to the difference value (cumulative sum) received from the difference value detector 107 (step S104). For example, a plurality of colors of the menu image is provided and the color is changed one by one every time a movement of the hand is detected. It is also possible to change the color from transparent to semitransparent, opaque, etc. Or the actual difference value (cumulative sum) is compared with a predetermined threshold (step S105) and if the cumulative sum is smaller than the threshold (step S105: N), the routine goes back to step S101 assuming that it is not sufficient to determine that “menu” of the menu screen has been selected.
  • When the cumulative sum exceeds the threshold (step S105: Y), the object controller 105 determines that “menu” of the menu screen has been selected, shows a pull-down image and reports it to the difference value detector 107 (step S106).
  • Thus, when the cumulative sum of the movement of the operator's hand detected in the area in which the menu image is displayed exceeds the threshold, the object controller 105 detects that “menu” of the menu image has been selected and shows the pull-down image. The color of the menu image changes according to the cumulative sum of the amount of movement of the hand, and therefore the operator can know a rough amount of additional movement of the hand required to select “menu”.
  • Furthermore, since the display device 3 shows a mirrored moving image, the operator can perform the above-described operation in much the same way the operator looks in a mirror, providing a man-machine interface easy-to-operate for the operator.
  • Thus, according to FIG. 5, when it is detected that “menu” on the menu screen has been selected, that is, the difference value (cumulative sum) has exceeded the threshold, the difference value detector 107 stores the image of the operator's hand (target) at that time as a template image (step S107).
  • When the frame is updated and the menu image is thereby replaced by the pull-down image in its subordinate layer and a combined image is shown (step S108), a search is started for the location of the image of the operator's hand in the new combined image. That is, the difference value detector 107 searches for an image that matches the template image from the combined image (step S109).
  • More specifically, the difference value detector 107 divides the combined image into areas in the same size as that of the template image and searches for the image most resembling the template image from among the images in the respective areas after the division. The image most resembling the template image in the area is, for example, when the sum total of absolute values (or squares) of differences between pixels of the images compared can be expressed as distances, an image whose distance from the template image is a minimum.
  • When a matched image is found (step S110: Y), it is determined whether the matched image is a pull-down image or not (step S111). If the matched image is a pull-down image (step S111: Y), the area of the pull-down image is detected from “select1”, “select2” or “select3” (step S112). The detected pull-down image becomes the pull-down image indicated and selected by the operator. Information on the selected pull-down image is reported from the difference value detector 107 to the object controller 105.
  • The object controller 105 reads a process-determining image accompanying the selected pull-down image from the object data storage device 103 and generates an object image to which this process-determining image is attached (step S113).
  • In this way, the display device 3 shows how the menus are selected one after another by the operator.
  • In the example in FIG. 7, the pull-down image of “select2” is selected from the menu image at the top layer and the process determining images (“process 21”, “process 22”, “process 23” and “process 24”) accompanying the pull-down image of “select2” are displayed.
  • The template image is replaced by a new one for every frame.
  • That is, the difference value detector 107 discards the template image used for the preceding frame and stores the above-described matched image (image of the operator's hand used to select the pull-down image) as a new template image (step S114). Then, the routine returns to step S108 to specify any one of the process determining images (“process 21”, “process 22”, “process 23” and “process 24”) as shown above.
  • In step S111, when the matched image is outside the area of the pull-down image but is any one of the process determining images within the process determining image area (step S111: N, S115: Y), the process determining image is assumed to have been selected and the content of the process linked thereto is determined, that is, the program is made executable and the process using the menu image is finished (step S118).
  • When the matched image is outside the areas of the pull-down image and the process determining image but within the menu image area (step S111: N, S115: N, S116: Y), this means that the operator attempts to select another pull-down image, and therefore the routine discards the template image, stores the matched image as a new template image and returns to step S108 (step S117).
  • Instep S110, when no matched image to be compared is found (step S110: N) or when a matched image is found but is an image outside the area of the menu image, the process by the menu image is finished at that time (step S111: N, S115: N, S116: N).
  • By carrying out processing according to the menu image in the above procedure, the operator can easily select the process with a desired content while watching the own mirrored moving image shown on the screen of the display device 3. Furthermore, the operator can enter instructions while checking the own behavior on the screen at any time, which prevents the operator from averting his/her eyes from the display device 3 as in the case of using an input device such as a keyboard.
  • Embodiment 2
  • The image processing system according to this embodiment links an object image to a program that causes the main CPU 10 to execute an event to be subjected to image processing so that processing of the relevant event is executed according to the action of the operator within the mirrored moving image on the object image.
  • As an example of an object image to be superimposed on the mirrored moving image, this embodiment shows a case of using an image of a matchstick and an image of a flame expressing that the matchstick ignites and burns.
  • As a premise, the image of the matchstick, which is the object image, is linked beforehand to a program to display an ignition animation indicating that the matchstick has ignited on the display device 3. Then, when the operator in the mirrored moving image behaves as if he/she struck the image of the match within the combined image, the ignition animation is designed to appear in the ignition part of the image of the matchstick. The image of the flame is displayed when the operator strikes the image of the matchstick.
  • The image of the flame can be generated using a technique of, for example, recursive texture drawing.
  • The “recursive texture drawing” refers to a drawing technique of referencing an image of an object rendered by texture mapping as texture of another image and carrying out texture mapping recursively. “Texture mapping” is a technique of rendering an image of an object to enhance the texture of the image by pasting bitmap data of the texture to the surface of the object and can be implemented by also using the frame memory 15 as a texture memory. When carrying out such recursive texture drawing, gouraud shading is applied to a polygon on which the texture is drawn, that is, the brightness at vertices of the polygon is, calculated and the brightness inside the polygon is calculated by interpolating the brightness of each vertex (this technique is called “gouraud shading drawing”).
  • To express the flame image, the positions of vertices of a mesh which is the source of the flame image are shifted using random numbers as shown in FIG. 10 and the positions of new vertices are determined. The brightness at the vertices is also determined based on random numbers. The positions of the vertices and brightness at the vertices are determined every time the frame is updated. Every unit of the mesh which is the source of the flame image becomes a polygon.
  • On each polygon, the image that becomes the basis of the flame drawn in the frame memory 15 is formed through the above-described recursive texture drawing and the above-described gouraud shading is applied based on the brightness at each vertex of the polygon. This makes it possible to express a rising air current caused by the flame, shimmering, attenuation of the flame in a more realistic way.
  • Suppose the image processor 2 shows a combined image with the image of a matchstick superimposed on the mirrored moving image of the operator on the display device 3. Here, suppose the target is the operator's hand. By detecting the amount of movement of the hand in the area in which the image of the matchstick is displayed, the program linked to the image of the matchstick is executed and the ignition animation is displayed on the display device 3.
  • FIG. 8 shows the processing procedure using the image processor 2 to realize such an operation.
  • When the mirrored moving image is updated to the image of the next frame and the combined image generated by the superimposing image generator 106 is thereby updated (step S301), the difference value detector 107 compares image features of the mirrored moving images included in the combined images before and after the updating, calculates a difference value of the image in the ignition section of the image of the matchstick and generates a difference image of the ignition section of the image of the matchstick (step S202). The difference value calculated here is a value that quantitatively expresses the movement of the hand in the ignition section of the image of the matchstick. The difference value generated is an image made up of the images of the hand before and after moving the hand in the ignition section of the image of the matchstick.
  • The calculated difference value is recorded in the main memory 11 and cumulatively added for a certain period of time (step S203).
  • The difference value detector 107 sends the cumulative sum, which is the cumulative sum of the difference images and difference values to the object controller 105.
  • The object controller 105 determines the color of the difference image according to the cumulative sum received from the difference value detector 107 and generates a flame image based on this difference image (step S204). The flame image is generated, for example, by dividing the difference image into meshes and using the aforementioned recursive texture based on these meshes. The color of the flame image is determined according to the color of the difference image. The flame image generated is superimposed on the ignition section of the image of the matchstick.
  • In this way, the flame image with the color according to the amount of movement of the hand added is displayed in the area showing the movement of the hand in the ignition section of the image of the matchstick.
  • Determining the color of the flame image according to the cumulative sum of difference values makes it possible, for example, to express how the color of the flame image displayed in the ignition section of the matchstick gradually changes according to the amount of movement of the hand.
  • Then, the object controller 105 compares the value indicating the color of the flame image with a predetermined threshold (step S205). For example, if the color of the flame image is expressed by R, G and B values, the sum of their respective values can be used.
  • When the value indicating the color is equal to or greater than the threshold (step S205: Y), the object controller 105 determines to execute the program that displays the ignition animation indicating that the match has ignited (step S206).
  • That is, whether or not to start the ignition animation is determined according to the color of the flame image. For example, when the color of the flame image changes from red to yellow according to the amount of movement of the hand, the ignition animation starts when the flame images turns yellow. The operator can know a rough amount of additional movement of the hand required to start the ignition animation.
  • The superimposing image generator 106 generates a combined image superimposing the ignition animation on the object image including the matchstick image and flame image, on the mirrored moving image obtained from the video camera 1 (step S207). The ignition animation is displayed in the ignition section of the matchstick image.
  • When the value indicating the color is smaller than the threshold (step S205: N), the object controller 105 sends the object image superimposing the flame image on the matchstick image to the superimposing image generator 106. The superimposing image generator 106 generates a combined image by superimposing this object image on the mirrored moving image obtained from the video camera 1 (step S208).
  • Then, if, for example, an instruction for finishing the processing is received from the operation device 35, the processing is finished (step S209: Y). If no instruction for finishing the processing is received (step S209: N), the routine returns to step S201 and the display controller 108 displays the combined image generated in step S207 or step S208 on the display device 3.
  • As shown above, the system executes the process of determining whether or not to execute the program for displaying the ignition animation linked to the matchstick image according to how much the operator moves his/her hand in the ignition section of the matchstick image.
  • Since the operator can perform operations for executing various events while watching the mirrored moving image, it is possible to perform input operations for executing processes more easily than conventional operations using input devices such as a keyboard and mouse.
  • Embodiment 3
  • Another embodiment will now be explained. As a premise, suppose the image processor 2 shows a combined image with a cursor (pointer) image, which is an example of an object image, superimposed on the mirrored moving image of the operator on the display device 3 as shown in FIG. 13(a). Also suppose a plurality of targets such as the hand, eyes, mouth of the operator are included in the mirrored moving image.
  • Here, a case will be explained whereby focusing on the movement of the operator's hand from the plurality of these targets, the cursor image is expressed in such a way as to follow this movement of the hand.
  • As shown in FIG. 13(a), the cursor image is an image like a face with an emphasis put on the eyes, which allows the eyes to be oriented toward the target. Furthermore, the cursor image moves following the movement of the target. That is, when the cursor image is distant from the target, the cursor image moves toward the target and when the cursor image catches the target, the cursor image follows the movement of the target.
  • FIG. 11 and FIG. 12 show the processing procedure using the image processor 2 to realize such an operation.
  • According to FIG. 11, when the mirrored moving image is updated to the image of the next frame and the combined image generated by the superimposing image generator 106 is thereby updated (step S301), the difference value detector 107 compares image features of the mirrored moving image included in the combined images before and after the updating and calculates the difference value thereof (step S302). The difference value calculated here is a value quantifying the movements of the hands, eyes, mouth, etc. of the operator, which become candidates of the target in the mirrored moving image.
  • The difference value detector 107 sends the difference value of each target to the object controller 105.
  • The object controller 105 detects one target based on the difference value of each target sent from the difference value detector 107 (step S303). For example, the object controller 105 detects a target whose difference value reaches a maximum. In this example, suppose the operator's hand is detected as the target.
  • Upon detecting the target, the object controller 105 determines how the cursor image is displayed according to the target.
  • First, the object controller 105 determines whether the target in the combined image updated in step S310 is outside the cursor image or not (step S304). If the target is within the cursor image (step S304: N), the object controller 105 determines that the cursor image has caught the target (step S308).
  • If the target is outside the cursor image (step S304: Y), the object controller 105 determines that the cursor image has not caught the target and carries out processing for determining how the cursor image is displayed. That is, the object controller 105 generates a cursor image so that the eyes in the cursor image are oriented toward the target.
  • Furthermore, the object controller 105 determines the speed at which the cursor image moves toward the target according to the distance between the cursor image and target (step S306). This speed is adjusted to increase as the cursor image moves away from the target. This makes it possible to obtain an image in which the cursor moves toward the target faster as the cursor image stays farther from the target.
  • The superimposing image generator 106 superimposes such a cursor image on the mirrored moving image of the next frame and thereby generates a combined image as shown in FIG. 13(a) (step S307). Then, the routine goes back to step S301 and performs the same operation for the combined image generated.
  • The routine carries out the operations of step S301 to S307 until the cursor image catches the target, that is, until it is determined in step S304 that the target is within the cursor image.
  • Such operations can provide an image as shown in FIG. 13(a) in which the eyes in the cursor image are oriented toward the target (hand) and the cursor image chases after the target.
  • Then, according to FIG. 12, when the cursor image catches the target, the difference value detector 107 stores the image of the target at that time as a template image (step S309). For example, the difference value detector 107 stores the section of the mirrored moving image that overlaps with the cursor image as the template image.
  • Then, the difference value detector 107 acquires the mirrored moving image of the next frame from the image inverter 102 (step S310). The difference value detector 107 searches for the position of an image that matches the stored template image from among the acquired mirrored moving images (step S311).
  • More specifically, the difference value detector 107 divides the acquired mirrored moving image into areas of the same size as the template image and searches for an image in the area most resembling the template image from among the images in the respective divided areas. Upon detecting the matched image as a result of the search, the difference value detector 107 reports the position of the detected image to the object controller 105.
  • The object controller 105 determines the position reported from the difference value detector 107 as the position of the cursor image for the next combined image (step S312).
  • The superimposing image generator 106 superimposes the cursor image at the position determined in step S312 by the object controller 105 on the same mirrored moving image as the mirrored moving image acquired in step S310 by the difference value detector 107 and thereby generates a combined image as shown in FIG. 13(b) (step S313). Then, the frame is updated and the display controller 108 displays the combined image generated on the display device 3 (step S314).
  • Repeating the above-described operations after the target is caught (step S309 to step S314) obtains an image in which the cursor image follows the target. That is, when the cursor image catches the target (hand) as shown in FIG. 13(b), the cursor image is displayed thereafter following the target wherever the target moves. Even when the operator extends the hand as shown in FIG. 13(b) to FIG. 13(c), the cursor image is displayed at the tip of the extended hand of the operator together with the movement of the hand recognized as the target.
  • Use of the cursor image allows the operator to know at a glance which position of the part of the own body is functioning as the cursor when selecting a process from the menu image as shown in Embodiment 1, for example.
  • Furthermore, if, for example, the trace of the movement of the cursor image is set to be kept and displayed, it is possible to show the trace of the movement of the target on the display device 3. This makes it possible to show, for example, pictures and characters, etc. drawn in the space on the display device 3.
  • As is clear from the foregoing explanations, when the operator needs to enter data, etc. the present invention allows the operator to enter or select the data easily using the mirrored moving image while watching the combined image displayed on the display device, and can thereby provide a user-friendly input interface without the need to get accustomed thereto.
  • Various embodiments and changes may be made thereunto without departing from the broad spirit and scope of the invention. The above-described embodiment intended to illustrate the present invention, not to limit the scope of the present invention. The scope of the present invention is shown by the attached claims rather than the embodiment. Various modifications made within the meaning of an equivalent of the claims of the invention and within the claims are to be regarded to be in the scope of the present invention.

Claims (10)

  1. 1. An image processor, comprising:
    a memory operable to store in real time a series of images of a location captured by an imaging device over time, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time;
    a detector operable to detect the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time;
    an image generator operable to generate an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement, the image generator being further operable to generate a combined image from the object image and an image from the captured series of images; and
    an output operable to output in real time a signal representing the combined image at the location captured in the series of images.
  2. 2. The image processor according to claim 1, wherein the combined image includes a mirrored image of the target.
  3. 3. The image processor according to claim 1, wherein the output is operable to output the signal representing the combined image to a predetermined display device.
  4. 4. The image processor according to claim 1, wherein the target is a first target, the captured series of images at least partially includes a plurality of targets including the first target, each of the plurality of targets being subject to move within the captured series of images from one point in the time to another point in the time, and the detector is operable to detect the quantitative value of the movement component of each of the plurality of targets and to detect a particular one of the plurality of targets based on the detected quantitative values of the movement components of the plurality of targets such that the object image follows the movement of the particular target, and the image representing the trace of the movement represents the trace of the movement of the particular target.
  5. 5. The image processor according to claim 1, wherein the object image is associated with predetermined processing and the image generator is further operable to perform the predetermined processing when the detected quantitative value of the movement component satisfies a predetermined condition.
  6. 6. The image processor according to claim 1, wherein the detected quantitative value of the movement component includes a rate of movement of the target.
  7. 7. The image processor according to claim 1, wherein the detected quantitative value of the movement component includes a cumulative amount of movement of the target.
  8. 8. An image processing method, comprising:
    storing in real time a series of images of a location captured by an imaging device over time, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time;
    detecting the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time;
    generating an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement;
    generating a combined image from the object image and an image from the captured series of images; and
    outputting in real time a signal representing the combined image at the location captured in the series of images.
  9. 9. A computer-readable recording medium having instructions recorded thereon, the instructions being executable by a computer or image processing system to perform a method, the method comprising:
    storing in real time a series of images of a location captured by an imaging device over time, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time;
    detecting the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time;
    generating an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement;
    generating a combined image from the object image and an image from the captured series of images; and
    outputting in real time a signal representing the combined image at the location captured in the series of images.
  10. 10. A system operable to process an image, comprising:
    one or more semiconductor devices, the one or more semiconductor devices including:
    a memory operable to store in real time a series of images of a location captured by an imaging device over time, the captured series of images at least partially including a target that is subject to move within the captured series of images from one point in the time to another point in the time,
    a detector operable to detect the target and a quantitative value of a movement component thereof by detecting differences between features of the captured series of images at a first point in the time and at a second point in the time,
    an image generator operable to generate an object image representing a predetermined object so that the object image follows a movement of the detected target and includes an image representing a trace of the movement, the image generator being further operable to generate a combined image from the object image and an image from the captured series of images, and
    an output operable to output in real time a signal representing the combined image at the location captured in the series of images.
US11522775 2000-10-06 2006-09-18 Image processor, image processing method, recording medium, computer program and semiconductor device Abandoned US20070013718A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2000-307574 2000-10-06
JP2000307574 2000-10-06
JP2001-295098 2001-09-26
JP2001295098A JP3725460B2 (en) 2000-10-06 2001-09-26 Image processing apparatus, image processing method, recording medium, computer program, a semiconductor device
US09971962 US6771277B2 (en) 2000-10-06 2001-10-05 Image processor, image processing method, recording medium, computer program and semiconductor device
US10872917 US7176945B2 (en) 2000-10-06 2004-06-21 Image processor, image processing method, recording medium, computer program and semiconductor device
US11522775 US20070013718A1 (en) 2000-10-06 2006-09-18 Image processor, image processing method, recording medium, computer program and semiconductor device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11522775 US20070013718A1 (en) 2000-10-06 2006-09-18 Image processor, image processing method, recording medium, computer program and semiconductor device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10872917 Division US7176945B2 (en) 2000-10-06 2004-06-21 Image processor, image processing method, recording medium, computer program and semiconductor device

Publications (1)

Publication Number Publication Date
US20070013718A1 true true US20070013718A1 (en) 2007-01-18

Family

ID=26601678

Family Applications (3)

Application Number Title Priority Date Filing Date
US09971962 Active 2022-08-29 US6771277B2 (en) 2000-10-06 2001-10-05 Image processor, image processing method, recording medium, computer program and semiconductor device
US10872917 Active 2021-11-19 US7176945B2 (en) 2000-10-06 2004-06-21 Image processor, image processing method, recording medium, computer program and semiconductor device
US11522775 Abandoned US20070013718A1 (en) 2000-10-06 2006-09-18 Image processor, image processing method, recording medium, computer program and semiconductor device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09971962 Active 2022-08-29 US6771277B2 (en) 2000-10-06 2001-10-05 Image processor, image processing method, recording medium, computer program and semiconductor device
US10872917 Active 2021-11-19 US7176945B2 (en) 2000-10-06 2004-06-21 Image processor, image processing method, recording medium, computer program and semiconductor device

Country Status (6)

Country Link
US (3) US6771277B2 (en)
EP (1) EP1324269B1 (en)
JP (1) JP3725460B2 (en)
CN (1) CN1279761C (en)
CA (1) CA2392725A1 (en)
WO (1) WO2002031773A1 (en)

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070109600A1 (en) * 2005-11-15 2007-05-17 Lexmark International, Inc. Printer optimization method and system
US20090060275A1 (en) * 2007-08-30 2009-03-05 Casio Computer Co., Ltd. Moving body image extraction apparatus and computer readable storage medium storing program
US20090096714A1 (en) * 2006-03-31 2009-04-16 Brother Kogyo Kabushiki Kaisha Image display device
US20100081507A1 (en) * 2008-10-01 2010-04-01 Microsoft Corporation Adaptation for Alternate Gaming Input Devices
US20100194872A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Body scan
US20100199221A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Navigation of a virtual plane using depth
US20100194741A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
US20100231512A1 (en) * 2009-03-16 2010-09-16 Microsoft Corporation Adaptive cursor sizing
US20100238182A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Chaining animations
US20100241998A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Virtual object manipulation
US20100281437A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Managing virtual ports
US20100277489A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Determine intended motions
US20100278384A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Human body pose estimation
US20100277470A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Systems And Methods For Applying Model Tracking To Motion Capture
US20100281432A1 (en) * 2009-05-01 2010-11-04 Kevin Geisner Show body position
US20100278431A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Systems And Methods For Detecting A Tilt Angle From A Depth Image
US20100281436A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Binding users to a gesture based system and providing feedback to the users
US20100281438A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Altering a view perspective within a display environment
US20100295771A1 (en) * 2009-05-20 2010-11-25 Microsoft Corporation Control of display objects
US20100306715A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Gestures Beyond Skeletal
US20100302365A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Depth Image Noise Reduction
US20100303302A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems And Methods For Estimating An Occluded Body Part
US20100302247A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Target digitization, extraction, and tracking
US20100306685A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation User movement feedback via on-screen avatars
US20100303290A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems And Methods For Tracking A Model
US20100306261A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Localized Gesture Aggregation
US20100306712A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Gesture Coach
US20100302257A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems and Methods For Applying Animations or Motions to a Character
US20100302395A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Environment And/Or Target Segmentation
US20100306710A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Living cursor control mechanics
US20100302138A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Methods and systems for defining or modifying a visual representation
US20100303289A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Device for identifying and tracking multiple humans over time
US20100304813A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Protocol And Format For Communicating An Image From A Camera To A Computing Environment
US20100306716A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Extending standard gestures
US20100306713A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Gesture Tool
US20100311280A1 (en) * 2009-06-03 2010-12-09 Microsoft Corporation Dual-barrel, connector jack and plug assemblies
US20110007142A1 (en) * 2009-07-09 2011-01-13 Microsoft Corporation Visual representation expression based on player expression
US20110018901A1 (en) * 2009-07-27 2011-01-27 Disney Enterprises Inc. System and method for forming a composite image in a portable computing device having a dual screen display
US20110025689A1 (en) * 2009-07-29 2011-02-03 Microsoft Corporation Auto-Generating A Visual Representation
US20110055846A1 (en) * 2009-08-31 2011-03-03 Microsoft Corporation Techniques for using human gestures to control gesture unaware programs
US20110109617A1 (en) * 2009-11-12 2011-05-12 Microsoft Corporation Visualizing Depth
US20110221768A1 (en) * 2010-03-10 2011-09-15 Sony Corporation Image processing apparatus, image processing method, and program
WO2011115572A1 (en) * 2010-03-19 2011-09-22 Xyz Wave Pte Ltd An apparatus for enabling control of content on a display device using at least one gesture, consequent methods enabled by the apparatus and applications of the apparatus
US20120162476A1 (en) * 2010-12-28 2012-06-28 Casio Computer Co., Ltd. Image capturing apparatus, image capturing control method and storage medium for capturing a subject to be recorded with intended timing
US20130044131A1 (en) * 2011-08-15 2013-02-21 Moheb Milad Software controller for audio mixer equipment
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US8638985B2 (en) 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US8959541B2 (en) 2012-05-04 2015-02-17 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9159151B2 (en) 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US9465980B2 (en) 2009-01-30 2016-10-11 Microsoft Technology Licensing, Llc Pose tracking pipeline
US9609236B2 (en) 2013-09-16 2017-03-28 Kyle L. Baltz Camera and image processing method
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking

Families Citing this family (101)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8956228B2 (en) * 1999-12-03 2015-02-17 Nike, Inc. Game pod
JP4615252B2 (en) * 2000-10-06 2011-01-19 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method, recording medium, computer program, a semiconductor device
WO2004007034A9 (en) * 2002-07-12 2004-04-29 Awaba Group Pty Ltd A dance training device
JP2006504504A (en) 2002-10-30 2006-02-09 ナイキ・インコーポレーテッドNike Inc Target to be used in an interactive activity device
US8206219B2 (en) 2002-10-30 2012-06-26 Nike, Inc. Interactive gaming apparel for interactive gaming
JP3854229B2 (en) * 2003-01-07 2006-12-06 株式会社東芝 Image processing apparatus
DE20300882U1 (en) * 2003-01-21 2003-03-13 Fraunhofer Ges Forschung Apparatus for the interactive control of a cursor of a graphical user interface
GB2398691B (en) * 2003-02-21 2006-05-31 Sony Comp Entertainment Europe Control of data processing
GB2398690B (en) * 2003-02-21 2006-05-10 Sony Comp Entertainment Europe Control of data processing
EP1625716B2 (en) 2003-05-06 2014-04-09 Apple Inc. Method of modifying a message, store-and-forward network system and data messaging system
US7982751B2 (en) * 2003-07-11 2011-07-19 The University Of North Carolina Methods and systems for controlling a computer using a video image and for combining the video image with a computer desktop
US7495343B1 (en) * 2003-07-31 2009-02-24 Nvidia Corporation Pad over active circuit system and method with frame support structure
US7453158B2 (en) * 2003-07-31 2008-11-18 Nvidia Corporation Pad over active circuit system and method with meshed support structure
GB0321337D0 (en) * 2003-09-11 2003-10-15 Massone Mobile Advertising Sys Method and system for distributing advertisements
JP4824409B2 (en) * 2004-01-06 2011-11-30 株式会社ソニー・コンピュータエンタテインメント Information processing system, input how to accept entertainment system and an information processing system,
JP3847753B2 (en) 2004-01-30 2006-11-22 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method, recording medium, computer program, a semiconductor device
GB2415639B (en) * 2004-06-29 2008-09-17 Sony Comp Entertainment Europe Control of data processing
JP4005061B2 (en) 2004-06-30 2007-11-07 株式会社ソニー・コンピュータエンタテインメント The information processing apparatus, a program, and an object control method in an information processing apparatus
JP4005060B2 (en) 2004-06-30 2007-11-07 株式会社ソニー・コンピュータエンタテインメント The information processing system, a program and a game character movement control method
US20060019746A1 (en) * 2004-07-22 2006-01-26 Atlantic City Coin & Slot Service Company, Inc Gaming device utilizing player image
JP4433948B2 (en) 2004-09-02 2010-03-17 株式会社セガ Background image acquisition program, a video game device, a background image acquiring method, and a computer-readable recording medium recording a program
JP4419768B2 (en) 2004-09-21 2010-02-24 日本ビクター株式会社 Control device for an electronic equipment
JP4717445B2 (en) * 2005-01-06 2011-07-06 株式会社バンダイナムコゲームス An image processing system, image processing apparatus, a game device, a program, an information storage medium and image processing method
CN100412908C (en) 2005-03-07 2008-08-20 腾讯科技(深圳)有限公司 Merge display method of multiple animation file
WO2006098255A1 (en) * 2005-03-15 2006-09-21 Shunsuke Nakamura Image display method and device thereof
JP4583981B2 (en) * 2005-03-16 2010-11-17 株式会社リコー Image processing apparatus
US7679689B2 (en) 2005-05-16 2010-03-16 Victor Company Of Japan, Limited Electronic appliance
CN100525467C (en) 2005-06-14 2009-08-05 北京中星微电子有限公司 Mirror processing method for YUY2 image
JP4861699B2 (en) * 2005-06-29 2012-01-25 株式会社コナミデジタルエンタテインメント Network game system, a control method of network game system, a game device, a game device control method, and program
JP2007072564A (en) * 2005-09-05 2007-03-22 Sony Computer Entertainment Inc Multimedia reproduction apparatus, menu operation reception method, and computer program
JP2007087100A (en) 2005-09-22 2007-04-05 Victor Co Of Japan Ltd Electronic device system
US7877387B2 (en) 2005-09-30 2011-01-25 Strands, Inc. Systems and methods for promotional media item selection and promotional program unit generation
US8549442B2 (en) * 2005-12-12 2013-10-01 Sony Computer Entertainment Inc. Voice and video control of interactive electronically simulated environment
JP4569555B2 (en) * 2005-12-14 2010-10-27 日本ビクター株式会社 Electronics
JP4742976B2 (en) 2006-05-12 2011-08-10 富士ゼロックス株式会社 Remote instruction system, remote instruction method, and program
GB2438449C (en) 2006-05-24 2018-05-30 Sony Computer Entertainment Europe Ltd Control of data processing
KR100801087B1 (en) 2006-07-05 2008-02-11 삼성전자주식회사 System and method for sensing moving body using structured light, mobile robot including the system
JP4707034B2 (en) 2006-07-07 2011-06-22 株式会社ソニー・コンピュータエンタテインメント Image processing method, an input interface device
JP4650381B2 (en) * 2006-09-08 2011-03-16 日本ビクター株式会社 Electronics
US8144121B2 (en) * 2006-10-11 2012-03-27 Victor Company Of Japan, Limited Method and apparatus for controlling electronic appliance
JP2008146243A (en) * 2006-12-07 2008-06-26 Toshiba Corp Information processor, information processing method and program
JP4720738B2 (en) * 2006-12-20 2011-07-13 日本ビクター株式会社 Electronics
GB0704837D0 (en) 2007-03-07 2007-04-18 Cvon Innovations Ltd System and method for ranking search results
KR20080088802A (en) * 2007-03-30 2008-10-06 삼성전자주식회사 Method for providing gui including pointer moving at a variable velocity and video apparatus thereof
GB2441399B (en) 2007-04-03 2009-02-18 Cvon Innovations Ltd Network invitation arrangement and method
KR101328950B1 (en) 2007-04-24 2013-11-13 엘지전자 주식회사 Image display method and image communication terminal capable of implementing the same
US8671000B2 (en) 2007-04-24 2014-03-11 Apple Inc. Method and arrangement for providing content to multimedia devices
US8726194B2 (en) * 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
JP5559691B2 (en) * 2007-09-24 2014-07-23 クアルコム,インコーポレイテッド Function improved interface for voice and video communications
US20110199635A1 (en) * 2008-02-08 2011-08-18 I-Jong Lin Printing Method Using Multiple Processors
JP5116514B2 (en) * 2008-03-11 2013-01-09 キヤノン株式会社 Imaging apparatus and a display control method
US8073203B2 (en) * 2008-04-15 2011-12-06 Cyberlink Corp. Generating effects in a webcam application
JP2009265709A (en) * 2008-04-22 2009-11-12 Hitachi Ltd Input device
US8514251B2 (en) * 2008-06-23 2013-08-20 Qualcomm Incorporated Enhanced character input using recognized gestures
KR20100039017A (en) * 2008-10-07 2010-04-15 한국전자통신연구원 Remote control apparatus using menu markup language
US20100091085A1 (en) * 2008-10-15 2010-04-15 Sony Corporation And Sony Electronics Inc. Augmenting tv menu icon with images in front of tv
JP2010142592A (en) 2008-12-22 2010-07-01 Intelligent Systems Co Ltd Game program and game device
JP2010176510A (en) * 2009-01-30 2010-08-12 Sanyo Electric Co Ltd Information display device
US8732623B2 (en) * 2009-02-17 2014-05-20 Microsoft Corporation Web cam based user interaction
JP5635736B2 (en) 2009-02-19 2014-12-03 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus and information processing method
JP4767331B2 (en) * 2009-03-10 2011-09-07 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method, recording medium, computer program, a semiconductor device
US8314832B2 (en) * 2009-04-01 2012-11-20 Microsoft Corporation Systems and methods for generating stereoscopic images
US8194101B1 (en) * 2009-04-01 2012-06-05 Microsoft Corporation Dynamic perspective video window
US20100295782A1 (en) 2009-05-21 2010-11-25 Yehuda Binder System and method for control based on face ore hand gesture detection
US8112719B2 (en) * 2009-05-26 2012-02-07 Topseed Technology Corp. Method for controlling gesture-based remote control system
EP2256590A1 (en) * 2009-05-26 2010-12-01 Topspeed Technology Corp. Method for controlling gesture-based remote control system
JP2010277197A (en) * 2009-05-26 2010-12-09 Sony Corp Information processing device, information processing method, and program
US20110010497A1 (en) * 2009-07-09 2011-01-13 Sandisk Il Ltd. A storage device receiving commands and data regardless of a host
GB2471905B (en) 2009-07-17 2011-08-31 Sony Comp Entertainment Europe User interface and method of user interaction
JP2010003303A (en) * 2009-07-21 2010-01-07 Victor Co Of Japan Ltd Controller for electronic equipment
JP5343773B2 (en) * 2009-09-04 2013-11-13 ソニー株式会社 The information processing apparatus, display control method and a display control program
US9633476B1 (en) * 2009-10-29 2017-04-25 Intuit Inc. Method and apparatus for using augmented reality for business graphics
US9146669B2 (en) * 2009-12-29 2015-09-29 Bizmodeline Co., Ltd. Password processing method and apparatus
JP4794678B1 (en) * 2010-05-24 2011-10-19 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method and a video communication system,
US9367847B2 (en) 2010-05-28 2016-06-14 Apple Inc. Presenting content packages based on audience retargeting
EP2400379A1 (en) * 2010-06-23 2011-12-28 MFA Informatik AG Graphical control of a computer by a user
EP2421251A1 (en) * 2010-08-17 2012-02-22 LG Electronics Display device and control method thereof
CN102645970B (en) * 2011-02-22 2015-10-28 鸿富锦精密工业(深圳)有限公司 Trigger motion vector control method and an electronic device using thereof
WO2012123033A1 (en) * 2011-03-17 2012-09-20 Ssi Schaefer Noell Gmbh Lager Und Systemtechnik Controlling and monitoring a storage and order-picking system by means of movement and speech
JP5585505B2 (en) * 2011-03-17 2014-09-10 セイコーエプソン株式会社 The image supply device, image display system, a control method of the image supply device, image display device, and a program
US8928589B2 (en) * 2011-04-20 2015-01-06 Qualcomm Incorporated Virtual keyboards and methods of providing the same
US8873841B2 (en) * 2011-04-21 2014-10-28 Nokia Corporation Methods and apparatuses for facilitating gesture recognition
US9329673B2 (en) 2011-04-28 2016-05-03 Nec Solution Innovators, Ltd. Information processing device, information processing method, and recording medium
CN103562822A (en) 2011-04-28 2014-02-05 Nec软件系统科技有限公司 Information processing device, information processing method, and recording medium
US9727132B2 (en) * 2011-07-01 2017-08-08 Microsoft Technology Licensing, Llc Multi-visor: managing applications in augmented reality environments
JP2013080413A (en) * 2011-10-05 2013-05-02 Sony Corp Input apparatus and input recognition method
US9043766B2 (en) * 2011-12-16 2015-05-26 Facebook, Inc. Language translation using preprocessor macros
KR101410410B1 (en) * 2011-12-21 2014-06-27 주식회사 케이티 Bodily sensation type learning apparatus and method
JP5567606B2 (en) * 2012-01-31 2014-08-06 東芝テック株式会社 Information processing apparatus and program
GB2501925B (en) * 2012-05-11 2015-04-29 Sony Comp Entertainment Europe Method and system for augmented reality
CN104508599A (en) 2012-07-13 2015-04-08 株式会社果汁 Element selection device, element selection method, and program
JP5689103B2 (en) * 2012-11-07 2015-03-25 任天堂株式会社 Game program, the game system, a game device and game control method
CN104871525A (en) * 2012-12-26 2015-08-26 索尼公司 Image processing device, and image processing method and program
JP6048189B2 (en) * 2013-02-08 2016-12-21 株式会社リコー Projection systems, image generating program, the information processing apparatus and an image generating method
US9873038B2 (en) 2013-06-14 2018-01-23 Intercontinental Great Brands Llc Interactive electronic games based on chewing motion
CN103428551A (en) * 2013-08-24 2013-12-04 渭南高新区金石为开咨询有限公司 Gesture remote control system
US9990034B2 (en) * 2013-11-15 2018-06-05 Lg Electronics Inc. Transparent display device and control method therefor
KR20150110032A (en) * 2014-03-24 2015-10-02 삼성전자주식회사 Electronic Apparatus and Method for Image Data Processing
JP5979450B2 (en) * 2014-07-28 2016-08-24 株式会社クラス・マイスター Game device control program of
US9977565B2 (en) 2015-02-09 2018-05-22 Leapfrog Enterprises, Inc. Interactive educational system with light emitting controller
CN104680477A (en) * 2015-03-04 2015-06-03 江西科技学院 Image mirror algorithm

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5261041A (en) * 1990-12-28 1993-11-09 Apple Computer, Inc. Computer controlled animation system based on definitional animated objects and methods of manipulating same
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
US5732227A (en) * 1994-07-05 1998-03-24 Hitachi, Ltd. Interactive information processing system responsive to user manipulation of physical objects and displayed images
US5936610A (en) * 1993-07-27 1999-08-10 Canon Kabushiki Kaisha Control device for image input apparatus
US6072494A (en) * 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
US6088018A (en) * 1998-06-11 2000-07-11 Intel Corporation Method of using video reflection in providing input data to a computer system
US6160899A (en) * 1997-07-22 2000-12-12 Lg Electronics Inc. Method of application menu selection and activation using image cognition
US6346929B1 (en) * 1994-04-22 2002-02-12 Canon Kabushiki Kaisha Display apparatus which detects an observer body part motion in correspondence to a displayed element used to input operation instructions to start a process
US6392675B1 (en) * 1999-02-24 2002-05-21 International Business Machines Corporation Variable speed cursor movement
US6466197B1 (en) * 1998-06-27 2002-10-15 Samsung Electronics Co., Ltd. Method and apparatus for driving pointing device of computer system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01315884A (en) 1988-06-16 1989-12-20 Sony Corp Pattern tracking method
JPH06102993A (en) 1992-09-22 1994-04-15 Nippon Telegr & Teleph Corp <Ntt> Instruction input device
JPH06153017A (en) * 1992-11-02 1994-05-31 Sanyo Electric Co Ltd Remote controller for equipment
JP3766981B2 (en) * 1994-04-05 2006-04-19 カシオ計算機株式会社 Image control device and an image control method
JPH09265538A (en) 1996-03-29 1997-10-07 Matsushita Electric Works Ltd Automatic tracking device
JP3209178B2 (en) 1998-03-30 2001-09-17 日本電気株式会社 The information processing apparatus using a mouse and mouse

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5261041A (en) * 1990-12-28 1993-11-09 Apple Computer, Inc. Computer controlled animation system based on definitional animated objects and methods of manipulating same
US5936610A (en) * 1993-07-27 1999-08-10 Canon Kabushiki Kaisha Control device for image input apparatus
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
US6346929B1 (en) * 1994-04-22 2002-02-12 Canon Kabushiki Kaisha Display apparatus which detects an observer body part motion in correspondence to a displayed element used to input operation instructions to start a process
US5732227A (en) * 1994-07-05 1998-03-24 Hitachi, Ltd. Interactive information processing system responsive to user manipulation of physical objects and displayed images
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US6160899A (en) * 1997-07-22 2000-12-12 Lg Electronics Inc. Method of application menu selection and activation using image cognition
US6072494A (en) * 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
US6088018A (en) * 1998-06-11 2000-07-11 Intel Corporation Method of using video reflection in providing input data to a computer system
US6466197B1 (en) * 1998-06-27 2002-10-15 Samsung Electronics Co., Ltd. Method and apparatus for driving pointing device of computer system
US6392675B1 (en) * 1999-02-24 2002-05-21 International Business Machines Corporation Variable speed cursor movement

Cited By (127)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070109600A1 (en) * 2005-11-15 2007-05-17 Lexmark International, Inc. Printer optimization method and system
US20090096714A1 (en) * 2006-03-31 2009-04-16 Brother Kogyo Kabushiki Kaisha Image display device
US20090060275A1 (en) * 2007-08-30 2009-03-05 Casio Computer Co., Ltd. Moving body image extraction apparatus and computer readable storage medium storing program
US8116521B2 (en) * 2007-08-30 2012-02-14 Casio Computer Co., Ltd. Moving body image extraction apparatus and computer readable storage medium storing program
US8133119B2 (en) 2008-10-01 2012-03-13 Microsoft Corporation Adaptation for alternate gaming input devices
US20100081507A1 (en) * 2008-10-01 2010-04-01 Microsoft Corporation Adaptation for Alternate Gaming Input Devices
US8866821B2 (en) 2009-01-30 2014-10-21 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
US8467574B2 (en) 2009-01-30 2013-06-18 Microsoft Corporation Body scan
US8897493B2 (en) 2009-01-30 2014-11-25 Microsoft Corporation Body scan
US9465980B2 (en) 2009-01-30 2016-10-11 Microsoft Technology Licensing, Llc Pose tracking pipeline
US8294767B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Body scan
US9153035B2 (en) 2009-01-30 2015-10-06 Microsoft Technology Licensing, Llc Depth map movement tracking via optical flow and velocity prediction
US9607213B2 (en) 2009-01-30 2017-03-28 Microsoft Technology Licensing, Llc Body scan
US20100194741A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
US20100199221A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Navigation of a virtual plane using depth
US20100194872A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Body scan
US9652030B2 (en) 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US20110032336A1 (en) * 2009-01-30 2011-02-10 Microsoft Corporation Body scan
US9007417B2 (en) 2009-01-30 2015-04-14 Microsoft Technology Licensing, Llc Body scan
US8773355B2 (en) 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US20100231512A1 (en) * 2009-03-16 2010-09-16 Microsoft Corporation Adaptive cursor sizing
US9478057B2 (en) 2009-03-20 2016-10-25 Microsoft Technology Licensing, Llc Chaining animations
US9256282B2 (en) 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US9824480B2 (en) 2009-03-20 2017-11-21 Microsoft Technology Licensing, Llc Chaining animations
US20100241998A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Virtual object manipulation
US20100238182A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Chaining animations
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US9377857B2 (en) 2009-05-01 2016-06-28 Microsoft Technology Licensing, Llc Show body position
US9298263B2 (en) 2009-05-01 2016-03-29 Microsoft Technology Licensing, Llc Show body position
US8762894B2 (en) 2009-05-01 2014-06-24 Microsoft Corporation Managing virtual ports
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US8503766B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US20100278431A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Systems And Methods For Detecting A Tilt Angle From A Depth Image
US9262673B2 (en) 2009-05-01 2016-02-16 Microsoft Technology Licensing, Llc Human body pose estimation
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US9015638B2 (en) 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US9191570B2 (en) 2009-05-01 2015-11-17 Microsoft Technology Licensing, Llc Systems and methods for detecting a tilt angle from a depth image
US20100281438A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Altering a view perspective within a display environment
US8451278B2 (en) 2009-05-01 2013-05-28 Microsoft Corporation Determine intended motions
US20100281436A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Binding users to a gesture based system and providing feedback to the users
US9498718B2 (en) 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US9524024B2 (en) 2009-05-01 2016-12-20 Microsoft Technology Licensing, Llc Method to control perspective for a camera-controlled computer
US9519970B2 (en) 2009-05-01 2016-12-13 Microsoft Technology Licensing, Llc Systems and methods for detecting a tilt angle from a depth image
US20100281432A1 (en) * 2009-05-01 2010-11-04 Kevin Geisner Show body position
US9910509B2 (en) 2009-05-01 2018-03-06 Microsoft Technology Licensing, Llc Method to control perspective for a camera-controlled computer
US20100277470A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Systems And Methods For Applying Model Tracking To Motion Capture
US20100278384A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Human body pose estimation
US8181123B2 (en) 2009-05-01 2012-05-15 Microsoft Corporation Managing virtual port associations to users in a gesture-based computing environment
US9519828B2 (en) 2009-05-01 2016-12-13 Microsoft Technology Licensing, Llc Isolate extraneous motions
US8253746B2 (en) 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US8290249B2 (en) 2009-05-01 2012-10-16 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US20100281437A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Managing virtual ports
US8503720B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US20100277489A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Determine intended motions
US8638985B2 (en) 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US20100295771A1 (en) * 2009-05-20 2010-11-25 Microsoft Corporation Control of display objects
US20100302138A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Methods and systems for defining or modifying a visual representation
US8145594B2 (en) 2009-05-29 2012-03-27 Microsoft Corporation Localized gesture aggregation
US8418085B2 (en) 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US9656162B2 (en) 2009-05-29 2017-05-23 Microsoft Technology Licensing, Llc Device for identifying and tracking multiple humans over time
US9861886B2 (en) 2009-05-29 2018-01-09 Microsoft Technology Licensing, Llc Systems and methods for applying animations or motions to a character
US20100306713A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Gesture Tool
US20100306716A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Extending standard gestures
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8542252B2 (en) * 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US20100304813A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Protocol And Format For Communicating An Image From A Camera To A Computing Environment
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US9182814B2 (en) 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US20100303289A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8660310B2 (en) 2009-05-29 2014-02-25 Microsoft Corporation Systems and methods for tracking a model
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US20100306710A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Living cursor control mechanics
US8803889B2 (en) 2009-05-29 2014-08-12 Microsoft Corporation Systems and methods for applying animations or motions to a character
US20100302395A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Environment And/Or Target Segmentation
US8856691B2 (en) 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US20100302257A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems and Methods For Applying Animations or Motions to a Character
US20100306712A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Gesture Coach
US8896721B2 (en) 2009-05-29 2014-11-25 Microsoft Corporation Environment and/or target segmentation
US20100306261A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Localized Gesture Aggregation
US20100303290A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems And Methods For Tracking A Model
US20100306685A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation User movement feedback via on-screen avatars
US20100302247A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Target digitization, extraction, and tracking
US20100303302A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems And Methods For Estimating An Occluded Body Part
US20100302365A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Depth Image Noise Reduction
US20100306715A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Gestures Beyond Skeletal
US9215478B2 (en) 2009-05-29 2015-12-15 Microsoft Technology Licensing, Llc Protocol and format for communicating an image from a camera to a computing environment
US8379101B2 (en) 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8351652B2 (en) 2009-05-29 2013-01-08 Microsoft Corporation Systems and methods for tracking a model
US8176442B2 (en) 2009-05-29 2012-05-08 Microsoft Corporation Living cursor control mechanics
US9943755B2 (en) 2009-05-29 2018-04-17 Microsoft Technology Licensing, Llc Device for identifying and tracking multiple humans over time
US20100311280A1 (en) * 2009-06-03 2010-12-09 Microsoft Corporation Dual-barrel, connector jack and plug assemblies
US7914344B2 (en) 2009-06-03 2011-03-29 Microsoft Corporation Dual-barrel, connector jack and plug assemblies
US20110007142A1 (en) * 2009-07-09 2011-01-13 Microsoft Corporation Visual representation expression based on player expression
US9519989B2 (en) 2009-07-09 2016-12-13 Microsoft Technology Licensing, Llc Visual representation expression based on player expression
US8390680B2 (en) 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US9159151B2 (en) 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US8847984B2 (en) * 2009-07-27 2014-09-30 Disney Enterprises, Inc. System and method for forming a composite image in a portable computing device having a dual screen display
US20110018901A1 (en) * 2009-07-27 2011-01-27 Disney Enterprises Inc. System and method for forming a composite image in a portable computing device having a dual screen display
US20110025689A1 (en) * 2009-07-29 2011-02-03 Microsoft Corporation Auto-Generating A Visual Representation
US20110055846A1 (en) * 2009-08-31 2011-03-03 Microsoft Corporation Techniques for using human gestures to control gesture unaware programs
US9141193B2 (en) 2009-08-31 2015-09-22 Microsoft Technology Licensing, Llc Techniques for using human gestures to control gesture unaware programs
US20110109617A1 (en) * 2009-11-12 2011-05-12 Microsoft Corporation Visualizing Depth
US9075442B2 (en) * 2010-03-10 2015-07-07 Sony Corporation Image processing apparatus, method, and computer-readable storage medium calculation size and position of one of an entire person and a part of a person in an image
US9454837B2 (en) * 2010-03-10 2016-09-27 Sony Corporation Image processing apparatus, method, and computer-readable storage medium calculating size and position of one of an entire person and a part of a person in an image
US20110221768A1 (en) * 2010-03-10 2011-09-15 Sony Corporation Image processing apparatus, image processing method, and program
WO2011115572A1 (en) * 2010-03-19 2011-09-22 Xyz Wave Pte Ltd An apparatus for enabling control of content on a display device using at least one gesture, consequent methods enabled by the apparatus and applications of the apparatus
US9338362B2 (en) 2010-12-28 2016-05-10 Casio Computer Co., Ltd. Image capturing apparatus, image capturing control method and storage medium for capturing a subject to be recorded with intended timing
US9172878B2 (en) * 2010-12-28 2015-10-27 Casio Computer Co., Ltd. Image capturing apparatus, image capturing control method and storage medium for capturing a subject to be recorded with intended timing
US20120162476A1 (en) * 2010-12-28 2012-06-28 Casio Computer Co., Ltd. Image capturing apparatus, image capturing control method and storage medium for capturing a subject to be recorded with intended timing
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US9372544B2 (en) 2011-05-31 2016-06-21 Microsoft Technology Licensing, Llc Gesture recognition techniques
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US20130044131A1 (en) * 2011-08-15 2013-02-21 Moheb Milad Software controller for audio mixer equipment
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US9154837B2 (en) 2011-12-02 2015-10-06 Microsoft Technology Licensing, Llc User interface presenting an animated avatar performing a media reaction
US9628844B2 (en) 2011-12-09 2017-04-18 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
US9788032B2 (en) 2012-05-04 2017-10-10 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
US8959541B2 (en) 2012-05-04 2015-02-17 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
US9609236B2 (en) 2013-09-16 2017-03-28 Kyle L. Baltz Camera and image processing method

Also Published As

Publication number Publication date Type
US7176945B2 (en) 2007-02-13 grant
CN1279761C (en) 2006-10-11 grant
EP1324269A1 (en) 2003-07-02 application
EP1324269A4 (en) 2006-02-08 application
US20020097247A1 (en) 2002-07-25 application
CN1393003A (en) 2003-01-22 application
JP2002196855A (en) 2002-07-12 application
CA2392725A1 (en) 2002-04-18 application
US20040233224A1 (en) 2004-11-25 application
JP3725460B2 (en) 2005-12-14 grant
US6771277B2 (en) 2004-08-03 grant
WO2002031773A1 (en) 2002-04-18 application
EP1324269B1 (en) 2017-01-25 grant

Similar Documents

Publication Publication Date Title
US5587723A (en) Display range control apparatus and external storage unit for use therewith
US8274535B2 (en) Video-based image control system
US7123263B2 (en) Automatic 3D modeling system and method
US7421093B2 (en) Multiple camera control system
US20070109296A1 (en) Virtual space rendering/display apparatus and virtual space rendering/display method
US6738066B1 (en) System, method and article of manufacture for detecting collisions between video images generated by a camera and an object depicted on a display
US20070002037A1 (en) Image presentation system, image presentation method, program for causing computer to execute the method, and storage medium storing the program
US6999084B2 (en) Method and apparatus for computer graphics animation utilizing element groups with associated motions
US20060202986A1 (en) Virtual clothing modeling apparatus and method
US20090128552A1 (en) Image processing apparatus for combining real object and virtual object and processing method therefor
US20110013038A1 (en) Apparatus and method for generating image including multiple people
US20070279485A1 (en) Image Processor, Image Processing Method, Recording Medium, Computer Program, And Semiconductor Device
US20090153569A1 (en) Method for tracking head motion for 3D facial model animation from video stream
US20050271361A1 (en) Image frame processing method and device for displaying moving images to a variety of displays
JP2002123842A (en) Device for generating stereoscopic image, and medium for recording information
US8081822B1 (en) System and method for sensing a feature of an object in an interactive video display
JP2002351603A (en) Portable information processor
US20110273474A1 (en) Image display apparatus and image display method
JP2009237680A (en) Program, information storage medium, and image generation system
JP2004193933A (en) Image enlargement display method, its apparatus, and medium program
US20140140579A1 (en) Image processing apparatus capable of generating object distance data, image processing method, and storage medium
US20040077393A1 (en) Apparatus and method for video based shooting game
JP2008015942A (en) User interface program, device and method, and information processing system
JP2005309746A (en) Method and program for tracking moving body, recording medium therefor, and moving body tracking device
US20030202686A1 (en) Method and apparatus for generating models of individuals

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY COMPUTER ENTERTAINMENT INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OHBA, AKIO;REEL/FRAME:018412/0925

Effective date: 20011203