US20040189871A1 - Method of generating moving picture information - Google Patents

Method of generating moving picture information Download PDF

Info

Publication number
US20040189871A1
US20040189871A1 US10/810,703 US81070304A US2004189871A1 US 20040189871 A1 US20040189871 A1 US 20040189871A1 US 81070304 A US81070304 A US 81070304A US 2004189871 A1 US2004189871 A1 US 2004189871A1
Authority
US
United States
Prior art keywords
camera
moving picture
picture
information
information relating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/810,703
Inventor
Takahiro Kurosawa
Tomoaki Kawai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2003-097074 priority Critical
Priority to JP2003097074A priority patent/JP4401672B2/en
Priority to JP2003125334A priority patent/JP4250449B2/en
Priority to JP2003-125334 priority
Priority to JP2003134021A priority patent/JP2004343175A/en
Priority to JP2003-134021 priority
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KAWAI, TOMOAKI, KUROSAWA, TAKAHIRO
Publication of US20040189871A1 publication Critical patent/US20040189871A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23203Remote-control signaling for television cameras, cameras comprising an electronic image sensor or for parts thereof, e.g. between main body and another part of camera
    • H04N5/23206Transmission of camera control signals via a network, e.g. Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/148Interfacing a video terminal to a particular transmission medium, e.g. ISDN

Abstract

A moving picture file is generated from moving picture data received from a camera. Control sequence information from the camera corresponding to the generated moving picture file, and information relating to an address of the camera, are incorporated into the moving picture file. The moving picture file is generated by dividing the received moving picture data, based on camera control information.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a technique for using communications to distribute picture data, and more particularly to a technique for providing picture data and control information of a camera device or other such input device for a live picture. [0002]
  • 2. Related Background Art [0003]
  • (Live Picture Communications Systems) [0004]
  • There are distribution systems for distributing a captured live picture via the Internet, and providing instructions for settings, operations, etc. for a camera that serves as a picture capturing device. [0005]
  • In addition to providing picture distribution, these picture distribution systems and such can also provide camera controls such as pan, tilt, zoom, backlight corrections, etc. via a network. [0006]
  • Furthermore, the camera controls can also restrict areas to be captured. For example, a privileged user may be able to use all zoom functions provided to the camera, but a normal user may be limited to using only a portion of the zoom functions (e.g., the user may not be able to make complete use of a telescopic side). This may also apply with respect to the pan function and the tilt function. [0007]
  • (Third Generation Mobile Telephone Technology) [0008]
  • Third Generation (3G) mobile telephone technology service is now being provided. This mobile telephone service uses electromagnetic waves more efficiently, and has wide-bandwidth communication, than conventional mobile telephone service. [0009]
  • With Third Generation (3G) mobile telephone service, Internet accesses and other data communications are possible during telephone communication. [0010]
  • (MPEG-4 Codec) [0011]
  • Picture transmission and reception terminals have become more widespread. These range from personal information terminals (such as PDAs) and mobile telephone terminals which connect to mobile communication networks, to PCs which connect to broadband Internets. Due to the spread of these devices, in 1999, MPEG-4 was established by ISO as a moving picture compression encoding format. This format has high compression encoding efficiency covering a wide range of bit rates from several tens of Kpbs to several tens of Mbps, a strong resistance to transmission line errors in wireless and Internet-based communications, and other advantages. [0012]
  • (MPEG-4 Clip Techniques for Mobile Telephones) [0013]
  • Techniques have been offered for providing picture clips (picture files) to mobile telephone terminals. [0014]
  • In these services, the picture clip (picture file), which is encoded using MPEG-4 codec or the like, is downloaded from a server using a data communication function built into the mobile telephone terminal. After that, a decoder built into the same mobile telephone terminal is then used to display the picture on a screen of the mobile telephone terminal. [0015]
  • The data formats used for these picture clips conform to formats which have been adopted widely with the Internet, PC's, and the like. These include Microsoft's Advanced Streaming Format (ASF), ISO standard MP4 format (ISO/IEC 14496-1 Amd1 MPEG-4 system Version 2), and the like. [0016]
  • (Techniques for Associating Links and Commands with Picture Clips) [0017]
  • In formats such as Microsoft's Advanced Streaming Format (ASF) and Apple's QuickTime File Format, URL's and other hyperlink functions can be linked to the picture clip. [0018]
  • For example, the use of ASF makes it possible to define a “script command object”. Within this object, a list can be made with link information that is set to synchronize with the timeline to be followed when regenerating the ASF file. Furthermore, as the name “script command object” indicates, the use of ASF also makes it possible to author scripts and other command information, not just link information. [0019]
  • KDDI's ezmovie specifications also include a function for adding text telops (subtitles) with hyperlink functions to the picture clips. KDDI's Synchronous Telop Mark-up Language (STML) is used as a language for writing telops. This function enables the user to link audio communication, mail transmissions, home page links, etc. to the picture clip. [0020]
  • SUMMARY OF THE INVENTION
  • The present invention advances the techniques described above, and has an object to provide a picture distribution system capable of handling a new terminal represented by the mobile telephone terminal described above. [0021]
  • In order to achieve this object, one embodiment of the present invention adopts a method of generating moving picture information to distribute to a terminal device, the method including: [0022]
  • receiving moving picture data taken by a camera, and control sequence information of operations performed by the camera; [0023]
  • generating a moving picture file from the received moving picture data; [0024]
  • incorporating, into the moving picture file, the control sequence information of the camera corresponding to the generated moving picture file, and information relating to an address of the camera; and [0025]
  • distributing, to the terminal device, the moving picture file with the control sequence information of the camera and link information of the camera incorporated in the incorporating. [0026]
  • In another aspect, the present invention adopts a method of generating a moving picture file, the method including: [0027]
  • obtaining moving picture data taken by a camera, and information relating to the camera corresponding to moving picture data; and [0028]
  • generating a moving picture file by dividing the moving picture data, based on information relating to the camera. [0029]
  • Additional objects and characteristics of the present invention will become clear from the following specification and the drawings.[0030]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram showing typical usage of an embodiment of the present invention; [0031]
  • FIG. 2 is a diagram illustrating one example of a hardware construction for operating a picture conversion server; [0032]
  • FIG. 3 is a diagram illustrating one example of a hardware construction used for a camera server; [0033]
  • FIG. 4 is a diagram illustrating one example of a hardware construction for a mobile telephone terminal; [0034]
  • FIG. 5 is a diagram schematizing the operational processing programs of each of devices; [0035]
  • FIG. 6 is a flowchart showing operations of regenerating a picture on the mobile telephone terminal; [0036]
  • FIGS. 7A to [0037] 7C are diagrams showing one example of a display screen on a display device of the mobile telephone terminal;
  • FIG. 8 is a flowchart showing processing for operations of a camera; [0038]
  • FIG. 9 is a flowchart showing operations relating to controlling of a camera inside the camera server; [0039]
  • FIG. 10 is a flowchart showing operations relating to processing of the picture inside the camera server; [0040]
  • FIG. 11 is a diagram showing one example of assignment of operational keys on the mobile telephone terminal; [0041]
  • FIG. 12 is a diagram for explaining conversion of a picture data in the picture conversion server; [0042]
  • FIG. 13 is a flowchart showing processing for operations of the picture conversion server; [0043]
  • FIG. 14 is a diagram showing one example of control sequence data incorporated into a picture clip; [0044]
  • FIG. 15 is a diagram showing one example of a control history management table of a camera; and [0045]
  • FIG. 16 is a graph showing one example of a relationship between a time elapsed and an evaluation value of a priority level.[0046]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Detailed explanations are given below regarding preferable embodiments of the present invention, with reference made to the attached drawings. [0047]
  • Embodiment 1
  • Embodiment 1 is explained with respect to an example in which a live picture is received from a camera server configured on a network, and this picture is converted for a mobile telephone terminal and then is transmitted. [0048]
  • Explanation is also given regarding a picture conversion server. This picture conversion server generates a picture clip which appropriately reflects camera control status information (i.e., pan, tilt, zoom, control rights, etc.) pertaining to the picture. More particularly, the picture conversion server of this embodiment is characterized in that it utilizes the camera control status information from the time when the picture clip is made to create the picture clip so that the current camera picture corresponding to the above-mentioned camera control status information can be viewed. Here, the “picture clip” refers to picture data accumulated during a given period of time. [0049]
  • The picture conversion server obtains the picture data from the camera server [0050] 101 in advance, and then generates the picture clip which can be regenerated on a standard picture viewer provided to the mobile telephone terminal. When this picture clip is generated, link information relating to the camera control information is incorporated into the picture clip. Based on the camera control information incorporated into the picture clip, the mobile telephone terminal user can control the camera corresponding to that picture clip to obtain real-time picture data.
  • FIG. 1 is a diagram showing typical usage of this embodiment. Camera servers [0051] 101, 102 have cameras and can perform live picture transmissions. The viewer 103 is a PC or the like, which is connected to a network. The camera servers 101, 102 and the viewer 103 are each connected to the network, and a request sent from the viewer 103 through the network 109 is sent to the camera servers 101, 102. When the request is received, the picture data is sent from the camera server to the viewer, and the camera picture can be displayed at the viewer 103. Furthermore, camera control commands can be sent from the viewer 103 to the camera servers 101, 102 to make the camera zoom, pan, tilt, etc. A relay server 104 is also arranged on the network 109 and relays communications between the viewer 103 and the camera servers 101, 102.
  • The picture conversion server [0052] 105 converts the picture data provided by the camera servers 101, 102 so as to be suitable for the mobile telephone terminal 107, and then transmits the picture data to the mobile telephone terminal. A gateway 106 mediates between the network 109 and a mobile line network 110. Application programs make the mobile telephone terminal 107 and the camera function as controllable viewers. The gateway 106 enables communications between the apparatus connected to the network of this embodiment and the mobile telephone terminals 107, 107. Typically, the application program on the mobile telephone terminal 107 is installed when shipped from the factory. However, the program may also be downloaded right before it is executed (used), such as a Java(R) program.
  • The network in FIG. 1 may be an intranet which is operated within a company or organization, or may also be the Internet connecting the entire world. Furthermore, the relay server and the picture conversion server are typically configured on an internet exchange (IX) or data center (IDC: Internet Data Center). This design alleviates communication loads. [0053]
  • FIG. 2 illustrates an example of a hardware construction for operating the picture conversion server [0054] 105. The picture conversion server 105 includes a CPU 202 for providing integrated control of the entire server based on a given program, and a network I/F 201 for connecting to the network 109. Storage devices include: a RAM 203 serving as a main storage device; a secondary storage device 204 such as a flash memory and a hard disk device; and a floppy disk device 205 for loading a program from a medium.
  • An input device may also be provided to perform settings and the like, but this is not shown in the diagrams. More specifically, this device could involve a display device for connecting a display, or a keyboard and mouse or other controllers. [0055]
  • FIG. 3 illustrates one example of a hardware construction used for the camera server [0056] 101. In actuality, this construction is constituted of a camera 301 for capturing pictures, and a computer. More specifically, this construction involves: a storage device for storing a program and data; a image capture board 302 for taking in the picture data; a serial I/F 303 for sending commands to the camera 302; a network I/F 304 for connecting to the network; a CPU 305 for performing various types of processing by means of programs; and other components. The storage device is constituted of: a RAM 306 serving as a main storage device; a secondary storage device 307 such as a flash memory and a hard disk device; and a floppy disk device 308 for loading the program from a medium. Multiple cameras 301 are provided and the user can switch the picture output as desired.
  • The camera server [0057] 101 may also be a network camera which is built into a server, so that the camera 301 and the computer are integrated in a single unit.
  • FIG. 4 illustrates an example of a hardware construction for operating the mobile telephone terminal [0058] 107. As described above, the mobile telephone terminal 107 is used and an application function operates to make it function as the viewer. More specifically, the mobile telephone terminal 107 includes: a storage device 401 for storing a program and data; a wireless communication I/F 402 for connecting with the mobile line network 110; a CPU 403 for executing various processes by means of programs; and peripheral devices. Examples of the storage device include a RAM serving as a main storage device and a flash memory. Examples of the peripheral devices include: an input device such as a group of buttons 406 or a group of switches 407 configured on the mobile telephone terminal 107; a display controller 407; a display output device including a display device 408; and an audio input/output device 409 such as a microphone and speaker.
  • FIG. 5 is a diagram schematizing the operational processing programs of each of the devices. The camera server [0059] 101 includes 2 modules: a camera control server 501 for governing the camera controls, and a picture server 502 for governing the sending of the pictures. Similarly, an application program operating on the mobile telephone terminal 107 includes a camera control unit 503 for issuing camera control commands and responding to camera status notifications, and a picture display unit 504 for handling display of the picture clip. Furthermore, the picture conversion server 105 includes: a camera control unit 505 for interpreting a camera command control string (below, a “PTZ sequence”), and emitting the camera control command to the camera server 101; and modules (including a picture obtaining unit 505, a picture conversion unit 507, and a picture transmission unit 508) for obtaining the picture from the camera server 101, converting the picture from the mobile telephone terminal, and then editing this converted picture into a picture clip for the mobile line network 110.
  • FIG. 6 is a diagram showing the flow of operations to regenerate and display the picture clip on the mobile telephone terminal [0060] 107.
  • First, at step S[0061] 601, an identifier of the picture conversion server 105 is obtained. This may be an identifier which the user inputted directly by operating the keys, or may be an identifier which is selected from identifiers included in a mail or a web page. Typically, this identifier is a URL for distinguishing the picture conversion server 105.
  • Then, at step S[0062] 602, based on the obtained identifier of the picture conversion server 105, a connection is established via the gateway 106 to the picture transmission unit 508 of the picture conversion server 105.
  • The picture conversion server [0063] 105 retrieves (from within itself) the identifier of the camera server 101 which the picture conversion server 105 should connect to, and the camera identifier. Then, from the mobile telephone terminal 107, the picture conversion server 105 obtains the PTZ sequence for the camera controls (pan, tilt, and zoom) for contolling the camera server 101, and a user identifier and password or other user identification information which is used for controlling access.
  • The camera server identifier and the PTZ sequence may be a camera control command input which the user inputted into the mobile telephone terminal [0064] 107 with direct key operations, or may be selected from camera server identifiers included in a mail or a web site or information relating to the PTZ sequence.
  • Then, at step S[0065] 603, the mobile telephone terminal 107 requests the picture clip from the picture conversion server 105. This request includes instructions for generating the requested picture clip and the like. Typically, the request is sent according to HTTP protocol.
  • If information relating to the PTZ sequence has been obtained, the PTZ sequence information is then sent to the picture conversion server [0066] 105. The request and the transmission of the PTZ sequence information may be incorporated into the URL using the GET method for establishing HTTP connections or by using the POST method for establishing HTTP connections may also be used. Here, explanations are given for the case where the transmissions are performed using the POST method. The following is given as an example. (In actuality URL encoding is used; in order to facilitate explanations, there are portions where URL encoding is not used.)
  • POST/getvideoclip/HTTP/1.1 [0067]
  • Host: 202.28.30.208:8080 [0068]
  • User-Agent: MozilePhone/2.0 C2101V(c100) [0069]
  • Pragma: no-cache [0070]
  • videoencodeparam=QCIF:fps15.0:bps64000:intraframe5:me 8 [0071]
  • cameraservers=webview://vb101[0072] 4.x—
  • zone.canon.co.jp:34560+34561+34562 [0073]
  • PTZ=HZ15[0074] 30S340S440P-1020
  • moviesizemax=240 kbytes [0075]
  • notifyto=mailto:riyousha3@mailserver.co.jp [0076]
  • userid=331245 [0077]
  • userpw=15215294 [0078]
  • The portion following “videoencodeparam=” designates parameter information to be used when encoding for the mobile telephone terminal the source picture which the picture conversion server [0079] 105 received from the camera server 101. The portion following “cameraservers=” designates the camera server 101 for the picture conversion server 105 to connect to. The portion following “PTZ=” designates the information relating to the PTZ sequence with the commands for the camera controls which the picture conversion server 105 should execute for the camera server 101. The portion following “moviesizemax=” designates the maximum size of the data of the picture clip, as determined by the mobile telephone terminal 107 or by the mobile line network 110. The portion following “notifyto=” is a contact point where notification should be sent when the picture clip is generated. A mail address of the mobile telephone terminal 107 is typically indicated here. The portions following “userid=” and “userpw=” are the user identifier and password, respectively. The constitutive elements of the PTZ sequence have the following meanings. Below, n indicates numerical data (both positive and negative).
    Pn Pan designation (horizontal camera control)
    Tn Tilt designation (vertical camera control)
    Zn Zoom designation
    Bn Backlight correction,
    Turn backlight correction on/off
    H Home position designation
    Sn Preset position designation,
    Nth preset position
    Cn Designation to switch camera server connection,
    Nth camera server
    Kn Designation to switch cameras within same
    camera server,
    Nth camera
    _n Time elapse designation. 0.1 second units.
  • Then at step S[0080] 604, the system waits for a response from the picture conversion server 105. Then, at step S605, the response received from the picture conversion server 105 is interpreted and displayed on the display device of the mobile telephone terminal 107. See FIG. 7A for an example. If the response from the picture conversion server 105 indicates that for some reason the picture cannot be generated immediately, then the processing advances to step S606. Conversely, if it indicates that the picture could be generated immediately, then the processing advances to step S607.
  • At step S[0081] 606, the system waits for a mail notification from the picture conversion server 105. The mail notification may be sent by Simple Mail Transfer Protocol (SMTP), for example. It may also be sent by Short Message Service (SMS). When the mail notification is received, the content of the mail is displayed as the response from the picture conversion server 105, and then the processing advances to step S607. See FIG. 7B for an example. At step S607, one of the picture clips included in the response is selected, downloaded and regenerated/displayed. See FIG. 7C for an example.
  • Here, the processing to regenerate/display the picture clip is executed after waiting for the downloading to finish. However, it is also possible to start the regeneration/display processing when sufficient picture data has been received to enable regeneration/display, without waiting for the downloading to finish. [0082]
  • If certain operations of the user operation buttons, which are provided to the mobile telephone terminal [0083] 107, are detected while the picture clip is being regenerated/displayed on the mobile telephone terminal 107, then the link information and the camera control information (PTZ sequence information) corresponding to the picture data (also called an “picture section” or “picture segment”) that is being displayed at the time when these operations are detected is extracted from the picture clip.
  • Then, the browser function provided to the mobile telephone terminal [0084] 107 is used to access the link designated in the link information. Typically, when the link information is accessed, the application for enabling the camera controls is booted. The camera is then controlled to move in the directions in which the picture clip that is regenerated was captured, and pictures are captured as the camera moves in these direction. More specifically, the camera is time sequentially controlled to pan, tilt and zoom in the stated order based on the PTZ sequence information so as to correspond to the regenerated segment of the designated picture clip. These actions may also be performed according to settings that are configured in the mobile telephone terminal. In this case, for example; the accessed link information may be attached to a mail which is then transmitted.
  • FIG. 8 is used to explain processing for the operations of the camera. First, at step S[0085] 651 a connection is made to the picture server 502, according to the picture server 502 address and the connection port information inside the camera server 101 that was designated when the camera was activated. Here, an operations program (which can be achieved by loading a thread or a process) is loaded to perform the processing after the connection, and step S661 is repeated until the program finishes. At step S661, each time picture data arrives from the picture server 502, this picture data is received and displayed.
  • At step S[0086] 652, a main program connects to the camera control server 501 based on the address and connection port information of the camera which was similarly designated at the time when the camera was activated.
  • In the main program that follows, an operation request is received from the user and then the procedure continues to a main loop of operations which it executes. First, at step S[0087] 653 the operation performed by the user is detected. If the user's operation relates to the camera control, then at step S654 a command is issued to the camera control server 501. If the user command relates to the picture server 502, then at step S655 a command is issued to the picture server 502.
  • If the user's operation is to modify the picture display status (such as an operation to modify the display size or the like) then at step S[0088] 656 the picture's display status is updated. If the user's operation is an operation to end the regeneration of the picture, then at step S657 each of the programs relating to the operation of the viewer is sequentially shut down. When this processing at S654-S657 ends, the procedure returns to S653 and waits for an operational input from the user.
  • The application for performing the camera operations, which runs on the mobile telephone terminal [0089] 107, may be installed as software that is included when the mobile telephone terminal is shipped. However, it may also be software which is downloaded from a network and then installed, like a Java(R) program.
  • FIG. 8 does not show a case where an operation is performed to switch the connection from one camera server to another camera server. Nor does it show a case where multiple cameras are connected to the camera server. In such cases, it is also possible to perform operations to switch the camera that obtains the pictures. [0090]
  • FIG. 9 is a flowchart showing operations of the camera control server [0091] 501 inside the camera server 101. First, at step S701, when the system is booted, the camera control server 501 reads out information for configuring the operations of the camera control server 501 from a specific file (such as a registry or other database depending on the OS). Operations are then started based on this information. Here, the port to receive the request from the mobile telephone terminal 107 (which is the client) is opened via the picture conversion server 105. Then, this port enters a state where it is ready to receive the request at step S702.
  • Upon reception of the request (e.g., connection request, or operation command request), the procedure then leaves step S[0092] 702. If the request is a connection request, then a judgment is made at step S703 as to whether or not to permit the connection. If the connection cannot be permitted, then an error code is returned to indicate that the connection was rejected, and then the procedure returns to step S702. If the connection is permitted, then at step S704 a thread is generated to perform reception processing (which serves as connection processing) to receive the commands from the mobile telephone terminal 107 or other client. After the client is registered, the processing then returns to step S702. The thread that was generated here is used at step S707 to receive the commands from the corresponding client.
  • When the command arrives it is received and passed to the main program for performing the camera operations. The main program receives this at step S[0093] 702. The procedure then advances to step S705 and performs camera control in accordance with the authority of the client connected to the thread from which the operation command was issued. The result of performing this camera control (such as a code indicating whether the control succeeded or failed) is transmitted to the thread corresponding to the client that received the camera operations request. At step S708, the thread corresponding to this client sends the result back to the client. At step S706, the main program portion transmits the status change produced by the camera operation at step S706 (i.e., the camera status information including, for example, a pan, tilt, or zoom value and whether or not a prohibited area was detected, etc.) to all the threads corresponding to the client.
  • At Step S[0094] 709, each thread corresponding to the client notifies the client of the camera control status change. When the thread corresponding to the client receives, from the client, the command to end the connection, the main program is then notified of this command, and then at step S710 that thread ends.
  • FIG. 10 is a flowchart showing operations of the picture control server [0095] 502 inside the camera server 101. First, at step S801, when the system is booted, the picture control server 502 reads out information for configuring the operations of the picture control server 502 from a specific file (such as a registry or other database depending on the OS). Operations are then started based on this information. Here, a thread for obtaining, encoding, and accumulating a picture is generated (this thread is initially in a resting state), and the port to receive the request from the mobile telephone terminal 107 (which is the client) is opened. Then, this port enters a state where it is ready to receive the request at step S802.
  • Upon reception of the request (e.g., connection request or command request), the procedure then leaves step S[0096] 802. If the request is a connection request, then a judgment is made at step S803 as to whether or not to permit the connection. If the connection cannot be permitted, then an error code is returned to indicate that the connection was rejected, and then the procedure returns to step S802. If the connection is permitted, then at step S804 a session identifier is generated to identify sessions for respective clients. A thread for performing reception processing of commands from the client is generated, and the client is registered in accordance with an information on an access right etc. of the client who has issued a connection request. The processing then returns to step S802.
  • If the content of the request is a request for the live picture, and if the thread for obtaining and encoding the picture is in a resting state, then an instruction is giving for the resting thread to start operating before the processing returns to step S[0097] 802. At step S807, the thread that was generated receives the command from the corresponding client. When the command arrives, it is received and then passed to the main program for performing the picture processing.
  • At step S[0098] 802 the main program receives this command, and then the procedure advances to step S805. Operations are then performed to modify the settings for obtaining, encoding, transmitting, and other handling of the picture. The result of these operations (e.g., a code indicating whether these operations succeeded or failed) is then sent to the thread that corresponds to the client that received the command request. At step S808, the thread corresponding to the client sends this result back to the client.
  • At step S[0099] 806, the main program portion uses the image capture board to obtain the picture data at pre-set time intervals, according to an instruction for starting operations which is given to the thread that will perform the picture capturing and encoding from step S804. This picture data is then converted into compressed data. This compressed data is then transmitted to the threads corresponding to all the clients connected to the live picture.
  • At step S[0100] 809, the thread corresponding to each client determines whether or not there is a subsequent picture frame transmission request. If there is a request, then the picture data is sent to the client. If a prohibited area has been detected, then pre-registered client information is referenced. If the client is not registered, then the connection to that client is in a prohibited area. Therefore, a notification (i.e., a prohibited area detection notification) is sent to indicate that the compressed data will not be transmitted.
  • Then, if the thread corresponding to the client which is connected to the live picture receives the subsequent picture frame transmission request from the client, a picture frame transmission request flag is then set. (The subsequent picture frame transmission request is generally sent from the client after the client has completed the reception of the compressed picture data.) [0101]
  • If a command to terminate the connection is received from the client, then this command is sent to the main program, and then that thread is terminated at step S[0102] 810.
  • FIG. 11 is a diagram showing one example of assignment of operational keys on the mobile telephone terminal [0103] 107 which are used to create the PTZ control command. In order to create the PTZ sequence, the keys of the mobile telephone terminal 107 are assigned as shown in the diagram with functions to pan (i.e., horizontal movement of the camera), tilt (i.e., vertical movement of the camera), zoom (i.e., changes to increase magnification), perform backlight correction, and the like. The commands that are created are then sent via the picture conversion server 105 to the camera control server 501.
  • FIG. 12 is a diagram which schematically shows the general flow of the picture data in the picture conversion server [0104] 105. The picture data which is sent from the camera servers 101 (motion JPEG, QVGA-size 320×240) is received through a communications stack corresponding to the camera server of the picture conversion server 105, and then this data is passed on to a JPEG decoder. This data is then transferred over to an MPEG-4 encoder which is configured for mobile telephone terminals. After the data is turned into picture data for mobile telephone terminals (MPEG-4 simple profile, QCIF size 176×144, 64 Kbps) it is then sent as the picture clip to the mobile telephone terminal 107 via the communications stack corresponding to the mobile line network.
  • FIG. 13 is a flowchart showing operations of the picture conversion server [0105] 105.
  • First, at step S[0106] 1301, upon activation, the picture conversion server 105 reads out the operation setting information for the picture conversion server 105 from a special file, and then operations are started based on this information. Here, the communications port for receiving the request from the application program running on the mobile telephone terminal 107 (which is the client) is opened, and then the picture conversion server 105 becomes ready to receive the request at step S1302.
  • When the request (i.e., HTTP request message or the like) has been received, the procedure then leaves step S[0107] 1302. At step S1303, a determination is made as to whether or not to permit the connection. If the connection is not to be permitted, then an error code response is sent to indicate that the connection has been rejected, and then the procedure returns to step S1302. If the connection is to be allowed, then at step S1304 the threads corresponding to each client are generated for each client to perform transfer of information with the client as the connection processing. The client is then registered, and then the procedure returns to S1302.
  • At step S[0108] 1311, the generated threads corresponding to each client then read the request from their corresponding clients and analyze the content of the request. Typically, the request is transferred to the picture conversion server 105 as an HTTP request. Sometimes the POST method is used to perform the HTTP request, and sometimes the GET method is used.
  • At step S[0109] 1312, the following information is extracted from the content of the request: the encoding parameter information (i.e., picture conversion parameters); the information for connecting to the camera server 101 (i.e., the source picture information); the PTZ sequence information; the maximum size of the picture clip (i.e., the upper limit value of the picture clip); the notification destination information (i.e., the address to which the notification should be sent); and the user identification, password and other user identification information. These are indicated respectively as the values given for “videoencodeparam=”, “cameraservers=”, “PTZ=”, “moviesizemax=”, “notifyto=”, “userid=”, and “userpw=”.
  • The picture conversion parameters include the selection of the codec for performing the conversion, parameters for the codec, data formats for inputting and outputting the codec, etc. The source picture information includes communication attribute information such as, for example, the network address and port number of the camera server [0110] 101 providing the live picture. Typically, the communications destination address is the mail address that designates the mobile telephone terminal 107 which is currently connected.
  • At step S[0111] 1313, information is then sent as the HTTP response to the HTTP request. This information indicates that “The picture cannot be generated immediately. A communication will be sent by mail in a while.”
  • At step S[0112] 1314, the picture obtaining unit 506 is initialized in accordance with the source picture information and the user identification information. More specifically, a connection is established with the camera server 501 providing the source picture, and then obtaining of the source picture is begun.
  • The procedure then advances to step S[0113] 1315, where the picture conversion unit is initialized in accordance with the picture conversion parameters. The picture conversion unit 507 includes the MPEG-4 encoder, etc. Then the procedure advances to step S1316 and the picture transmission unit is initialized. At this time, instructions indicating the maximum value of the picture clip and the address where the notification should be sent to are given to the picture transmission unit 508.
  • At step S[0114] 1317, mutual relationships are established to send the processing data from the picture obtaining unit 506 to the picture conversion unit 507, and then from the picture conversion unit 507 to the picture transmission unit 508, respectively. Then, the camera control unit 505 transfers the PTZ sequence information to the camera control server 501, and then the camera control at the camera server 101 is executed.
  • At step S[0115] 1318, the post-processing is then performed at the picture obtaining unit 506, the picture conversion unit 507, and the picture transmission unit 508. At step S1319, the thread for the client is ended.
  • Explanation is now given in sequence regarding the picture obtaining unit [0116] 506, the picture conversion unit 507, the picture transmission unit 508 and the camera control unit 505, which function inside the picture conversion server 105. First, the picture obtaining unit 506 performs processing to connect to the camera server 101 which provides the live picture. This processing is performed according to the source picture information and the user identification information received during initialization. Then, the picture data is obtained from the camera server 101 and receives a time stamp indicating the time it was obtained, and then the picture data is transferred to the picture conversion unit 507. The camera server 101 according to this embodiment provides the picture data in the Motion JPEG format. Therefore, the time stamp is applied to each of the individual units of JPEG data.
  • In the case where the notification indicating detection of the prohibited area has been received from the camera server [0117] 101, the prohibited area detection notification is sent to the picture conversion unit 507 instead of sending the picture data. Then, at the picture conversion unit 507, the MPEG-4 encoder is first configured with the codec parameters which were received during initialization, the codec input/output data formats, and the like.
  • The picture conversion unit [0118] 507 then arranges the source picture data received from the picture obtaining unit 506 so as to conform to the codec input data format and the picture size, and then this is inputted into the MPEG-4 encoder. The data resulting from this processing is then transferred to the picture transmission unit 508. Here, the picture conversion unit 507 according to the present invention pre-arranges the JPEG-format source picture data so as to conform with the QCIF size and YUV411 format, and then inputs this data into MPEG-4 codec. The MPEG-4 data thus generated (i.e., I-frame or P-frame) is then transferred to the picture transmission unit 508. At this time, the time stamp which was applied at the picture obtaining unit 506 is also inputted into the MPEG-4 codec together with the picture data.
  • In the case where the notification indicating detection of the prohibited area has been received from the picture obtaining unit [0119] 506, a composite screen which indicates that camera controls are limited in that area and the picture cannot be displayed, is inputted into the MPEG-4 codec instead of inputting the source picture data.
  • Next, the picture transmission unit [0120] 508 first secures a memory area equivalent to the maximum value of the picture clip, which was received during initialization. The picture data which the picture conversion unit 507 configured for the mobile telephone terminal is then received and saved in the memory area that was secured.
  • The picture transmission unit [0121] 508 then determines where to divide the picture clip. This determination is for a segment running from the point where the picture display prohibition detection notification was received, to a point where it was detected that the prohibited area has ended. The rate of usage of the memory area is also taken into consideration when determining where to divide the picture clip. Then, when the point to divide the picture clip has been determined, header information is placed conforming to the data format of the picture clip configured for the mobile telephone terminal. The picture data is then saved in the memory area as a file, and the memory area is reused. Accordingly, the picture clip is divided and saved into a plurality of files.
  • Explanation is now given regarding a method of determining where to divide the picture data when generating the picture clip in areas where picture display is not prohibited. [0122]
  • When the picture transmission unit [0123] 508 generates the picture clip, periodic calculations are made to determine priority levels for places to divide the picture clip. These periodic calculations may be made each time picture data is received, say, at one-second intervals. The calculations may be performed according to conditions (0)-(5), for example. As shown in FIG. 14, evaluation values can be obtained to indicate priority levels corresponding to points in time.
  • (0) “pri” represents the evaluation value for the priority level assigned to the immediately preceding picture data from the picture conversion unit [0124] 507. The initial value of pri is 0.
  • (1) The value of pri is raised by 3 in either of the following cases: when information is notified from the camera control unit [0125] 505 indicating picture data when the camera control information for the picture data are preset designations (including home position designations), or when a notification is received giving the instruction to change the camera server.
  • (2) The value of pri is raised by 2 when a notification is received from the camera control unit [0126] 505 with instructions for timing while changing cameras (within the same camera server).
  • (3) The value of pri is raised by 1 when notification is received from the camera control unit [0127] 505 with instructions indicating that the camera pan/tilt/zoom (PTZ) is intended for the pictures that are currently being processed.
  • (4) The value of pri is multiplied by 0.9 if there is no notification from the camera control unit [0128] 505.
  • (5) A priority level of “3” is assigned whenever the calculations produce a result greater than “3”. [0129]
  • The evaluation values shown in FIG. 14 are arranged in predetermined intervals (e.g., 15-second intervals), and the picture clip is divided at the point where the evaluation value is the greatest. Thus the picture clip is generated. [0130]
  • The picture clip can also be divided according to another method. For example, information may be detected which indicates that the camera control information corresponding to the picture data is moving toward a preset position, or significant changes may be detected in the pan/tilt/zoom within a given duration of time (e.g., changes equal to 20% or more of the controllable range), and the picture clip can be divided at such points. [0131]
  • The picture transmission unit [0132] 508 receives the camera control status information that was obtained at the picture obtaining unit 506, and saves this information in chronological order. The camera control status information refers to information indicating when and how the camera was manipulated while capturing pictures. The camera control sequence (the PTZ sequence) information which corresponds to the camera control status information is generated from the camera control status information from a predetermined period of time. This PTZ sequence serves as a parameter for information which is incorporated into the picture clip as the link information.
  • Typically, the PTZ sequence which corresponds to the camera control status information is used as the parameter. Furthermore, in a case where the initial value of the PTZ sequence is designated, or when it matches with the preset position or a home position or the like, the absolute values of the pan, tilt, and zoom are incorporated into the picture clip as the link information. [0133]
  • As shown in FIG. 14, in this embodiment, duplicate camera control sequences are redundantly assigned to the link information for a given picture data section (picture segment). [0134]
  • The PTZ sequence may be constituted as the camera sequence information that is incorporated into the picture clip. This may include designations for multiple camera servers. However, the following type of PTZ sequence is also possible. (Here, the “Cn” included in the PTZ sequence indicates that the connection switches to the n[0135] th camera server.)
  • cameraservers=(webview://cam1.univ.ac.jp:34560+34561+34562)(webview://cam2.univ.ac.jp:34560+34561+34 562)(webview://cam3.univ.ac.jp:34560+34561+34562)
  • PTZ=C1HZ1530C2S340C3S440P−1020
  • Furthermore, it is also possible to weave a designation for the camera server [0136] 1 into the PTZ sequence, for example, as shown below. (Here, the “C(x)” that is included in the PTZ sequence indicates that the connection switches to the camera server 1 indicated at x.)
  • cameraservers=[0137] PTZ=C(webview://cam1.univ.ac.jp:34560+34561+345 62)HZ1530C(webview://cam2.univ.ac.jp:34560+34561+345 62)S340C(webview://cam3.univ.ac.jp:34560+34561+34562 )S440P−10—20
  • Then, when a notification is received from the camera control unit [0138] 505 indicating that the PTZ sequence is finished, the remainder of the picture data that is being saved in the memory area is also saved as a file. After that, notification information to be sent to the mobile telephone terminal 107 is created. In this notification information is buried link information with links to the multiple picture clips which have been saved so far. The notification information is then sent to the address which indicates where the notification should be sent, which was received during initialization. Accordingly, the mobile telephone terminal 107 that received the notification information can now send the download request for each of the picture clips.
  • The picture transmission unit [0139] 508 has an HTTP server function and handles the HTTP picture clip download request from mobile telephone terminal 107.
  • Then, the camera control unit [0140] 505 interprets the PTZ sequence and creates the camera control command that will be sent to the camera server 101, and then the camera control command which is thus created is sent to the camera server 101 according to timing indicated in the PTZ sequence. This is how the camera server 101 camera control is performed. Then, upon finish of the PTZ sequence interpretation, notification is sent to the picture transmission unit 508 to indicate that the PTZ sequence is finished.
  • According to the construction described above, as to the mobile telephone terminal [0141] 107, the user regenerating the picture clip can use the camera control status information from the time when the picture clip was made owing to the function of the picture conversion server.
  • According to the foregoing explanations, the live picture sent from the camera server arranged on the network can be converted into the picture clip for the mobile telephone terminal and provided to the user. In particular, the picture conversion server of this embodiment is characterized by generating the picture clip that reflects the camera control status information of the camera server. [0142]
  • In accordance with this embodiment, the picture conversion server [0143] 105 divides the picture clip. However, the picture data and the information relating to the evaluation value of the priority level (see FIG. 14) can be sent to the mobile telephone terminal 107, and the mobile telephone terminal 107 can generate the picture clip.
  • This embodiment is explained using an example in which the picture conversion server is arranged on the network independently from the gateway which connects the mobile line network and the network. However, the picture conversion server [0144] 105 can also be arranged as one portion of the gateway 106. It is also easy to imagine a connection configuration in which the picture conversion server 105 and the gateway 106 are connected by a dedicated line, which could include a Virtual Private Network (VPN) or the like.
  • Furthermore, this embodiment is explained using an example in which the mobile telephone terminal [0145] 107 and the picture conversion server 105 communicate with each other by using HTTP. However, they may also communicate using Simple Mail Transfer Protocol (SMTP).
  • This embodiment was explained using an example in which the mobile telephone terminal where the camera control is performed, and the terminal which requests the picture clip, are the same terminal. However, the user who issues the camera control command may be a different user. For example, a configuration may be used in which the picture clip is accessed from a terminal to which the picture clip has been transferred once again. When this configuration is used, the application for controlling the camera becomes unnecessary. [0146]
  • This embodiment was explained using an example in which the link information containing the camera control sequence is generated at pre-set cycles and incorporated into the picture clip. However, the timing to generate the link information is not restricted to pre-set cycles. For example, the link information may be generated when changes in the camera control status information accumulate and reach a value which satisfies predetermined conditions. Alternatively, the link information may be generated when changes in the picture data itself (e.g., changes in the number of optically recognized objects) satisfy predetermined conditions. [0147]
  • Furthermore, the camera control sequence, which is generated at the picture transmission unit in this embodiment, is generated based on camera status information obtained from the picture obtaining unit. However, it may also be generated by cutting out a partial sequence from the PTZ sequence that is interpreted by the camera control unit. [0148]
  • When this method is used, it becomes possible to obtain the camera control sequence which is closer to that intended by the user who requested the creation of the picture clip. However, when the method includes a camera control for a prohibited area, for example, a camera control sequence will be generated which is not synchronized with the picture data. Thus, this method has advantages and disadvantages. [0149]
  • The picture transmission unit of this embodiment was explained using an example in which the camera control sequence during the adjacent picture data section is assigned to the link information for a given picture data section within the picture clip. However, the length (or time) of the assigned camera control sequence does not have to be dependent upon the chronological length of the picture data section. Typically, it is a fixed length (time) which is pre-set in the picture conversion server. Moreover, the length (time) of the assigned camera control sequence may also be dependent upon the occurrence of various events. [0150]
  • This embodiment was explained using an example in which the picture obtained from the camera server is converted into the picture clip suitable for the mobile telephone terminal and then is transmitted. There is no restriction as to the format of the picture clip. For example, when using Microsoft, Inc.'s Windows(R) MediaPlayer, in order to make the data correspond to MPEG-4 codec as well, when the picture transmission unit of this embodiment generates the picture clip, it can arrange the data in a data format conforming to Microsoft, Inc.'s ASF format. Moreover, within that format, the MPEG-4 codec can be designated so that the data can be regenerated in the Windows(R) MediaPlayer. Similarly, by arranging the data to conform to Apple, Inc.'s QuickTime File Format, the data can also be used with the QuickTimePlayer as well. [0151]
  • It is also possible to perform the control of the camera by referencing a management table containing control information history, which is managed inside the picture conversion server [0152] 105. When the control is performed in this way, the picture conversion unit in Embodiment 2 first saves the camera status information received from the camera server 101 in chronological sequence, and then based on this information, the picture conversion unit then generates the camera control sequence (PTZ sequence) corresponding to the camera control status information. When performing this processing, the generated PTZ sequence is not incorporated into the picture clip directly as the link information. Instead of performing this processing, the generated PTZ sequence is stored into the control history management table of the camera inside the picture conversion server such as shown in FIG. 15, and then information reference in the control history management table is incorporated into the picture clip as the link information. Identifiers (i−1, i, i+1, . . . ) are assigned to entries in the control history management table, and these identifiers serve as the information referenced in the control history management table.
  • The assigned identifiers may be synthesized from a camera server identifier (typically an IP address) and an indication of the time at which the PTZ sequence was generated. Alternatively, the identifiers may be serial numbers, which are assigned sequentially. [0153]
  • In the case where the mobile telephone terminal [0154] 107 requests that the camera control information be referenced, the identifier serves as a key to search inside the control history management table, and then the camera control is performed based on the PTZ sequence that is thus found.
  • According to the construction described above, as the user of the mobile telephone terminal [0155] 107, the user regenerating the picture clip can use the camera control status information from the time when the picture clip was made. Furthermore, by referencing the control history management table, which is managed inside the picture conversion server, it also becomes possible to reference the camera control information from time frames adjacent to the time frame of the generated picture clip. By tracing the identifier (the information referenced in the control history management table) contained in the picture clip, the controls performed before and after the picture clip can also be obtained, like pulling up a string of potatoes, as it were.
  • A configuration is also possible in which the camera status information itself from the time frame in question is stored in the picture clip, and then based on that camera status information, the camera information is generated on the mobile telephone terminal [0156] 107 side, and then this serves as the control request for the camera 301 of the camera server 101. This configuration alleviates the burden on the picture conversion server 105.
  • The objectives of the present can also be achieved when the picture conversion server [0157] 105 and the camera server 101 are constructed as a single, integrated device.

Claims (25)

What is claimed is:
1. A method of generating moving picture information to distribute to a terminal device, the method including:
receiving moving picture data taken by a camera, and control sequence information of operations performed by the camera;
generating a moving picture file from the received moving picture data;
incorporating, into the moving picture file, the control sequence information of the camera corresponding to the generated moving picture file, and information relating to an address of the camera; and
distributing, to the terminal device, the moving picture file with the control sequence information of the camera and link information of the camera incorporated in the incorporating.
2. The method according to claim 1, wherein the control sequence information from the camera includes at least one of a pan operation, a tilt operation, and a zoom operation performed by the camera.
3. The method according to claim 1, wherein in the generating, the moving picture file is generated by dividing the received moving picture data.
4. A moving picture information distribution apparatus, including:
a communications device for receiving moving picture data taken by a camera, and control sequence information of operations performed by the camera; and
a file generation device for generating a moving picture file from the received moving picture data by incorporating into the moving picture file the camera control sequence information corresponding to the generated moving picture file, and information relating to the an address of the camera.
5. The apparatus according to claim 4, wherein the camera control sequence information includes at least one of a pan operation, a tilt operation, and a zoom operation performed by the camera.
6. The apparatus according to claim 4, wherein in the file generation device, the moving picture file is generated by dividing the received moving picture data.
7. The apparatus according to claim 4, wherein the camera is incorporated integrally into the distribution device.
8. A computer readable medium which stores a program for executing a distribution method comprising:
receiving moving picture data taken by a camera, and control sequence information of operations performed by the camera;
generating a moving picture file from the received moving picture data;
incorporating, into the moving picture file, the control sequence information of the camera corresponding to the generated moving picture file, and information relating to an address of the camera; and
distributing, to the terminal device, the moving picture file with the control sequence information of the camera and link information of the camera incorporated in the incorporating.
9. The medium according to claim 8, wherein the control sequence information from the camera includes at least one of a pan operation, a tilt operation, and a zoom operation performed by the camera.
10. The medium according to claim 8, wherein in the generating, the moving picture file is generated by dividing the received moving picture data.
11. A method of generating a moving picture file, the method including:
obtaining moving picture data taken by a camera, and information relating to the camera corresponding to moving picture data; and
generating a moving picture file by dividing the moving picture data, based on information relating to the camera.
12. The method according to claim 11, wherein the information relating to the camera is information relating to a range where the camera is prohibited from capturing pictures.
13. The method according to claim 11, wherein the information relating to the camera is information relating to switching of the camera.
14. The method according to claim 11, wherein the information relating to the camera is information relating to operations of the camera.
15. The method according to claim 14, wherein the operation information of the camera is one of information relating to a change amount per unit time and movement information indicating movement toward a pre-set position.
16. An apparatus for generating a moving picture file, comprising:
an obtaining device for obtaining moving picture data taken by a camera, and information relating to the camera corresponding to moving picture data; and
a generating device for generating a moving picture file by dividing the moving picture data, based on information relating to the camera.
17. The apparatus according to claim 16, wherein the information relating to the camera is information relating to a range where the camera is prohibited from capturing pictures.
18. The apparatus according to claim 16, wherein the information relating to the camera is information relating to switching of the camera.
19. The apparatus according to claim 16, wherein the information relating to the camera is information relating to operations of the camera.
20. The apparatus according to claim 16, wherein the operation information of the camera is one of information relating to a change amount per unit time and movement information indicating movement toward a pre-set position.
21. A computer readable medium which stores a program for executing a method of generating a moving picture file, the method comprising:
obtaining moving picture data taken by a camera, and information relating to the camera corresponding to moving picture data; and
generating a moving picture file by dividing the moving picture data, based on information relating to the camera.
22. The medium according to claim 21, wherein the information relating to the camera is information relating to a range where the camera is prohibited from capturing pictures.
23. The medium according to claim 21, wherein the information relating to the camera is information relating to switching of the camera.
24. The medium according to claim 21, wherein the information relating to the camera is information relating to operations of the camera.
25. The medium according to claim 21, wherein the operation information of the camera is one of information relating to a change amount per unit time and movement information indicating movement toward a pre-set position.
US10/810,703 2003-03-31 2004-03-29 Method of generating moving picture information Abandoned US20040189871A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2003-097074 2003-03-31
JP2003097074A JP4401672B2 (en) 2003-03-31 2003-03-31 The information processing apparatus, information processing method, and program
JP2003125334A JP4250449B2 (en) 2003-04-30 2003-04-30 Video communication system, a video communication device, terminal, and camera control method of the terminal
JP2003-125334 2003-04-30
JP2003-134021 2003-05-13
JP2003134021A JP2004343175A (en) 2003-05-13 2003-05-13 Video relaying apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/329,432 US8692897B2 (en) 2003-03-31 2011-12-19 Method of generating moving picture information

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/329,432 Continuation US8692897B2 (en) 2003-03-31 2011-12-19 Method of generating moving picture information

Publications (1)

Publication Number Publication Date
US20040189871A1 true US20040189871A1 (en) 2004-09-30

Family

ID=32995628

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/810,703 Abandoned US20040189871A1 (en) 2003-03-31 2004-03-29 Method of generating moving picture information
US13/329,432 Active US8692897B2 (en) 2003-03-31 2011-12-19 Method of generating moving picture information

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/329,432 Active US8692897B2 (en) 2003-03-31 2011-12-19 Method of generating moving picture information

Country Status (2)

Country Link
US (2) US20040189871A1 (en)
CN (1) CN100493177C (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060109354A1 (en) * 2004-11-23 2006-05-25 Samsung Electronics Co.,Ltd. Mobile communication terminal for controlling a zoom function and a method thereof
US20090077622A1 (en) * 2005-03-16 2009-03-19 Marc Baum Security Network Integrated With Premise Security System
US20100277600A1 (en) * 2007-12-17 2010-11-04 Electronics And Telecommunications Research Institute System and method for image information processing
US9269398B2 (en) * 2004-07-12 2016-02-23 Koninklijke Philips N.V. Content with navigation support
US9287727B1 (en) 2013-03-15 2016-03-15 Icontrol Networks, Inc. Temporal voltage adaptive lithium battery charger
US9306809B2 (en) 2007-06-12 2016-04-05 Icontrol Networks, Inc. Security system with networked touchscreen
US9349276B2 (en) 2010-09-28 2016-05-24 Icontrol Networks, Inc. Automated reporting of account and sensor information
US9412248B1 (en) 2007-02-28 2016-08-09 Icontrol Networks, Inc. Security, monitoring and automation controller access and use of legacy security control panel information
US9426720B2 (en) 2009-04-30 2016-08-23 Icontrol Networks, Inc. Controller and interface for home security, monitoring and automation having customizable audio alerts for SMA events
US9450776B2 (en) 2005-03-16 2016-09-20 Icontrol Networks, Inc. Forming a security network including integrated security system components
US9510065B2 (en) 2007-04-23 2016-11-29 Icontrol Networks, Inc. Method and system for automatically providing alternate network access for telecommunications
US9531593B2 (en) 2007-06-12 2016-12-27 Icontrol Networks, Inc. Takeover processes in security network integrated with premise security system
US9609003B1 (en) 2007-06-12 2017-03-28 Icontrol Networks, Inc. Generating risk profile using data of home monitoring and security system
US9621408B2 (en) 2006-06-12 2017-04-11 Icontrol Networks, Inc. Gateway registry methods and systems
US9628440B2 (en) 2008-11-12 2017-04-18 Icontrol Networks, Inc. Takeover processes in security network integrated with premise security system
US9729342B2 (en) 2010-12-20 2017-08-08 Icontrol Networks, Inc. Defining and implementing sensor triggered response rules
US9867143B1 (en) 2013-03-15 2018-01-09 Icontrol Networks, Inc. Adaptive Power Modulation
US9928975B1 (en) 2013-03-14 2018-03-27 Icontrol Networks, Inc. Three-way switch
US10051078B2 (en) 2007-06-12 2018-08-14 Icontrol Networks, Inc. WiFi-to-serial encapsulation in systems
US10062273B2 (en) 2010-09-28 2018-08-28 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US10062245B2 (en) 2005-03-16 2018-08-28 Icontrol Networks, Inc. Cross-client sensor user interface in an integrated security network
US10078958B2 (en) 2010-12-17 2018-09-18 Icontrol Networks, Inc. Method and system for logging security event data
US10079839B1 (en) 2007-06-12 2018-09-18 Icontrol Networks, Inc. Activation of gateway device
US10091014B2 (en) 2005-03-16 2018-10-02 Icontrol Networks, Inc. Integrated security network with security alarm signaling system
US10127801B2 (en) 2005-03-16 2018-11-13 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US10142392B2 (en) 2007-01-24 2018-11-27 Icontrol Networks, Inc. Methods and systems for improved system performance
US10156959B2 (en) 2005-03-16 2018-12-18 Icontrol Networks, Inc. Cross-client sensor user interface in an integrated security network
US10156831B2 (en) 2004-03-16 2018-12-18 Icontrol Networks, Inc. Automation system with mobile interface
US10200504B2 (en) 2007-06-12 2019-02-05 Icontrol Networks, Inc. Communication protocols over internet protocol (IP) networks
US10237237B2 (en) 2007-06-12 2019-03-19 Icontrol Networks, Inc. Communication protocols in integrated systems
US10313303B2 (en) 2007-06-12 2019-06-04 Icontrol Networks, Inc. Forming a security network including integrated security system components and network devices
US10339791B2 (en) 2007-06-12 2019-07-02 Icontrol Networks, Inc. Security network integrated with premise security system
US10348575B2 (en) 2013-06-27 2019-07-09 Icontrol Networks, Inc. Control system user interface
US10365810B2 (en) 2015-05-05 2019-07-30 Icontrol Networks, Inc. Control system user interface

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108495052A (en) 2013-10-11 2018-09-04 奥林巴斯株式会社 Imaging device, imaging system, and imaging method

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5581362A (en) * 1993-08-04 1996-12-03 Matsushita Electric Industrial Co., Ltd. Video camera system which multiplexes internal, external, and sensing parameters onto the video signal in order to identify individual segments
JPH10136247A (en) * 1996-10-25 1998-05-22 Canon Inc Camera control system, camera server, control method therefor and storage medium
US5986695A (en) * 1996-07-27 1999-11-16 Samsung Electronics Co., Ltd. Recording method and apparatus for conserving space on recording medium of security system
US5996023A (en) * 1996-10-31 1999-11-30 Sensormatic Electronics Corporation Efficient pre-alarm buffer management in intelligent video information management system
US6002995A (en) * 1995-12-19 1999-12-14 Canon Kabushiki Kaisha Apparatus and method for displaying control information of cameras connected to a network
US6208379B1 (en) * 1996-02-20 2001-03-27 Canon Kabushiki Kaisha Camera display control and monitoring system
US20010010549A1 (en) * 1997-01-27 2001-08-02 Fuji Photo Film Co., Ltd. Camera which records positional data of GPS unit
US6414716B1 (en) * 1996-11-29 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for controlling an imaging apparatus, imaging operation control system, and storage medium storing a program implementing such a method
US20020146238A1 (en) * 2001-04-10 2002-10-10 Takayuki Sugahara Video signal recording method, video signal reproduction method, video signal recording apparatus, video signal reproducing apparatus, and video signal recording medium
US6469737B1 (en) * 1996-07-23 2002-10-22 Canon Kabushiki Kaisha Image-sensing server and control method and storage medium therefor
US20020154070A1 (en) * 2001-03-13 2002-10-24 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and control program
US20020175995A1 (en) * 2001-05-26 2002-11-28 Marc Sleeckx Video surveillance system
US20030078038A1 (en) * 2001-09-28 2003-04-24 Takahiro Kurosawa Communication apparatus and control method therefor, information apparatus and control method therefor, communication system, and control programs
US20030093810A1 (en) * 2001-10-30 2003-05-15 Koji Taniguchi Video data transmitting/receiving method and video monitor system
US20030214582A1 (en) * 2002-05-08 2003-11-20 Kazunori Takahashi Video delivery apparatus and video information delivery system
US6698021B1 (en) * 1999-10-12 2004-02-24 Vigilos, Inc. System and method for remote control of surveillance devices
US6738572B2 (en) * 2001-02-03 2004-05-18 Hewlett-Packard Development Company, L.P. Function disabling system for a camera used in a restricted area
US6741790B1 (en) * 1997-05-29 2004-05-25 Red Hen Systems, Inc. GPS video mapping system
US20040109071A1 (en) * 2002-12-05 2004-06-10 Minolta Co., Ltd. Image capturing apparatus
US20040125877A1 (en) * 2000-07-17 2004-07-01 Shin-Fu Chang Method and system for indexing and content-based adaptive streaming of digital video content
US6807361B1 (en) * 2000-07-18 2004-10-19 Fuji Xerox Co., Ltd. Interactive custom video creation system
US6999613B2 (en) * 2001-12-28 2006-02-14 Koninklijke Philips Electronics N.V. Video monitoring and surveillance systems capable of handling asynchronously multiplexed video
US7046273B2 (en) * 2001-07-02 2006-05-16 Fuji Photo Film Co., Ltd System and method for collecting image information

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3536348B2 (en) 1993-08-04 2004-06-07 松下電器産業株式会社 Video camera system
AU7921094A (en) 1993-10-20 1995-05-08 Videoconferencing Systems, Inc. Adaptive videoconferencing system
JP3247307B2 (en) 1996-11-01 2002-01-15 松下電器産業株式会社 Information providing system
JPH1185654A (en) * 1997-09-12 1999-03-30 Matsushita Electric Ind Co Ltd Virtual www server device and camera controllable www server device
JP2000041206A (en) 1998-07-22 2000-02-08 Sony Corp Camera system and image edit method
CN1278980A (en) 1998-09-18 2001-01-03 三菱电机株式会社 Pick-up camera control system
JP2001218194A (en) 1999-11-15 2001-08-10 Canon Inc Control method for image pickup unit and image distributing system, controller for image pickup unit, system and device for distributing image and device and method for distributing data
JP4214346B2 (en) 2000-07-31 2009-01-28 富士フイルム株式会社 Communication equipment, servers, service providing apparatus, the service system and service method and service menu providing method and service menu providing system
JP2002077879A (en) 2000-08-30 2002-03-15 Toshiba Corp Image-monitoring method and system
JP2002084445A (en) 2000-09-08 2002-03-22 Canon Inc Method for controlling camera through the use of portable phone
US7373391B2 (en) * 2000-10-24 2008-05-13 Seiko Epson Corporation System and method for digital content distribution
JP4664527B2 (en) 2001-05-24 2011-04-06 株式会社日立国際電気 Video distribution system, video delivery method
JP2002354328A (en) * 2001-05-30 2002-12-06 Minolta Co Ltd Image-photographing device, operation device and image- photographing system thereof
JP2003037836A (en) 2001-07-24 2003-02-07 Ntt Docomo Inc Picture distributing method, system thereof computer- readable recording medium and computer program
JP4276393B2 (en) 2001-07-30 2009-06-10 日本放送協会 Program production support device and program production support program

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5581362A (en) * 1993-08-04 1996-12-03 Matsushita Electric Industrial Co., Ltd. Video camera system which multiplexes internal, external, and sensing parameters onto the video signal in order to identify individual segments
US6002995A (en) * 1995-12-19 1999-12-14 Canon Kabushiki Kaisha Apparatus and method for displaying control information of cameras connected to a network
US6208379B1 (en) * 1996-02-20 2001-03-27 Canon Kabushiki Kaisha Camera display control and monitoring system
US6469737B1 (en) * 1996-07-23 2002-10-22 Canon Kabushiki Kaisha Image-sensing server and control method and storage medium therefor
US5986695A (en) * 1996-07-27 1999-11-16 Samsung Electronics Co., Ltd. Recording method and apparatus for conserving space on recording medium of security system
JPH10136247A (en) * 1996-10-25 1998-05-22 Canon Inc Camera control system, camera server, control method therefor and storage medium
US5996023A (en) * 1996-10-31 1999-11-30 Sensormatic Electronics Corporation Efficient pre-alarm buffer management in intelligent video information management system
US6414716B1 (en) * 1996-11-29 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for controlling an imaging apparatus, imaging operation control system, and storage medium storing a program implementing such a method
US20010010549A1 (en) * 1997-01-27 2001-08-02 Fuji Photo Film Co., Ltd. Camera which records positional data of GPS unit
US6741790B1 (en) * 1997-05-29 2004-05-25 Red Hen Systems, Inc. GPS video mapping system
US6698021B1 (en) * 1999-10-12 2004-02-24 Vigilos, Inc. System and method for remote control of surveillance devices
US20040125877A1 (en) * 2000-07-17 2004-07-01 Shin-Fu Chang Method and system for indexing and content-based adaptive streaming of digital video content
US6807361B1 (en) * 2000-07-18 2004-10-19 Fuji Xerox Co., Ltd. Interactive custom video creation system
US6738572B2 (en) * 2001-02-03 2004-05-18 Hewlett-Packard Development Company, L.P. Function disabling system for a camera used in a restricted area
US20020154070A1 (en) * 2001-03-13 2002-10-24 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and control program
US20020146238A1 (en) * 2001-04-10 2002-10-10 Takayuki Sugahara Video signal recording method, video signal reproduction method, video signal recording apparatus, video signal reproducing apparatus, and video signal recording medium
US20020175995A1 (en) * 2001-05-26 2002-11-28 Marc Sleeckx Video surveillance system
US7046273B2 (en) * 2001-07-02 2006-05-16 Fuji Photo Film Co., Ltd System and method for collecting image information
US20030078038A1 (en) * 2001-09-28 2003-04-24 Takahiro Kurosawa Communication apparatus and control method therefor, information apparatus and control method therefor, communication system, and control programs
US20030093810A1 (en) * 2001-10-30 2003-05-15 Koji Taniguchi Video data transmitting/receiving method and video monitor system
US6999613B2 (en) * 2001-12-28 2006-02-14 Koninklijke Philips Electronics N.V. Video monitoring and surveillance systems capable of handling asynchronously multiplexed video
US20030214582A1 (en) * 2002-05-08 2003-11-20 Kazunori Takahashi Video delivery apparatus and video information delivery system
US20040109071A1 (en) * 2002-12-05 2004-06-10 Minolta Co., Ltd. Image capturing apparatus

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10156831B2 (en) 2004-03-16 2018-12-18 Icontrol Networks, Inc. Automation system with mobile interface
US10142166B2 (en) 2004-03-16 2018-11-27 Icontrol Networks, Inc. Takeover of security network
US9269398B2 (en) * 2004-07-12 2016-02-23 Koninklijke Philips N.V. Content with navigation support
US20060109354A1 (en) * 2004-11-23 2006-05-25 Samsung Electronics Co.,Ltd. Mobile communication terminal for controlling a zoom function and a method thereof
US9450776B2 (en) 2005-03-16 2016-09-20 Icontrol Networks, Inc. Forming a security network including integrated security system components
US8478844B2 (en) * 2005-03-16 2013-07-02 Icontrol Networks, Inc. Forming a security network including integrated security system components and network devices
US8473619B2 (en) * 2005-03-16 2013-06-25 Icontrol Networks, Inc. Security network integrated with premise security system
US10127801B2 (en) 2005-03-16 2018-11-13 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US10062245B2 (en) 2005-03-16 2018-08-28 Icontrol Networks, Inc. Cross-client sensor user interface in an integrated security network
US20090077624A1 (en) * 2005-03-16 2009-03-19 Marc Baum Forming A Security Network Including Integrated Security System Components and Network Devices
US10156959B2 (en) 2005-03-16 2018-12-18 Icontrol Networks, Inc. Cross-client sensor user interface in an integrated security network
US20090077622A1 (en) * 2005-03-16 2009-03-19 Marc Baum Security Network Integrated With Premise Security System
US10091014B2 (en) 2005-03-16 2018-10-02 Icontrol Networks, Inc. Integrated security network with security alarm signaling system
US9621408B2 (en) 2006-06-12 2017-04-11 Icontrol Networks, Inc. Gateway registry methods and systems
US10225314B2 (en) 2007-01-24 2019-03-05 Icontrol Networks, Inc. Methods and systems for improved system performance
US10142392B2 (en) 2007-01-24 2018-11-27 Icontrol Networks, Inc. Methods and systems for improved system performance
US9412248B1 (en) 2007-02-28 2016-08-09 Icontrol Networks, Inc. Security, monitoring and automation controller access and use of legacy security control panel information
US9510065B2 (en) 2007-04-23 2016-11-29 Icontrol Networks, Inc. Method and system for automatically providing alternate network access for telecommunications
US10140840B2 (en) 2007-04-23 2018-11-27 Icontrol Networks, Inc. Method and system for providing alternate network access
US9609003B1 (en) 2007-06-12 2017-03-28 Icontrol Networks, Inc. Generating risk profile using data of home monitoring and security system
US10200504B2 (en) 2007-06-12 2019-02-05 Icontrol Networks, Inc. Communication protocols over internet protocol (IP) networks
US9531593B2 (en) 2007-06-12 2016-12-27 Icontrol Networks, Inc. Takeover processes in security network integrated with premise security system
US10051078B2 (en) 2007-06-12 2018-08-14 Icontrol Networks, Inc. WiFi-to-serial encapsulation in systems
US10313303B2 (en) 2007-06-12 2019-06-04 Icontrol Networks, Inc. Forming a security network including integrated security system components and network devices
US10339791B2 (en) 2007-06-12 2019-07-02 Icontrol Networks, Inc. Security network integrated with premise security system
US10142394B2 (en) 2007-06-12 2018-11-27 Icontrol Networks, Inc. Generating risk profile using data of home monitoring and security system
US10079839B1 (en) 2007-06-12 2018-09-18 Icontrol Networks, Inc. Activation of gateway device
US9306809B2 (en) 2007-06-12 2016-04-05 Icontrol Networks, Inc. Security system with networked touchscreen
US10237237B2 (en) 2007-06-12 2019-03-19 Icontrol Networks, Inc. Communication protocols in integrated systems
US8495690B2 (en) * 2007-12-17 2013-07-23 Electronics And Telecommunications Research Institute System and method for image information processing using unique IDs
US20100277600A1 (en) * 2007-12-17 2010-11-04 Electronics And Telecommunications Research Institute System and method for image information processing
US9628440B2 (en) 2008-11-12 2017-04-18 Icontrol Networks, Inc. Takeover processes in security network integrated with premise security system
US10332363B2 (en) 2009-04-30 2019-06-25 Icontrol Networks, Inc. Controller and interface for home security, monitoring and automation having customizable audio alerts for SMA events
US10275999B2 (en) 2009-04-30 2019-04-30 Icontrol Networks, Inc. Server-based notification of alarm event subsequent to communication failure with armed security system
US10237806B2 (en) 2009-04-30 2019-03-19 Icontrol Networks, Inc. Activation of a home automation controller
US9426720B2 (en) 2009-04-30 2016-08-23 Icontrol Networks, Inc. Controller and interface for home security, monitoring and automation having customizable audio alerts for SMA events
US10062273B2 (en) 2010-09-28 2018-08-28 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US9349276B2 (en) 2010-09-28 2016-05-24 Icontrol Networks, Inc. Automated reporting of account and sensor information
US10223903B2 (en) 2010-09-28 2019-03-05 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US10127802B2 (en) 2010-09-28 2018-11-13 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US10078958B2 (en) 2010-12-17 2018-09-18 Icontrol Networks, Inc. Method and system for logging security event data
US9729342B2 (en) 2010-12-20 2017-08-08 Icontrol Networks, Inc. Defining and implementing sensor triggered response rules
US9928975B1 (en) 2013-03-14 2018-03-27 Icontrol Networks, Inc. Three-way switch
US9867143B1 (en) 2013-03-15 2018-01-09 Icontrol Networks, Inc. Adaptive Power Modulation
US9287727B1 (en) 2013-03-15 2016-03-15 Icontrol Networks, Inc. Temporal voltage adaptive lithium battery charger
US10117191B2 (en) 2013-03-15 2018-10-30 Icontrol Networks, Inc. Adaptive power modulation
US10348575B2 (en) 2013-06-27 2019-07-09 Icontrol Networks, Inc. Control system user interface
US10365810B2 (en) 2015-05-05 2019-07-30 Icontrol Networks, Inc. Control system user interface

Also Published As

Publication number Publication date
CN1578453A (en) 2005-02-09
CN100493177C (en) 2009-05-27
US8692897B2 (en) 2014-04-08
US20120086820A1 (en) 2012-04-12

Similar Documents

Publication Publication Date Title
US7523481B2 (en) Integrated internet camera
US7423670B2 (en) Remote control of image pickup apparatus
JP4286267B2 (en) Integrated Delivery Architecture multipoint video conferencing and interactive broadcasting system
DE69906711T2 (en) Retrieval of images from a portable digital camera via the Internet
US20180316969A1 (en) Methods and apparatus for selecting digital access technology for programming and data delivery
US20080126943A1 (en) System and method for recording a presentation for on-demand viewing over a computer network
US8744523B2 (en) Method and system for interactive home monitoring
AU2003212488B2 (en) Media transmission system and method
US20070078768A1 (en) System and a method for capture and dissemination of digital media across a computer network
US6728753B1 (en) Presentation broadcasting
DE60030659T2 (en) Method and apparatus for remote recording of audiovisual signals
US5943321A (en) Circuit set-up and caching for multimedia multipoint servers
EP1480178A2 (en) Security system
KR101009185B1 (en) Parental monitoring of digital content
CN104685895B (en) Receiving apparatus, receiving method, transmission apparatus, transmission method, and
US7587454B2 (en) Video streaming parameter optimization and QoS
JP4349365B2 (en) Method of transmitting control information, the relay server, and the control device
CN1278557C (en) Information delivery system, method, information processing apparatus, and method
US20060120385A1 (en) Method and system for creating and managing multiple subscribers of a content delivery network
US20050076058A1 (en) Interface for media publishing
US20040117427A1 (en) System and method for distributing streaming media
US7171485B2 (en) Broadband network system configured to transport audio or video at the transport layer, and associated method
US20110307548A1 (en) Data distribution
US20100169410A1 (en) Method and Apparatus for Distributing Multimedia to Remote Clients
WO2002046901A1 (en) System and method for implementing open-protocol remote device control

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUROSAWA, TAKAHIRO;KAWAI, TOMOAKI;REEL/FRAME:015158/0037

Effective date: 20040323

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION