US5808670A - Method and system for camera control with monitoring area view - Google Patents

Method and system for camera control with monitoring area view Download PDF

Info

Publication number
US5808670A
US5808670A US08524277 US52427795A US5808670A US 5808670 A US5808670 A US 5808670A US 08524277 US08524277 US 08524277 US 52427795 A US52427795 A US 52427795A US 5808670 A US5808670 A US 5808670A
Authority
US
Grant status
Grant
Patent type
Prior art keywords
camera
monitoring
angle
unit
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US08524277
Inventor
Masahiko Oyashiki
Ryosuke Nishiguchi
Hidenori Kawamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Network and System Integration Corp
Original Assignee
NEC Network and System Integration Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Grant date

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed circuit television systems, i.e. systems in which the signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control; Control of cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in, e.g. mobile phones, computers or vehicles

Abstract

A method of and a system of camera control are shown, in which a monitoring area view 2 is displayed on a monitor screen 1a. An installation position A of a camera unit 5 is input as origin A on the monitoring area view 2 with a point input unit 4. Further, a point B indicative of the home position direction of the camera unit 5 is input from the point input unit 4. A line connecting the pick-up position A of the camera unit and home position direction B thereof are set to be a virtual line C (0°); and a monitoring point X is input on the monitoring area view with the point input unit 4 to obtain a monitoring point angle θa which is defined by the designated monitoring point X, the origin A and the virtual line C. The camera unit 5 is caused to undergo revolution by the monitoring point angle θa.

Description

BACKGROUND OF THE INVENTION

1. Field of the Invention

This invention relates to a system of camera control, in which picture data obtained from a television camera is displayed on the monitor screen of a CRT or the like. More particularly, the invention concerns a method and a system for camera control with monitoring area view, for monitoring facilities involving danger, such as buildings, offices, convenience stores, large scale shops, banks and other financial organizations, power plants and so forth, and also factories and like production facilities, or cameras used for television conferences, weather forecast, viewing and so forth.

2. Related Art

Heretofore, in such type of television control system, for instance, the operator gives control system to a camera support by operating control buttons or the like provided in a control unit while watching camera picture displayed on a separate display such as a CRT, thus changing the sight field direction of the camera or enlarging or contracting the camera picture size in case of a camera having a zooming function.

In the prior art control system noted above, however, since the camera picture display and the controller are disposed independently, the operator has to give eyesight alternately to the two units, the operation is rather cumbersome, requiring great time for introducing an intended foreground subject and dictating great fatigue of the operator.

Further, the button operation requires a certain degree of skill, and the operation may be done erroneously to fail in correct change of the sight field direction of the camera to the direction of introducing the intended subject or cause opposite zooming.

Further, where a plurality of camera are used, it is necessary to provide the corresponding number of controllers or provide a switching unit for switching the cameras, thus increasing the size of the overall system, increasing the installation space thereof and increasing the degree of complication of the operation. Therefore, it is difficult to greatly increase the number of cameras that are used together.

SUMMARY OF THE INVENTION

The invention has been intended in view of the above circumstances, and its object is to provide a method of a system of camera control, in which a monitoring area view of a camera unit is displayed in the vicinity of a camera picture displayed on the screen of a CRT or the like, and the camera sight field is moved to a given position by designating the position on the monitoring area view with an external input, thus allowing simple operation, requiring less eyesight movement and eliminating operation errors.

Another object of the invention is to provide a method of and a system for camera control with monitoring area view, which allow connection of a number of cameras by using a small size and easily operable system.

To attain the above objects of the invention, there is provided a method of camera control with monitoring area view comprising the steps of:

displaying a monitoring area view 2 on a monitoring screen 1a;

inputting a point on the monitoring area view 2 with a point input unit 4, the point being the installation position A of the camera unit 5 as an origin A;

inputting a point B indicative of the home position direction of the camera unit 5 with a point input unit 4;

setting a line defined by the pick-up position A of the camera unit and the home position direction B thereof to be a virtual line C (0°);

inputting the monitoring point X on the monitoring area view 2 with the point input unit 4;

obtaining a monitoring point angle θa defined by a designated monitoring point X, the origin A and the virtual Line C; and

causing revolution of the camera unit 5 by the monitoring point angle θa.

establishing a monitoring prohibition area to be excluded from the image picked up by the camera unit;

limiting the monitoring point angle θa to exclude the monitoring prohibition area from the camera image;

The method of camera control further comprises the steps of:

reading out present camera picture angle data θr from the camera unit 5;

obtaining an angle θd by adding one half the camera picture angle θr to the monitoring point angle θa; and

sending out the data of the monitoring point angle θa to the camera unit when the angle θd is below the angle θc of the boundary with the monitoring prohibition area while sending out an angle obtained as a result of subtraction of the one half the camera picture angle θr from the boundary angle θc as corrected monitoring point angle θa' to the camera unit when the angle θd is above the angle θc of the boundary with the monitoring prohibition area 2c.

The method of camera control with monitoring area view further comprises the steps of:

designating a next monitoring point of the camera unit;

obtaining a relative angle θe by adding the monitoring point angle θa to a next monitoring point angle θb; or

obtaining a relative angle θe by subtracting the monitoring point angle θa from a next monitoring point angle θb; and

sending the relative angle θe out to the camera unit.

According to the invention, there is further provided a system for camera control with monitoring area view comprising a camera unit 5 capable of being revolved vertically and horizontally according to a predetermined control signal and for converting a video input, signal into a picture signal, a monitor 1 having a monitor screen 1a for displaying the picture signal from the camera unit 4, a point input unit 4 for designating a desired point on the monitor screen 1a, and a controller 3 to be connected to the monitor 1, the camera unit 4 and the point input unit 4, wherein the controller 3 includes:

drawing display means for displaying a monitoring area view 2 on the monitor screen 1a;

input means for obtaining, from the point unit 4, a pick-up position as origin A of the camera unit 5, a point B indicative of a home position direction of the camera unit 5 and a monitoring point 4 on the monitoring area view X;

monitoring point angle calculating means for setting a Line defined by the origin A and the home position direction B to be a virtual line (0°) and obtaining a monitoring point angle θa which is defined as an angle between a line passing through the designated monitoring point and the origin, and the virtual line (0°).

Further, in the system for camera control the controller 3 includes:

picture angle data input means for reading out present camera picture angle data θr from the camera unit 5;

angle calculating means for obtaining an angle θd by adding one half the camera picture angle data θr obtained from the picture angle data input means to the monitoring point angle θa;

comparing means for comparing the angle θd obtained by the angle calculating means and the angle θc of the boundary with the monitoring prohibition area;

corrected monitoring point angle calculating means for obtaining a corrected monitoring point angle θa' by subtracting one half the camera picture angle θr from the boundary angle θc when the angle θd is above the boundary angle θc; and

revolution angle data generating means for sending out the monitoring point angle θa and corrected monitoring point angle θa' as revolution angle data DA to the camera unit 5.

An area capable of being monitored by the camera unit 5 is displayed as a drawing of that place on the monitor screen 1a of the monitor 1. This display is the monitoring area view 2. A pick-up position of the camera unit 5 is designated as origin A on the monitoring area view 2 with the point input unit 4. Likewise, a point indicative of the sight field direction, i.e., home position direction, of the camera unit 5 is designated. Further, absolute position data input from the camera unit 5 at this time constitutes absolute position data of the camera home position.

The point input unit 4 is for designating a given point on the monitor screen 1a, and it can be readily constituted by a mouse, a touch panel or unit having like function as well-known means.

A portion to which it is desired to direct the sight line of the camera unit, a place desired to be monitored, is designated by point designation on the monitoring area view 2. Since the camera unit 5 is not moved, it is possible to determine the angle a from the coordinates A (XO, YO) and B (XH, YH) of the origin and camera home position and the coordinates X (Xx, Yx) of the monitoring point, and the angle of movement can be known from this angle θa.

It is possible to set the monitoring prohibition area 2c on the monitoring area view display area 2a on a monitoring area view 2. The monitoring prohibition area 2c is for designating a portion which is not desired to be included in the picture from the camera unit 5 for the privacy protection sake or the like. When a monitoring point is designated on the monitoring prohibition area 2c, the operation of obtaining the angle of movement is not executed.

When a monitoring point is designated in the vicinity of the boundary with the monitoring prohibition area 2c, the picture angle θr of the camera unit 5 is obtained. When the sum of the angle of the monitoring point X and the picture angle θr is present in the monitoring prohibition area 2c, it is meant that the monitoring prohibition area 2c is partly included in the camera picture.

Thus, it is possible to exclude the monitoring prohibition area 2c from the camera picture by providing the subtraction of the angle of an overlap portion of the monitoring prohibition area 2c and the camera unit 5 from the revolution angle as revolution angle correction data DE.

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other objects, features and advantages of the invention will be more apparent from the detailed description of the preferred embodiment thereof when the same is read with reference to the accompanying drawings, in which:

FIG. 1 is a monitor screen and a monitor area view in an embodiment of the invention;

FIG. 2 is a view showing the structure of a system for carrying out the invention:

FIG. 3 is a view showing an example of the position relation of the pick-up position of camera (origin) to home position, monitoring point and monitoring prohibition area;

FIG. 4 is a view showing a state in which a monitoring prohibition area is designated in a monitoring area;

FIG. 5 is a view showing a state in which a monitoring pint is designated near the boundary line of the monitoring prohibition area;

FIG. 6 is a view showing a picture obtained in the state shown in FIG. 5;

FIG. 7 is a view showing a state of zooming brought about by reducing the picture angle from the FIG. 4 state with the monitoring point designated near the boundary line of the monitoring prohibition area;

FIG. 8 is a view showing a picture obtained in the FIG. 7 state;

FIG. 9 is a flow chart illustrating one embodiment of the invention;

FIG. 10 is a flow chart illustrating a different embodiment of the invention; and

FIG. 11 is a view showing an example of overlay circuit.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

Embodiments of the invention will now be described. FIG. 1 is a view showing a monitor screen in an embodiment of the invention, and FIG. 2 is a view showing the structure of a system for carrying out the invention.

Referring to the Figures, reference numeral 5 designates a camera unit for converting a video input into a picture signal. The camera part of the unit is like the usual television camera. The camera unit 5 includes a camera and camera support integral therewith. The camera support includes a pulse motor or a servo motor, an encoder and a control circuit for controlling the motor and encoder, and it can be rotated to a prescribed extent in correspondence to prescribed pulses and digital quantity. The sight field direction of the camera can be varied freely with rotation of the pulse motor or servo motor, which has been set in the vertical and horizontal directions, according to a control signal from a controller 3.

The controller 3 may be a personal computer or like computer system having a data processing function. The controller 3 and the camera unit 5 are connected to each other via an Overlay circuit and a switching circuit (these circuits being not shown). The overlay circuit overlays NTSC signal supplied as picture signal from the camera unit 5 on a monitor screen 1a, and it has a structure as shown in FIG. 11, for instance.

Individual video input signals are displayed in a contracted and combined state on a screen through an NTSC signal converter unit 110, a color signal memory unit 120, a digital-to-analog converter 130 and an overlay circuit 131. The NTSC signal converter unit 110 has an NTSC/RGB converter 111a, for converting video into R, G and B signals.

The R, G and B signals thus obtained are converted in an analog-to-digital converter 112a into digital signals and contracted in a contraction controller 113a into a predetermined contraction factor before being stored in R, G and B signal memories 120r, 120g and 120b of the color signal memory unit 120. Further, a synchronizing signal separator 114a separates synchronizing signal from the video input signal, the synchronizing signal thus separated being input to a control signal generator 113a to synchronize various processes. The display position, etc. of picture is controlled from the controller 3, which is a personal computer, through a personal computer interface controller 140.

The switching circuit connects the controller 3 and a plurality of camera units 5. It is thus possible to control a plurality of camera units 5 with a single controller through switching of control signals, such as picture angle data and sight field direction data.

Reference numeral 4 designates a point input unit. This unit may be of any kind so long as it permits designation of a given point on the monitor screen; for instance a mouse, a touch panel, etc. may be used. Reference numeral 6 designates communicating means for connecting the controller 3 to a different controller 3'. For example, the communicating means may be a modem line or a digital communication network. With different controller 3' and monitor 1' connected via the communication means 6, operation like the operation of the controller 3 and monitor 1 may be made even in a remote place, which is effective for television conferences, events, sight-seeing guides, etc.

The monitor 1 has a monitor screen 1a. A monitor area view 2 is displayed on the monitor screen 1a. Using, for instance, a graphic software based on a computer program, a drawing of the camera installation cite or like constitution of an area capable of being visually monitored by the camera unit, is produced in advance and displayed. Labeled A is the pick-up position of the camera unit 5, i.e., origin as the center of the camera unit 5, in the camera area view 2. This position is displayed by the symbol as shown. Labeled B is the home position indicative of the direction of the camera, i.e., the sight field direction of the camera unit 5. Labeled C is a virtual line constituted by the pick-up position of the camera unit 5 and the origin A and the home position B. Labeled D is the present camera position in such case as when the next monitoring point is designated in consequence of movement of the camera unit 5 from the origin A along the sight line. Labeled X is a monitoring point designating the camera direction.

Now, the operation according to the invention will be described with reference to flow charts. FIGS. 9 and 10 are flow charts illustrating an embodiment of the invention.

As described above, it is possible to display pictures from a plurality of cameras on the monitor screen 1a. It is also possible to display a plurality of monitoring area views in correspondence to such displayed pictures. That is, it is possible that a camera picture and a corresponding monitor area view 2 are displayed on each monitor screen 1a, and each camera can be controlled on each monitor screen 1a. For the sake of the brevity of description, a case will now be considered, in which a specific camera unit 5 is connected in one-to-one correspondence relation to the controller 3 and monitor 1.

The monitoring area view of the pertinent camera unit 5 is displayed on the monitor screen 1a of the monitor 1 (step S1).

An area shown by maximum and minimum values of X and Y coordinates in the monitoring area view is set to be an effective monitoring point designation area, a monitoring area view display area 2a (step S2).

Point A (XO, YO) indicative of the pick-up position of of the actually installed camera unit 5, is designated in the monitoring area view display area 2a on the monitor screen 1a. The controller 3 stores this point and displays its symbol as camera origin A (XO, YO) (step S3).

Point B (XH, YH) Indicative of camera unit home position, i.e., the sight field direction of the actual camera unit 5, is designated as camera hole position B (XH, YH) (step S4).

Line connecting the camera origin A (XO, YO) and camera home position B (XH, YH) is set as virtual line C which is expressed as 0°. An arrow indicative of the sight field direction, i.e., camera direction symbol, is displayed on the basis of the virtual line C (step S5).

A read signal is sent out to the camera unit 5, and angle data, i.e., actual home position data, is read out to be stored as absolute value home position data θy. As the home position data θy, the count of a servo motor encoder or pulse motor controller pulse counter of the camera unit 5 is output, that is, data about the number of pulses (i.e., degrees) of movement of the camera unit home position to the mechanical origin is output (step S6).

Using the point input unit 4, the position to which it is desired to move the camera unit sight line, i.e., monitoring point X (XX, YX) is designated in the monitoring area display area 2a (step 7).

The controller 3 checks whether the coordinates (XX, YX) of the monitoring point X (XX, YX) are in a monitoring prohibition area. If it is determined that the coordinates are in the monitoring prohibition area, the subsequent operation is stopped, and an alarm is generated or likewise attention is invoked. Alternately, the revolution angle is corrected as will be described later (step S8).

If is not determined that the coordinates are in the monitoring prohibition area, a monitoring point angle θa, which is defined by line passing through the camera origin A (XO, YO) and the monitoring point X (XX, YX) and the virtual Line C passing through the camera origin A (XO, YO), is calculated and set as angle of revolution of the camera unit 5, i.e., relative angle (step S9).

Since the monitoring point angle θa is the relative angle difference with respect to the camera unit home position (0°) and monitoring point X (XX, YX), it is sent out directly, or alternatively an absolute value angle data control signal is obtained from the absolute value home position data θy obtained in the step S6 and sent out to the camera unit 5 (step S10).

In the above way, the revolution data indicative of the position designated on the monitor screen 1a is sent out to the camera unit 5. As a result, the pulse motor or servo motor in the camera unit 5 is moved to designated data extent to designated direction, so that the sight line of the camera unit is directed to the designated direction.

Now, an operation in case when the vicinity of the monitoring prohibition area is designated as monitoring point will be described with reference to the drawings. FIG. 10 is a flow chart illustrating a different embodiment of the invention. FIG. 3 is a view showing the position relation of the origin A of the camera unit 5, home position direction B, virtual line C, monitoring point X and monitoring prohibition area 2c to one another.

Here, a case as shown in FIG. 4 is considered, in which a monitoring prohibition area 2c is designated to exclude a building 2d in the monitoring area view display area 2a from the picture.

As shown in FIGS. 4 and 5, if a monitoring pint X is designated in the vicinity of the boundary of the monitoring prohibition area 2c, causing revolution of the camera unit 5 by directly calculating the angle data control signal as noted above results in that the building 2d is contained in the picture because the sight field 2e is a range expressed by the picture angle θr. The picture at this time is shown in FIG. 6. While the picture angle θr is changed by designating the monitoring point X as shown in FIG. 7, in this case the building 2d is included in the sight field of the camera. In this case, the picture is as shown in FIG. 8.

It will be seen that when the monitoring point X is designated in the vicinity of the monitoring prohibition area 2c, therefore, it is necessary to exclude the monitoring prohibition area 2c from the camera sight field 2c by some or other means.

Here, the same routine as in that in FIG. 9 is executed up to the step S8 in the same Figure. Then, again revolution angle data indicative of the monitoring point angle, i.e., the relative angle θa (step S11).

Then, the present picture angle data is read out by sending out a read signal to the camera unit 5 (step S12). From this picture angle data the picture angle θr is obtained (step S13).

One half the picture angle θr is added to the monitoring point angle, i.e., relative angle θa, noted above, thus obtaining angle θd (step S14).

Then, the angle of the boundary with the monitoring prohibition area 2c and the angle θd obtained in the step S14 are compared (step S15).

If the angle θd is equal to or greater than the boundary angle, the difference between the monitoring point angle, i.e., relative angle θa and one half the picture angle θr, is sent out as data of the revolution angle, i.e., corrected revolution angle θe to the camera unit 5 (step S16, S17).

Otherwise, the revolution angle data, i.e., relative angle data θa, is sent out to the camera unit 5 (step S16, S18).

In the above way, even when a monitoring point is set in the vicinity of the boundary with the monitoring prohibition area 2c, it is possible to exclude the monitoring prohibition area 2c from the sight field 2e of the camera unit 5, i.e., from the camera picture.

Further, in case when a new monitoring point X is designated after movement of the camera unit sight line to the direction of the monitoring point that has previously been designated, a check is made as to whether the present camera position D is in the positive or negative area with respect to the virtual Line C of the home position B of the camera, and the relative angle is obtained as the corrected revolution angle θe by adding or subtracting the angle θb defined by the present camera position and the virtual line to or from the monitoring point angle θa.

Further, in cases when there are a plurality of monitoring prohibition areas 2c or when a new monitoring point X is designated by exceeding the monitoring prohibition area 2e, a check is made as to whether the angle θa is in the upper or lower half of the usual area or monitoring prohibition area 2c.

As has been described in the foregoing, according to the invention the direction control of the camera unit is possible by designating the monitoring position on the drawing of camera unit installation cite on the monitor screen, thus simplifying the operation, requiring less operator's eyesight movement, causing less fatigue and eliminating operation errors. Further, it is made possible to connect a large number of cameras with a small-size and easily operable system.

Further, with the designation of monitoring point in the monitoring prohibition area or the vicinity thereof, the camera unit is caused to undergo no revolution or revolution by such an angle that no monitoring prohibition area is in the camera picture. Thus, it is possible to ensure reliable privacy protection. Further, checking as to whether a picture can be monitored permits saving of the time of operating the camera unit, thus improving the operation control property.

Claims (4)

What is claimed is:
1. A method of camera control with monitoring area view comprising the steps of:
displaying a monitoring area view in which an installation position of a camera unit, a home position of the camera unit, a pick-up subject, and a sight field direction of the camera unit is displayed on a portion of a monitor screen for displaying an image picked up by the camera unit;
inputting a point on the monitoring area view with a point input unit, the point being the installation position of the camera unit as an origin;
inputting a home point indicative of a home position direction of the camera unit with the point input unit;
setting a line defined by the installation position of the camera unit and the home position direction thereof to be a virtual line (0°);
inputting a monitoring point on the monitoring area view with the point input unit;
obtaining a monitoring point angle θa, which is defined as an angle between a line passing through a designated monitoring point and the origin, and the virtual line (0°);
causing revolution of the camera unit by an amount equal to the monitoring point angle θa;
establishing a monitoring prohibition area to be excluded from the image picked up by the camera unit;
limiting the monitoring point angle θa to exclude the monitoring prohibition area from the camera image;
reading out present camera picture angle data θr from the camera unit;
obtaining an angle θd by adding one half the camera picture angle θr to the monitoring point angle θa; and
sending out the data of the monitoring point angle θa to the camera unit when the angle θd is below an angle θc of a boundary with the monitoring prohibition area while sending out an angle obtained as a result of subtraction of the one half the camera picture angle θr from the boundary angle θc as corrected monitoring point angle θa' to the camera unit when the angle θd is above the angle θc of the boundary with the monitoring prohibition area.
2. The method of camera control with monitoring area view according to claim 1, further comprising the steps of:
designating a next monitoring point of the camera unit;
obtaining a relative angle θe by adding the monitoring point angle θa to a next monitoring point angle θb; and
sending the relative angle θe out to the camera unit.
3. The method of camera control with monitoring area view according to claim 1, further comprising the steps of:
designating a next monitoring point of the camera unit;
obtaining a relative angle θe by subtracting the monitoring point angle θa from a next monitoring point angle θb; and
sending the relative angle θe out to the camera unit.
4. A system for camera control with monitoring area view comprising:
a camera unit capable of being revolved vertically and horizontally according to a predetermined control signal, and of converting a video input signal into a picture signal;
a monitor having a monitor screen for displaying the picture signal from the camera unit;
a point input unit for designating a desired point on the monitor screen; and
a controller connected to the camera unit, the monitor, and the point input unit, wherein the controller includes:
drawing display means for displaying a monitoring area view on the monitor screen;
input means for obtaining, from the point input unit, an installation position of the camera unit as origin thereof, a home point indicative of a home position of the camera unit, and a monitoring point on the monitoring area view;
monitoring point angle calculating means for setting a line defined by the origin and the home position direction to be a virtual line (0°), and for obtaining a monitoring point angle θa, which is defined as an angle between a line passing through the designated monitoring point and the origin, and the virtual line (0°);
picture angle data input means for reading out present camera picture angle data θr from the camera unit;
angle calculating means for obtaining an angle θd by adding one half the camera picture angle data θr obtained from the picture angle data input means to the monitoring point angle θa;
comparing means for comparing the angle θd obtained by the angle calculating means and an angle θc of a boundary with a monitoring prohibition area;
corrected monitoring point angle calculating means for obtaining a corrected monitoring point angle θa, by subtracting one half the camera picture angle θr from the boundary angle θc when the angle θd is above the boundary angle θc; and
revolution angle data generating means for sending out the monitoring point angle θa and the corrected monitoring point angle θa' as revolution angle data DA to the camera unit.
US08524277 1995-02-17 1995-09-06 Method and system for camera control with monitoring area view Expired - Fee Related US5808670A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP5322795A JPH08223561A (en) 1995-02-17 1995-02-17 Method and device for controlling camera by the use of monitor area diagram
JP7-053227 1995-02-17

Publications (1)

Publication Number Publication Date
US5808670A true US5808670A (en) 1998-09-15

Family

ID=12936946

Family Applications (1)

Application Number Title Priority Date Filing Date
US08524277 Expired - Fee Related US5808670A (en) 1995-02-17 1995-09-06 Method and system for camera control with monitoring area view

Country Status (2)

Country Link
US (1) US5808670A (en)
JP (1) JPH08223561A (en)

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6055014A (en) * 1996-06-28 2000-04-25 Sony Corporation Control apparatus and control method
EP1081955A2 (en) * 1999-08-31 2001-03-07 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of displaying picture from monitor camera thereof
US6208379B1 (en) * 1996-02-20 2001-03-27 Canon Kabushiki Kaisha Camera display control and monitoring system
US6337709B1 (en) 1995-02-13 2002-01-08 Hitachi, Ltd. Image display device
US20020186299A1 (en) * 2001-06-08 2002-12-12 Honeywell Inc. Machine safety system with mutual exclusion zone
US6578067B1 (en) * 1997-09-30 2003-06-10 Canon Kabushiki Kaisha Apparatus, system, method and memory medium for data processing
US20030137589A1 (en) * 2002-01-22 2003-07-24 Kazunori Miyata Video camera system
US20030153351A1 (en) * 2002-01-24 2003-08-14 K2Web Usa, Llc System for use in a monitoring and management system based on the internet
US20030227556A1 (en) * 2002-05-15 2003-12-11 Michael Doyle Method and system for generating detail-in-context video presentations using a graphical user interface
US6685366B1 (en) * 1997-09-05 2004-02-03 Robert Bosch Gmbh Camera positioning system with optimized field of view
US6727954B1 (en) * 1998-08-12 2004-04-27 Minolta Co., Ltd. Electronic camera and image processing system
US20040125217A1 (en) * 2002-12-31 2004-07-01 Jesson Joseph E. Sensing cargo using an imaging device
US20040179121A1 (en) * 2003-03-12 2004-09-16 Silverstein D. Amnon System and method for displaying captured images according to imaging device position
US20040223191A1 (en) * 1995-02-24 2004-11-11 Makoto Murata Image input system
US20050068437A1 (en) * 2003-09-29 2005-03-31 Sony Corporation Image pickup device
US20050199782A1 (en) * 2004-03-12 2005-09-15 Calver Andrew J. Cargo sensing system
US20050225638A1 (en) * 1997-01-28 2005-10-13 Canon Kabushiki Kaisha Apparatus and method for controlling a camera based on a displayed image
US6977678B1 (en) * 1999-08-31 2005-12-20 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of controlling monitor camera thereof
US6985178B1 (en) * 1998-09-30 2006-01-10 Canon Kabushiki Kaisha Camera control system, image pick-up server, client, control method and storage medium therefor
US7136513B2 (en) 2001-11-08 2006-11-14 Pelco Security identification system
US20060256201A1 (en) * 2005-05-10 2006-11-16 Ge Security, Inc. Methods and systems for controlling camera movement
US20070133844A1 (en) * 2001-11-08 2007-06-14 Waehner Glenn C Security identification system
US20080055422A1 (en) * 2006-09-05 2008-03-06 Canon Kabushiki Kaisha Shooting system, image sensing apparatus, monitoring apparatus, control method therefor, program, and storage medium
US20080138783A1 (en) * 2006-12-06 2008-06-12 Microsoft Corporation Memory training via visual journal
US20080158356A1 (en) * 2006-12-28 2008-07-03 Canon Kabushiki Kaisha Monitoring apparatus and control method thereof
US20080183049A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Remote management of captured image sequence
US20080259162A1 (en) * 2005-07-29 2008-10-23 Matsushita Electric Industrial Co., Ltd. Imaging Region Adjustment Device
US20090128632A1 (en) * 2007-11-19 2009-05-21 Hitachi, Ltd. Camera and image processor
US7580036B2 (en) 2005-04-13 2009-08-25 Catherine Montagnese Detail-in-context terrain displacement algorithm with optimizations
US7667699B2 (en) 2002-02-05 2010-02-23 Robert Komar Fast rendering of pyramid lens distorted raster images
US7714859B2 (en) 2004-09-03 2010-05-11 Shoemaker Garth B D Occlusion reduction and magnification for multidimensional data presentations
US7737976B2 (en) 2001-11-07 2010-06-15 Maria Lantin Method and system for displaying stereoscopic detail-in-context presentations
US7761713B2 (en) 2002-11-15 2010-07-20 Baar David J P Method and system for controlling access in detail-in-context presentations
US7773101B2 (en) 2004-04-14 2010-08-10 Shoemaker Garth B D Fisheye lens graphical user interfaces
US7966570B2 (en) 2001-05-03 2011-06-21 Noregin Assets N.V., L.L.C. Graphical user interface for detail-in-context presentations
US7978210B2 (en) 2002-07-16 2011-07-12 Noregin Assets N.V., L.L.C. Detail-in-context lenses for digital image cropping and measurement
US7983473B2 (en) 2006-04-11 2011-07-19 Noregin Assets, N.V., L.L.C. Transparency adjustment of a presentation
US7995078B2 (en) 2004-09-29 2011-08-09 Noregin Assets, N.V., L.L.C. Compound lenses for multi-source data presentation
US8031206B2 (en) 2005-10-12 2011-10-04 Noregin Assets N.V., L.L.C. Method and system for generating pyramid fisheye lens detail-in-context presentations
US8106927B2 (en) 2004-05-28 2012-01-31 Noregin Assets N.V., L.L.C. Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
US8120624B2 (en) 2002-07-16 2012-02-21 Noregin Assets N.V. L.L.C. Detail-in-context lenses for digital image cropping, measurement and online maps
US8139089B2 (en) 2003-11-17 2012-03-20 Noregin Assets, N.V., L.L.C. Navigating digital images using detail-in-context lenses
US8225225B2 (en) 2002-07-17 2012-07-17 Noregin Assets, N.V., L.L.C. Graphical user interface having an attached toolbar for drag and drop editing in detail-in-context lens presentations
USRE43742E1 (en) 2000-12-19 2012-10-16 Noregin Assets N.V., L.L.C. Method and system for enhanced detail-in-context viewing
US8311915B2 (en) 2002-09-30 2012-11-13 Noregin Assets, N.V., LLC Detail-in-context lenses for interacting with objects in digital image presentations
US8416266B2 (en) 2001-05-03 2013-04-09 Noregin Assetts N.V., L.L.C. Interacting with detail-in-context presentations
US20150092064A1 (en) * 2013-09-29 2015-04-02 Carlo Antonio Sechi Recording Device Positioner Based on Relative Head Rotation
US9026938B2 (en) 2007-07-26 2015-05-05 Noregin Assets N.V., L.L.C. Dynamic detail-in-context user interface for application access and content access on electronic displays
US9317945B2 (en) 2004-06-23 2016-04-19 Callahan Cellular L.L.C. Detail-in-context lenses for navigation
US9323413B2 (en) 2001-06-12 2016-04-26 Callahan Cellular L.L.C. Graphical user interface with zoom for detail-in-context presentations
EP1883050A3 (en) * 2006-07-26 2016-11-30 Delphi Technologies, Inc. Vision-based method of determining cargo status by boundary detection
US9760235B2 (en) 2001-06-12 2017-09-12 Callahan Cellular L.L.C. Lens-defined adjustment of displays

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02266237A (en) * 1989-04-07 1990-10-31 Yamato Scale Co Ltd Combination scale
US4992866A (en) * 1989-06-29 1991-02-12 Morgan Jack B Camera selection and positioning system and method
US5111288A (en) * 1988-03-02 1992-05-05 Diamond Electronics, Inc. Surveillance camera system
US5359363A (en) * 1991-05-13 1994-10-25 Telerobotics International, Inc. Omniview motionless camera surveillance system
US5430511A (en) * 1993-12-21 1995-07-04 Sensormatic Electronics Corporation Controller for a surveillance assembly
US5517236A (en) * 1994-06-22 1996-05-14 Philips Electronics North America Corporation Video surveillance system
US5523783A (en) * 1992-10-19 1996-06-04 Fuji Photo Optical Co., Ltd. Pan head control system for TV camera
US5528289A (en) * 1993-10-20 1996-06-18 Videoconferencing Systems, Inc. Method for automatically adjusting a videoconferencing system camera to center an object

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2992617B2 (en) * 1990-07-06 1999-12-20 日本電信電話株式会社 Photographing position determination of the remote control camera method and a camera remote control system
JP2661657B2 (en) * 1992-12-01 1997-10-08 松下情報システム株式会社 Image synthesis device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5111288A (en) * 1988-03-02 1992-05-05 Diamond Electronics, Inc. Surveillance camera system
JPH02266237A (en) * 1989-04-07 1990-10-31 Yamato Scale Co Ltd Combination scale
US4992866A (en) * 1989-06-29 1991-02-12 Morgan Jack B Camera selection and positioning system and method
US5359363A (en) * 1991-05-13 1994-10-25 Telerobotics International, Inc. Omniview motionless camera surveillance system
US5523783A (en) * 1992-10-19 1996-06-04 Fuji Photo Optical Co., Ltd. Pan head control system for TV camera
US5528289A (en) * 1993-10-20 1996-06-18 Videoconferencing Systems, Inc. Method for automatically adjusting a videoconferencing system camera to center an object
US5430511A (en) * 1993-12-21 1995-07-04 Sensormatic Electronics Corporation Controller for a surveillance assembly
US5517236A (en) * 1994-06-22 1996-05-14 Philips Electronics North America Corporation Video surveillance system

Cited By (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337709B1 (en) 1995-02-13 2002-01-08 Hitachi, Ltd. Image display device
US7583414B2 (en) 1995-02-24 2009-09-01 Canon Kabushiki Kaisha Image input system
US20070097460A1 (en) * 1995-02-24 2007-05-03 Tomoaki Kawai Image input system
US7321453B2 (en) * 1995-02-24 2008-01-22 Canon Kabushiki Kaisha Image input system
US20040223191A1 (en) * 1995-02-24 2004-11-11 Makoto Murata Image input system
US6208379B1 (en) * 1996-02-20 2001-03-27 Canon Kabushiki Kaisha Camera display control and monitoring system
US6055014A (en) * 1996-06-28 2000-04-25 Sony Corporation Control apparatus and control method
US7532238B2 (en) 1997-01-28 2009-05-12 Canon Kabushiki Kaisha Apparatus and method for controlling a camera based on a displayed image
US7061525B1 (en) * 1997-01-28 2006-06-13 Canon Kabushiki Kaisha Apparatus and method for controlling a camera based on a displayed image
US20050225638A1 (en) * 1997-01-28 2005-10-13 Canon Kabushiki Kaisha Apparatus and method for controlling a camera based on a displayed image
US6685366B1 (en) * 1997-09-05 2004-02-03 Robert Bosch Gmbh Camera positioning system with optimized field of view
US6578067B1 (en) * 1997-09-30 2003-06-10 Canon Kabushiki Kaisha Apparatus, system, method and memory medium for data processing
US6727954B1 (en) * 1998-08-12 2004-04-27 Minolta Co., Ltd. Electronic camera and image processing system
US6985178B1 (en) * 1998-09-30 2006-01-10 Canon Kabushiki Kaisha Camera control system, image pick-up server, client, control method and storage medium therefor
US6744461B1 (en) * 1999-08-31 2004-06-01 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of displaying picture from monitor camera thereof
EP1081955A3 (en) * 1999-08-31 2001-10-04 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of displaying picture from monitor camera thereof
EP1081955A2 (en) * 1999-08-31 2001-03-07 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of displaying picture from monitor camera thereof
US6977678B1 (en) * 1999-08-31 2005-12-20 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of controlling monitor camera thereof
USRE43742E1 (en) 2000-12-19 2012-10-16 Noregin Assets N.V., L.L.C. Method and system for enhanced detail-in-context viewing
US7966570B2 (en) 2001-05-03 2011-06-21 Noregin Assets N.V., L.L.C. Graphical user interface for detail-in-context presentations
US8416266B2 (en) 2001-05-03 2013-04-09 Noregin Assetts N.V., L.L.C. Interacting with detail-in-context presentations
US7768549B2 (en) * 2001-06-08 2010-08-03 Honeywell International Inc. Machine safety system with mutual exclusion zone
US20020186299A1 (en) * 2001-06-08 2002-12-12 Honeywell Inc. Machine safety system with mutual exclusion zone
US9323413B2 (en) 2001-06-12 2016-04-26 Callahan Cellular L.L.C. Graphical user interface with zoom for detail-in-context presentations
US9760235B2 (en) 2001-06-12 2017-09-12 Callahan Cellular L.L.C. Lens-defined adjustment of displays
US8947428B2 (en) 2001-11-07 2015-02-03 Noreign Assets N.V., L.L.C. Method and system for displaying stereoscopic detail-in-context presentations
US7737976B2 (en) 2001-11-07 2010-06-15 Maria Lantin Method and system for displaying stereoscopic detail-in-context presentations
US8400450B2 (en) 2001-11-07 2013-03-19 Noregin Assets, N.V., L.L.C. Method and system for displaying stereoscopic detail-in-context presentations
US7136513B2 (en) 2001-11-08 2006-11-14 Pelco Security identification system
US20070133844A1 (en) * 2001-11-08 2007-06-14 Waehner Glenn C Security identification system
US7305108B2 (en) 2001-11-08 2007-12-04 Pelco Security identification system
US20030137589A1 (en) * 2002-01-22 2003-07-24 Kazunori Miyata Video camera system
US7893959B2 (en) * 2002-01-22 2011-02-22 Sanyo Electric Co., Ltd. Video display system for correcting the position and size of mask images
US20030153351A1 (en) * 2002-01-24 2003-08-14 K2Web Usa, Llc System for use in a monitoring and management system based on the internet
US7667699B2 (en) 2002-02-05 2010-02-23 Robert Komar Fast rendering of pyramid lens distorted raster images
US7411610B2 (en) 2002-05-15 2008-08-12 Idelix Software Inc. Method and system for generating detail-in-context video presentations using a graphical user interface
US20030227556A1 (en) * 2002-05-15 2003-12-11 Michael Doyle Method and system for generating detail-in-context video presentations using a graphical user interface
US8120624B2 (en) 2002-07-16 2012-02-21 Noregin Assets N.V. L.L.C. Detail-in-context lenses for digital image cropping, measurement and online maps
US9804728B2 (en) 2002-07-16 2017-10-31 Callahan Cellular L.L.C. Detail-in-context lenses for digital image cropping, measurement and online maps
US7978210B2 (en) 2002-07-16 2011-07-12 Noregin Assets N.V., L.L.C. Detail-in-context lenses for digital image cropping and measurement
US8225225B2 (en) 2002-07-17 2012-07-17 Noregin Assets, N.V., L.L.C. Graphical user interface having an attached toolbar for drag and drop editing in detail-in-context lens presentations
US9400586B2 (en) 2002-07-17 2016-07-26 Callahan Cellular L.L.C. Graphical user interface having an attached toolbar for drag and drop editing in detail-in-context lens presentations
US8311915B2 (en) 2002-09-30 2012-11-13 Noregin Assets, N.V., LLC Detail-in-context lenses for interacting with objects in digital image presentations
US8577762B2 (en) 2002-09-30 2013-11-05 Noregin Assets N.V., L.L.C. Detail-in-context lenses for interacting with objects in digital image presentations
US7761713B2 (en) 2002-11-15 2010-07-20 Baar David J P Method and system for controlling access in detail-in-context presentations
US7746379B2 (en) 2002-12-31 2010-06-29 Asset Intelligence, Llc Sensing cargo using an imaging device
US20040125217A1 (en) * 2002-12-31 2004-07-01 Jesson Joseph E. Sensing cargo using an imaging device
US20040179121A1 (en) * 2003-03-12 2004-09-16 Silverstein D. Amnon System and method for displaying captured images according to imaging device position
US7423667B2 (en) * 2003-09-29 2008-09-09 Sony Corporation Image pickup device with image masking
US20050068437A1 (en) * 2003-09-29 2005-03-31 Sony Corporation Image pickup device
US8139089B2 (en) 2003-11-17 2012-03-20 Noregin Assets, N.V., L.L.C. Navigating digital images using detail-in-context lenses
US9129367B2 (en) 2003-11-17 2015-09-08 Noregin Assets N.V., L.L.C. Navigating digital images using detail-in-context lenses
US20050199782A1 (en) * 2004-03-12 2005-09-15 Calver Andrew J. Cargo sensing system
US7421112B2 (en) * 2004-03-12 2008-09-02 General Electric Company Cargo sensing system
US7773101B2 (en) 2004-04-14 2010-08-10 Shoemaker Garth B D Fisheye lens graphical user interfaces
US8711183B2 (en) 2004-05-28 2014-04-29 Noregin Assets N.V., L.L.C. Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
US8106927B2 (en) 2004-05-28 2012-01-31 Noregin Assets N.V., L.L.C. Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
US8350872B2 (en) 2004-05-28 2013-01-08 Noregin Assets N.V., L.L.C. Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
US9317945B2 (en) 2004-06-23 2016-04-19 Callahan Cellular L.L.C. Detail-in-context lenses for navigation
US7714859B2 (en) 2004-09-03 2010-05-11 Shoemaker Garth B D Occlusion reduction and magnification for multidimensional data presentations
US9299186B2 (en) 2004-09-03 2016-03-29 Callahan Cellular L.L.C. Occlusion reduction and magnification for multidimensional data presentations
US8907948B2 (en) 2004-09-03 2014-12-09 Noregin Assets N.V., L.L.C. Occlusion reduction and magnification for multidimensional data presentations
US7995078B2 (en) 2004-09-29 2011-08-09 Noregin Assets, N.V., L.L.C. Compound lenses for multi-source data presentation
USRE44348E1 (en) 2005-04-13 2013-07-09 Noregin Assets N.V., L.L.C. Detail-in-context terrain displacement algorithm with optimizations
US7580036B2 (en) 2005-04-13 2009-08-25 Catherine Montagnese Detail-in-context terrain displacement algorithm with optimizations
US20060256201A1 (en) * 2005-05-10 2006-11-16 Ge Security, Inc. Methods and systems for controlling camera movement
US8154599B2 (en) * 2005-07-29 2012-04-10 Panasonic Corporation Imaging region adjustment device
US20080259162A1 (en) * 2005-07-29 2008-10-23 Matsushita Electric Industrial Co., Ltd. Imaging Region Adjustment Device
US8031206B2 (en) 2005-10-12 2011-10-04 Noregin Assets N.V., L.L.C. Method and system for generating pyramid fisheye lens detail-in-context presentations
US8687017B2 (en) 2005-10-12 2014-04-01 Noregin Assets N.V., L.L.C. Method and system for generating pyramid fisheye lens detail-in-context presentations
US8478026B2 (en) 2006-04-11 2013-07-02 Noregin Assets N.V., L.L.C. Method and system for transparency adjustment and occlusion resolution for urban landscape visualization
US8675955B2 (en) 2006-04-11 2014-03-18 Noregin Assets N.V., L.L.C. Method and system for transparency adjustment and occlusion resolution for urban landscape visualization
US8194972B2 (en) 2006-04-11 2012-06-05 Noregin Assets, N.V., L.L.C. Method and system for transparency adjustment and occlusion resolution for urban landscape visualization
US7983473B2 (en) 2006-04-11 2011-07-19 Noregin Assets, N.V., L.L.C. Transparency adjustment of a presentation
EP1883050A3 (en) * 2006-07-26 2016-11-30 Delphi Technologies, Inc. Vision-based method of determining cargo status by boundary detection
US7907180B2 (en) * 2006-09-05 2011-03-15 Canon Kabushiki Kaisha Shooting system, access control apparatus, monitoring apparatus, control method, and storage medium for processing an image shot by an image sensing apparatus to restrict display
US20080055422A1 (en) * 2006-09-05 2008-03-06 Canon Kabushiki Kaisha Shooting system, image sensing apparatus, monitoring apparatus, control method therefor, program, and storage medium
US20080138783A1 (en) * 2006-12-06 2008-06-12 Microsoft Corporation Memory training via visual journal
US8287281B2 (en) * 2006-12-06 2012-10-16 Microsoft Corporation Memory training via visual journal
US8648909B2 (en) * 2006-12-28 2014-02-11 Canon Kabushiki Kaisha Camera monitoring apparatus and registration method thereof
US20080158356A1 (en) * 2006-12-28 2008-07-03 Canon Kabushiki Kaisha Monitoring apparatus and control method thereof
US20080183049A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Remote management of captured image sequence
US9026938B2 (en) 2007-07-26 2015-05-05 Noregin Assets N.V., L.L.C. Dynamic detail-in-context user interface for application access and content access on electronic displays
US20090128632A1 (en) * 2007-11-19 2009-05-21 Hitachi, Ltd. Camera and image processor
US20150092064A1 (en) * 2013-09-29 2015-04-02 Carlo Antonio Sechi Recording Device Positioner Based on Relative Head Rotation

Also Published As

Publication number Publication date Type
JPH08223561A (en) 1996-08-30 application

Similar Documents

Publication Publication Date Title
US5475447A (en) Apparatus and method for adjusting video display
US5691765A (en) Image forming and processing device and method for use with no moving parts camera
US6859907B1 (en) Large data set storage and display for electronic spreadsheets applied to machine vision
US6563574B2 (en) Surveying apparatus
US7034866B1 (en) Combined display-camera for an image processing system
US3959582A (en) Solid state electronically rotatable raster scan for television cameras
Peri et al. Generation of perspective and panoramic video from omnidirectional video
US4802022A (en) Cable TV system for guest facilities
US6892360B1 (en) Focus traversal mechanism for graphical user interface widgets
US20050168705A1 (en) Projection system
US20050036036A1 (en) Camera control apparatus and method
US6501515B1 (en) Remote control system
US6292713B1 (en) Robotic telepresence system
US20010019355A1 (en) Controller for photographing apparatus and photographing system
US6549215B2 (en) System and method for displaying images using anamorphic video
US5877801A (en) System for omnidirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US4834473A (en) Holographic operator display for control systems
US6795113B1 (en) Method and apparatus for the interactive display of any portion of a spherical image
US20030085992A1 (en) Method and apparatus for providing immersive surveillance
US20020057279A1 (en) System and method for displaying images using foveal video
US8427538B2 (en) Multiple view and multiple object processing in wide-angle video camera
US6445411B1 (en) Camera control system having anti-blur facility
US6509926B1 (en) Surveillance apparatus for camera surveillance system
US5990934A (en) Method and system for panoramic viewing
US4992866A (en) Camera selection and positioning system and method

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEC SYSTEM INTEGRATION & CONSTRUCTION, LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OYASHIKI, MASAHIKO;NISHIGUCHI, RYOSUKE;KAWAMURA, HIDENORI;REEL/FRAME:007705/0631

Effective date: 19950831

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
FP Expired due to failure to pay maintenance fee

Effective date: 20100915