WO2006118366A1 - Camera control method using gui - Google Patents

Camera control method using gui Download PDF

Info

Publication number
WO2006118366A1
WO2006118366A1 PCT/KR2005/002792 KR2005002792W WO2006118366A1 WO 2006118366 A1 WO2006118366 A1 WO 2006118366A1 KR 2005002792 W KR2005002792 W KR 2005002792W WO 2006118366 A1 WO2006118366 A1 WO 2006118366A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
pointing device
area
movement
screen
Prior art date
Application number
PCT/KR2005/002792
Other languages
French (fr)
Inventor
Dong-Young Jung
Original Assignee
Namsun Gtl Co., Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Namsun Gtl Co., Ltd filed Critical Namsun Gtl Co., Ltd
Publication of WO2006118366A1 publication Critical patent/WO2006118366A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Abstract

Provided is a graphic user interface (GUI)-based camera control method, and more particularly, a GUI-based camera control method, which shifts a screen in real time according to the movement of a pointing device, sets a desired area using the pointing device in order to shift the screen to or zoom in/out of the desired area, sequentially preset-stores the set area, and immediately changes the screen to an area preset-stored before or after a currently viewed area by manipulating a specific button of the pointing device. Therefore, according to a first mode of the present invention, it is possible to control a camera at a high speed using only the movement of a pointing device, thereby shifting a screen to a desired area. Also, according to a second mode of the present invention, it is possible to control a camera to quickly shift a screen to a desired area or zoom in/out of the desired area, by setting an area of a target object on a GUI screen. Also, according to a third mode of the present invention, by sequentially preset-storing set areas, it is possible to control a camera to easily replay previous pictures sequentially by manipulation of a button of a pointing device.

Description

CAMERA CONTROL METHOD USING GUI
TECHNICAL FIELD The present invention relates to a camera control method using a graphic user in terface (GUI), and more particularly, to a GUI-based camera control method, which shift s a screen in real time according to the movement of a pointing device, sets a desired a rea using the pointing device in order to shift the screen to or zoom in/out of the desired area, sequentially preset-stores the set area, and immediately changes the screen to a n area preset-stored even before or after a currently viewed area by manipulating a spe cific button of the pointing device.
BACKGROUND ART
Control is essential for panning, tilting, and zooming in/out of cameras used in di gital video recorders (DVRs) or closed circuit televisions (CCTVs), etc. The panning m eans moving a camera horizontally, and the tilting means moving a camera vertically. The zooming in/out means enlarging or reducing a specific area.
The camera control is performed by controlling a pan drive motor and a tilt drive motor which shift a camera in an up, down, left, or right direction, and a lens drive motor which moves a lens to adjust a focal distance, by hardware or/and software.
Such a camera control technology is being gradually developed in terms of softw are control to take over from conventional hardware control. Specifically, recently, a G Ul-based camera control technology which converts a picture obtained from a camera i nto a digital signal, displays it on a screen of a monitor, and controls the movement of t he camera using a pointing device such as a mouse, etc. on the screen, has been deve loped. According to the GUI-based camera control technology, it is possible to conven tionally and easily control a camera using a variety of software manipulation items provi ded on a computer screen as necessary, without requiring a complicated structure, larg e volume, and separate hardware, as required in existing hardware control. However, in an existing GUI-based camera control technology, in order to shift a screen or zoom in/out on the screen, an operator should input commands for selecting a target location (a target point) to be shifted on the screen, selecting a different locatio n to which the target location will be moved, and shifting the target location to the differe i nt location, which causes inconvenience and requires a lot of time. That is, when the c amera focus shifting and zoom in/out function are performed on, for example, a target p ositioned at the upper and left part of a screen, in order to view the target according to t he operator's intention, the target should be adjusted by shifting the screen using a left- side adjustment button (or a left-right adjustment joystick) and an up-side adjustment bu tton (or an up-down adjustment joystick) or using diagonal movement and then enlargin g (a so-called "Near" or "FAR" function) the target using the zoom function of the lens, a nd then a focus of the camera focus should be readjusted.
However, in the conventional GUI-based control technology, a lot of time is requir ed to control a product, and it is difficult to perform quick processing when a target obje ct is moving. That is, when a target object is not fixed and is moving, an operator can not deal with the moving speed of the target object.
As such, since the conventional GUI-based control technology requires complicat ed processing and much processing time, there is difficulty in performing a quick proces s when an emergency situation occurs requiring quick shifting or zooming in/out or whe n an accident or event is monitored.
DETAILED DESCRIPTION OF THE INVENTION
TECHNICAL PROBLEM According to the present invention, it is possible to quickly shift a screen to a targ et object or a target image and set an area of the target object or image, thereby achiev ing quick screen shifting (by a quick operation of an up-down-left-right motor, and quick operations of zooming in/out and focusing). Also, according to the present invention, it is possible to perform quick processing in correspondence to a moving speed of a mov ing target using the movement of a pointing device and sequentially replay the pictures of previously set areas.
Accordingly, the present invention provides a graphic user interface (GUI)-based camera control method which is capable of controlling a camera at a high speed using only the movement of a pointing device to shift a screen to a desired area. The present invention also provides a GUI-based camera control method which i s capable of achieving quick zoom in/out by setting an area of a target object on a GUI screen. The present invention also provides a GUI-based camera control method which i s capable of sequentially replaying previous pictures by using button manipulation of a pointing device and by sequentially preset-storing set areas.
TECHNICAL SOLUTION
According to an aspect of the present invention, there is provided a method for c ontrolling a graphic user interface (GUI) of a camera, including: displaying a video signa I received from the camera and forming a GUI screen; sensing a movement of a locatio n of a cursor of a pointing device on the GUI screen and calculating a location value of t he cursor; and calculating a movement control value of the camera according to the loc ation value, thereby controlling a movement of the camera in real time according to a m ovement of the pointing device.
The camera GUI control method further includes: after the pointing device starts area setting on the GUI screen, calculating a location value of the corresponding area i mmediately when the area setting is complete; and controlling the camera to shift a scr een to the corresponding area or to zoom in on the corresponding area.
The camera GUI control method further includes: storing the location value of the set area; if a predetermined button of the pointing device is manipulated, sensing the manipulation of the predetermined button, calculating a rotation displacement value ace ording to the sensed result, and requesting a set area corresponding to the rotation disp lacement value; and calculating a control value of the camera on the basis of the reque sted set area, thereby controlling the camera.
ADVANTAGEOUS EFFECTS According to the present invention, it is possible to quickly shift a screen to check a target by easily setting an area of the target when detecting the target, and to control a screen in correspondence to a moving speed of a target by resetting an area of the t arget and manipulating a button in a moving direction of the target when the target is m oving. By utilizing a speed variable function which is an advantage of a wheel mouse, that is, by utilizing a function of the wheel of a wheel mouse, which causes slow movem ent of the screen when the view is centered on the moving target and fast movement of the screen when the view is far from the moving target, constant speed values and vari able speed values can be obtained so that speed values of up, down, left, and right adj ustments are obtained. This can be applied to a moving target when a zoom function operates. Also, since areas of previous pictures can be sequentially replayed even aft er area setting is executed many times by an auto preset function through area setting, separate resetting for the areas of the previous pictures is not required and previous op erations can be again checked, which allows for quick management.
DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a system according to an embodiment of the present invention; FIG. 2 is a flowchart illustrating a graphic user interface (GUI)-based camera con trol method according to an embodiment of the present invention;
FIGS. 3 and 4 are views for explaining the GUI-based camera control method ac cording to an embodiment of the present invention; and
FIG. 5 is a flowchart illustrating a process related to the GUI-based camera contr ol method illustrated in FIG. 2 according to an embodiment of the present invention.
BEST MODE Hereinafter, embodiments of a graphic user interface (GUI)-based camera contra
1 method according to the present invention will be described in detail with reference to t he appended drawings.
FIG. 1 is a block diagram of a system for implementing a GUI-based camera con trol method according to an embodiment of the present invention.
Referring to FIG. 1 , a video signal output from a camera 10 is output on a displa y screen of a graphic user interface (GUI) 12 of a control system. Generally, the came ra 10 and the GUI are located far away from each other.
If an operator selects a desired control command or tries a control operation by u sing the GUI 12 and a mouse 14, a control signal generated according to the control co mmand is transferred to a control unit 16. Here, the mouse 14 includes a center scroll wheel as well as including left and right buttons, however, the present invention is not Ii mited to this.
The control unit 16 functionally includes a camera movement processor 18, a ca mera zoom processor 20, and a preset processor 22. These processors 18, 20, and 2
2 are functional processors and are not physically separate components. Also, the control unit 16 includes an area storage unit 24. The area storage unit 24 can be implemented as a memory device. The area storage unit 24 is generally Ii nked with the preset processor 22, however, the area storage unit 24 can be linked with the camera movement processor 18 and the camera zoom processor 20. A detailed description related to this will be given later.
A camera manipulation signal output from the control unit 16 is transferred to a c amera drive unit 26 to control the operation of the camera 10. Here, the operation of t he camera is panning, tilting, zooming in/out, etc., as described above.
Here, a camera in which an up, down, left, and right (panning and tilting) drive mo tor(s) and a zoom in/out drive motor are integrated into one unit is preferably used for q uick camera driving, however, the present invention is not limited to this.
In the system constructed as described above, the GUI-based camera control me thod according to the present invention is performed as follows.
FIG. 2 is a flowchart illustrating a GUI-based camera control method performed by the camera movement processor 18 and the camera zoom processor 20, according t o an embodiment of the present invention, and FIG. 5 is a flowchart illustrating the oper ation of the preset processor 22, related to the GUI-based camera control method of Fl G. 2, according to an embodiment of the present invention.
Referring to FIG. 2, a video signal received from a camera is displayed on a sere en, so that a GUI screen is formed (operation 102). A cursor location of a mouse will b e displayed on the GUI screen.
If an operator does not press a specific button of the mouse on the GUI screen (o peration 104), the following process (a first mode of the present invention) is performed.
That is, a location value of the mouse cursor is calculated in real time using only the movement of the mouse cursor (operation 106), and a movement control value of the c amera is calculated according to the location value, so that the movement of the earner a is controlled in real time according to the movement of the mouse (operation 108). H owever, the present invention is not limited to this operation. In the current embodime nt, by only moving the mouse without manipulating the mouse, a screen is shifted accor ding to the movement of the mouse. However, it is also possible that the above operat ion is performed when the operator presses a specific button (for example, a left button) of the mouse. Such a modification can be easily executed by those skilled in the art. A screen illustrated in FIG. 3 can be selected through the operation as described above. That is, by moving the mouse right, as illustrated in (a) of FIG. 3, the right porti on of a displayed screen is moved to the center of the screen, as illustrated in (b) of FIG . 3. As illustrated in FIG. 3, since the mouse can be easily moved in all directions inclu ding diagonal directions, it is possible to move the camera more conventionally and quic kly to a desired location than by conventional methods.
Also, in the current embodiment, particularly, when a wheel mouse is used, by uti lizing a speed variable function which is advantage of a wheel mouse, that is, by utilizin g a function of the wheel of a wheel mouse, which causes slow movement of the screen when the view is centered on the moving target and fast movement of the screen whe n the view is far from the moving target, constant speed values and variable speed valu es can be obtained so that speed values of up, down, left, and right adjustments are obt ained. This can be applied to a moving target when a zoom function operates.
Meanwhile, if the operator presses a specific button of the mouse on the GUI scr een (operation 110), the following process (a second mode of the present invention) is performed. For example, if the operator presses a specific button (for example, a left button) of the mouse, a command for indicating "start area setting" is transferred to a co ntrol unit, so as to allow the operator to set a quadrangular area at a desired location on the GUI screen. In order to set the quadrangular area, for example, the operator presses a specifi c button of the mouse at a start point, extends a quadrangular window to a desired size while continuously pressing the specific button, and then releases the pressed button, t hereby completing area setting (operation 112). The control unit receives the start poi nt at which the area setting is started and an end point at which the area setting is com plete, calculates location values of the two points, and generates a camera manipulatio n signal for shifting the screen to the corresponding area or enlarging or reducing (zoom ing in/out) the corresponding area (operation 114). Accordingly, the camera can imme diately display a screen corresponding to the area when the operator completes area s etting on the GUI screen (operation 116). Here, the button manipulation for achieving t he second mode of the present invention can be implemented in various manners.
FIG. 4 illustrates screens on which the process described above is performed. I f an area 40 to be zoomed in on is set as illustrated in (a) of FIG. 4, the zoom-in functio n of the camera operates immediately when the setting of the area 40 is complete, so t hat the area 40 is displayed over the entire screen, as illustrated in (b) of FIG. 4.
Meanwhile, if the area setting is complete (operation 112) and the related locatio n values are calculated (operation 114), information regarding the area and the location values is stored in the area storage unit 24 (operation 115). The operation 115 can b e performed whenever area setting is performed. Accordingly, a preset function (a thir d mode of the present invention) of allowing an operator to sequentially search for pictu res corresponding to previously set areas by manipulating a specific button (for exampl e, a scroll wheel) of a wheel mouse, is implemented. In the current embodiment, the p reset function is performed by rotating the center button (that is, the scroll wheel) of the mouse forward or backward. For example, if the operator pushes and rotates the scrol I wheel forward, the pictures of the previously stored areas are sequentially reconstruct ed. Meanwhile, if the operator draws back and rotates the scroll wheel backward, the pictures of areas stored after a currently viewed area are sequentially reconstructed as screens. Details, such as button manipulation methods for using the preset function, t he number of preset-stored areas, etc., are well known to those skilled in the art.
FIG. 5 is a flowchart illustrating a method of performing the preset function. If a n operator rotates a specific button (for example, a scroll wheel) of the mouse when set areas are stored in the area storage unit 24 (see FIG. 2), the control unit senses the rot ation of the scroll wheel (operation 118), calculates rotation displacement values accord ing to the rotation (operation 119), and requests a set area corresponding to the rotatio n displacement values from the area storage unit 24 (operation 120). Then, the contro I unit calculates control values of the camera on the basis of area information received f rom the area storage unit 24 (operation 122), thereby controlling the camera (operation 124).
In the current embodiment, a wheel mouse is used as a pointing device and a sp ecific button of the wheel mouse is used for area setting, however, the use of the pointi ng device can vary. If a function of shifting a screen in real time according to the move ment of a pointing device and a function of zooming in/out of an arbitrary area while sett ing the corresponding area, which are basic concepts of the present invention, are impl emented, the use of the buttons (left, right, and center buttons) of the pointing device ca n be arbitrarily designed by those skilled in the art. For example, if a mouse is used as a pointing device, it is possible that a camera is controlled so that a screen is shifted centering on a point at which a mouse cursor is positioned according to the movement of the mouse when no button of the mouse is pr essed, and the camera is controlled so that a quadrangle area is set while the left butto n of the mouse is pressed (or continuously pressed) and a screen corresponding to the area is zoomed in on when the area setting is complete. At this time, the right button o f the mouse can be used to perform a variety of additional functions. The technical co ncept of the present invention can be implemented in various manners by combining th e uses of the left, right, and center buttons of the mouse and the pressing, release, and continuous pressing of the respective buttons. The preset function can be also imple mented using the left or right button instead of the center button. Accordingly, the flow charts illustrated in FIGS. 2 and 5 are detailed embodiments only for explaining the tech nical concept of the present invention.
An algorithm related to the GUI-based camera control method according to the pr esent invention is as follows. The following two equations are proposed as basic equa tions used in the present invention.
K=2(F+L), (1 ) where K denotes the size of the quadrangular area, F denotes an X-axis coordinate of a first point, and L denotes an X-axis coordinate of a second point.
H=(F.X+L.X)/2, V=(F.Y+LY)/2, (2) where H denotes a coordinate of the horizontal center point of the quadrangular area, F .X denotes the X-axis coordinate of the first point, L.X denotes the X-axis coordinate of t he second point, V denotes a coordinate of the vertical center point of the quadrangular area, F. Y denotes a Y-axis coordinate of the first point, and L.Y denotes a Y-axis coor dinate of the second point.
A window message generated by a movement (for example, pressing the left but ton of a mouse) of a mouse is detected. If the left button of the mouse is pressed, a wi ndow message "WM_LBUTTONDOWN" event is generated, so that information regardi ng a mouse location on a screen of the monitor can be obtained. If the pressing of the left button of the mouse is released, a window message "WM_LBUTTONUP" event is generated, so that information regarding a current location of the mouse on the screen of the monitor can be obtained.
The X and Y coordinates of the mouse can be calculated by the following windo w message function: WM_LBUTTONDOWN/WM_LBUTTONUP. Here, the window message function of the window message "WM_LBUTTOND
OWN7"WM_LBUTTONUP" uses the same parameters.
fwKeys = wParam; // value indicating a current status of a mouse button, differe nt from a keyboard xPos = LOWORD(IParam); // current X coordinate of the mouse yPos = HIWORD(IParam); // current Y coordinate of the mouse
The coordinate values of the xPos and the yPos are calculated based on the origi n of the corresponding task area.
status values of fwKeys MK_CONTROL : "Ctrl" key is pressed MK_LBUTTON : the left button of the mouse is pressed MK_MBUTTON : the center button of the mouse is pressed MK_RBUTTON : right button of the mouse is pressed
MK_SHIFT : "Shift" key is pressed
The following functions are used to detect and process a window message when the left button of the computer mouse is pressed or released. A function used when the left button of the mouse is as follows.
procedure TfrmZoomlnOut.FormMouseDown(Sender: TObject; Button: TMouseB utton;
Shift: TShiftState; X, Y: Integer); begin pFirst := Point(X, Y); end; A function used when the left button of the mouse is released is as follows.
procedure TfrmZoomlnOut.FormMouseUp(Sender: TObject; Button: TMouseButt on; Shift: TShiftState; X, Y: Integer); begin pLast := Point(X, Y);
// move a camera shot to the center of the quadrangle // Zoom in // Preset : store end;
A function used when the mouse moves while the left button of the mouse is pres sed is as follows.
procedure TfrmZoomlnOut.FormMouseMove (Sender: TObject; Shift: TShiftState
; X,
Y: Integer); begin // draw a quadrangle pLast := Point(X, Y); Canvas. Pen. Mode := PmNotXor;
Canvas. Polyline([pFirst, Point(pLast.x, pFirst.y), pLast, Point(pFirst.x, pLast.Y), pFirst]); end;
When the mouse moves while the left button of the computer mouse is pressed, a message "WM_MOUSEMOVE" is received and a quadrangle selection area is display ed in correspondence to a distance moved from the first point. A movement of the center button (the wheel) of the mouse can be calculated usi ng the following window message function.
WM_MOUSEWHEEL fwKeys = LOWORD(wParam); // value indicating a current status of a mouse bu tton, different from a keyboard zDelta = (short) HIWORD(wParam); // variation in wheel movement xPos = (short) LOWORD(IParam); // current X coordinate of the mouse yPos = (short) HIWORD(IParam); // current Y coordinate of the mouse
status values of fwKeys MK_CONTROL : "Ctrl" key is pressed MK_LBUTTON : the left button of the mouse is pressed MK_MBUTTON : the center button of the mouse is pressed
MK_RBUTTON : the right button of the mouse is pressed MK_SHIFT : "Shift" key is pressed
A function used to sense and process a window message when the center button (the wheel) of the computer mouse moves forward or backward is as follows.
procedure TfrmZoomlnOut.FormMouseWheel(Sender: TObject; Shift: TShiftStat e;
WheelDelta: Integer; MousePos: TPoint; var Handled: Boolean); begin if WheelDelta >= 0 then // when the wheel mouse moves upward begin
// move to "preset" stored before a current picture end else begin
// move to "preset" stored after the current picture Handled := True; end;
When the center button (the wheel) of the computer mouse moves, a message "
WM_MOUSEWHEEL" is received and processed. By checking a variation in wheel movement when an operator moves the mouse wheel forward or backward, it is possible to change a screen to a "preset" picture stored before or after a current picture in response to the movement of the wheel.

Claims

CLAIMS 1.
A method for controlling a graphic user interface (GUI) of a camera, comprising: displaying a video signal received from the camera and forming a GUI screen; sensing a movement of a location of a cursor of a pointing device on the GUI scr een and calculating a location value of the cursor; and calculating a movement control value of the camera according to the location val ue, thereby controlling a movement of the camera in real time according to a movement of the pointing device.
2. The method of claim 1 , further comprising: after the pointing device starts area setting on the GUI screen, calculating a locat ion value of the corresponding area immediately when the area setting is complete; and controlling the camera to shift a screen to the corresponding area or to zoom in o n the corresponding area.
3. The method of claim 2, further comprising: storing the location value of the set area; if a predetermined button of the pointing device is manipulated, sensing the mani pulation of the predetermined button, calculating a rotation displacement value accordin g to the sensed result, and requesting a set area corresponding to the rotation displace ment value; and calculating a control value of the camera on the basis of the requested set area, t hereby controlling the camera.
4.
The method of claim 3, wherein the pointing device is a computer mouse and the predetermined button is a scroll wheel of the computer mouse.
5.
The method of claim 1 , wherein, when the movement of the camera is controlled in real time according to the movement of the pointing device, no manipulation button of the pointing device operates.
6.
The method of claim 1 , wherein, the controlling of the movement of the camera i n real time according to the movement of the pointing device is performed after a manip ulation button of the pointing device operates.
7.
The method of any one of claims 2 through 6, wherein the area setting is perform ed while the manipulation button of the pointing device is pressed.
8.
The method of any one of claims 2 through 6, wherein the area setting is perform ed after the manipulation button of the pointing device is pressed once.
PCT/KR2005/002792 2005-05-03 2005-08-24 Camera control method using gui WO2006118366A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020050037181A KR20060114950A (en) 2005-05-03 2005-05-03 Camera control method using gui
KR10-2005-0037181 2005-05-03

Publications (1)

Publication Number Publication Date
WO2006118366A1 true WO2006118366A1 (en) 2006-11-09

Family

ID=37308139

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2005/002792 WO2006118366A1 (en) 2005-05-03 2005-08-24 Camera control method using gui

Country Status (2)

Country Link
KR (1) KR20060114950A (en)
WO (1) WO2006118366A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8723956B2 (en) 2004-08-30 2014-05-13 Trace Optic Technologies Pty Ltd Method and apparatus of camera control

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111901613B (en) * 2020-08-10 2021-05-11 成都中科大旗软件股份有限公司 Scenic spot live broadcast system based on 5G

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10322581A (en) * 1997-03-19 1998-12-04 Sony Corp Image pickup device controller and image pickup system
KR20010025583A (en) * 2001-01-09 2001-04-06 정귀수 A computer-based remote surveillance CCTV system, a computer video matrix switcher and a control program adapted to the CCTV system
WO2002013513A1 (en) * 2000-08-03 2002-02-14 Koninklijke Philips Electronics N.V. Method and apparatus for external calibration of a camera via a graphical user interface
KR20050000276A (en) * 2003-06-24 2005-01-03 주식회사 성진씨앤씨 Virtual joystick system for controlling the operation of a security camera and controlling method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10322581A (en) * 1997-03-19 1998-12-04 Sony Corp Image pickup device controller and image pickup system
WO2002013513A1 (en) * 2000-08-03 2002-02-14 Koninklijke Philips Electronics N.V. Method and apparatus for external calibration of a camera via a graphical user interface
KR20010025583A (en) * 2001-01-09 2001-04-06 정귀수 A computer-based remote surveillance CCTV system, a computer video matrix switcher and a control program adapted to the CCTV system
KR20050000276A (en) * 2003-06-24 2005-01-03 주식회사 성진씨앤씨 Virtual joystick system for controlling the operation of a security camera and controlling method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8723956B2 (en) 2004-08-30 2014-05-13 Trace Optic Technologies Pty Ltd Method and apparatus of camera control

Also Published As

Publication number Publication date
KR20060114950A (en) 2006-11-08

Similar Documents

Publication Publication Date Title
CN111066315B (en) Apparatus, method and readable medium configured to process and display image data
US9952754B2 (en) Information processing device, information processing method, and program
US5396287A (en) TV camera work control apparatus using tripod head
US9454230B2 (en) Imaging apparatus for taking image in response to screen pressing operation, imaging method, and program
US7619677B2 (en) Electronic camera adjusting size of image to fit display area
JP3399891B2 (en) On-screen menu selection method and apparatus
KR101299613B1 (en) Control device, control method, camera system, and recording medium
KR100940971B1 (en) Providing area zoom functionality for a camera
JP5591006B2 (en) Control device for automatic tracking camera system and automatic tracking camera system having the same
JP2000307928A (en) Camera controller
EP1496702A2 (en) Virtual joystick system for controlling the operation of security cameras and controlling method thereof
JP2004521551A (en) Remote camera control device
JP2012227737A (en) Automatic tracking device for camera device and automatic tracking camera system having the same
WO2006118366A1 (en) Camera control method using gui
JP3497184B2 (en) TV camera operation control device using pan head
JP2011239104A (en) Camera device, display magnifying method and program
JP5907184B2 (en) Information processing apparatus, information processing method, and program
JP2002157079A (en) Method of discriminating intention
KR0183827B1 (en) Supervisory camera remote control apparatus using a pen mouse and method thereof
KR100758985B1 (en) Image display apparatus for operating user interface and method thereof
KR100815234B1 (en) GUI apparatus and method for camera control device
JP2001183383A (en) Imaging apparatus and method for calculating velocity of object to be imaged
KR101445607B1 (en) device of processing digital image using a accelerator sensor and image replaying method using the same
JPH08163411A (en) Camera system
JP6135789B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 05781194

Country of ref document: EP

Kind code of ref document: A1