WO2015001635A1 - 遠隔モニタリングシステムおよびモニタリング方法 - Google Patents
遠隔モニタリングシステムおよびモニタリング方法 Download PDFInfo
- Publication number
- WO2015001635A1 WO2015001635A1 PCT/JP2013/068284 JP2013068284W WO2015001635A1 WO 2015001635 A1 WO2015001635 A1 WO 2015001635A1 JP 2013068284 W JP2013068284 W JP 2013068284W WO 2015001635 A1 WO2015001635 A1 WO 2015001635A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- video
- monitoring
- image
- real
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 155
- 238000000034 method Methods 0.000 title claims description 37
- 238000004364 calculation method Methods 0.000 claims abstract description 27
- 238000009434 installation Methods 0.000 description 16
- 230000002452 interceptive effect Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 6
- 238000013500 data storage Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000004091 panning Methods 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- the present invention relates to a remote monitoring system and a monitoring method.
- remote monitoring systems are used for observing the situation at different locations, regardless of distance.
- the remote monitoring system it is possible to efficiently perform observation by controlling a number of surveillance cameras via a network and selectively displaying transmitted images to a user.
- the surveillance camera can control the pan angle, tilt angle, and zoom magnification, and the user can obtain a desired image.
- a method in which the user directly operates the pan angle, the tilt angle, and the zoom magnification while looking at the image of the monitoring camera is often used. Due to the network delay, there was a problem that the video was delayed, and it was difficult to point the surveillance camera at the point to be observed.
- the above remote monitoring system has a problem that it is difficult to know where the user is looking. This is remarkable when the number of surveillance cameras increases and the user can control the pan angle, tilt angle, and zoom magnification of the surveillance cameras.
- Patent Literature 1 As a control method of a surveillance camera for remote monitoring, and Patent Literature 1 describes a method using a sensing device such as a sensor. This method uses a three-dimensional virtual space. A three-dimensional virtual space model is prepared in advance, and the position of the sensing device, the position and line of sight of the monitoring camera, and the angle of view area are displayed on the three-dimensional virtual space. By displaying in three dimensions in real time, it indicates where the currently imaged area is located.
- Patent Document 1 Although the technique described in Patent Document 1 can indicate where the currently imaged region is located using a three-dimensional virtual space, the surveillance camera itself cannot be interactively controlled.
- An object of the present invention is to facilitate selection of a surveillance camera that can capture an image to be viewed by using a CG image.
- One embodiment of the present invention has a display unit on which a CG image generated from a three-dimensional CG model is displayed.
- an input unit that receives a user input for the CG video is provided.
- a three-dimensional CG image generation unit configured to display the CG image after being moved based on the input on the display unit;
- an optimal camera calculation unit is provided that identifies a monitoring camera that can capture a real video similar to the CG video after being moved.
- a control unit configured to control the identified surveillance camera; Further, the display unit displays a real image from the controlled surveillance camera.
- an input step for receiving a user input for the CG video is provided.
- it has the movement step which moves the said CG image
- a control step of specifying the monitoring camera that can capture a real image similar to the CG image after being moved and controlling the specified monitoring camera is included.
- the remote monitoring system in Embodiment 1 of this invention it is a figure which shows the example of the image
- summary of the optimal camera calculation process is shown.
- the remote monitoring system displays a real image captured from a monitoring camera, and superimposes and displays a CG image expressing the same image as the displayed real image on the real image. Then, the input of the user to the CG video is received, the CG video is moved, a monitoring camera that can capture an image similar to the moved CG video is specified, and the video from the specified monitoring camera is displayed.
- the surveillance camera can be interactively controlled so as to capture an image to be viewed.
- FIG. 1 is a diagram showing an outline of a configuration example of a remote monitoring system according to Embodiment 1 of the present invention.
- a remote monitoring system includes a remote monitoring system control unit 100, a plurality of monitoring cameras 201 to 204 connected to the remote monitoring system control unit 100, and a control unit 205 (for example, controlling these monitoring cameras 201 to 204).
- a mechanism for controlling the direction of the monitoring cameras 201 to 204 includes a display unit 401.
- the computer of the remote monitoring system control unit 100 is implemented by predetermined hardware and software.
- the remote monitoring system control unit 100 includes a processor, a memory, and the like, and executes a program on the memory by the processor.
- the remote monitoring system control unit 100 includes a three-dimensional CG image generation unit 101, an optimum camera calculation unit 102, a camera control unit 103, an image input unit 104, a virtual camera interactive setting unit 105, and a target CG model storage unit. 106 and a monitoring camera installation data storage unit 107.
- the target CG model storage unit 106 stores a three-dimensional CG model that is a three-dimensional virtual space expressing the real world.
- the 3D CG image generation unit 101 acquires a 3D CG model from the target CG model storage unit 106. Further, the three-dimensional CG image generation unit 101 receives virtual camera parameters (described later, FIG. 2) that are parameters of a virtual camera that captures the target three-dimensional CG model from the virtual camera interactive type setting unit 105. Then, the 3D CG image generation unit 101 generates a CG video based on the acquired 3D CG model and the received virtual camera parameter, and displays the generated CG video on the display unit 401.
- virtual camera parameters described later, FIG. 2
- the input received from the input unit 301 is interpreted by the virtual camera interactive setting unit 105, and the virtual parameter is changed by the virtual camera interactive setting unit 105. Then, the virtual camera interactive setting unit 105 transmits the changed virtual camera parameters to the three-dimensional CG image generation unit 101. Thereafter, the three-dimensional CG image generation unit 101 generates a CG video based on the acquired three-dimensional CG model and the received changed virtual camera parameter, and displays the generated CG video on the display unit 401. Accordingly, based on the input received by the input unit 301, based on the 3D CG model and the virtual camera parameter after the change from the CG video displayed based on the 3D CG model and the virtual camera parameter before the change. The CG image is interactively moved to the displayed CG image.
- the input unit 301 corresponds to an input device such as a joystick, a keyboard, or a mouse.
- various known methods can be used for setting the virtual camera parameters described above.
- the virtual camera parameters may be interactively controlled by changing the virtual camera parameters using an arc ball or track ball that virtually rotates the object, and moving the CG image based on the changed virtual camera parameters. good.
- the position and direction of the virtual camera may be changed using a walk-through or fly-through.
- the position of the virtual camera, the position of a point of interest described later, the up vector of the virtual camera, and the like may be interactively changed.
- the surveillance cameras 201 to 204 capture a real image that is an image of the real world, and transmit the captured real image to the image input unit 104 in real time.
- the optimum camera calculation unit 102 uses the virtual camera parameters and the monitoring camera installation data (to be described later) as the monitoring cameras 201 to 204 capable of capturing a real image similar to the moved CG image captured from the virtual camera. 3). Thereafter, the optimal camera calculation unit 102 determines the optimal pan angle, tilt angle, and zoom magnification for capturing the real video similar to the CG video captured from the virtual camera for the specified monitoring cameras 201 to 204 as the optimal camera parameters. Calculate as
- the camera control unit 103 sets the pan angle, the tilt angle, and the zoom magnification using the optimal camera parameters in the control unit 205 of the monitoring cameras 201 to 204.
- the control unit 205 controls the monitoring cameras 201 to 204 based on the set value. Thereafter, the real video captured by the monitoring cameras 201 to 204 is displayed on the display unit 401 via the image input unit 104.
- the three-dimensional CG model stored in the target CG model storage unit 106 is a three-dimensional CG model that represents the real world to be monitored or observed, and includes information on the positions and directions of the monitoring cameras 201 to 204. .
- the 3D CG model can also be created interactively using 3D CG software.
- CAD information for example, a DXF file
- a three-dimensional CG model may be generated from the CAD information.
- the three-dimensional CG model may be measured three-dimensionally using a laser range finder or the like. Further, the three-dimensional CG model may be reconstructed from a plurality of photographs.
- FIG. 2 is a diagram illustrating a configuration example of virtual camera parameters in the remote monitoring system according to Embodiment 1 of the present invention.
- the virtual camera parameters have data items such as [virtual camera position], [virtual camera direction], and [virtual camera angle of view].
- [Virtual camera position] indicates coordinates for specifying the position of the virtual camera, and is composed of an x-coordinate, a y-coordinate, and a z-coordinate.
- [Virtual Camera Direction] indicates the direction of the virtual camera when the pan angle and the tilt angle, which are rotational components with respect to each axis, are set to 0. From the tilt angle ⁇ x , the pan angle ⁇ y, and the roll angle ⁇ z become.
- [Camera angle] indicates the angle of view ⁇ a of the camera.
- FIG. 3 is a diagram illustrating a configuration example of monitoring camera installation data stored in the monitoring camera installation data storage unit 107 in the remote monitoring system according to Embodiment 1 of the present invention.
- the surveillance camera installation data stored in the surveillance camera installation data storage unit 107 includes [surveillance camera ID], [surveillance camera position], [surveillance camera installation direction], [pan angle, tilt angle, zoom magnification], [camera rated value] ] And other data items.
- [Monitoring camera position] indicates coordinates for specifying the positions of the monitoring cameras 201 to 204, and is composed of an x-coordinate, a y-coordinate, and a z-coordinate.
- [Monitoring Camera Installation Direction] indicates the direction of the monitoring cameras 201 to 204 when the pan angle and tilt angle, which are rotational components with respect to each axis, are set to 0, and includes the tilt angle ⁇ x , pan angle ⁇ y , and roll angle. consisting of a ⁇ z.
- [Pan angle ⁇ p ] indicates the pan angle of the surveillance camera
- [Tilt angle ⁇ t ] indicates the tilt angle of the surveillance camera
- [Camera rated value] indicates the range of the zoom magnification, and the angle of view of the camera, the controllable tilt angle ⁇ t , the pan angle ⁇ p when the zoom magnification is 1.
- the actual direction of the monitoring camera is a direction obtained by combining [pan angle ⁇ p ] and [tilt angle ⁇ t ] with respect to the above [monitoring camera installation direction].
- FIG. 4 is a diagram showing an overview of overall processing in the remote monitoring system according to Embodiment 1 of the present invention.
- a real image captured by the monitoring cameras 201 to 204 is displayed on the display unit 401 as shown in FIG.
- the 3D CG image generation unit 101 generates and generates a CG video based on the virtual camera parameters equivalent to the monitoring camera installation data of the monitoring cameras 201 to 204 and the 3D CG model.
- the CG video thus displayed is superimposed on the real video as shown in FIG.
- the CG image shown in FIG. 5B is represented by a wire frame.
- the CG video may be expressed in a three-dimensional plane.
- the CG video and the real video may be displayed in separate windows using a multi-window instead of displaying the CG video superimposed on the real video.
- only the CG video may be displayed on the display unit 401. In this case, the CG video is hidden after being moved, and then the real video captured by the controlled monitoring cameras 201 to 204 is displayed on the display unit 401.
- the input unit 301 receives an input for the CG video displayed on the display unit 401.
- the virtual camera interactive setting unit 105 changes the virtual camera parameters based on the input received in S403. Thereafter, the virtual camera interactive setting unit 105 transmits the changed virtual camera parameter to the three-dimensional CG image generation unit 101.
- the three-dimensional CG image generation unit 101 generates a CG video based on the virtual camera parameters and the three-dimensional CG model that have been changed in S404, and displays them on the display unit 401 as shown in FIG.
- the CG image moved from the state of (b) is superimposed and displayed on the real image as shown in FIG.
- the virtual camera by panning angle theta y is changed it is panned to left.
- the CG video is moved in the right direction, which is the opposite direction of the panning of the virtual camera.
- an optimal camera specifying process (described later in FIG. 6) is performed, and the monitoring cameras 201 to 204 capable of capturing a real image similar to the moved CG image captured from the virtual camera.
- the optimum camera parameters for capturing a real image similar to the CG image captured from the virtual camera are calculated.
- step S ⁇ b> 407 the camera control unit 103 sets optimal camera parameters in the control unit 205 of the monitoring cameras 201 to 204 specified by the optimal camera calculation unit 102.
- the control unit 205 controls the pan angle ⁇ p , the tilt angle ⁇ t, and the zoom magnification f z of the monitoring cameras 201 to 204 based on the set optimal camera parameters.
- the monitoring cameras 201 to 204 controlled by the control unit 205 capture a real image.
- the display unit 401 displays the real video imaged by the controlled monitoring cameras 201 to 204, and the CG video image after being moved corresponds to the real video image. It is displayed superimposed.
- the three-dimensional CG image generation unit 101 ends the process of displaying the CG image, and the superimposed display of the CG video with respect to the real video is removed as shown in FIG.
- the surveillance camera parameters of the surveillance cameras 201 to 204 after being controlled may be applied as virtual camera parameters of the CG video after being moved based on the user input. More specifically, the virtual camera interactive setting unit 105 terminates the input received from the user (for example, the mouse drag ends), and when the process of moving the CG video ends, the optimal camera parameter is set as the virtual camera parameter. Apply. Thereafter, the 3D CG image generation unit 101 generates a CG video based on the new virtual camera parameter and the 3D CG model, and displays the generated CG video on the display unit 401. In this case, the virtual camera parameter may be gradually corrected to the optimum camera parameter by the animation function, and a CG image may be displayed each time the virtual camera parameter is corrected.
- FIG. 6 shows an outline of optimal camera calculation processing in the remote monitoring system according to Embodiment 1 of the present invention.
- the position of the point that the user wants to confirm (hereinafter referred to as the point of interest) is set in S601
- the monitoring cameras 201 to 204 are specified in S602 to S605
- the optimum camera parameter is set in S606. Calculated.
- an attention point is set in the CG image after being moved.
- a method for setting the attention point 700 will be described with reference to FIGS. 7 and 8.
- FIG. 7 is a diagram illustrating a method for the user to set a point of interest in the remote monitoring system according to Embodiment 1 of the present invention.
- the center point of the CG image after being moved is set as the attention point 700.
- the CG image is a two-dimensional image
- the x-coordinate and the y-coordinate can be acquired, but information about the coordinate (z coordinate) in the depth direction is missing. Therefore, in the example of FIG. 7, the coordinates in the depth direction of the outermost surface of the object displayed in the CG image (such as a desk or a chair) are used as the coordinates in the depth direction of the attention point 700.
- a three-dimensional CG model for displaying a CG image it is possible to obtain coordinates in the depth direction of the center of the CG image. At this time, when an object does not exist at the center of the CG image, an error display may be performed and an input for setting the attention point 700 may be received from the user.
- FIG. 8 is a diagram for explaining another method for setting the attention point 700 in the remote monitoring system according to the first embodiment of the present invention.
- FIG. 8A shows an example of a CG video
- FIG. 8B shows an example of an overhead CG video.
- the display unit 401 displays the CG video and the overhead CG video in separate windows using a multi-window.
- the user can input to set the attention point 700 and the virtual camera position 800 via the input unit 104 while confirming the overhead view CG video.
- the input unit 104 receives an input to set the position of the virtual camera position 800 or the point of interest 700 from the user, the CG image is moved in real time according to the set contents.
- the optimum camera calculation unit 102 acquires [monitoring camera position] from the monitoring camera installation data storage unit 107.
- step S ⁇ b> 603 the optimal camera calculation unit 102 acquires the virtual camera parameter [virtual camera position] from the virtual camera interactive setting unit 105.
- an attention point 700 is set for the target three-dimensional CG model 901.
- step S604 the optimal camera calculation unit 102 determines the straight lines 902 to 905 connecting the [surveillance camera position] and the attention point 700 for each of the monitoring cameras 201 to 204, and the straight line 906 connecting the [virtual camera position] and the attention point 700.
- the angle ⁇ c1 ⁇ ⁇ c4 which bets forms calculated for each surveillance cameras 201-204.
- the angles ⁇ c1 to You may make it calculate (theta) c4 .
- the angles ⁇ c1 to are based on the direction of the monitoring camera when facing the closest possible direction and the virtual camera parameter [virtual camera direction]. You may make it calculate (theta) c4 .
- the optimum camera calculation unit 102 determines that there is no monitoring camera 201 to 204 that can capture the attention point 700. Thereafter, an error may be displayed on the display unit 401. In this case, whether or not there are the monitoring cameras 201 to 204 that can capture the attention point 700 is determined by whether or not the attention point 700 is included in the range in which each of the monitoring cameras 201 to 204 can be captured by the optimum camera calculation unit 102. If it is not included, it is determined that there are no monitoring cameras 201 to 204 that can capture images.
- the optimum camera calculation unit 102 uses the monitoring cameras 201 to 204 with the smallest angles ⁇ c1 to ⁇ c4 calculated in S604 as the monitoring cameras 201 that can capture real images similar to CG images. Specify as ⁇ 204.
- the optimal camera calculation unit 102 may specify the monitoring cameras 201 to 204 based on the similarity between the CG image captured by the virtual camera and the real video captured by the monitoring cameras 201 to 204.
- the optimal camera calculation unit 102 calculates optimal camera parameters. More specifically, the optimum camera calculation unit 102 calculates the pan angle ⁇ p and the tilt angle ⁇ t when the identified monitoring cameras 201 to 204 face the direction of the attention point 700 as optimum camera parameters. Further, the size of the three-dimensional CG model 901 in the vicinity of the point of interest 700 when facing the direction of the point of interest 700 at the calculated pan angle ⁇ p and tilt angle ⁇ t is the size of the three-dimensional CG model 901 in the virtual camera. calculating the equal zoom magnification f z sized as the optimal camera parameters.
- S402 corresponds to the CG video display step
- S403 and S601 correspond to the input step
- S405 corresponds to the movement step
- S407 and S602 to S606 correspond to the control step
- S407 corresponds to the real video display step. Equivalent to.
- the monitoring camera 201 to 204 that can capture a real image similar to the CG image after being moved can be specified, and the monitoring camera 201 that can capture an image to be viewed. It becomes possible to easily select ⁇ 204.
- the monitoring cameras 201 to 204 that can capture a real image similar to the CG image from the plurality of monitoring cameras 201 to 204 are specified, so that the attention point 700 is obtained. It is possible to easily select the monitoring cameras 201 to 204 that can capture the image.
- the real video that matches the generated CG video is actually displayed.
- the monitoring cameras 201 to 204 can match the CG video with the real video even when the imaging cannot be performed.
- the display unit 401 displays the CG video superimposed on the real video, so that the user can move the CG video while grasping the correspondence relationship with the real video.
- the second embodiment is different from the first embodiment in that a real image is captured by receiving an input from the user to select any one of the monitoring cameras 201 to 204 from a plurality of monitoring cameras 201 to 204. The point is that the cameras 201 to 204 are specified.
- Embodiment 2 of the present invention will be described with reference to FIG.
- FIG. 10 shows an outline of the optimum camera parameter calculation process when there is one virtual camera in the remote monitoring system according to the second embodiment of the present invention.
- step S1001 the input unit 301 receives an input of a monitoring camera ID for identifying the monitoring cameras 201 to 204.
- the optimum camera calculation unit 102 identifies the monitoring cameras 201 to 204 that capture a real image based on the monitoring camera ID received in S1001.
- the 3D CG image generation unit 101 generates and generates a CG video based on the virtual camera parameters equivalent to the monitoring camera installation data of the monitoring cameras 201 to 204 and the 3D CG model.
- the displayed CG video is displayed superimposed on the real video.
- the input unit 301 receives an input for the CG video displayed on the display unit 401.
- the virtual camera interactive setting unit 105 changes the [virtual camera direction] of the virtual camera parameter based on the input received in S403.
- [virtual camera position] is fixed at the same position as the [monitoring camera position] of the monitoring cameras 201 to 204 specified in S1002.
- the virtual camera interactive setting unit 105 transmits the changed virtual camera parameter to the three-dimensional CG image generation unit 101.
- the three-dimensional CG image generation unit 101 generates a CG video based on the virtual camera parameters and the three-dimensional CG model that have been changed in S1006, and the display unit 401 is moved.
- the CG image after the display is superimposed on the real image.
- the optimal camera calculation unit 102 determines the virtual camera parameter (that is, the virtual camera parameter changed in S1006) (ie, the optimal camera parameter for capturing a real video similar to the CG video captured from the virtual camera). , [Virtual camera direction]). Then, the optimal camera calculation unit 102 transmits the optimal camera parameter to the camera control unit 103.
- the control unit 205 controls the monitoring cameras 201 to 204 based on the set value.
- the monitoring cameras 201 to 204 controlled by the control unit 205 capture real images, and the display unit 401 displays real images captured by the controlled monitoring cameras 201 to 204 and is moved.
- the CG video is superimposed on the real video.
- a user interface in which the virtual camera position 800 of the virtual camera is not fixed even when the input unit 301 receives the monitoring camera IDs of the monitoring cameras 201 to 204 as described above is also conceivable.
- an input for setting the attention point 700 may be received, and the specified optimum camera parameters of the monitoring cameras 201 to 204 may be set by using the set attention point 700.
- S1001 corresponds to the designation step
- S1008 and S1009 correspond to the control step
- the designated surveillance cameras 201 to 204 are controlled to capture a real video similar to the CG video, thereby adding to the effects of the first embodiment.
- the display of the real video can be made to follow the CG video without designating the attention point 700.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
本発明の実施の形態1を、図1~図9を用いて説明する。
図1は、本発明の実施の形態1における遠隔モニタリングシステムの構成例の概要を示す図である。図1において、遠隔モニタリングシステムは、遠隔モニタリングシステム制御部100と、遠隔モニタリングシステム制御部100と接続される複数の監視カメラ201~204と、これら監視カメラ201~204を制御する制御部205(例えば、監視カメラ201~204の方向を制御する機構)と、入力部301と、表示部401とを有する。
図2は、本発明の実施の形態1における遠隔モニタリングシステムにおいて、仮想カメラパラメータの構成例を示す図である。
図3は、本発明の実施の形態1における遠隔モニタリングシステムにおいて、監視カメラ設置データ記憶部107が記憶する監視カメラ設置データの構成例を示す図である。
図4は、本発明の実施の形態1における遠隔モニタリングシステムにおいて、全体処理の概要を示す図である。
図6は、本発明の実施の形態1における遠隔モニタリングシステムにおいて、最適カメラ算出処理の概要を示す。最適カメラ算出処理では、S601にてユーザが確認したい点の位置(以下、注目点という)が設定され、S602~S605にて、監視カメラ201~204が特定され、S606にて、最適カメラパラメータが算出される。
以上説明した実施の形態1における遠隔モニタリングシステムによれば、動かされた後のCG映像に類似したリアル映像を撮像できる監視カメラ201~204を特定することで、見たい画像を撮像できる監視カメラ201~204の選択が容易にできるようになる。
実施の形態2が実施の形態1と異なる点は、複数台ある監視カメラ201~204の中からいずれかの監視カメラ201~204を選択する入力をユーザから受け付けることで、リアル映像を撮像する監視カメラ201~204を特定する点である。
以上説明した実施の形態2における遠隔モニタリングシステムによれば、指定された監視カメラ201~204を制御することで、CG映像に類似するリアル映像を撮像することで、実施の形態1の効果に加えて、注目点700を指定しなくても、リアル映像の表示をCG映像に追従させることができるようになる。
101 3次元CG画像生成部
102 最適カメラ算出部
103 カメラ制御部
104 画像入力部
105 仮想カメラ対話型設定部
106 対象CGモデル記憶部
107 監視カメラ設置データ記憶部
201,202,203,204 監視カメラ
205 制御部
301 入力部
401 表示部
700 注目点
800 仮想カメラ位置
901 3次元CGモデル
902,903,904,905,906 直線。
Claims (10)
- 3次元CGモデルから生成されたCG映像が表示される表示部と、
前記CG映像に対するユーザの入力を受け付ける入力部と、
前記入力に基づいて動かされた後の前記CG映像を前記表示部に表示する3次元CG画像生成部と、
動かされた後の前記CG映像に類似したリアル映像を撮像できる監視カメラを特定する最適カメラ算出部と、
前記特定された前記監視カメラを制御する制御部と、
を有し、
前記表示部には、制御された前記監視カメラで撮影されたリアル映像が表示される、遠隔モニタリングシステム。 - 請求項1に記載の遠隔モニタリングシステムにおいて、
前記入力部は、注目点を設定する入力を受け付け、
前記最適カメラ算出部は、前記仮想カメラの位置と設定された前記注目点とを用いて複数台の前記監視カメラから前記CG映像に類似した前記リアル映像を撮像できる前記監視カメラを特定する、遠隔モニタリングシステム。 - 請求項1に記載の遠隔モニタリングシステムにおいて、
前記入力部は、1台の前記監視カメラを指定する入力を受け付け、
前記制御部は、前記指定された前記監視カメラを制御することで、前記CG映像に類似した前記リアル映像を撮像させる、遠隔モニタリングシステム。 - 請求項1に記載の遠隔モニタリングシステムにおいて、
ユーザの入力に基づいて動かされた後の前記CG映像を、前記監視カメラで撮像可能な監視カメラパラメータを用いて調整する、遠隔モニタリングシステム。 - 請求項1に記載の遠隔モニタリングシステムにおいて、
前記表示部には、前記CG映像が前記リアル映像に対して重畳表示される、遠隔モニタリングシステム。 - 3次元CGモデルから生成されたCG映像を表示するCG映像表示ステップと、
前記CG映像に対するユーザの入力を受け付ける入力ステップと、
ユーザから受け付けた前記入力に基づいて前記CG映像を動かす移動ステップと、
動かされた後の前記CG映像に類似したリアル映像を撮像できる前記監視カメラを特定し、特定された前記監視カメラを制御する制御ステップと、
制御された前記監視カメラで撮影された前記リアル映像を表示するリアル映像表示ステップと、
を有する、モニタリング方法。 - 請求項6に記載のモニタリング方法において、
前記入力ステップは、注目点を設定する入力を受け付け、
前記制御ステップは、前記仮想カメラの位置と前記注目点の情報とを用いて複数台の前記監視カメラから前記CG映像に類似した前記リアル映像を撮像できる前記監視カメラを特定し、特定された前記監視カメラを制御する、モニタリング方法。 - 請求項6に記載のモニタリング方法において、
1台の前記監視カメラを指定する指定ステップをさらに有し、
前記制御ステップは、指定された前記監視カメラを制御することで、前記仮想カメラの前記CG映像に類似した前記リアル映像を撮像する、モニタリング方法。 - 請求項6に記載のモニタリング方法において、
前記移動ステップは、ユーザ入力が完了した後に前記監視カメラで撮像可能なカメラパラメータを用いて動かされた後の前記CG映像を調整する、モニタリング方法。 - 請求項6に記載のモニタリング方法において、
CG映像表示ステップは、前記CG映像を前記リアル映像に対して重畳表示する、
モニタリング方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015524966A JP6214653B2 (ja) | 2013-07-03 | 2013-07-03 | 遠隔モニタリングシステムおよびモニタリング方法 |
US14/901,757 US9967544B2 (en) | 2013-07-03 | 2013-07-03 | Remote monitoring system and monitoring method |
PCT/JP2013/068284 WO2015001635A1 (ja) | 2013-07-03 | 2013-07-03 | 遠隔モニタリングシステムおよびモニタリング方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2013/068284 WO2015001635A1 (ja) | 2013-07-03 | 2013-07-03 | 遠隔モニタリングシステムおよびモニタリング方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015001635A1 true WO2015001635A1 (ja) | 2015-01-08 |
Family
ID=52143255
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/068284 WO2015001635A1 (ja) | 2013-07-03 | 2013-07-03 | 遠隔モニタリングシステムおよびモニタリング方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9967544B2 (ja) |
JP (1) | JP6214653B2 (ja) |
WO (1) | WO2015001635A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021015440A (ja) * | 2019-07-11 | 2021-02-12 | キヤノン株式会社 | 情報処理装置、設定方法、及びプログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10600245B1 (en) | 2014-05-28 | 2020-03-24 | Lucasfilm Entertainment Company Ltd. | Navigating a virtual environment of a media content item |
US10657729B2 (en) * | 2018-10-18 | 2020-05-19 | Trimble Inc. | Virtual video projection system to synch animation sequences |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1042272A (ja) * | 1996-07-24 | 1998-02-13 | Nippon Telegr & Teleph Corp <Ntt> | 遠隔監視装置 |
JP2001249633A (ja) * | 2000-03-03 | 2001-09-14 | Hideki Araki | 3次元cgモデルによる商品広告方法及びシステム |
JP2002269593A (ja) * | 2001-03-13 | 2002-09-20 | Canon Inc | 画像処理装置及び方法、並びに記憶媒体 |
JP2010200167A (ja) * | 2009-02-26 | 2010-09-09 | Toshiba Corp | 監視システム及び監視方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8471910B2 (en) * | 2005-08-11 | 2013-06-25 | Sightlogix, Inc. | Methods and apparatus for providing fault tolerance in a surveillance system |
JP4960659B2 (ja) * | 2006-06-20 | 2012-06-27 | クボテック株式会社 | 3次元仮想空間を利用したビデオカメラ撮影制御装置およびビデオカメラ撮影制御方法 |
JP2008259154A (ja) | 2007-04-06 | 2008-10-23 | Kubo Tex Corp | 3次元仮想空間を利用した感知装置のリアルタイム状態把握、制御の方法 |
JP5727207B2 (ja) * | 2010-12-10 | 2015-06-03 | セコム株式会社 | 画像監視装置 |
JP5714960B2 (ja) * | 2011-03-31 | 2015-05-07 | セコム株式会社 | 監視範囲検知装置 |
-
2013
- 2013-07-03 JP JP2015524966A patent/JP6214653B2/ja not_active Expired - Fee Related
- 2013-07-03 WO PCT/JP2013/068284 patent/WO2015001635A1/ja active Application Filing
- 2013-07-03 US US14/901,757 patent/US9967544B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1042272A (ja) * | 1996-07-24 | 1998-02-13 | Nippon Telegr & Teleph Corp <Ntt> | 遠隔監視装置 |
JP2001249633A (ja) * | 2000-03-03 | 2001-09-14 | Hideki Araki | 3次元cgモデルによる商品広告方法及びシステム |
JP2002269593A (ja) * | 2001-03-13 | 2002-09-20 | Canon Inc | 画像処理装置及び方法、並びに記憶媒体 |
JP2010200167A (ja) * | 2009-02-26 | 2010-09-09 | Toshiba Corp | 監視システム及び監視方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021015440A (ja) * | 2019-07-11 | 2021-02-12 | キヤノン株式会社 | 情報処理装置、設定方法、及びプログラム |
JP7455524B2 (ja) | 2019-07-11 | 2024-03-26 | キヤノン株式会社 | 情報処理装置、設定方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6214653B2 (ja) | 2017-10-18 |
JPWO2015001635A1 (ja) | 2017-02-23 |
US20160205379A1 (en) | 2016-07-14 |
US9967544B2 (en) | 2018-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210142582A1 (en) | Display of an occluded object in a hybrid-reality system | |
WO2017114508A1 (zh) | 三维监控系统中基于三维重构的交互式标定方法和装置 | |
JP6062039B2 (ja) | 画像処理システムおよび画像処理用プログラム | |
US20140368621A1 (en) | Image processing apparatus, image processing method, and computer program product | |
JP6310149B2 (ja) | 画像生成装置、画像生成システム及び画像生成方法 | |
KR20150067197A (ko) | 비디오 시각 변경 방법 및 장치 | |
EP3486749B1 (en) | Provision of virtual reality content | |
TWI821220B (zh) | 影像擷取之設備及方法 | |
JP2008005450A (ja) | 3次元仮想空間を利用したビデオカメラのリアルタイム状態把握、制御の方法 | |
JP6939801B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP5727207B2 (ja) | 画像監視装置 | |
JP6214653B2 (ja) | 遠隔モニタリングシステムおよびモニタリング方法 | |
KR20110088995A (ko) | 3차원 모델 안에서 감시 카메라 영상을 시각화하기 위한 방법 및 시스템, 및 기록 매체 | |
CN113677412B (zh) | 信息处理装置、信息处理方法和程序 | |
KR101572800B1 (ko) | 3차원 공간 모델에 대한 카메라 제어 시스템 | |
KR101710860B1 (ko) | 영상정보를 기반으로 공간상의 위치정보를 생성하는 방법 및 그 장치 | |
JP2021197572A (ja) | カメラ制御装置及びプログラム | |
JP5960472B2 (ja) | 画像監視装置 | |
JP6358998B2 (ja) | 警備シミュレーション装置 | |
JP2022012900A (ja) | 情報処理装置、表示方法、及び、プログラム | |
Zhao et al. | Active visual mapping system for digital operation environment of bridge crane | |
JP2005252831A (ja) | 設備監視支援装置 | |
KR102644608B1 (ko) | 디지털 트윈 기반의 카메라 위치 초기화 방법 | |
US12095964B2 (en) | Information processing apparatus, information processing method, and storage medium | |
WO2012091537A1 (en) | System and method for navigation and visualization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13888605 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015524966 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14901757 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13888605 Country of ref document: EP Kind code of ref document: A1 |