WO2013111479A1 - Système de surveillance - Google Patents

Système de surveillance Download PDF

Info

Publication number
WO2013111479A1
WO2013111479A1 PCT/JP2012/082904 JP2012082904W WO2013111479A1 WO 2013111479 A1 WO2013111479 A1 WO 2013111479A1 JP 2012082904 W JP2012082904 W JP 2012082904W WO 2013111479 A1 WO2013111479 A1 WO 2013111479A1
Authority
WO
WIPO (PCT)
Prior art keywords
monitoring
information
image
vehicle
terminal device
Prior art date
Application number
PCT/JP2012/082904
Other languages
English (en)
Japanese (ja)
Inventor
照久 高野
真史 安原
秋彦 香西
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Publication of WO2013111479A1 publication Critical patent/WO2013111479A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control

Definitions

  • the present invention relates to a monitoring system.
  • This application claims priority based on Japanese Patent Application No. 2012-011447 filed on Jan. 23, 2012.
  • the contents described in the application are incorporated into the present application by reference and made a part of the description of the present application.
  • a security device that detects the occurrence of abnormalities is known by installing multiple security camera devices in shopping streets, store entrances, home entrances, and other streets, and monitoring surrounding images captured by the security camera device (Patent Document 1).
  • An object of the present invention is to provide a monitoring system capable of continuously monitoring a predetermined point even when a camera mounted on a moving body is used.
  • the present invention selects a mobile monitoring terminal device that approaches or leaves a monitoring area set for monitoring a predetermined point, and transmits monitoring information including image information to the selected monitoring terminal device.
  • the above object is achieved by outputting a transmission command.
  • the monitoring terminal device since the image transmission command is output so as to transmit the monitoring information including the image information to the monitoring terminal device of the moving body that approaches or separates from the monitoring area, the monitoring terminal device is mounted on the moving body. Even in this case, it is possible to continuously monitor a predetermined monitoring point. As a result, the central monitoring device can monitor a certain point using the monitoring terminal device mounted on the moving body that moves at random.
  • FIG. 1 is a schematic diagram showing a monitoring system according to an embodiment of the present invention. It is a block diagram which shows the monitoring system of FIG. It is a perspective view which shows arrangement
  • the monitoring system is embodied as a monitoring system 1 that centrally monitors the security of a city by authorities such as a police station and a fire station. That is, the position information of each of the plurality of moving objects, the image information around the moving objects, and the time information are acquired at a predetermined timing, and the position information, the image information, and the time information are acquired via wireless communication. The position information is displayed on the map information and, if necessary, the image information and the time information are displayed on the display. Therefore, the monitoring system 1 of this example acquires and processes monitoring information via the telecommunications network 30 and the monitoring terminal device 10 that acquires monitoring information such as position information and image information as shown in FIG. Central monitoring device 20 is provided.
  • FIG. 2 is a block diagram showing a specific configuration of the monitoring terminal device 10 and the central monitoring device 20.
  • the monitoring system of this embodiment continuously acquires monitoring information regarding a predetermined monitoring point.
  • the monitoring terminal device 10 is a terminal device mounted on a plurality of moving bodies V, and is mounted on each of the plurality of moving bodies and a position detection function that detects position information of each of the plurality of moving bodies V.
  • An image generation function that captures the surroundings of the moving body with a camera and generates image information, a time detection function, an information acquisition control function that acquires position information, image information, and time information at a predetermined timing;
  • a monitoring information generation function for generating monitoring information including position information and / or image information, and a communication function for outputting the position information, image information, and time information to the central monitoring apparatus 20 and acquiring a command from the central monitoring apparatus 20 And a function for reporting the occurrence of an abnormality.
  • the monitoring terminal device 10 can exchange information with the vehicle controller 17 that centrally controls the vehicle speed sensor 18, the navigation device 19, and other vehicle-mounted electronic devices.
  • the monitoring terminal device 10 can transmit the vehicle speed information acquired via the vehicle speed sensor 18 and the vehicle controller 17 to the central monitoring device 20 as a part of the monitoring information.
  • the time information is mainly information used for post-event analysis, and may be omitted.
  • the mobile body V on which the monitoring terminal device 10 is mounted is not particularly limited as long as it travels in the target monitoring area, and includes mobile bodies such as passenger cars, motorcycles, industrial vehicles, and trams.
  • the vehicle V1, the private passenger car V2, and the emergency passenger car V3 are included, but in particular, a taxi or a route bus V1 that travels randomly and constantly in a predetermined area is particularly preferable.
  • FIG. 1 illustrates an emergency passenger car V3 such as a taxi V1, a private passenger car V2, a police car, a fire engine or an ambulance, but these are collectively referred to as a moving body V or a passenger car V.
  • Each moving body V includes a plurality of in-vehicle cameras 11a to 11e (hereinafter collectively referred to as cameras 11), an image processing device 12, a communication device 13, an in-vehicle control device 14, a position detection device 15, and a notification button 16.
  • the camera 11 is composed of a CCD camera or the like, images the surroundings of the moving object V, and outputs the image pickup signal to the image processing device 12.
  • the image processing device 12 reads an imaging signal from the camera 11 and performs image processing on the image information. Details of this image processing will be described later.
  • the position detection device 15 is composed of a GPS device and its correction device, etc., detects the current position of the moving object V, and outputs it to the in-vehicle control device 14.
  • the notification button 16 is an input button installed in the passenger compartment, and inputs information for reporting an abnormality when a driver or a passenger finds an incident (an incident related to security such as an accident, fire, or crime). It is a manual button. This information can include position information of the moving body V that has reported the abnormality.
  • the in-vehicle control device 14 includes a CPU, a ROM, and a RAM, and controls the image processing device 12, the communication device 13, and the position detection device 15 when the notification button 16 is pressed, and is generated by the image processing device 12.
  • the image information, the position information of the moving object V detected by the position detection device 15, and the time information from the clock built in the CPU are output to the central monitoring device 20 via the communication device 13 and the telecommunication network 30.
  • a command for requesting information such as an image transmission command is acquired from the central monitoring device 20 received via the telecommunication network 30 and the communication device 13, and the image processing device 12, the communication device 13, and the position detection device 15 are obtained.
  • Monitoring information including image information generated by the image processing device 12, position information of the moving object V detected by the position detection device 15, and time information from a clock built in the CPU is transmitted to the communication device 13. And output to the central monitoring device 20 through the telecommunication network 30.
  • the in-vehicle control device 14 can store monitoring information including image information, position information, time information, and the like for at least a predetermined time.
  • the communication device 13 is a communication means capable of wireless communication, and exchanges information with the communication device 23 of the central monitoring device 20 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 13 and 23 can be used.
  • a wireless LAN, WiFi (registered trademark), WiMAX (registered trademark), Bluetooth (registered trademark), a dedicated wireless line, or the like can be used.
  • the central monitoring device 20 stores an information acquisition function for acquiring the position information and image information output from the monitoring terminal device 10 described above, and stores the acquired monitoring information in the database 26 at least temporarily in association with the position information. And a display control function for displaying the map information from the map database, controlling the received position information on the map information, and displaying the received image information on the display 24.
  • the central monitoring device 20 refers to the database 26, selects a monitoring terminal device 10 of the passenger car V that approaches and / or separates from the monitoring area based on the selected monitoring point, and the selected monitoring terminal.
  • the apparatus 10 has a command output function that outputs an image transmission command that includes at least image information and transmits monitoring information including position information and time information as necessary.
  • the selection function of the present embodiment has a function of hopping the target passenger car V one after another as the passenger car V moves when the passenger car V that approaches and / or separates from the monitoring area is selected. Have. Specifically, when the previously selected passenger vehicle V passes through the monitoring area, the central monitoring device 20 selects the monitoring terminal device 10 of another passenger vehicle V that approaches this monitoring area.
  • the central control device 21 includes a CPU, a ROM, and a RAM, and controls the image processing device 22, the communication device 23, and the display 24, and receives position information, image information, and time information transmitted from the monitoring terminal device 10.
  • the image is displayed on the display 24 after being subjected to image processing as necessary.
  • the image processing device 22 has a map database, displays map information from the map database on the display 24, and superimposes and displays position information detected by the position detection device 15 of the monitoring terminal device 10 on the map information. To do. Further, image processing for displaying image information captured by the vehicle-mounted camera 11 of the monitoring terminal device 10 and processed by the image processing device 12 on the display 24 is performed.
  • the display 24 can be composed of, for example, a liquid crystal display device having a size capable of displaying two window screens on one screen or two liquid crystal display devices each displaying two window screens.
  • One window screen displays a screen in which the position information of each moving object V is superimposed on the map information (see FIG. 1), and the other window screen displays an image captured by the in-vehicle camera 11. Such image information is displayed.
  • the input device 25 is constituted by a keyboard or a mouse, and is used when inputting an information acquisition command output to a desired moving body V or inputting various information processing commands displayed on the display 24. It is done.
  • the monitoring point serving as the reference for the monitoring area can be input by the monitor via the input device 25.
  • the monitor can specify a monitoring point by clicking (selecting and inputting) the icon of each point superimposed on the map information, and set a monitoring area based on this monitoring point. it can.
  • the monitor can set a monitoring area surrounded by a plurality of selected points by continuously selecting and inputting each point superimposed and displayed on the map information on the basis of an arbitrary monitoring point. .
  • the communication device 23 is a communication means capable of wireless communication, and exchanges information with the communication device 13 of the monitoring terminal device 10 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 13 and 23 can be used.
  • the cameras 11a to 11e are configured using an image sensor such as a CCD, and the four on-vehicle cameras 11a to 11d are installed at different positions outside the passenger car V, respectively, and shoot four directions around the vehicle.
  • the camera 1 of this embodiment has a zoom-up function for enlarging and imaging a subject, and can arbitrarily change the focal length according to the control command, or can arbitrarily change the imaging magnification according to the control command.
  • the in-vehicle camera 11 a installed at a predetermined position in front of the passenger car V such as a front grill portion is an object or road surface (in the area SP1 in front of the passenger car V and in the space in front thereof)
  • the in-vehicle camera 11b installed at a predetermined position on the left side of the passenger car V such as the left side mirror portion is an object or road surface (left side view) that exists in the area SP2 on the left side of the passenger car V and in the surrounding space.
  • the in-vehicle camera 11c installed at a predetermined position in the rear part of the passenger car V, such as a rear finisher part or a roof spoiler part, is an object or road surface (rear view) existing in the area SP3 behind the passenger car V and in the space behind it.
  • the in-vehicle camera 11d installed at a predetermined position on the right side of the passenger car V such as the right side mirror portion is an object or road surface (right side view) that exists in the area SP4 on the right side of the passenger car V and in the surrounding space.
  • one in-vehicle camera 11e is installed, for example, on the ceiling of a passenger car, and images the area SP5 in the passenger compartment as shown in FIG. Used for crime prevention or crime reporting.
  • FIG. 4 is a view of the arrangement of the in-vehicle cameras 11a to 11e as viewed from above the passenger car V.
  • the in-vehicle camera 11a that images the area SP1 the in-vehicle camera 11b that images the area SP2
  • the in-vehicle camera 11c that images the area SP3 the in-vehicle camera 11d that images the area SP4 are It is installed along the outer periphery VE of the body along the counterclockwise direction (counterclockwise) or the clockwise direction (clockwise).
  • the in-vehicle camera 11b is installed on the left side of the in-vehicle camera 11a, and the left side of the in-vehicle camera 11b.
  • the vehicle-mounted camera 11c is installed on the left side of the vehicle-mounted camera 11c, and the vehicle-mounted camera 11a is installed on the left side of the vehicle-mounted camera 11d.
  • the in-vehicle camera 11d is installed on the right side of the in-vehicle camera 11a.
  • the vehicle-mounted camera 11c is installed on the right side
  • the vehicle-mounted camera 11b is installed on the right side of the vehicle-mounted camera 11c
  • the vehicle-mounted camera 11a is installed on the right side of the vehicle-mounted camera 11b.
  • FIG. 5A shows an example of an image GSP1 in which the front in-vehicle camera 11a images the area SP1
  • FIG. 5B shows an example of an image GSP2 in which the left-side in-vehicle camera 11b images the area SP2
  • FIG. 5D shows an example of an image GSP3 in which the area SP3 is imaged
  • FIG. 5D shows an example of an image GSP4 in which the right-side in-vehicle camera 11d images the area SP4
  • FIG. 5E shows an indoor in-vehicle camera 11e.
  • the size of each image is vertical 480 pixels ⁇ horizontal 640 pixels.
  • the image size is not particularly limited, and may be any size as long as a general terminal device can reproduce a moving image.
  • the number and position of the in-vehicle camera 11 can be appropriately determined according to the size, shape, detection area setting method, etc. of the passenger car V.
  • the plurality of in-vehicle cameras 11 described above are assigned identifiers corresponding to the respective arrangements, and the in-vehicle control device 14 can identify each of the in-vehicle cameras 11 based on each identifier.
  • the vehicle-mounted control apparatus 14 can transmit an imaging command and other commands to a specific vehicle-mounted camera 11 by attaching an identifier to the command signal.
  • the in-vehicle control device 14 controls the image processing device 12 to acquire each image signal picked up by the in-vehicle camera 11, and the image processing device 12 processes the image pickup signal from each in-vehicle camera 11 to perform FIG. It is converted into image information shown in 5E. Then, the in-vehicle control device 14 generates a monitoring image based on the four pieces of image information shown in FIGS. 5A to 5D (image generation function), and the monitoring image is projected on the side of the projection model of the columnar body. Mapping information to be projected onto the surface is associated with the monitoring image (mapping information adding function) and output to the central monitoring device 20.
  • the image generation function and the mapping information addition function will be described in detail.
  • the process of generating a monitoring image based on the four pieces of image information obtained by imaging the periphery of the passenger car V and associating the mapping information with the monitoring image is executed by the monitoring terminal device 10 as in this example, and also by the central monitoring device 20. It can also be executed. In this case, four pieces of image information obtained by imaging the periphery of the passenger car V are transmitted as they are from the monitoring terminal device 10 to the central monitoring device 20, and are monitored by the image processing device 22 and the central control device 21 of the central monitoring device 20. It is only necessary to generate an image, associate mapping information, and perform projection conversion.
  • the in-vehicle control device 14 of the monitoring terminal device 10 of the present embodiment controls the image processing device 12 to acquire the imaging signals of the in-vehicle cameras 11a to 11e, respectively, and further clockwise or along the outer periphery of the body of the passenger car V
  • One monitoring image is generated so that the image information of the in-vehicle cameras 11a to 11d installed in the counterclockwise direction is arranged in the order of installation of these in-vehicle cameras 11a to 11d.
  • the four in-vehicle cameras 11a to 11d are installed in the order of the cameras 11a, 11b, 11c, and 11d in the counterclockwise direction (counterclockwise) along the outer periphery VE of the body of the passenger car V. Therefore, the vehicle-mounted control device 14 integrates the four images captured by the vehicle-mounted cameras 11a to 11d in accordance with the order of installation of the vehicle-mounted cameras 11a to 11d (vehicle-mounted cameras 11a ⁇ 11b ⁇ 11c ⁇ 11d). Are connected in the horizontal direction to generate a single monitoring image. In the monitoring image of the present embodiment, the images are arranged such that the ground contact surface (road surface) of the passenger vehicle V is the lower side, and the images are connected to each other at sides in the height direction (vertical direction) with respect to the road surface.
  • FIG. 6 is a diagram illustrating an example of the monitoring image K.
  • the monitoring image K of the present embodiment includes a captured image GSP1 in which the front in-vehicle camera 11a images the area SP1 along the direction P from the left side to the right side in the drawing, and the left in-vehicle camera 11b.
  • a captured image GSP2 obtained by imaging the area SP2 a captured image GSP3 obtained by the rear vehicle-mounted camera 11c imaging the area SP3, and a captured image GSP4 obtained by the right-side vehicle-mounted camera 11d imaging the area SP4 are arranged in this order in the horizontal direction. These four images are arranged as a series of images.
  • the monitor image K generated in this way is displayed in order from the left end to the right side with the image corresponding to the road surface (vehicle contact surface) facing down, so that the monitor can rotate the periphery of the vehicle V counterclockwise. It can be visually recognized on the display 24 in a manner similar to the look around.
  • one monitoring image K when one monitoring image K is generated, four images acquired at substantially the same time as the photographing timings of the in-vehicle cameras 11a to 11d are used. Thereby, since the information contained in the monitoring image K can be synchronized, the situation around the vehicle at a predetermined timing can be accurately expressed.
  • the monitoring image K generated from the respective captured images having substantially the same imaging timing of the camera is stored with time, and the moving image monitoring image K including the plurality of monitoring images K per predetermined unit time is generated. It may be. By generating the moving image monitoring image K based on the images having the same imaging timing, it is possible to accurately represent changes in the situation around the vehicle.
  • the conventional central monitoring device 20 has a disadvantage in that it cannot simultaneously watch images (moving images) in a plurality of directions and cannot monitor the entire vehicle periphery on a single screen.
  • the vehicle-mounted control apparatus 14 of this embodiment produces
  • the monitoring terminal device 10 of the present embodiment generates the monitoring image K by compressing the data amount of the image so that the number of pixels of the monitoring image K is substantially the same as the number of pixels of the images of the in-vehicle cameras 11a to 11d.
  • the size of each image shown in FIGS. 5A to 5D is 480 ⁇ 640 pixels
  • compression processing is performed so that the size of the monitoring image K is 1280 ⁇ 240 pixels as shown in FIG. Do.
  • image processing and image reproduction can be performed.
  • the in-vehicle control device 14 of the present embodiment can also attach a line figure indicating the boundary of each arranged image to the monitoring image K.
  • the in-vehicle controller 14 forms a rectangular partition image Bb, Bc, Bd, Ba, Ba ′ between the images as a line figure indicating the boundary between the arranged images. Can be attached to the monitoring image K.
  • the partition image functions as a frame of each captured image.
  • the image distortion is large in the vicinity of the boundary of each captured image, it is possible to hide the image of the region with large distortion or to suggest that the distortion is large by arranging the partition image at the boundary of the captured image. .
  • the vehicle-mounted control apparatus 14 of this embodiment can also generate
  • image distortion is likely to occur.
  • the distortion of the captured image tends to be large, so in order to correct the image distortion in advance. It is desirable to correct the distortion of the captured image using the defined image conversion algorithm and correction amount.
  • the in-vehicle control device 14 reads out the same projection model information as the projection model for projecting the monitoring image K in the central monitoring device 20 from the ROM, and images it on the projection plane of the projection model. It is also possible to project an image and correct in advance distortion generated on the projection surface.
  • the image conversion algorithm and the correction amount can be appropriately defined according to the characteristics of the in-vehicle camera 11 and the shape of the projection model. In this way, by correcting in advance the distortion when the image K is projected with respect to the projection plane of the projection model, it is possible to provide the monitoring image K with good visibility with less distortion. Further, by correcting the distortion in advance, it is possible to reduce the positional deviation between the images arranged side by side.
  • the mapping information addition function will be described.
  • the in-vehicle control device 14 projects the generated monitoring image K on the projection plane set on the side surface of the projection model M of the columnar body with the ground contact surface of the passenger car V as the bottom surface.
  • a process for associating the mapping information for monitoring with the monitoring image K is executed.
  • the mapping information is information for allowing the central monitoring device 20 that has received the monitoring image K to easily recognize the projection reference position.
  • FIG. 8 is a diagram showing an example of the projection model M of the present embodiment
  • FIG. 9 is a schematic sectional view taken along the xy plane of the projection model M shown in FIG.
  • the projection model M of this embodiment is a regular octagonal prism body having a regular octagonal bottom surface and a height along the vertical direction (z-axis direction in the figure).
  • the shape of the projection model M is not particularly limited as long as it is a column having side surfaces adjacent to each other along the boundary of the bottom surface, and is a cylinder, or a prism, such as a triangular column, a quadrangular column, or a hexagonal column, or An anti-rectangular column having a polygonal bottom surface and a triangular side surface can also be used.
  • the bottom surface of the projection model M of this embodiment is parallel to the ground contact surface of the passenger car V.
  • Projection surfaces Sa, Sb, Sc, and Sd (hereinafter collectively referred to as a projection surface S) that project an image around the passenger vehicle V that contacts the bottom surface of the projection model M are provided on the inner surface of the side surface of the projection model M. Is set.
  • the projection surface S includes a part of the projection surface Sa and a part of the projection surface Sb, a part of the projection surface Sb and a part of the projection surface Sc, a part of the projection surface Sc and a part of the projection surface Sd, and the projection surface Sd. And a part of the projection surface Sa.
  • the monitoring image K is projected on the projection plane S as an image of the passenger car V viewed from above the viewpoint R (R1 to R8, hereinafter referred to as viewpoint R) above the projection model M surrounding the passenger car V.
  • the in-vehicle control device 14 associates the reference coordinates of the captured image arranged at the right end or the left end with the monitoring image K as mapping information.
  • the in-vehicle control device 14 is arranged at the right end as mapping information (reference coordinates) indicating the start end position or the end position of the monitoring image K when projected onto the projection model M.
  • mapping information reference coordinates
  • the coordinates A (x, y) of the upper left vertex of the captured image GSP1 and the coordinates B (x, y) of the upper right vertex of the captured image GSP2 arranged at the left end are attached to the monitoring image K.
  • the reference coordinates of the captured image indicating the start position or the end position are not particularly limited, and may be the lower left vertex of the monitoring image K arranged at the left end or the lower right vertex of the monitoring image K arranged at the right end.
  • the mapping information may be attached to each pixel of the image data of the monitoring image K, or may be managed as a file different from the monitoring image K.
  • the information indicating the start position or the end position of the monitoring image K that is, the reference coordinates used as a reference in the projection processing is associated with the monitoring image K as mapping information, whereby the central monitoring apparatus 20 that has received the monitoring image K Since the reference position at the time of the projection process can be easily recognized, the monitoring images K arranged in the order in which the in-vehicle cameras 11a to 11d are arranged are projected sequentially and easily on the projection surface S on the side surface of the projection model M. be able to. That is, as shown in FIG.
  • the captured image GSP1 in front of the vehicle is projected onto the projection surface Sa positioned in the imaging direction of the in-vehicle camera 11a, and the captured image on the right side of the vehicle is projected onto the projection surface Sb positioned in the imaging direction of the in-vehicle camera 11b.
  • GSP2 is projected, a captured image GSP3 behind the vehicle is projected onto a projection plane Sc located in the imaging direction of the in-vehicle camera 11c, and a captured image GSP4 on the left side of the vehicle is projected onto the projection plane Sd positioned in the imaging direction of the in-vehicle camera 11d. can do.
  • the monitoring image K projected on the projection model M can show an image that can be seen as if looking around the passenger car V. That is, since the monitoring image K including four images arranged in a line in the horizontal direction according to the installation order of the in-vehicle cameras 11a to 11d is projected on the side surfaces that are also arranged in the horizontal direction in the column of the projection model M. An image around the passenger car V can be reproduced in the monitoring image K projected on the projection surface S of the projection model M of the columnar body while maintaining the positional relationship.
  • the in-vehicle control device 14 of the present embodiment stores the correspondence relationship between each coordinate value of the monitoring image K and the coordinate value of each projection plane S of the projection model M as mapping information, and attaches it to the monitoring image K.
  • it may be stored in the central monitoring device 20 in advance.
  • the positions of the viewpoint R and the projection plane S shown in FIGS. 8 and 9 are examples, and can be arbitrarily set.
  • the viewpoint R can be changed by the operation of the operator.
  • the relationship between the viewpoint R and the projection position of the monitoring image K is defined in advance, and when the position of the viewpoint R is changed, a predetermined coordinate transformation is performed, so that the viewpoint R is viewed from the newly set viewpoint R.
  • the monitoring image K can be projected onto the projection surface S (Sa to Sd). A known method can be used for this viewpoint conversion processing.
  • the in-vehicle control device 14 generates the monitoring image K based on the image information captured at a predetermined timing, and the monitoring image K includes a line figure (mapping information, reference coordinates, and boundary). (Partition image) information is associated and stored over time according to the imaging timing.
  • the in-vehicle control device 14 may store the monitoring image K as a single moving image file including a plurality of monitoring images K per predetermined unit time, or can be transferred / reproduced by a streaming method.
  • the monitoring image K may be stored in
  • the communication device 23 of the central monitoring device 20 receives the monitoring image K transmitted from the monitoring terminal device 10 and the mapping information associated with the monitoring image K. Moreover, the image information image
  • this monitoring image K as described above, images of the four in-vehicle cameras 11 installed at different positions of the body of the passenger car V are installed along the outer periphery of the body of the passenger car V along the clockwise or counterclockwise direction.
  • the vehicle-mounted cameras 11a to 11d are arranged according to the installation order (clockwise or counterclockwise order along the outer periphery of the body of the vehicle V).
  • the monitoring image K is associated with mapping information for projecting the monitoring image K onto the projection plane S of the octagonal prism projection model M.
  • the communication device 23 transmits the acquired monitoring image K and mapping information to the image processing device 22.
  • the image processing apparatus 22 reads the projection model M stored in advance, and sets it on the side surface of the octagonal prism projection model M with the ground contact surface of the passenger car V shown in FIGS. 8 and 9 as the bottom surface based on the mapping information.
  • a display image is generated by projecting the monitoring image K onto the projected planes Sa to Sd. Specifically, according to the mapping information, each pixel of the received monitoring image K is projected onto each pixel of the projection surfaces Sa to Sd. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 recognizes the start point of the monitoring image K (the right end or the left end of the monitoring image K) based on the reference coordinates received together with the monitoring image K.
  • the projection processing is performed so that the start point coincides with the start point (the right end or the left end of the projection surface S) defined in advance on the projection model M. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 arranges a line figure (partition image) indicating the boundary of each image on the projection model M.
  • the partition image can be attached to the projection model M in advance, or can be attached to the monitoring image K after the projection processing.
  • the display 24 displays the monitoring image K projected on the projection plane S of the projection model M.
  • FIG. 10 shows an example of a display image of the monitoring image K.
  • the input device 25 such as a mouse or a keyboard or the display 24 as the touch panel type input device 25
  • the viewpoint can be freely set and changed by the operation of the supervisor. Since the correspondence relationship between the viewpoint position and the projection plane S is defined in advance in the image processing device 22 or the display 24 described above, the monitoring image K corresponding to the changed viewpoint is displayed on the display 24 based on this correspondence relationship. can do.
  • FIG. 11 is a flowchart showing the operation on the monitoring terminal device 10 side
  • FIGS. 12A and 12B are flowcharts showing the operation on the central monitoring device 20 side
  • FIG. 13 is a diagram showing an example of database information.
  • the monitoring terminal device 10 As shown in FIG. 11, in the monitoring terminal device 10, surrounding video and indoor video are acquired from the in-vehicle camera 11 at a predetermined time interval (one routine shown in FIG. 11), and the image processing device 12 converts the video information into image information. Conversion is performed (step ST1). Further, the current position information of the passenger car V on which the monitoring terminal device 10 is mounted is detected from the position detection device 15 having GPS (step ST2).
  • step ST3 it is determined whether or not the report button 16 for reporting the abnormality is pressed. If the report button 16 is pressed, the process proceeds to step ST4, and the image information acquired in step ST1 and the image information acquired in step ST2 are acquired.
  • the positional information is associated with the CPU time information, and these are transmitted as monitoring information to the central monitoring device 20 via the communication device 13 and the telecommunications network 30 together with the abnormality information indicating that an abnormality has occurred.
  • the occurrence of an abnormality related to security such as an accident or crime is automatically transmitted to the central monitoring device 20 together with the position information of the passenger car V and the image information around the passenger car V, thereby further strengthening the monitoring in the city. Will be.
  • the image information and the position information are acquired in the first steps ST1 and ST2, but the image information and the position information may be acquired at a timing between steps ST3 and ST4.
  • step ST3 if the report button 16 has not been pressed, the process proceeds to step ST5 to communicate with the central monitoring device 20 and obtain a control command.
  • step ST6 the monitoring terminal device 10 determines whether or not an image transmission command has been acquired from the central monitoring device 20, and if an image transmission command has been acquired, the process proceeds to step ST7 where image information and position information are acquired.
  • the monitoring information including the time information is transmitted to the central monitoring device 20. Further, when a storage command is included in the image transmission command, image information, position information, and time information are stored.
  • step ST6 even if the image transmission command is not acquired from the central monitoring device 20, if the passenger vehicle V is present in the pre-defined priority monitoring area in step ST8, the process proceeds to step ST10 and the image information is obtained. Send monitoring information including. On the other hand, if the image transmission command is not acquired and it is not the priority monitoring area, the process proceeds to step ST9, and monitoring information not including image information, that is, time information and position information is transmitted to the central monitoring device 20.
  • FIG. 13 is a diagram illustrating an example of information stored in the database 26.
  • monitoring information including image information, position information, and time information acquired from the passenger car V (monitoring terminal device 10) is stored in association with the position information. That is, if position information is designated, a series of monitoring information can be called.
  • the monitoring information can include a mobile body ID (monitoring terminal device ID) for specifying the monitoring terminal device 10.
  • the mobile object ID may be the address of the communication device 13 of the monitoring terminal device 10.
  • step ST12 based on the position information acquired in step ST11, the passenger car V is displayed on the map information of the map database displayed on the display 24 as shown in the upper left of FIG. Since the position information of the passenger car V is acquired and transmitted at a predetermined timing for each routine in FIG. 11, the supervisor can grasp the current position of the passenger car V in a timely manner.
  • step ST13 it is determined whether or not abnormality information notified from the monitoring terminal device 10 of the passenger car V, that is, a notification that an abnormality relating to security such as an accident or a crime has occurred has been received.
  • This abnormality information is output when the passenger of the passenger car V presses the notification button 16 of the monitoring terminal device 10.
  • step ST14 If there is abnormality information, the passenger vehicle V to which the abnormality information is output is identified in step ST14, image information and time information are received from the monitoring terminal device 10 of the passenger vehicle, and the image information is displayed on the display 24. Further, as shown in the upper left of FIG. 1, highlighting is performed such as changing the color so that the passenger car displayed on the map information can be distinguished from other passenger cars. Thereby, the position where the abnormality has occurred can be visually recognized on the map information, and the abnormality content can be grasped on the display 24.
  • the processing from step ST13 to step 20 is an example in which abnormality information is reported, and the location of the passenger vehicle V that has reported abnormality information is selected as a monitoring point, but abnormality information is reported. Even if the monitor designates a place to be monitored arbitrarily, the processing from step ST13 to step 20 can be executed in the same manner. In this case, the place designated by the monitor becomes the monitoring point.
  • step ST15 the central monitoring device 20 selects a monitoring point to be monitored by paying attention to the position where the passenger vehicle V that has output the abnormality information is present.
  • the supervisor can also set arbitrarily.
  • the central monitoring device 20 selects another vehicle existing in the monitoring area within a predetermined distance from the monitoring point, that is, the monitoring terminal device 10, with reference to the monitoring point.
  • the monitoring area may be a circular area of the same distance from the monitoring point, a belt-like area of a predetermined distance along the up or down direction of the monitoring point, and when a right turn or a left turn is considered at an intersection or the like, A fan-shaped area having a predetermined distance and a predetermined central angle may be used.
  • the central monitoring device 20 preferably sends an image transmission command specifying the imaging time of the image information to the passenger vehicle V separated from the monitoring point.
  • the central monitoring device 20 of the present embodiment transmits an image transmission command to a vehicle passing through the monitoring area in the future in order to continue monitoring the status of the monitoring point.
  • Vehicles that pass through the surveillance area include vehicles that are approaching the surveillance area, that is, vehicles that include the surveillance area and have entered a larger area than the surveillance area, vehicles that are currently passing through the surveillance area, or surveillance areas.
  • a vehicle immediately after passing, a vehicle that is moving away from the monitoring area, that is, a vehicle that includes the monitoring area and exits from an area larger than the monitoring area, can be defined appropriately.
  • the method for selecting the passenger car existing in the monitoring area is not particularly limited.
  • the central monitoring device 20 selects the passenger car V whose traveling direction is the direction of the monitoring point and (YX) / V is less than a predetermined value. In the same step, the central monitoring device 20 transmits an image transmission command to the selected monitoring terminal device 10.
  • the image transmission command can include information specifying the imaging direction.
  • the central monitoring device 20 calculates the imaging direction based on the positional relationship between the monitoring point and the monitoring area.
  • the imaging direction may be expressed by an azimuth, or may be expressed by identification information of the in-vehicle camera 11 if the position of the in-vehicle camera 11 is known. Thereby, the image
  • monitoring including image information is automatically performed at the timing when the host vehicle enters the monitoring area from the monitoring point transmitted by the central monitoring device 20 and the current position. Information can also be sent to the central monitoring device 20.
  • step ST17 the position information of the passenger car V that has output the abnormality information is transmitted to an emergency passenger car such as a police car, an ambulance, or a fire engine.
  • an emergency passenger car such as a police car, an ambulance, or a fire engine.
  • image information may be attached and transmitted in order to notify the abnormal content.
  • the emergency passenger car can be dispatched before a report from the site is entered, and it is possible to quickly deal with accidents and crimes.
  • step ST18 all position information, image information, and time information received from the monitoring terminal device 10 are recorded on the recording medium. This record is used to resolve these after an accident or crime. If there is no abnormality information in step ST13, the process proceeds to step ST21 without performing the processes in steps ST14 to ST18.
  • step ST19 it is determined whether or not the centralized monitoring state of the monitoring point has been released. If it has been released, the processing from step 21 is performed. On the other hand, if it is not released, the monitoring point is continuously monitored. Therefore, when the passenger car V selected in the previous step ST16 passes through the monitoring area, the process returns to step ST16, and a new monitoring passenger car V is obtained. Select.
  • the method of selecting the newly monitored passenger vehicle V to be transitioned (hopped) is not particularly limited. For example, first, the road where the current position of the passenger vehicle that reported the abnormality report and other monitoring points (Y) exist is identified, With reference to the database 26, a passenger car V traveling on the road is extracted. Then, the position (X), the moving speed (V), and the traveling direction of the specified passenger car V are specified. The moving speed and the traveling direction of the passenger vehicle V may be obtained based on the temporal change of the position information, or may be obtained based on the moving speed acquired by being included in the monitoring information.
  • the central monitoring device 20 selects the passenger car V whose traveling direction of the passenger car V is the direction of the monitoring point and has the smallest (Y ⁇ X) / V as a target for transmitting an image transmission command next. Note that if (Y ⁇ X) / V is too small, the monitoring point is passed immediately, so a lower limit may be provided.
  • the monitoring passenger car V that can image the monitoring point is sequentially changed (hop), so that the passenger vehicle V on which the camera 11 is mounted moves. Also, it is possible to continuously image a predetermined monitoring point.
  • step ST21 it is determined whether there is an image information transmission command from an emergency passenger car such as a police car, an ambulance, or a fire engine. If an image transmission command is input, the process proceeds to step ST22. In step ST22, it is determined whether or not the passenger vehicle V exists in the area specified by the image information transmission command. If the passenger vehicle V exists, the process proceeds to step ST23. In step ST23, an image information transmission command is output to the passenger vehicle V existing in the area specified by the image information transmission command. Thereby, the image information from the passenger car V can be acquired in step ST11 of FIG.
  • step ST21 the process proceeds to step ST24 without performing the processes of steps ST21 to ST23.
  • step ST24 it is determined whether or not there is a passenger car V in the vicinity of a suspicious location such as a preset crime-prone delay, and if so, the process proceeds to step ST25 to transmit image information to the passenger car V. Outputs a command. Suspicious areas are streets with poor security and streets. As a result, the monitoring of streets and streets that are suspicious places can be strengthened, and crime prevention can be expected. If the passenger vehicle V does not exist in the region near the suspicious part, the process proceeds to step ST26 without performing the process of step ST22.
  • step ST26 it is determined whether or not there is a passenger vehicle V in the vicinity of the priority monitoring position where the priority monitoring object whose details should be monitored can be imaged. If the passenger vehicle V exists in the vicinity of the priority monitoring position, step ST27 is determined. To the passenger vehicle V, and outputs a priority monitoring command for requesting transmission of image information in which the priority monitoring target is expanded. As a result, it is possible to monitor the priority monitoring target in detail, and to effectively detect a suspicious object that causes an incident or an accident in the specified priority monitoring target, so that prevention of crime can be expected. If there is no passenger vehicle V in the vicinity of the priority monitoring position, the process proceeds to step ST28 without performing the process of step ST27.
  • step ST28 based on the position information received from each passenger car V, the passenger car V is not traveling within a predetermined time within a predetermined area that is required to be monitored (not limited to the suspicious location and the priority monitoring area). It is determined whether there is a route, and when there is such a route, it is monitored whether there is a passenger vehicle V traveling on the route. Then, if there is a passenger car V traveling on the route most recently, the process proceeds to step ST29, and an image information transmission command is output to the passenger car V. Thereby, it is possible to automatically acquire image information of a route that is a region other than the suspicious portion or the priority monitoring region and has a small traffic volume of the passenger car V. If there is no route that satisfies the condition of step ST28, the process returns to step ST11 of FIG. 12A without performing the process of step ST29.
  • the monitoring system 1 of this example selects the monitoring terminal device 10 of a moving body that approaches and / or separates from the monitoring area with the selected monitoring point as a reference, and for the selected monitoring terminal device 10, By outputting an image transmission command for transmitting monitoring information including at least image information, even if the camera 11 mounted on the passenger car V that moves at random is used instead of a predetermined place, the predetermined value is continuously determined. Can continue to monitor the monitoring point. As a result, the central monitoring device 20 can monitor a certain point using the monitoring terminal device 10 mounted on the randomly moving passenger car V.
  • the monitoring terminal device 10 of another passenger car that approaches this monitoring area is selected.
  • the monitoring passenger car V that can image the monitoring point is sequentially moved and continuously selected (hop), so that even if the passenger vehicle V on which the camera 11 is mounted moves, the predetermined monitoring point can be continuously imaged. .
  • the central monitoring device 20 since the central monitoring device 20 outputs an image transmission command including information for designating the imaging direction, the image of the monitoring point is reliably acquired by the camera 11 of the passenger car V in the monitoring area. can do. In addition, since only necessary image information can be transmitted, the amount of transmission data can be reduced.
  • the central monitoring device 20 outputs an image transmission command including information specifying the imaging time, and therefore when a monitoring point is selected after a lapse of time since the incident occurred Even so, the image information before and after the occurrence of the incident can be collected retrospectively by the image transmission command specifying the imaging time before and after the incident occurrence time.
  • the location information of the monitoring terminal device 10 to which the central monitoring device 20 has output the abnormality information is selected as a monitoring point and the monitoring terminal device 10 is narrowed down, so that the location where the abnormality has occurred is continued. Can continue to monitor.
  • the monitoring method of this example has the same operation and effect as the monitoring system including the monitoring terminal device 10 and the central monitoring device 20.
  • the position information of the passenger car V and the image information from the in-vehicle cameras 11a to 11e are acquired.
  • the image information from the fixed camera 11f installed in the city shown in FIG. May be obtained.
  • the passenger car V that acquires the position information and the image information it is desirable to use a taxi V1 or a bus that travels in a predetermined area as shown in FIG. 1, but even if a private passenger car V2 or an emergency passenger car V3 is used. Good.
  • the in-vehicle camera 11e in the room is acquired. It may be omitted.
  • the number of the four on-vehicle cameras 11a to 11d may be three or less, particularly in an environment where image information can be acquired from many passenger cars V, such as a monitoring area where there is a large amount of traffic.
  • the central control device 21 corresponds to selection means, and the input device 25 corresponds to information acquisition means, abnormality information reception means, and command output means according to the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention concerne un système (1) de surveillance muni d'un dispositif (10) de terminal de surveillance et d'un dispositif central (20) de surveillance capable de communiquer via un réseau (30) de télécommunications. Le dispositif central (20) de surveillance obtient des informations de surveillance comprenant au moins des informations de position émises à partir du dispositif (10) de terminal de surveillance, sélectionne un dispositif (10) de terminal de surveillance dans un corps mobile qui se rapproche ou s'éloigne d'une zone de surveillance prescrite référencée par rapport à un point de surveillance sélectionné à partir des informations de position comprises dans les informations de surveillance obtenues et, pour le dispositif (10) de terminal de surveillance sélectionné, émet une commande de transmission d'image afin de transmettre des informations de surveillance comprenant des informations d'image.
PCT/JP2012/082904 2012-01-23 2012-12-19 Système de surveillance WO2013111479A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012011447 2012-01-23
JP2012-011447 2012-01-23

Publications (1)

Publication Number Publication Date
WO2013111479A1 true WO2013111479A1 (fr) 2013-08-01

Family

ID=48873218

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/082904 WO2013111479A1 (fr) 2012-01-23 2012-12-19 Système de surveillance

Country Status (1)

Country Link
WO (1) WO2013111479A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104767981A (zh) * 2015-05-05 2015-07-08 国家电网公司 配网无线音视频抢修指挥系统
JP2020187432A (ja) * 2019-05-10 2020-11-19 トヨタ自動車株式会社 情報処理装置及び情報処理プログラム
JP2022158104A (ja) * 2021-04-01 2022-10-17 トヨタ自動車株式会社 監視装置、監視方法及び監視システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350520A (ja) * 2005-06-14 2006-12-28 Auto Network Gijutsu Kenkyusho:Kk 周辺情報収集システム
JP2008217218A (ja) * 2007-03-01 2008-09-18 Denso Corp 事故情報取得システム
JP2009169540A (ja) * 2008-01-11 2009-07-30 Toyota Infotechnology Center Co Ltd 監視システム及びセキュリティ管理システム
JP2010072845A (ja) * 2008-09-17 2010-04-02 Nec Personal Products Co Ltd ドライブレコーダシステム、ドライブレコーダ及び情報処理装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350520A (ja) * 2005-06-14 2006-12-28 Auto Network Gijutsu Kenkyusho:Kk 周辺情報収集システム
JP2008217218A (ja) * 2007-03-01 2008-09-18 Denso Corp 事故情報取得システム
JP2009169540A (ja) * 2008-01-11 2009-07-30 Toyota Infotechnology Center Co Ltd 監視システム及びセキュリティ管理システム
JP2010072845A (ja) * 2008-09-17 2010-04-02 Nec Personal Products Co Ltd ドライブレコーダシステム、ドライブレコーダ及び情報処理装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104767981A (zh) * 2015-05-05 2015-07-08 国家电网公司 配网无线音视频抢修指挥系统
JP2020187432A (ja) * 2019-05-10 2020-11-19 トヨタ自動車株式会社 情報処理装置及び情報処理プログラム
JP7298285B2 (ja) 2019-05-10 2023-06-27 トヨタ自動車株式会社 情報処理装置及び情報処理プログラム
JP2022158104A (ja) * 2021-04-01 2022-10-17 トヨタ自動車株式会社 監視装置、監視方法及び監視システム
JP7380633B2 (ja) 2021-04-01 2023-11-15 トヨタ自動車株式会社 監視装置、監視方法及び監視システム
US11971265B2 (en) 2021-04-01 2024-04-30 Toyota Jidosha Kabushiki Kaisha Monitoring device, monitoring method, and monitoring system

Similar Documents

Publication Publication Date Title
JP5786963B2 (ja) 監視システム
JP6451840B2 (ja) 情報提示システム
EP3543979B1 (fr) Surveillance autonome mobile
KR101397453B1 (ko) 영상 감시 시스템 및 그 방법
JP5890294B2 (ja) 映像処理システム
US9736369B2 (en) Virtual video patrol system and components therefor
JP4643860B2 (ja) 車両用視覚支援装置及び支援方法
JP5811190B2 (ja) 監視システム
KR101287190B1 (ko) 영상감시장치의 촬영 위치 자동 추적 방법
WO2013111494A1 (fr) Système de surveillance
WO2012137367A1 (fr) Système d'accumulation d'images
JP6260174B2 (ja) 監視画像提示システム
WO2013111479A1 (fr) Système de surveillance
KR101780929B1 (ko) 움직이는 물체를 추적하는 영상감시 시스템
WO2013111491A1 (fr) Système de surveillance
KR20190050113A (ko) 이동 물체 자동 추적 영상 감시 시스템
JP5790788B2 (ja) 監視システム
WO2013125301A1 (fr) Système de surveillance
WO2013111492A1 (fr) Système de surveillance
WO2013161345A1 (fr) Système de surveillance et procédé de surveillance
WO2013111493A1 (fr) Système de surveillance
KR20180099098A (ko) 촬영 대상의 움직임 추적 기능을 가진 융합형 영상 감시 시스템
JP5796638B2 (ja) 監視システム
CN103139539A (zh) 一种全景监视系统
KR101069766B1 (ko) 방범용 폐쇄회로 텔레비전과 연동하는 주정차 단속 시스템 및 이를 이용한 주정차 단속 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12866532

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12866532

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP