WO2013111492A1 - Système de surveillance - Google Patents

Système de surveillance Download PDF

Info

Publication number
WO2013111492A1
WO2013111492A1 PCT/JP2012/083465 JP2012083465W WO2013111492A1 WO 2013111492 A1 WO2013111492 A1 WO 2013111492A1 JP 2012083465 W JP2012083465 W JP 2012083465W WO 2013111492 A1 WO2013111492 A1 WO 2013111492A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
monitoring
image
terminal device
image information
Prior art date
Application number
PCT/JP2012/083465
Other languages
English (en)
Japanese (ja)
Inventor
真史 安原
秋彦 香西
照久 高野
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Publication of WO2013111492A1 publication Critical patent/WO2013111492A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a monitoring system.
  • a security device that detects the occurrence of abnormalities is known by installing multiple security camera devices in shopping streets, store entrances, home entrances, and other streets, and monitoring surrounding images captured by the security camera device (Patent Document 1).
  • An object of the present invention is to provide a monitoring system capable of transmitting monitoring information in real time.
  • the present invention attaches a camera to a plurality of moving objects, acquires position information of the moving object and image information captured by the camera at a predetermined timing, and transmits position information and image information from a specific moving object.
  • the above object is achieved by suppressing transmissions from surrounding mobile objects.
  • image information from a camera mounted on a plurality of mobile bodies that travel at random and position information of the mobile body are acquired, and transmission from surrounding mobile bodies is suppressed when transmitting this information.
  • the amount of communication data in the communication area is optimized, and monitoring information can be transmitted in real time.
  • FIG. 1 is a schematic diagram showing a monitoring system according to an embodiment of the present invention. It is a block diagram which shows the monitoring system of FIG. It is a flowchart which shows the main control content by the monitoring terminal device side of the monitoring system of FIG. It is a flowchart which shows the main control content by the central monitoring apparatus side of the monitoring system of FIG. It is a perspective view which shows the imaging range of the camera in the monitoring system of FIG. It is a top view which shows the example of arrangement
  • the monitoring system is embodied as a monitoring system 1 that centrally monitors the security of a city by authorities such as a police station and a fire station. That is, the position information of each of the plurality of moving objects, the image information around the moving objects, and the time information are acquired at a predetermined timing, and the position information, the image information, and the time information are acquired via wireless communication. The position information is displayed on the map information and, if necessary, the image information and the time information are displayed on the display. Therefore, the monitoring system 1 of this example inputs and processes the monitoring information via the telecommunications network 30 and the monitoring terminal device 10 that acquires monitoring information such as position information and image information as shown in FIG. Central monitoring device 20 is provided.
  • FIG. 1 centrally monitors the security of a city by authorities such as a police station and a fire station. That is, the position information of each of the plurality of moving objects, the image information around the moving objects, and the time information are acquired at a predetermined timing, and the position information, the image information, and the time
  • FIG. 2 is a block diagram illustrating a specific configuration of the monitoring terminal device 10 and the central monitoring device 20.
  • the monitoring system 1 of this example when an incident or accident occurs, temporarily suppresses the transmission of image information from a moving body around the moving body that has discovered the incident, so that the image information from the discoverer The amount of line communication for transmitting the image at high speed or with high quality can be ensured, so that the image information in the field can be monitored in real time.
  • the monitoring terminal device 10 is a terminal device mounted on a plurality of moving bodies V, and is mounted on each of the plurality of moving bodies and a position detection function that detects position information of each of the plurality of moving bodies V.
  • An image generation function that captures an image of the periphery of the moving body to generate image information, a time detection function, an information acquisition control function that acquires position information, image information, and time information at a predetermined timing, and the position information and image It has a communication function for outputting information and time information to the central monitoring device 20 and receiving a command from the central monitoring device 20, and a function for reporting the occurrence of an abnormality.
  • a plurality of in-vehicle cameras 11a to 11e, an image processing device 12, a communication device 13, an in-vehicle control device 14, a position detection device 15, and a notification button 16 are provided.
  • the time information is mainly used for the post-analysis of the event and may be omitted.
  • the mobile object whose transmission has been temporarily suppressed temporarily stores the image information acquired during that time. Since it may be stored and transmitted as necessary when transmission suppression is released, it may be used for post-mortem analysis, so it is effective to acquire time information from the meaning as well.
  • the mobile body V on which the monitoring terminal device 10 is mounted is not particularly limited as long as it travels in the target monitoring area, and includes mobile bodies such as automobiles, motorcycles, industrial vehicles, and trams.
  • the vehicle V2 and the emergency vehicle V3 are included, and in particular, a taxi or a route bus V1 that travels randomly and constantly in a predetermined area is particularly preferable.
  • FIG. 1 illustrates an emergency vehicle V3 such as a taxi V1, a private vehicle V2, a police car, a fire engine or an ambulance, but these are collectively referred to as a moving body V or a passenger vehicle V.
  • Each moving body V includes a plurality of in-vehicle cameras 11a to 11e (hereinafter collectively referred to as cameras 11), an image processing device 12, a communication device 13, an in-vehicle control device 14, a position detection device 15, and a notification button 16.
  • the camera 11 is composed of a CCD camera or the like, images the surroundings of the moving object V, and outputs the image pickup signal to the image processing device 12.
  • the image processing device 12 reads an imaging signal from the camera 11 and performs image processing on the image information. Details of this image processing will be described later.
  • the position detection device 15 is composed of a GPS device and its correction device, etc., detects the current position of the moving object V, and outputs it to the in-vehicle control device 14.
  • the notification button 16 is an input button installed in the passenger compartment, and is a manual button that is input when a driver or a passenger finds an incident (an incident related to security such as an accident, fire, or crime).
  • the in-vehicle control device 14 includes a CPU, a ROM, and a RAM, and controls the image processing device 12, the communication device 13, and the position detection device 15 when the notification button 16 is pressed, and is generated by the image processing device 12.
  • the image information, the position information of the moving object V detected by the position detection device 15, and the time information from the clock built in the CPU are output to the central monitoring device 20 via the communication device 13 and the telecommunication network 30. . Further, it receives an information acquisition command from the central monitoring device 20 received via the telecommunications network 30 and the communication device 13, controls the image processing device 12, the communication device 13 and the position detection device 15, and controls the image processing device 12.
  • the central monitoring device through the communication device 13 and the telecommunications network 30 includes the image information generated in step S1, the position information of the moving body V detected by the position detection device 15, and the time information from the clock built in the CPU. 20 output. Details of these controls will also be described later.
  • the communication device 13 is a communication means capable of wireless communication, and exchanges information with the communication device 23 of the central monitoring device 20 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 13 and 23 can be used.
  • a wireless LAN, WiFi (registered trademark), WiMAX (registered trademark), Bluetooth (registered trademark), a dedicated wireless line, or the like can be used.
  • the central monitoring device 20 displays the information input function for inputting the position information and image information output from the monitoring terminal device 10 and the map information from the map database, and displays the received position information on the map information.
  • the central control device 21 includes a CPU, a ROM, and a RAM, and controls the image processing device 22, the communication device 23, and the display 24, and receives position information, image information, and time information transmitted from the monitoring terminal device 10.
  • the image is displayed on the display 24 after being subjected to image processing as necessary.
  • the image processing device 24 has a map database, displays map information from the map database on the display 24, and displays position information detected by the position detection device 15 of the monitoring terminal device 10 on the map information. . Further, image processing for displaying image information captured by the vehicle-mounted camera 11 of the monitoring terminal device 10 and processed by the image processing device 12 on the display 24 is performed.
  • the display 24 can be composed of, for example, a liquid crystal display device having a size capable of displaying two window screens on one screen or two liquid crystal display devices each displaying two window screens.
  • One window screen displays a screen in which the position information of each moving object V is superimposed on the map information (see FIG. 1), and the other window screen displays an image captured by the in-vehicle camera 11. Such image information is displayed.
  • the input device 25 is composed of a keyboard or a mouse, and is used when outputting an information acquisition command to a desired moving body V or inputting a processing command for various information displayed on the display 24.
  • the communication device 23 is a communication means capable of wireless communication, and exchanges information with the communication device 13 of the monitoring terminal device 10 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 13 and 23 can be used.
  • the central monitoring device 20 of this example has a function of temporarily suppressing image information transmitted from the monitoring terminal device 10.
  • image information has a larger amount of data than position information, so it is necessary to secure a sufficient amount of line communication to transmit at high speed or high quality.
  • the transmission of the image information from the mobile bodies around the mobile body V that discovered the image is temporarily suppressed, thereby reducing the amount of line communication for transmitting the image information from the discovered mobile body V at high speed or with high quality. Secure.
  • FIG. 13 is a diagram illustrating an example of suppressing transmission of image information.
  • Va is a mobile body (hereinafter also referred to as a notification vehicle) in which an incident or accident is detected and the notification button 16 is pressed
  • Ta is the mobile body Va.
  • the communication area of the communication base station to which the monitoring terminal device 10 belongs, Tb is the communication area of another communication base station, and the communication area to which the monitoring terminal device 10 of the mobile unit Va does not belong, and Vb to Ve indicate other mobile units. .
  • the central monitoring device 20 of this example obtains effective image information in addition to the image information from the reporting vehicle Va for the moving body Vb existing within the radius Ra (for example, within 50 m) with the reporting vehicle Va as a base point. Therefore, a command for transmitting the position information and the image information is output.
  • the possibility of being obtained cannot be said to be zero, priority is given to securing the amount of line communication over urgency, and a command for transmitting only position information, that is, a command for suppressing transmission of image information is output.
  • the image information is temporarily stored in the memory RAM or the like of the monitoring terminal device 10 of the moving body Vc, and is transmitted later as necessary.
  • the central monitoring device 20 of the present example suppresses or cancels the transmission of image information depending on the relationship with the communication area Ta of the base station to which the monitoring terminal device 10 of the reporting vehicle Va belongs. That is, a command to suppress transmission of image information is output to the mobile body Vc that exists in the radius Rb far from the radius Ra with the reporting vehicle Va as a base point and belongs to the communication area Ta of the same base station. Since it is not necessary to secure the communication line amount for the mobile body Vd that does not belong to the communication area Ta of the base station and belongs to the communication area Tb of another base station, the suppression of transmission of image information is cancelled.
  • transmission of image information is not suppressed for the mobile body Vf existing in a region farther than the radius Rb from the reporting vehicle Va and belonging to the communication area Tb of another base station, but the reporting vehicle Va is the starting point.
  • transmission of image information is suppressed for a mobile unit Ve that exists in an area farther than the radius Rb and belongs to the communication area Ta of the same base station.
  • the mounting positions and imaging ranges of the in-vehicle cameras 11a to 11e will be described.
  • a passenger car V will be described as an example of the moving body V.
  • the cameras 11a to 11e are configured using an image sensor such as a CCD, and the four on-vehicle cameras 11a to 11d are installed at different positions outside the passenger car V, respectively, and shoot four directions around the vehicle.
  • the in-vehicle camera 11 a installed at a predetermined position in front of the passenger car V such as a front grill portion is an object or road surface (in the area SP1 in front of the passenger car V and in the space in front thereof)
  • the in-vehicle camera 11b installed at a predetermined position on the left side of the passenger car V such as the left side mirror portion is an object or road surface (left side view) that exists in the area SP2 on the left side of the passenger car V and in the surrounding space.
  • the in-vehicle camera 11c installed at a predetermined position in the rear part of the passenger car V, such as a rear finisher part or a roof spoiler part, is an object or road surface (rear view) existing in the area SP3 behind the passenger car V and in the space behind it.
  • the in-vehicle camera 11d installed at a predetermined position on the right side of the passenger car V such as the right side mirror portion is an object or road surface (right side view) that exists in the area SP4 on the right side of the passenger car V and in the surrounding space.
  • one in-vehicle camera 11e is installed, for example, on the ceiling of the passenger car interior, and images the indoor area SP5 as shown in FIG. Used for crime prevention or crime reporting.
  • FIG. 6 is a view of the arrangement of the in-vehicle cameras 11a to 11e as viewed from above the passenger car V.
  • the in-vehicle camera 11a that images the area SP1 the in-vehicle camera 11b that images the area SP2
  • the in-vehicle camera 11c that images the area SP3 the in-vehicle camera 11d that images the area SP4 are It is installed along the outer periphery VE of the body along the counterclockwise direction (counterclockwise) or the clockwise direction (clockwise).
  • the in-vehicle camera 11b is installed on the left side of the in-vehicle camera 11a, and the left side of the in-vehicle camera 11b.
  • the vehicle-mounted camera 11c is installed on the left side of the vehicle-mounted camera 11c, and the vehicle-mounted camera 11a is installed on the left side of the vehicle-mounted camera 11d.
  • the in-vehicle camera 11d is installed on the right side of the in-vehicle camera 11a.
  • the vehicle-mounted camera 11c is installed on the right side
  • the vehicle-mounted camera 11b is installed on the right side of the vehicle-mounted camera 11c
  • the vehicle-mounted camera 11a is installed on the right side of the vehicle-mounted camera 11b.
  • FIG. 7A shows an example of an image GSP1 in which the front in-vehicle camera 11a images the area SP1
  • FIG. 7B shows an example of the image GSP2 in which the left in-vehicle camera 11b images the area SP2
  • FIG. 7D shows an example of an image GSP3 in which the area SP3 is imaged
  • FIG. 7D shows an example of an image GSP4 in which the right-side in-vehicle camera 11d images the area SP4
  • FIG. 7E shows an indoor in-vehicle camera 11e.
  • the size of each image is vertical 480 pixels ⁇ horizontal 640 pixels.
  • the image size is not particularly limited, and may be any size as long as a general terminal device can reproduce a moving image.
  • the number and position of the in-vehicle camera 11 can be appropriately determined according to the size, shape, detection area setting method, etc. of the passenger car V.
  • the plurality of in-vehicle cameras 11 described above are assigned identifiers corresponding to the respective arrangements, and the in-vehicle control device 14 can identify each of the in-vehicle cameras 11 based on each identifier.
  • the vehicle-mounted control apparatus 14 can send an imaging command and other commands to a specific vehicle-mounted camera 11 by attaching an identifier to the command signal.
  • the in-vehicle control device 14 controls the image processing device 12 to acquire each image signal picked up by the in-vehicle camera 11, and the image processing device 12 processes the image pickup signal from each in-vehicle camera 11 to perform FIG. 7E is converted into image information.
  • the in-vehicle controller 14 generates monitoring image information based on the four pieces of image information shown in FIGS. 7A to 7D (monitoring image generation function), and sets the monitoring image information on the side of the projection model of the columnar body.
  • the mapping information to be projected onto the projected plane is associated with the monitoring image information (mapping information adding function) and output to the central monitoring device 20.
  • the monitoring image generation function and the mapping information addition function will be described in detail.
  • the monitoring image information is generated based on the four pieces of image information obtained by imaging the periphery of the passenger car V, and the process of associating the mapping information with this is performed by the monitoring terminal device 10 as in this example, and the central monitoring device 20 You can also run In this case, four pieces of image information obtained by imaging the periphery of the passenger car V are transmitted as they are from the monitoring terminal device 10 to the central monitoring device 20, and are monitored by the image processing device 22 and the central control device 21 of the central monitoring device 20. Image information may be generated and mapping information may be associated to perform projection conversion.
  • the in-vehicle control device 14 of the monitoring terminal device 10 of the present embodiment controls the image processing device 12 to acquire the imaging signals of the in-vehicle cameras 11a to 11e, respectively, and further clockwise or along the outer periphery of the body of the passenger car V
  • One monitoring image is generated so that the image information of the in-vehicle cameras 11a to 11d installed in the counterclockwise direction is arranged in the order of installation of these in-vehicle cameras 11a to 11d.
  • the four in-vehicle cameras 11a to 11d are installed in the order of the cameras 11a, 11b, 11c, and 11d in the counterclockwise direction (counterclockwise) along the outer periphery VE of the body of the passenger car V. Therefore, the vehicle-mounted control device 14 integrates the four images captured by the vehicle-mounted cameras 11a to 11d in accordance with the order of installation of the vehicle-mounted cameras 11a to 11d (vehicle-mounted cameras 11a ⁇ 11b ⁇ 11c ⁇ 11d). Are connected in the horizontal direction to generate a single monitoring image. In the monitoring image of the present embodiment, the images are arranged such that the ground contact surface (road surface) of the passenger vehicle V is the lower side, and the images are connected to each other at sides in the height direction (vertical direction) with respect to the road surface.
  • FIG. 8 is a diagram illustrating an example of the monitoring image K.
  • the monitoring image K of the present embodiment includes a captured image GSP1 in which the front in-vehicle camera 11a images the area SP1 along the direction P from the left side to the right side in the drawing, and the left in-vehicle camera 11b.
  • a captured image GSP2 obtained by imaging the area SP2 a captured image GSP3 obtained by the rear vehicle-mounted camera 11c imaging the area SP3, and a captured image GSP4 obtained by the right-side vehicle-mounted camera 11d imaging the area SP4 are arranged in this order in the horizontal direction. These four images are arranged as a series of images.
  • the monitor image K generated in this way is displayed in order from the left end to the right side with the image corresponding to the road surface (vehicle contact surface) facing down, so that the monitor can rotate the periphery of the vehicle V counterclockwise. It can be visually recognized on the display 24 in a manner similar to the look around.
  • one monitoring image K when one monitoring image K is generated, four images acquired at substantially the same time as the photographing timings of the in-vehicle cameras 11a to 11d are used. Thereby, since the information contained in the monitoring image K can be synchronized, the situation around the vehicle at a predetermined timing can be accurately expressed.
  • the monitoring image K generated from the respective captured images having substantially the same imaging timing of the camera is stored with time, and the moving image monitoring image K including the plurality of monitoring images K per predetermined unit time is generated. It may be. By generating the moving image monitoring image K based on the images having the same imaging timing, it is possible to accurately represent changes in the situation around the vehicle.
  • the conventional central monitoring device 20 has a disadvantage in that it cannot simultaneously watch images (moving images) in a plurality of directions and cannot monitor the entire vehicle periphery on a single screen.
  • the vehicle-mounted control apparatus 14 of this embodiment produces
  • the monitoring terminal device 10 of the present embodiment generates the monitoring image K by compressing the data amount of the image so that the number of pixels of the monitoring image K is substantially the same as the number of pixels of the images of the in-vehicle cameras 11a to 11d.
  • the size of each image shown in FIGS. 7A to 7D is 480 ⁇ 640 pixels
  • compression processing is performed so that the size of the monitoring image K is 1280 ⁇ 240 pixels as shown in FIG. Do.
  • image processing and image reproduction can be performed.
  • the in-vehicle control device 14 of the present embodiment can also attach a line figure indicating the boundary of each arranged image to the monitoring image K.
  • the in-vehicle controller 14 forms a rectangular partition image Bb, Bc, Bd, Ba, Ba ′ between the images as a line figure indicating the boundary between the arranged images. Can be attached to the monitoring image K.
  • the partition image functions as a frame of each captured image.
  • the image distortion is large in the vicinity of the boundary of each captured image, it is possible to hide the image of the region with large distortion or to suggest that the distortion is large by arranging the partition image at the boundary of the captured image. .
  • the vehicle-mounted control apparatus 14 of this embodiment can also generate
  • image distortion is likely to occur.
  • the distortion of the captured image tends to be large, so in order to correct the image distortion in advance. It is desirable to correct the distortion of the captured image using the defined image conversion algorithm and correction amount.
  • the in-vehicle control device 14 reads out information of the same projection model as the projection model for projecting the monitoring image K in the central monitoring device 20 from the ROM, and images it on the projection plane of the projection model. It is also possible to project an image and correct in advance distortion generated on the projection surface.
  • the image conversion algorithm and the correction amount can be appropriately defined according to the characteristics of the in-vehicle camera 11 and the shape of the projection model. In this way, by correcting in advance the distortion when the image K is projected with respect to the projection plane of the projection model, it is possible to provide the monitoring image K with good visibility with less distortion. Further, by correcting the distortion in advance, it is possible to reduce the positional deviation between the images arranged side by side.
  • the mapping information addition function will be described.
  • the in-vehicle control device 14 projects the generated monitoring image K on the projection plane set on the side surface of the projection model M of the columnar body with the ground contact surface of the passenger car V as the bottom surface.
  • a process for associating the mapping information for monitoring with the monitoring image K is executed.
  • the mapping information is information for allowing the central monitoring device 20 that has received the monitoring image K to easily recognize the projection reference position.
  • FIG. 10 is a diagram illustrating an example of the projection model M of the present embodiment
  • FIG. 11 is a schematic cross-sectional view along the xy plane of the projection model M illustrated in FIG.
  • the projection model M of this embodiment is a regular octagonal prism body having a regular octagonal bottom surface and a height along the vertical direction (z-axis direction in the figure).
  • the shape of the projection model M is not particularly limited as long as it is a column having side surfaces adjacent to each other along the boundary of the bottom surface, and is a cylinder, or a prism, such as a triangular column, a quadrangular column, or a hexagonal column, or An anti-rectangular column having a polygonal bottom surface and a triangular side surface can also be used.
  • the bottom surface of the projection model M of this embodiment is parallel to the ground contact surface of the passenger car V.
  • Projection surfaces Sa, Sb, Sc, and Sd (hereinafter collectively referred to as a projection surface S) that project an image around the passenger vehicle V that contacts the bottom surface of the projection model M are provided on the inner surface of the side surface of the projection model M. Is set.
  • the projection surface S includes a part of the projection surface Sa and a part of the projection surface Sb, a part of the projection surface Sb and a part of the projection surface Sc, a part of the projection surface Sc and a part of the projection surface Sd, and the projection surface Sd. And a part of the projection surface Sa.
  • the monitoring image K is projected on the projection plane S as an image of the passenger car V viewed from above the viewpoint R (R1 to R8, hereinafter referred to as viewpoint R) above the projection model M surrounding the passenger car V.
  • the in-vehicle control device 14 associates the reference coordinates of the captured image arranged at the right end or the left end with the monitoring image K as mapping information.
  • the in-vehicle control device 14 is arranged at the right end as mapping information (reference coordinates) indicating the start end position or the end position of the monitoring image K when projected onto the projection model M.
  • mapping information reference coordinates
  • the coordinates A (x, y) of the upper left vertex of the captured image GSP1 and the coordinates B (x, y) of the upper right vertex of the captured image GSP2 arranged at the left end are attached to the monitoring image K.
  • the reference coordinates of the captured image indicating the start position or the end position are not particularly limited, and may be the lower left vertex of the monitoring image K arranged at the left end or the lower right vertex of the monitoring image K arranged at the right end.
  • the mapping information may be attached to each pixel of the image data of the monitoring image K, or may be managed as a file different from the monitoring image K.
  • the information indicating the start position or the end position of the monitoring image K that is, the reference coordinates used as a reference in the projection processing is associated with the monitoring image K as mapping information, whereby the central monitoring apparatus 20 that has received the monitoring image K Since the reference position at the time of the projection process can be easily recognized, the monitoring images K arranged in the order in which the in-vehicle cameras 11a to 11d are arranged are projected sequentially and easily on the projection surface S on the side surface of the projection model M. be able to. That is, as shown in FIG.
  • the captured image GSP1 in front of the vehicle is projected onto the projection surface Sa positioned in the imaging direction of the in-vehicle camera 11a, and the captured image on the right side of the vehicle is projected onto the projection surface Sb positioned in the imaging direction of the in-vehicle camera 11b.
  • GSP2 is projected, a captured image GSP3 behind the vehicle is projected onto a projection plane Sc located in the imaging direction of the in-vehicle camera 11c, and a captured image GSP4 on the left side of the vehicle is projected onto the projection plane Sd positioned in the imaging direction of the in-vehicle camera 11d. can do.
  • the monitoring image K projected on the projection model M can show an image that can be seen as if looking around the passenger car V. That is, since the monitoring image K including four images arranged in a line in the horizontal direction according to the installation order of the in-vehicle cameras 11a to 11d is projected on the side surfaces that are also arranged in the horizontal direction in the column of the projection model M. An image around the passenger car V can be reproduced in the monitoring image K projected on the projection surface S of the projection model M of the columnar body while maintaining the positional relationship.
  • the in-vehicle control device 14 of the present embodiment stores the correspondence relationship between each coordinate value of the monitoring image K and the coordinate value of each projection plane S of the projection model M as mapping information, and attaches it to the monitoring image K.
  • it may be stored in the central monitoring device 20 in advance.
  • the positions of the viewpoint R and the projection plane S shown in FIGS. 10 and 11 are examples, and can be arbitrarily set.
  • the viewpoint R can be changed by the operation of the operator.
  • the relationship between the viewpoint R and the projection position of the monitoring image K is defined in advance, and when the position of the viewpoint R is changed, a predetermined coordinate transformation is performed, so that the viewpoint R is viewed from the newly set viewpoint R.
  • the monitoring image K can be projected onto the projection surface S (Sa to Sd). A known method can be used for this viewpoint conversion processing.
  • the in-vehicle control device 14 generates the monitoring image K based on the image information captured at a predetermined timing, and the monitoring image K includes a line figure (mapping information, reference coordinates, and boundary). (Partition image) information is associated and stored over time according to the imaging timing.
  • the in-vehicle control device 14 may store the monitoring image K as a single moving image file including a plurality of monitoring images K per predetermined unit time, or can be transferred / reproduced by a streaming method.
  • the monitoring image K may be stored in
  • the communication device 23 of the central monitoring device 20 receives the monitoring image K transmitted from the monitoring terminal device 10 and the mapping information associated with the monitoring image K.
  • photographed with the indoor vehicle-mounted camera 11e is received separately.
  • this monitoring image K images of the four in-vehicle cameras 11 installed at different positions of the body of the passenger car V are installed along the outer periphery of the body of the passenger car V along the clockwise or counterclockwise direction.
  • the vehicle-mounted cameras 11a to 11d are arranged according to the installation order (clockwise or counterclockwise order along the outer periphery of the body of the vehicle V).
  • the monitoring image K is associated with mapping information for projecting the monitoring image K onto the projection plane S of the octagonal prism projection model M.
  • the communication device 23 sends the acquired monitoring image K and mapping information to the image processing device 22.
  • the image processing device 22 reads the projection model M stored in advance, and sets the projection model M on the side surface of the octagonal prism projection model M with the ground contact surface of the passenger car V shown in FIGS. 10 and 11 as the bottom surface based on the mapping information.
  • a display image is generated by projecting the monitoring image K onto the projected planes Sa to Sd. Specifically, according to the mapping information, each pixel of the received monitoring image K is projected onto each pixel of the projection surfaces Sa to Sd. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 recognizes the start point of the monitoring image K (the right end or the left end of the monitoring image K) based on the reference coordinates received together with the monitoring image K.
  • the projection processing is performed so that the start point coincides with the start point (the right end or the left end of the projection surface S) defined in advance on the projection model M. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 arranges a line figure (partition image) indicating the boundary of each image on the projection model M.
  • the partition image can be attached to the projection model M in advance, or can be attached to the monitoring image K after the projection processing.
  • the display 24 displays the monitoring image K projected on the projection plane S of the projection model M.
  • FIG. 12 shows an example of a display image of the monitoring image K.
  • the input device 25 such as a mouse or a keyboard or the display 24 as the touch panel type input device 25
  • the viewpoint can be freely set and changed by the operation of the supervisor. Since the correspondence relationship between the viewpoint position and the projection plane S is defined in advance in the image processing device 22 or the display 24 described above, the monitoring image K corresponding to the changed viewpoint is displayed on the display 24 based on this correspondence relationship. can do.
  • FIG. 3 is a flowchart showing the operation on the monitoring terminal device 10 side
  • FIG. 4 is a flowchart showing the operation on the central monitoring device 20 side.
  • the surrounding video and the indoor video are acquired from the in-vehicle camera 11 at a predetermined time interval (one routine shown in FIG. 3), and image information is obtained by the image processing device 12. Conversion is performed (step ST1). Further, the current position information of the passenger car V on which the monitoring terminal device 10 is mounted is detected from the position detection device 15 (step ST2).
  • step ST3 it is determined whether or not the report button 16 has been pressed. If the report button 16 has been pressed, the process proceeds to step ST4, where the image information acquired in step ST1, the position information acquired in step ST2, and The time information of the CPU is associated, and these are transmitted to the central monitoring device 20 via the communication device 13 and the telecommunication network 30 together with the abnormality information indicating that an abnormality has occurred. As a result, the occurrence of an abnormality related to security such as an accident or crime is automatically transmitted to the central monitoring device 20 together with the position information of the passenger car V and the image information around the passenger car V, thereby further strengthening the monitoring in the city. Will be.
  • the image information and the position information are acquired in the first steps ST1 and ST2, but the image information and the position information may be acquired at a timing between steps ST3 and ST4.
  • step ST5 if the report button 16 is not pressed, the process proceeds to step ST5, and an image transmission command is input from the central monitoring device 20. If there is an image transmission command from the central monitoring apparatus 20 in step ST6, the process proceeds to step ST7, and the image information acquired in step ST1, the position information acquired in step ST2, and the time information of the CPU are associated with each other. These are transmitted to the central monitoring device 20 via the communication device 13 and the telecommunication network 30. Thereby, even if the passenger of the passenger car V does not press the notification button 16, the required image information can be appropriately transmitted when requested by the supervisor operating the central monitoring device 20. In this example, the image information and the position information are acquired in the first steps ST1 and ST2, but the image information and the position information may be acquired at a timing between steps ST3 and ST4.
  • step ST6 if there is no image transmission command from the central monitoring device 20, the process proceeds to step ST8, and it is determined whether there is an image transmission suppression command from the central monitoring device 20.
  • the image transmission suppression command is a command that is executed to ensure the amount of line communication when the notification button 16 of the passenger vehicle V around the passenger vehicle V is pressed and abnormality information is transmitted to the central monitoring device 20. is there. If there is an image transmission suppression command from the central monitoring device 20, the process proceeds to step ST10, where the image information acquired in step ST1, the position information acquired in step ST2, and the time information of the CPU are associated and monitored. Temporarily stored in the memory RAM of the terminal device 10.
  • step ST8 when there is no image transmission suppression command from the central monitoring device 20, the process proceeds to step ST9, and only the position information acquired in step ST2 is transmitted to the central monitoring device 20 via the communication device 13 and the telecommunication network 30. .
  • the central monitoring device 20 displays the current position of each passenger car V on the map information (step ST12 in FIG. 4), so that the current position of each passenger car V can be grasped in a timely manner.
  • the central monitoring device 20 acquires position information and abnormality information from all the passenger cars equipped with the monitoring terminal device 10 as shown in FIG. 4 (step ST11). If the communication load is not high, the image information may be acquired at this timing.
  • step ST12 the passenger car V is displayed on the map information of the map database displayed on the display 24 as shown in the upper left of FIG. 1 based on the position information acquired in step ST11. Since the position information of the passenger car V is acquired and transmitted at a predetermined timing for each routine in FIG. 3, the supervisor can grasp the current position of the passenger car V in a timely manner.
  • step ST13 it is determined whether or not abnormality information notified from the monitoring terminal device 10 of the passenger car V, that is, a notification that an abnormality relating to security such as an accident or a crime has occurred has been received.
  • This abnormality information is output when the passenger of the passenger car V presses the notification button 16 of the monitoring terminal device 10. If there is abnormality information, the passenger vehicle V to which abnormality information was output is identified in step ST14, image information and time information are received from the monitoring terminal device 10 of the passenger vehicle, and the image information is displayed on the display 24. Further, as shown in the upper left of FIG. 1, highlighting is performed such as changing the color so that the passenger car displayed on the map information can be distinguished from other passenger cars. Thereby, the position where the abnormality has occurred can be visually recognized on the map information, and the abnormality content can be grasped on the display 24.
  • the image information can be acquired from the passenger vehicle Vb that travels in the vicinity of the passenger car Va that has output the abnormality information. Therefore, the abnormality information can be obtained from a plurality of image information in addition to the image information from the passenger car Va that has output the abnormality information. The contents can be grasped in detail.
  • emergency vehicles such as a police car, an ambulance, and a fire engine.
  • image information may be attached and transmitted in order to notify the abnormal content.
  • the emergency vehicle can be dispatched before a report from the site is entered, and a quick response to an accident or crime is possible.
  • a command for transmitting only position information that is, a command for suppressing transmission of image information is output.
  • the image information from these passenger cars Vc cannot be said to be effective in addition to the image information from the reporting vehicle Va, it is not intended that the possibility of obtaining effective information is zero. is there.
  • the presence / absence of the image transmission suppression command output here is determined in step ST8 of FIG. 3 described above. If the transmission of the image information is suppressed in step ST16, the image information is temporarily stored in the memory RAM or the like of the monitoring terminal device 10 of the moving body Vc as described above in step ST10 of FIG. Sent later, if necessary.
  • the central monitoring device 20 of this example suppresses or cancels the transmission of image information depending on the relationship with the communication area Ta of the base station to which the monitoring terminal device 10 of the reporting vehicle Va belongs. That is, as shown in FIG. 13, transmission of image information is suppressed for a moving body Vc that exists in a radius Rb far from the radius Ra with the reporting vehicle Va as a base point and belongs to the communication area Ta of the same base station. Although the command is output, since it is not necessary to secure the communication line amount for the mobile body Vd that does not belong to the communication area Ta of the same base station and belongs to the communication area Tb of another base station, transmission of image information is not necessary. Release the suppression.
  • transmission of image information is not suppressed for the mobile body Vf existing in a region farther than the radius Rb from the reporting vehicle Va and belonging to the communication area Tb of another base station, but the reporting vehicle Va is the starting point.
  • transmission of image information is suppressed for a mobile unit Ve that exists in an area farther than the radius Rb and belongs to the communication area Ta of the same base station.
  • step ST17 all position information, image information, and time information received from the monitoring terminal device 10 are recorded on the recording medium. This record is used to resolve these after an accident or crime. If there is no abnormality information in step ST13, the process returns to step ST1 without performing the processes in steps ST14 to ST17.
  • the monitoring system of the present embodiment has the following effects. (1) Since the monitoring system 1 of this example attaches the vehicle-mounted camera 11 to a plurality of passenger vehicles V, images the surroundings with the vehicle-mounted camera, and detects the position information of the passenger vehicle V with the position detection device 15, the fixed camera Compared with, it is possible to monitor a wide range with a small number of in-vehicle cameras. Further, since the plurality of passenger cars V travel at random, the number of blind spots can be reduced with a small number of in-vehicle cameras as compared with the fixed cameras. In addition, since the camera is mounted on the passenger car V, it is less likely to be destroyed for the purpose of monitoring prevention compared to a fixed camera. Moreover, since a wide range can be monitored, it is possible to reduce the patrol work of the supervisor.
  • the monitoring system 1 of this example acquires time information in addition to position information and image information, the position information and image information are arranged along the time information when performing a post-mortem analysis of an accident or a crime. Can contribute to the resolution of the case.
  • the monitoring system 1 of the present example always acquires and transmits position information, the image information is acquired and transmitted in an abnormal state or when requested by the central monitoring device 20.
  • the capacity can be minimized, and a decrease in communication speed can be suppressed. Further, the information recording capacity can be minimized, and an inexpensive and small system can be obtained.
  • the notification button 16 is provided in the monitoring terminal device 10 of this example, when the passenger of the passenger car V finds an abnormality, the central monitoring device 20 is immediately notified together with the position information and the image information. can do. As a result, it is possible to grasp the contents of the abnormality more accurately, quickly and easily than the explanation by telephone or the like, and to contribute to the initial investigation of the incident.
  • the monitoring terminal device 10 of this example receives the image transmission command from the central monitoring device 20, the monitoring terminal device 10 transmits the image information together with the position information. Can be confirmed at the place where the central monitoring device 20 is installed.
  • the central monitoring device 20 of this example displays the position information received from the monitoring terminal device on the map information on the display 24, the central monitoring device 20 grasps the arrangement of the passenger car V that can detect information. Can do. As a result, it is possible to grasp the distribution of the area from which image information can be acquired, and to contribute to the monitoring plan of the supervisor. Moreover, since the image information received from the monitoring terminal device 10 is displayed on the display 24 as necessary, the monitor can view the video of the desired position at the place where the central monitoring device 20 is installed without going to the site. Can be confirmed.
  • the central monitoring device 20 of this example When the central monitoring device 20 of this example receives the abnormality information from the monitoring terminal device 10, the central monitoring device 20 highlights the display of the passenger car V notified on the display 24 and displays the image information received from the passenger vehicle V on the display 24. Since it is displayed, it is possible to immediately confirm the position and the video, and prompt response to the incident can be expected.
  • the central monitoring device 20 of this example receives the abnormality information from the monitoring terminal device 10, as shown in FIG. 13, the central monitoring device 20 issues an image transmission command to the passenger vehicle Vb traveling in the vicinity of the passenger vehicle Va that has transmitted the abnormality information. Since the transmission is performed, not only the image information from one passenger car V but also the image information from a plurality of passenger cars V can be acquired immediately, and the contents of the abnormality can be easily grasped.
  • the central monitoring device 20 of this example receives the abnormality information from the monitoring terminal device 10, as shown in FIG. Since the transmission suppression command is transmitted, the transmission speed of the image information from the reporting vehicle Va is increased, or a high-quality image can be transmitted from the reporting vehicle Va instead. Further, in the passenger car V in which the transmission of the image information is suppressed, the image information is stored in the memory RAM of the monitoring terminal device 10, so that it can be used for post-mortem analysis. In addition, since image transmission suppression is canceled for passenger cars Vd with different communication base stations even if they are not near the reporting vehicle Va, the image information to the central monitoring device 20 can be transmitted to the central monitoring device 20 when multiple accidents occur at the same time. Aggregation is possible.
  • the central monitoring device 20 of this example receives the abnormality information from the monitoring terminal device 10, it transmits the abnormality information to emergency vehicles such as police cars, ambulances, and fire engines, so that the incident can be quickly handled. Can do. Further, by transmitting the position information and the image information together with the abnormality information to the emergency vehicle, it is possible to quickly and accurately grasp the abnormality content on the emergency vehicle side.
  • emergency vehicles such as police cars, ambulances, and fire engines
  • the position information of the passenger car V and the image information from the in-vehicle cameras 11a to 11e are acquired.
  • the image information from the fixed camera 11f installed in the city shown in FIG. May be obtained.
  • the passenger car V which acquires positional information and image information it is desirable to use the taxi V1 and bus
  • the number of the four on-vehicle cameras 11a to 11d may be three or less, particularly in an environment where image information can be acquired from many passenger cars V, such as a monitoring area where there is a large amount of traffic.
  • the passenger vehicle V corresponds to a moving body according to the present invention
  • the position detection device 15 corresponds to a position detection unit according to the present invention
  • the in-vehicle camera 11 and the image processing device 12 correspond to an image generation unit according to the present invention.
  • the on-vehicle control device 14 corresponds to the information acquisition control means according to the present invention
  • the CPU of the on-vehicle control device 14 corresponds to the time detection means according to the present invention
  • the notification button 16 is a command input according to the present invention.
  • the communication device 13 corresponds to a command receiving means and an information output means according to the present invention
  • the communication device 23 corresponds to an information input means, an abnormality information receiving means and a command output means according to the present invention
  • the display 24 corresponds to first display control means and second display control means according to the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Alarm Systems (AREA)

Abstract

L'invention concerne un système de surveillance (1) comprenant un dispositif terminal de surveillance (10), qui acquiert des informations de surveillance, et un dispositif de surveillance central (20) dans lequel les informations de surveillance sont entrées par l'intermédiaire d'un réseau de lignes de communication électronique (30). Le dispositif terminal de surveillance comprend : un moyen de détection de position (15) qui détecte des informations de position relatives à plusieurs corps mobiles (V) ; des moyens de production d'image (11, 12) qui sont montés sur chacun desdits corps mobiles et génèrent des informations d'image par imagerie du voisinage du corps mobile correspondant ; et un moyen de sortie d'informations d'anomalie (16) qui fournit des informations d'anomalie lorsqu'un état anormal est constaté. Le dispositif de surveillance central comprend : un moyen d'entrée d'informations (23) dans lequel sont entrées les informations de position et les informations d'image fournies par le dispositif terminal de surveillance ; et un moyen de sortie de commande (21) qui fournit, au dispositif terminal de surveillance d'un corps mobile présent dans une première région prédéterminée ayant comme point d'origine la position d'un corps mobile ayant fourni les informations d'anomalie, une commande qui supprime la transmission d'informations d'image.
PCT/JP2012/083465 2012-01-26 2012-12-25 Système de surveillance WO2013111492A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-014617 2012-01-26
JP2012014617 2012-01-26

Publications (1)

Publication Number Publication Date
WO2013111492A1 true WO2013111492A1 (fr) 2013-08-01

Family

ID=48873231

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/083465 WO2013111492A1 (fr) 2012-01-26 2012-12-25 Système de surveillance

Country Status (1)

Country Link
WO (1) WO2013111492A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015179333A (ja) * 2014-03-18 2015-10-08 株式会社日本総合研究所 自動運転交通システムを利用した地域見守システム及び地域見守方法
EP3599134A1 (fr) * 2018-07-25 2020-01-29 Denso Ten Limited Dispositif et procédé de signalisation d'accidents
CN114868167A (zh) * 2019-12-13 2022-08-05 大和通信株式会社 安全系统和监控方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007076404A (ja) * 2005-09-12 2007-03-29 Sony Ericsson Mobilecommunications Japan Inc 車両用データ取得システムおよび車両用データ取得装置
JP2007328477A (ja) * 2006-06-07 2007-12-20 Hitachi Ltd 通信システム、通信端末および情報処理装置
JP2009205368A (ja) * 2008-02-27 2009-09-10 Denso Corp 事故通報システム、及び、車載装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007076404A (ja) * 2005-09-12 2007-03-29 Sony Ericsson Mobilecommunications Japan Inc 車両用データ取得システムおよび車両用データ取得装置
JP2007328477A (ja) * 2006-06-07 2007-12-20 Hitachi Ltd 通信システム、通信端末および情報処理装置
JP2009205368A (ja) * 2008-02-27 2009-09-10 Denso Corp 事故通報システム、及び、車載装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015179333A (ja) * 2014-03-18 2015-10-08 株式会社日本総合研究所 自動運転交通システムを利用した地域見守システム及び地域見守方法
EP3599134A1 (fr) * 2018-07-25 2020-01-29 Denso Ten Limited Dispositif et procédé de signalisation d'accidents
JP2020017077A (ja) * 2018-07-25 2020-01-30 株式会社デンソーテン 事故通報装置及び事故通報方法
US20200031299A1 (en) * 2018-07-25 2020-01-30 Denso Ten Limited Accident report device and accident report method
US10713829B2 (en) 2018-07-25 2020-07-14 Denso Ten Limited Accident report device and accident report method
JP7168367B2 (ja) 2018-07-25 2022-11-09 株式会社デンソーテン 事故通報装置
CN114868167A (zh) * 2019-12-13 2022-08-05 大和通信株式会社 安全系统和监控方法

Similar Documents

Publication Publication Date Title
JP5786963B2 (ja) 監視システム
JP5811190B2 (ja) 監視システム
JP6451840B2 (ja) 情報提示システム
WO2013008623A1 (fr) Dispositif de surveillance de véhicule, système de surveillance de véhicule, dispositif de terminal et procédé de surveillance de véhicule
KR101470313B1 (ko) 자동차 블랙박스 영상물 실시간 수집 시스템 및 방법
JP4643860B2 (ja) 車両用視覚支援装置及び支援方法
WO2015045578A1 (fr) Système de fourniture d'informations
WO2013111494A1 (fr) Système de surveillance
WO2012137367A1 (fr) Système d'accumulation d'images
JP6260174B2 (ja) 監視画像提示システム
WO2013111491A1 (fr) Système de surveillance
WO2013111492A1 (fr) Système de surveillance
JP2016092814A (ja) 360度全景ドライブレコーダー
KR20140035645A (ko) 차량용 블랙박스 시스템 및 그 제공 방법
WO2013111479A1 (fr) Système de surveillance
WO2013161345A1 (fr) Système de surveillance et procédé de surveillance
JP5796638B2 (ja) 監視システム
WO2013094405A1 (fr) Système de contrôle
WO2013125301A1 (fr) Système de surveillance
WO2013111493A1 (fr) Système de surveillance
JP5812105B2 (ja) 監視システム
JP4093094B2 (ja) 車両周辺監視システム及び車両周辺監視方法
JP4696825B2 (ja) 車両用死角映像表示装置
KR20130028214A (ko) 스마트폰 연동형 차량용 블랙박스
WO2013136894A1 (fr) Système et procédé de suivi

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12866441

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12866441

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP