JP2016092614A - On-vehicle camera system - Google Patents

On-vehicle camera system Download PDF

Info

Publication number
JP2016092614A
JP2016092614A JP2014225391A JP2014225391A JP2016092614A JP 2016092614 A JP2016092614 A JP 2016092614A JP 2014225391 A JP2014225391 A JP 2014225391A JP 2014225391 A JP2014225391 A JP 2014225391A JP 2016092614 A JP2016092614 A JP 2016092614A
Authority
JP
Japan
Prior art keywords
vehicle
data
shooting
camera
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2014225391A
Other languages
Japanese (ja)
Inventor
勉 足立
Tsutomu Adachi
勉 足立
毅 川西
Takeshi Kawanishi
毅 川西
林 茂
Shigeru Hayashi
茂 林
大介 毛利
Daisuke Mori
大介 毛利
丈誠 横井
Takemasa Yokoi
丈誠 横井
謙史 竹中
Kenji Takenaka
謙史 竹中
健純 近藤
Takeyoshi Kondo
健純 近藤
博司 前川
Hiroshi Maekawa
博司 前川
Original Assignee
エイディシーテクノロジー株式会社
Adc Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エイディシーテクノロジー株式会社, Adc Technology Inc filed Critical エイディシーテクノロジー株式会社
Priority to JP2014225391A priority Critical patent/JP2016092614A/en
Publication of JP2016092614A publication Critical patent/JP2016092614A/en
Application status is Pending legal-status Critical

Links

Abstract

PROBLEM TO BE SOLVED: To make it possible to provide service having high added value, on the basis of a picture imaged by an on-vehicle camera.SOLUTION: An on-vehicle camera system comprises at least one camera configured so as to be capable of imaging a picture outside or inside a vehicle and outputting data obtained by the imaging. The data imaged by the camera is used for predetermined support control for supporting a driver's operation of driving the vehicle. When a specific imaging condition (for example, detection of illegal operation, detection of a person or animal, etc) is established in an imaging suspension period in which the support control is not performed and the camera's operation stops, the camera is actuated and is made to store its imaged data in a storage unit.SELECTED DRAWING: Figure 6

Description

  The present invention relates to an in-vehicle camera system that is mounted on a vehicle and configured to take an image with an in-vehicle camera and execute processing based on the image.

  Various technologies have been proposed and put into practical use in which a camera is mounted on a vehicle to photograph the surroundings of the vehicle, and various controls of the vehicle are performed based on the photographing results. Japanese Patent Application Laid-Open No. 2004-133867 describes a technique for generating an alarm when an in-vehicle camera captures an image of the vehicle surroundings and detects an obstacle around the vehicle. Japanese Patent Application Laid-Open No. 2004-133867 describes a technique for capturing a vehicle travel path with an in-vehicle camera and outputting a warning when a vehicle is about to protrude into the oncoming lane so that a collision with the oncoming vehicle can be avoided.

Japanese Unexamined Patent Publication No. 2011-210087 JP 2012-208701 A

By mounting the camera on the vehicle, as described in Patent Documents 1 and 2, the driving of the vehicle by the driver can be supported to improve safety.
On the other hand, when a camera is mounted on a vehicle, as long as the in-vehicle camera is operable (that is, as long as the power for operating the in-vehicle camera is not interrupted), the in-vehicle camera is operated as necessary to obtain various image information. Can be obtained. Therefore, the fact that the image of the in-vehicle camera is used only for the above-described driving support has much room for improvement from the viewpoint of effective use of the in-vehicle camera.

  The present invention has been made in view of the above problems, and an object thereof is to provide a service with high added value based on a photographed image of a vehicle-mounted camera.

  An in-vehicle camera system according to one aspect of the present invention made to solve the above problems is mounted on a vehicle and includes at least one photographing unit, a support control unit, and a specific photographing control unit. The imaging unit is configured to be able to capture an image inside or outside the vehicle and output the captured data. The support control unit is configured to perform predetermined support control for supporting the driving operation of the vehicle by the driver based on the shooting data of the shooting unit. The specific photographing control unit activates the photographing unit to shoot the photographing unit when a specific photographing condition is satisfied during a photographing suspension period in which the operation of the photographing unit is stopped without support control by the support control unit. Data is stored in the storage unit.

  According to the in-vehicle camera system having the above-described configuration, the imaging unit used for support control is activated when a specific imaging condition is satisfied during the imaging suspension period, and the imaging data is acquired. By appropriately setting specific shooting conditions, it is possible to effectively operate the shooting unit to acquire shooting data, thereby providing a high value-added service based on the shooting data of the shooting unit Is possible.

  In addition, the specific shooting conditions are such that the state of the inside, the outside of the vehicle, or the vehicle itself can provide a value-added service by shooting the outside or the inside of the vehicle, It is. The period in which the support control is not performed is a period in which photographing by the photographing unit is not necessary in principle. However, when a specific photographing condition is satisfied, photographing by the photographing unit is performed. Thereby, the photographing data can be effectively used for a purpose other than the support control, and a service with high added value can be provided.

  The in-vehicle camera system may include a person approach detection unit configured to be able to detect that a person is approaching the vehicle outside the vehicle. In this case, at least that the person approaching detection unit detects that a person is approaching the vehicle may be set as the imaging condition.

  That is, the photographing unit used for support control is activated when a person approaches the vehicle even during the photographing suspension period, and photographing data is acquired. As a result, it is possible to photograph a person approaching the vehicle, and it is possible to improve the crime prevention performance of the vehicle while suppressing an increase in cost.

  The in-vehicle camera system may further include a communication unit and a transmission control unit. The communication unit is provided to perform data communication with an information processing device outside the vehicle. The transmission control unit has a first transmission function for transmitting the shooting data stored in the storage unit by the specific shooting control unit to the information processing device via the communication unit, and a shooting unit when receiving a shooting instruction from the information processing device. And at least one of the second transmission functions for transmitting the captured data to the information processing apparatus via the communication unit.

  By having at least one of the first transmission function and the second transmission function, it is possible to acquire image data captured by the vehicle from the information processing apparatus outside the vehicle. Therefore, the surroundings of the vehicle can be monitored by remote operation, and the crime prevention performance of the vehicle can be further improved.

It is explanatory drawing which shows schematic structure of the vehicle of embodiment. It is a perspective view which shows schematic structure inside a glove box. It is a block diagram which shows the electric constitution of the vehicle of embodiment. It is a flowchart of an operation mode setting process. It is a flowchart of a driving assistance process. It is a flowchart of a monitoring control process. It is a flowchart of a fraud monitoring process. It is explanatory drawing which shows the database of imaging | photography data. It is a flowchart of a specific monitoring process. It is a flowchart of a regular monitoring process. It is explanatory drawing which shows a photography vehicle database. It is a flowchart of a monitoring result notification process. It is a flowchart of a glove box monitoring process. It is a flowchart of a data amount adjustment process. It is a flowchart of an emergency recording process. It is a flowchart of a driver determination process. It is a flowchart of imaging | photography data acquisition processing. It is a flowchart of a server transmission process. It is explanatory drawing which shows schematic structure of the front camera of 2nd Embodiment. It is explanatory drawing for demonstrating operation | movement of the front camera of 2nd Embodiment. It is explanatory drawing which shows schematic structure of the center camera of 2nd Embodiment. It is a flowchart of a camera mutual monitoring process.

Embodiments to which the present invention is applied will be described below with reference to the drawings.
[First Embodiment]
(1) Configuration of Vehicle 1 FIG. 1A shows a side view of the vehicle 1 of the present embodiment, and FIG. 1B shows a top view of the vehicle 1. However, FIG. 1A and FIG. 1B briefly illustrate the arrangement state mainly for the purpose of clearly showing the arrangement state of various cameras and sensors in the vehicle 1.

  As shown in FIGS. 1A and 1B, the vehicle 1 has at least a front camera 2, a rear camera 3, a left camera 4, a right camera 5, and an indoor camera 6 as cameras for photographing the inside and outside of the vehicle 1. A vehicle bottom camera 7 and a glove box (hereinafter abbreviated as “G box”) camera 8.

  Each of the cameras 2 to 8 is a camera capable of shooting color images and moving images. Moreover, each camera 2-8 may be a monocular camera, or may be a stereo camera that can also acquire information in the depth direction by providing a plurality of lenses.

  The front camera 2 is provided so as to face forward on the front end side of the ceiling in the passenger compartment. The front camera 2 can photograph the front of the vehicle 1 in a wide range. The rear camera 3 is provided so as to face rearward on the rear end side of the ceiling in the passenger compartment. The rear camera 3 can shoot the rear of the vehicle 1 in a wide range.

  The left side camera 4 is provided on the left side surface of the vehicle 1 so as to face the left side. The left side camera 4 can shoot the left side of the vehicle 1 in a wide range. The right side camera 5 is provided on the right side surface of the vehicle 1 so as to face the right side. The right side camera 5 can shoot the right side of the vehicle 1 in a wide range.

  The indoor camera 6 is provided so as to face rearward (vehicle interior) on the front end side of the ceiling in the vehicle interior. With this indoor camera 6, it is possible to take an image of the whole area of the vehicle interior centered on the upper body of the driver (driver). The vehicle bottom camera 7 is provided at the bottom of the vehicle 1. The entire bottom region of the vehicle 1 can be photographed by the vehicle bottom camera 7.

  As shown in FIG. 2, the G box camera 8 is provided on the inner wall of the G box 41 in front of the passenger seat in the passenger compartment. In a state where the lid 42 of the G box 41 is closed, the G box camera 8 is not visible to the passenger in the vehicle interior. When the lid 42 is opened, the G box camera 8 in the G box 41 appears.

  As will be described later, the G box camera 8 does not operate while the lid 42 is closed, and operates when the lid 42 is open. The range that can be photographed by the G box camera 8 is an internal space 43 of the G box 41 and a part of the surrounding area of the G box 41. For this reason, the G box camera 8 can shoot an object stored in the G box 41 and a state in which the stored object is put in and out of the G box 41. When the occupant sitting in the driver's seat or the passenger seat opens the lid 42 of the G box 41 and puts in / out the stored items, the occupant's face is also included in the photographing range.

  Moreover, the vehicle 1 is provided with the front infrared sensor 11, the back infrared sensor 12, the left side infrared sensor 13, and the right side infrared sensor 14 as shown to FIG. 1A and FIG. 1B. Each of the infrared sensors 11 to 14 is provided mainly for detecting that a person or an animal is approaching the vehicle 1 from the outside of the vehicle 1. Since each infrared sensor 11-14 has a wide detection range, the infrared sensor 11-14 can detect the approach of a person or an animal from any direction with respect to the vehicle 1.

  Specifically, the front infrared sensor 11 is provided at the front end of the vehicle 1. The front infrared sensor 11 can detect a person or animal approaching the vehicle 1 from the front side of the vehicle 1. The rear infrared sensor 12 is provided at the rear end of the vehicle 1. The rear infrared sensor 12 can detect a person or an animal approaching the vehicle 1 from the rear side of the vehicle 1. The left side infrared sensor 13 is provided on the left side surface of the vehicle 1. The left side infrared sensor 13 can detect a person or an animal approaching the vehicle 1 from the left side of the vehicle 1. The right side infrared sensor 14 is provided on the right side surface of the vehicle 1. The right side infrared sensor 14 can detect a person or an animal approaching the vehicle 1 from the right side of the vehicle 1.

Moreover, the vehicle 1 is provided with the solar radiation sensor 21, the handle | steering-wheel pressure sensor 22, and the impact sensor 23, as shown to FIG. 1A and FIG. 1B.
The solar radiation sensor 21 is installed in the lower part of the front window 10 in front of the vehicle interior. The solar radiation sensor 21 can detect the amount of solar radiation with respect to the vehicle 1 and consequently the brightness around the vehicle 1.

  The handle pressure sensor 22 is provided at a plurality of locations (two locations in the present embodiment) on the handle 20 operated by the driver for steering. The pressure sensor 22 can detect whether or not the driver is grasping the handle 20 and can detect how much force (grip force) is grasped when the driver is grasping the handle.

  The impact sensor 23 can detect the impact when an impact is applied to the vehicle 1 from the outside. In the present embodiment, for simplicity of explanation, an example in which one impact sensor 23 is provided is shown. However, a plurality of impact sensors 23 may be provided to improve impact detection accuracy.

In addition, as shown in FIGS. 1A and 1B, the vehicle 1 includes a headlight 56, a blinker light 57, an alarm generation unit 58, and an ultrasonic generation unit 59.
The headlight 56 is a well-known lamp for irradiating light toward the front of the vehicle 1. The blinker lights 57 are well-known lamps provided at the four corners of the vehicle, respectively, for notifying the outside of the vehicle 1 that the traveling direction is changed to the right side or the left side.

  The alarm generator 58 is provided to generate an alarm sound around the vehicle 1. The vehicle 1 of the present embodiment is configured to operate the alarm generation unit 58 when there is a possibility of harm to the vehicle 1 in a specific state (for example, in a monitoring mode described later). ing.

  Two ultrasonic generators 59 are provided in the present embodiment. The ultrasonic generator 59 is provided to generate ultrasonic waves in a specific frequency region around the vehicle 1. The frequency region of the ultrasonic wave to be generated is a predetermined region including frequencies that have been experimentally proven to be disliked by birds and beasts (for example, dogs, cats, crows, pigeons, etc.). When the vehicle 1 of the present embodiment detects that a bird or animal is approaching the vehicle 1 in a specific state, the ultrasonic generator 59 is operated to prevent the animal or animal from approaching the vehicle 1. It is configured.

(2) Electrical configuration of vehicle 1 The electrical configuration of the vehicle 1 will be specifically described with reference to FIG. As shown in FIG. 3, the vehicle 1 includes a control device 30. The control device 30 mainly has a driving support function that supports a driving operation by a driver while the vehicle 1 is traveling, and a monitoring function that monitors the inside and outside of the vehicle 1 when the vehicle 1 is in a specific state.

  That is, the control device 30 realizes at least one of the driving support function and the monitoring function based on at least one photographing result of the plurality of cameras 2 to 8 described above. Although the driving support function and the monitoring function may be executed in parallel, in the present embodiment, one of the functions will be described as being selectively executed. Specifically, the control device 30 according to the present embodiment includes, as operation modes, a driving support mode that realizes a driving support function, a monitoring mode that realizes a monitoring function, and a normal mode in which neither the driving support function nor the monitoring function is operated. With. And according to the state of the vehicle, the operation mode is set to any one of the three types of modes.

The control device 30 includes a control unit 31, a memory 32, and an RTC (abbreviation for real time clock) 33.
Specifically, the memory 32 includes ROM, RAM, and other various storage media (eg, EEPROM, flash memory, hard disk drive, etc.). The control part 31 implement | achieves the various functions including the driving assistance function and monitoring function which were mentioned above by running the various programs memorize | stored in the memory 32. FIG. The control unit 31 includes at least a CPU. The RTC 33 is a known timing device that outputs information indicating the current date and time.

  Each of the cameras 2 to 8 shown in FIGS. 1A and 1B is connected to the control device 30. The control unit 31 of the control device 30 individually controls the operations of the cameras 2 to 8, acquires shooting results (shooting data) from the cameras 2 to 8, and stores them in the memory 32.

  When the operation mode is the driving support mode, the control unit 31 operates the other cameras 2 to 7 (hereinafter collectively referred to as “a camera group for shooting inside and outside the vehicle”) except for the G box camera 8. The shooting data of the camera group for shooting inside and outside the vehicle is stored in the memory 32. The storage of each shooting data of the camera group for shooting inside and outside the vehicle in the driving support mode is repeatedly performed at regular intervals.

  The control unit 31 operates the G box camera 8 while the G box lid 42 is opened, and stores shooting data of the G box camera 8 in the memory 32. Shooting data of the G box camera 8 is stored as moving image data. However, still image shooting data may be acquired and stored at a predetermined timing (for example, every second) while the G box lid 42 is opened.

  The control unit 31 can recognize various situations inside and outside the vehicle based on each shooting data of the camera group for shooting inside and outside the vehicle. For example, the presence / absence of an occupant in the passenger compartment or the occupant's face can be recognized from the shooting data of the indoor camera 6. Further, from the image data of the front camera 2, it is possible to recognize objects in front of the vehicle 1 (including people and animals), lane markings, pedestrian crossings, and the like.

  Further, the infrared rays sensors 11 to 14 shown in FIGS. 1A and 1B are connected to the control device 30. The control unit 31 individually controls the infrared sensors 11 to 14 when the operation mode is the monitoring mode, and a person or an animal approaches the vehicle 1 based on the detection signals from the infrared sensors 11 to 14. Detect whether or not.

  The control device 30 is connected to the solar radiation sensor 21, the handle pressure sensor 22, and the impact sensor 23 shown in FIGS. 1A and 1B. The control unit 31 can determine the brightness around the vehicle based on the detection signal from the solar radiation sensor 21 and determine whether the brightness is in the nighttime or a similar situation (hereinafter simply referred to as “nighttime”). In the present embodiment, each of the cameras 2 to 8 is equipped with an infrared LED so that the state around the vehicle can be satisfactorily photographed even at night. When the control unit 31 determines that the surroundings of the vehicle are nighttime based on the detection signal from the solar radiation sensor 21, the infrared LEDs of the cameras 2 to 8 are turned on. Thereby, even if the surroundings of a vehicle are nighttime, the imaging | photography data inside and outside a vehicle can be appropriately acquired by each camera 2-8.

  Further, the control unit 31 can detect whether or not the driver is holding the handle 20 based on the detection signal from the handle pressure sensor 22, and when the driver is holding the handle 20, the grip force Can be detected. Further, the control unit 31 can detect that an impact has been applied to the vehicle 1 based on a detection signal from the impact sensor 23.

  Further, the direction sensor 24 and the brake sensor 25 are connected to the control device 30. The control unit 31 can detect the front direction of the vehicle 1 in the front-rear direction based on the detection signal from the direction sensor 24. Further, the control unit 31 can detect the depression amount of the brake pedal by the driver based on the detection signal from the brake sensor 25.

  In addition, as shown in FIG. 3, the vehicle 1 includes a GPS communication unit 61, a road-to-vehicle communication unit 62, a first wireless communication unit 63, and a second wireless communication unit 64 as components connected to the control device 30. I have.

  The GPS communication unit 61 receives radio waves from a plurality of GPS (Global Positioning System) satellites, and outputs information (GPS information) included in these received radio waves to the control device 30. The control unit 31 can acquire the current position information of the vehicle 1 based on the information received by the GPS communication unit 61. The current position information that can be acquired by the control unit 31 includes at least the latitude and longitude of the position where the vehicle 1 exists.

  The first wireless communication unit 63 is a wireless communication module for performing data communication with the relay station 72 provided on the ground by the first wireless communication method. The relay station 72 is connected to the Internet 70. Therefore, the control unit 31 can perform data communication with various communication devices connected to the Internet 70 via the first wireless communication unit 63. A communication device connected to the Internet 70 includes a server 73. In the present embodiment, the first wireless communication system is LTE, which is a well-known mobile phone communication standard.

  As will be described later, image data captured by the cameras 2 to 8 of the vehicle 1 is uploaded to the server 73. Shooting data is appropriately uploaded to the server 73 not only from the vehicle 1 but also from other vehicles. The shooting data uploaded from each vehicle is stored as a shooting data database (see FIG. 8 described later).

  The second wireless communication unit 64 is a wireless communication module for performing direct data communication with a vehicle 76 other than the host vehicle and a communication terminal (for example, a mobile terminal 77 such as a smartphone) by the second wireless communication method. The other vehicle 76 and the portable terminal 77 also have a function of performing data communication using the first wireless communication method. Therefore, the control unit 31 performs data communication with the other vehicle 76 and the portable terminal 77 via the relay station 71 and the Internet 70 in the first wireless communication method (that is, via the first wireless communication unit 63). Is also possible.

  The road-to-vehicle communication unit 62 is a wireless communication module for realizing data communication between a road communication device (not shown) provided on the road and the control unit 31. The road communicator may be configured to be connected to the Internet like the repeater 71. In that case, the control unit 31 can perform data communication with the server 73 via the road-to-vehicle communication unit 62. .

  Further, the headlight 56, the blinker light 57, the alarm generation unit 58, and the ultrasonic generation unit 59 shown in FIGS. 1A and 1B are connected to the control device 30, and these operations are all performed by the control unit 31. Controlled by

  Moreover, the vehicle 1 is provided with the operation part 46 and the display part 47 as a component connected to the control apparatus 30, as shown in FIG. The operation unit 46 is an input interface for accepting various input operations on the vehicle 1 by an occupant of the vehicle 1 including a driver. The display unit 47 is an output interface for visually providing various information to the occupant of the vehicle 1 including the driver.

  In this embodiment, the operation unit 46 includes at least a power switch 46a, a driving support switch 46b, and a monitoring switch 46c. In FIG. 3, “switch” is abbreviated as “SW”.

  The power switch 46 a is a switch for turning on and off the power of the vehicle 1. A known ignition switch is a kind of the power switch 46a. When the power source of the vehicle 1 is turned on by the power switch 46a, various devices that operate on the power source power (for example, on-vehicle battery power) in the vehicle 1 become operable. When the power of the vehicle 1 is turned off by the power switch 46a, the operation of the above devices is stopped except at least the control device 30. That is, in this embodiment, even if the power switch 46a is turned off, at least the control device 30 continues to operate with power supplied.

  In the present embodiment, as will be described later, even when the power switch 46a is turned off, when the operation mode of the vehicle 1 is set to the monitoring mode, the devices other than the control device 30 operate in the monitoring mode. The power source power is also supplied to predetermined devices to be operated so that they can operate. The predetermined device to be operated in the monitoring mode is a device necessary for executing the monitoring control process of FIG. 6 described later (that is, a device necessary for realizing the monitoring function), and includes at least the cameras 2 to 8 and the infrared sensors. 11-14, the impact sensor 23, the vehicle speed sensor 54, the headlight 56, the blinker light 57, the alarm generation part 58, the ultrasonic wave generation part 59, each communication part 61-64, etc. are contained.

  The driving support switch 46b is a switch for causing the driver to select whether or not to operate the driving support function. The driving support process of FIG. 5 described later for realizing the driving support function is executed when the driving support switch 46b is turned on.

  The monitoring switch 46c is a switch for causing the driver to select whether or not to activate the monitoring function. The monitoring control process of FIG. 6 described later for realizing the monitoring function is executed when the monitoring switch 46c is turned on.

  Supply of operation power to the cameras 2 to 8 is controlled by the control device 30. That is, the control device 30 can individually control the operation power supply to the cameras 2 to 8. In other words, the control device 30 can individually control the operation and non-operation (stop) of the cameras 2 to 8. When the operation mode of the vehicle 1 is the normal mode, the control device 30 stops all the cameras 2 to 8. However, the G box camera 8 is operated while the lid 42 of the G box 41 is opened to perform photographing.

  More specifically, among the cameras 2 to 8, the camera group for shooting inside and outside the vehicle except the G box camera 8 basically operates with power supplied during the driving support mode, and operates in other operation modes. Is stopped. However, in the monitoring mode, when it becomes a state in which shooting by the camera group for shooting inside and outside the vehicle is to be executed (for example, during the processing of S132, S134, and S136 in FIG. 6), the camera group for shooting outside and inside the vehicle is activated. Shooting is performed.

  As shown in FIG. 3, the vehicle 1 includes a travel drive control unit 51, a brake control unit 52, a steering control unit 53, and a vehicle speed sensor 54 as components connected to the control device 30.

  The travel drive control unit 51 controls an engine and a transmission (not shown) based on various information such as a depression amount of an accelerator pedal (not shown), an operation position of a shift lever (not shown), a vehicle speed, and an engine speed. Thus, the traveling of the vehicle 1 is controlled. The brake control unit 52 controls a brake device (not shown) based on the depression amount of a brake pedal (not shown). The steering control unit 53 has a so-called electric power steering function, and assists the operation of the handle 20 by a driver with a motor.

(3) Description of Various Processes Performed by Control Unit 31 Next, various processes performed by the control unit 31 of the control device 30 will be described with reference to FIGS.

(3-1) Operation Mode Setting Process First, the operation mode setting process will be described with reference to FIG. The operation mode setting process shown in FIG. 4 is a process for setting the operation mode of the control device 30 to any one of the normal mode, the driving support mode, and the monitoring mode.

  The control unit 31 repeatedly executes the operation mode setting process shown in FIG. 4 at a predetermined cycle. When the operation mode setting process shown in FIG. 4 is started, the control unit 31 determines whether or not the power switch 46a of the vehicle 1 is turned off in S10. If the power switch 46a is turned off (S10: YES), in S20, it is determined whether or not the monitoring switch 46c is turned on. When the monitoring switch 46c is turned off (S20: NO), the operation mode setting process is terminated. When the monitoring switch 46c is turned on (S20: YES), the operation mode is set to the monitoring mode in S30.

  If the power switch 46a is on (S10: NO), it is determined in S40 whether there is an occupant in the vehicle 1 or not. Whether or not there is an occupant in the vehicle can be determined, for example, by operating the indoor camera 6 and based on the captured data. If there is no passenger in the vehicle (S40: NO), it is determined in S50 whether the monitoring switch 46c is turned on. When the monitoring switch 46c is turned on (S50: YES), the process proceeds to S30, and the operation mode is set to the monitoring mode. If the monitoring switch 46c is turned off (S50: NO), the operation mode is set to the normal mode in S60.

If there are passengers in the car at S40 (S40: YES),
In S70, it is determined whether or not the driving support switch 46b is turned on. When the driving support switch 46b is turned on (S70: YES), the operation mode is set to the driving support mode in S80. When the driving support switch 46b is turned off (S70: NO), the process proceeds to S60, and the operation mode is set to the normal mode.

  For example, the monitoring switch 46c may be omitted, and the operation mode may be set to the monitoring mode when the power switch 46a is turned off. Further, for example, the driving support switch 46b may be omitted, and the operation mode may be set to the driving support mode when the power switch 46a is turned on and there are passengers in the vehicle.

  In other words, the setting method shown in FIG. 4 is merely an example as to which operation mode is set to the normal mode, the driving support mode, or the monitoring mode, and in which case the operation mode is set. About can be decided suitably.

(3-2) Driving Support Process Next, the driving support process executed by the control unit 31 when the operation mode is set to the driving support mode will be described with reference to FIG.

  When the operation mode is set to the driving support mode, the control unit 31 executes the driving support process of FIG. When the driving support process of FIG. 5 is started, the control unit 31 activates a camera group for photographing inside and outside the vehicle (that is, each of the cameras 2 to 7 other than the G box camera 8) and starts photographing in S111. In S112, the shooting data of each activated camera 2 to 8 is acquired. In S113, the above-described driving support function is realized by executing driving support control based on each acquired image data.

  As the driving support control in S113, for example, there is a collision suppression control based on photographing data of the front camera 2 and the rear camera 3. In the collision suppression control, when there is an obstacle ahead of the vehicle 1 while the vehicle 1 is moving forward, the obstacle does not collide with the obstacle, and there is an obstacle behind the vehicle 1 while the vehicle 1 is moving backward. In this case, the vehicle 1 is automatically steered or braked / stopped so as not to collide with the obstacle. The driving support control includes, for example, lane change support control based on the shooting data of the side cameras 4 and 5. The lane change support control is a control for notifying the driver when there is another vehicle on the side of the vehicle 1 when changing the lane during travel to the adjacent lane.

  In addition, there are various types of driving support control. The more the types of driving support control that are executed and the more advanced the driving support control that is executed, the more the driving operation of the vehicle 1 by the driver is reduced. Depending on the type and content of the driving support control that can be executed, it is possible to drive the vehicle 1 to the destination completely automatically without requiring a driving operation by the driver to the destination. However, since the driving support control itself is not directly related to the present invention, a description of more specific contents of the driving support control is omitted.

  The acquisition of each shooting data in S112 and the driving support control in S113 based on the acquired shooting data are repeatedly executed while the operation mode is set to the driving support mode. When the operation mode is switched from the driving support mode to another mode, the control unit 31 ends the driving support process of FIG. At that time, the operation of each camera activated in S111 is stopped.

  As described above, when the driving support switch 46b is turned on, the operation mode is maintained while the power switch 46a of the vehicle 1 is turned on and the vehicle 1 is traveling (including a state where the vehicle 1 is stopped in a travelable state). Is set to the driving support mode. Then, during the driving support mode, the driving support process of FIG. 5 is executed, whereby various driving support controls for supporting the driving operation of the vehicle 1 by the driver are executed.

(3-3) Monitoring Control Process On the other hand, while the operation mode is set to the monitoring mode, the control unit 31 executes the monitoring control process illustrated in FIG. In short, the monitoring control process in FIG. 6 is a process for suppressing the vehicle 1 from being subjected to various crimes by providing the vehicle 1 with a crime prevention function.

  The control unit 31 repeatedly executes the monitoring control process of FIG. 6 at a predetermined control period while the operation mode is set to the monitoring mode. When starting the monitoring control process of FIG. 6, the control unit 31 determines whether or not an unauthorized operation on the vehicle 1 is detected in S <b> 131. Examples of the unauthorized operation to be detected include, for example, unauthorized unlocking in which a door can be opened without using a regular key, unauthorized movement in which the vehicle 1 is moved without using a regular key, and acts of external harm to the vehicle 1 ( For example, glass breakage, collision of objects, etc.). Unauthorized unlocking can be determined using a detection signal from a door sensor (not shown). Unauthorized movement can be determined using a detection signal from the vehicle speed sensor 54. The presence or absence of the harmful action can be determined using a detection signal from the impact sensor 23, for example.

  If an unauthorized operation on the vehicle 1 is detected (S131: YES), an unauthorized monitoring process is executed in S132. The fraud monitoring process of S132 is a process for protecting the vehicle 1 from unauthorized operation by photographing the inside and outside of the vehicle 1 with each of the cameras 2 to 7 and generating an alarm or the like. Details of the fraud monitoring process will be described later with reference to FIG. When an unauthorized operation on the vehicle 1 is not detected (S131: NO), it is determined whether or not a person or an animal is detected around the vehicle 1 in S133.

  The presence or absence of a person or an animal can be determined based on the angle detection signals from the four infrared sensors 11-14. When a person or animal is detected (S133: YES), a specific monitoring process is executed in S134.

  The specific monitoring process of S134 is performed when the inside and outside of the vehicle 1 are photographed by the cameras 2 to 7 and when there is a possibility that a suspicious person is approaching the vehicle 1 or when an animal is approaching the vehicle 1 This is a process for deterring approach. Details of this specific monitoring process will be described later with reference to FIG. If no person or animal has been detected (S133: NO), it is determined in S135 whether or not the timing of regular imaging has arrived.

  In the present embodiment, the inside and outside of the vehicle 1 is photographed by the camera group for photographing inside and outside the vehicle at regular intervals (every periodical photographing timing of a predetermined cycle) even if an unauthorized operation is not detected and the approach of a person or animal is not detected. Then, the photographing data is held, and whether or not an abnormality has occurred is monitored based on the photographing data.

  If the periodical shooting timing has arrived (S135: YES), the periodical monitoring process is executed in S136. The regular monitoring process of S136 is to take an appropriate response according to the event when a specific event occurs in the vehicle 1 or its surroundings while photographing the inside and outside of the vehicle 1 with each camera 2-7. It is processing. Details of the regular monitoring process will be described later with reference to FIG. If the regular shooting timing has not arrived (S135: NO), the monitoring control process ends.

(3-3-1) Fraud Monitoring Process The fraud monitoring process of S132 will be specifically described with reference to FIG. When the process proceeds to S132 in the monitoring control process of FIG. 6, the control unit 31 executes the fraud monitoring process of FIG. When the control unit 31 proceeds to the fraud monitoring process in FIG. 7, in S151, the control unit 31 activates the camera group for photographing inside and outside the vehicle. In S152, recording of shooting data (moving image in this case) of each activated camera is started. “Recording” as used herein means temporarily storing, for example, in a RAM among a plurality of types of storage media constituting the memory 32.

  In S153, the registered address is notified by e-mail (e-mail) that an unauthorized operation has been performed on the vehicle 1. The address of the mail transmission destination can be registered in the memory 32 in advance. In S153, an e-mail for notifying that an unauthorized operation is performed is transmitted to the registered address via the first wireless communication unit 63 or the second wireless communication unit 64. Thereby, the recipient of the mail (for example, the owner of the vehicle 1) can know that an unauthorized operation has been performed on the vehicle 1.

  In S154, an unauthorized operation flag is set. In S155, the alarm generation unit 58 is operated to generate an alarm sound outside the vehicle 1. Further, in S156, the four blinker lights 57 are activated (for example, blinking). The processes in S155 and S156 are processes for threatening a person who performs an unauthorized operation and stopping the unauthorized operation.

  In S157, it is determined whether or not the unauthorized operation is released. If the unauthorized operation has not been released (still continued) (S157: NO), the process returns to S155, and the processes of S155 to S156 are continued. When the unauthorized operation is released (S157: YES), the operation of the alarm generation unit 58 is stopped in S158, and the operation of the blinker light is stopped in S159.

  In S160, the camera waits for a certain period of time and continues shooting with the cameras 2 to 7 constituting the camera group for shooting inside and outside the vehicle. In S161, the operations of the cameras 2 to 7 are stopped, and the recording of the shooting data of the cameras 2 to 7 is ended. In S162, the recorded shooting data is stored in the memory 32 as the improper operation moving image data. Specifically, it is stored in a non-volatile storage medium such as a hard disk drive or a flash memory among a plurality of types of storage media constituting the memory 32.

Note that when recording of shooting data is started in S152, shooting data may be stored in a nonvolatile storage medium from the beginning.
In S <b> 163, the unauthorized operation moving image data stored in the nonvolatile storage medium in S <b> 162 is uploaded to the server 73. At that time, the attribute information of the improper operation video data is also added and uploaded. The attribute information includes at least a shooting location (longitude / latitude information), shooting direction, shooting date / time, data type (information indicating whether it is a moving image or a still image), a file name, an operation mode of the vehicle 1 at the time of shooting, and shooting factors. included. When the moving image data at the time of the unauthorized operation is uploaded by the unauthorized monitoring process of FIG. 7, the shooting factor included in the attribute information is “illegal operation”.

  Shooting data may be uploaded to the server 73 not only in the fraud monitoring process of FIG. 7 but also in other various processes to be described later. In this case as well, attribute information is added to the shooting data and uploaded.

  When receiving image data from various vehicles including the vehicle 1, the server 73 creates the database of the image data based on the attribute information. Specifically, the server 73 has a database of photographing data as illustrated in FIG. That is, the server 73 collects image data captured by various vehicles including the vehicle 1. Note that in the vehicle 1 as well, each piece of shooting data shot by the vehicle 1 may be stored in the memory 32 as a database, as in FIG.

  The shooting data collected in the server 73 in this way can be appropriately selected and downloaded at each vehicle or a communication terminal or the like, and a shot image can be viewed, as will be described later.

(3-3-2) Specific Monitoring Process Next, the specific monitoring process of S134 in FIG. 6 will be specifically described with reference to FIG. When the process proceeds to S134 in the monitoring control process of FIG. 6, the control unit 31 executes the specific monitoring process of FIG. When the control unit 31 proceeds to the specific monitoring process of FIG. 9, in S181, the control unit 31 activates the camera group for photographing inside and outside the vehicle. In S182, recording of shooting data (moving image in this case) of each activated camera is started.

  In S183, it is determined whether the detection target detected by the infrared sensor is a human or an animal. If the detection target is a person, the process proceeds to S184. If the detection target is an animal, the process proceeds to S194.

  In S184, which is shifted to the case where the detection target is a person in S183, it is determined whether an unauthorized operation on the vehicle 1 is detected. This determination is the same as S131 in FIG. If an unauthorized operation is detected (S184: YES), an unauthorized monitoring process (detailed in FIG. 7) is executed in S200. If an unauthorized operation is not detected (S184: NO), it is determined in S185 whether or not the detected person may be a suspicious person.

  The determination of whether or not there is a possibility of a suspicious person can be made by various methods. For example, it may be determined that the person is a suspicious person when the behavior of a person shown in any of the cameras 2 to 7 is analyzed and the behavior of a specific pattern is indicated.

  If there is no possibility that the detected person is a suspicious person (S185: NO), the process proceeds to S189. If there is a possibility that the detected person is a suspicious person (S185: YES), an email notification is made in S186. This mail notification is the same as S153 in FIG. However, the mail notification is executed only once (only the first time) from the time when it is determined that there is a possibility of a suspicious person in S185 until the negative determination is made in S190 (that is, no person is detected). .

  In S187, a suspicious person flag is set. In S188, the lamp is operated. Specifically, at least one of the headlight 56 and the blinker light 57 is turned on (or blinked) to threaten the suspicious person.

  In S189, it is determined whether a person is continuously detected by the infrared sensor. If human detection by the infrared sensor is continued (S189: YES), the process returns to S185. When a person is no longer detected by the infrared sensor (S189: NO), the lamp operated in S188 is stopped in S190.

Note that the processing of S186 to S188 may be performed regardless of whether or not a person is detected without performing the determination of S185 (determination of the possibility of a suspicious person).
In S191, the operations of the cameras 2 to 7 are stopped, and the recording of the shooting data of the cameras 2 to 7 is ended. In S192, the recorded shooting data is stored in the memory 32 as moving image data when the human / animal approaches, as in S162 of FIG. Note that when recording of shooting data is started in S182, shooting data may be stored in a non-volatile storage medium from the beginning.

  In S193, the human / animal approaching moving image data stored in the non-volatile storage medium in S192 is uploaded to the server 73. At that time, similarly to S163 of FIG. 7, the attribute information of the moving image data at the time of human / animal proximity is also added and uploaded. When the moving image data at the time of human / animal proximity is uploaded by the specific monitoring process of FIG. 9, the photographing factor included in the attribute information is “proximity of human / animal”.

  In S194, where the detection target is an animal in S183, the ultrasonic generator 59 is operated to generate ultrasonic waves around the vehicle 1. In S195, the lamp is operated in the same manner as in S188. The processing of S194 and S195 is processing for moving an animal approaching the vehicle 1 away from the vehicle 1.

  In S196, it is determined whether or not an animal is continuously detected by the infrared sensor. While the detection of the animal by the infrared sensor is continued (S196: YES), the determination process of S196 is repeated. If no animal is detected by the infrared sensor (S196: NO), the lamp operated in S195 is stopped in S197. In S198, the generation of ultrasonic waves from the ultrasonic wave generator 59 operated in S194 is stopped. In step S199, the animal proximity flag is set, and the process proceeds to step S191 and subsequent steps.

(3-3-3) Regular Monitoring Processing Next, the regular monitoring processing in S136 in FIG. 6 will be specifically described with reference to FIG. When the control unit 31 proceeds to S136 in the monitoring control process of FIG. 6, the control unit 31 executes the periodic monitoring process of FIG. When the control unit 31 proceeds to the regular monitoring process of FIG. 10, in S <b> 211, the control unit 31 activates the camera group for photographing inside and outside the vehicle. In S212, shooting data (here, a still image) is acquired from each activated camera. In S212, the number of pieces of shooting data to be acquired from each camera can be determined as appropriate. For example, shooting data for one image may be acquired, or shooting data for a plurality of images may be acquired by shooting a plurality of images at a predetermined shooting interval.

  In S213, it is determined whether or not another vehicle has been recognized from the shooting data acquired in S212 (that is, whether or not there is shooting data showing the other vehicle). When the other vehicle is recognized (S213: YES), the photographed vehicle (that is, the other vehicle in the photographed data) is collated with the photographed vehicle database stored in the memory 32 in S214.

  The photographed vehicle database is a database for determining whether or not another photographed vehicle is a regular vehicle. For example, the owner of the vehicle 1 may use a regular vehicle (in other words, not suspicious in advance) such as a vehicle that is parked in the neighborhood of a home, a vehicle of a carrier, a motorcycle that delivers mail or newspapers, and the like. Vehicle)) can be registered in the photographing vehicle database.

  A specific example of the photographing vehicle database is shown in FIG. As shown in FIG. 11, the vehicle type, color, number, latest shooting date and time, etc. are registered for each vehicle in the shooting vehicle database. For example, the owner of the vehicle 1 may register the regular vehicle in the photographing vehicle database, or when the control unit 31 performs a predetermined learning process and regards the vehicle as a regular vehicle, the vehicle is registered as a regular vehicle. You may comprise.

  In S214, the photographed vehicle database illustrated in FIG. 11 and the other vehicle recognized from each photographed data acquired in S212 are collated, that is, the vehicle corresponding to the recognized other vehicle is registered in the photographed vehicle database. Make a decision.

  In S215, it is determined whether or not the collation result in S214 is OK, that is, whether or not the corresponding vehicle has been registered. When the collation result is OK (when the corresponding vehicle is registered) (S215: YES), the process proceeds to S229. In S229, the shooting data of each of the cameras 2 to 7 is stored in the memory 32 as regular shooting data as in S162 of FIG. Note that the shooting data may be stored in a non-volatile storage medium when the shooting data is acquired in S212.

  In S230, the periodical shooting data stored in the nonvolatile storage medium in S229 is uploaded to the server 73. At that time, similarly to S163 of FIG. 7, the attribute information of the periodical shooting data is also added and uploaded. When the regular shooting data is uploaded in S230, the shooting factor included in the attribute information is “normal”. After uploading the regular shooting data in S230, each camera 2-7 activated in S211 is stopped in S221.

  If the collation result is not OK in S215 (when the corresponding vehicle has not been registered) (S215: NO), in S216, the shooting data showing the other vehicle is taken as non-regular vehicle shooting data. This is stored in the memory 32 as in S229. In S <b> 217, the unauthorized vehicle shooting data stored in the nonvolatile storage medium in S <b> 216 is uploaded to the server 73. At that time, similarly to S230, the attribute information of the non-regular vehicle shooting data is also added and uploaded. When the non-regular vehicle shooting data is uploaded in S217, the shooting factor included in the attribute information is “non-regular vehicle”. After uploading irregular vehicle shooting data in S217, an irregular vehicle flag is set in S218.

  In S219, shooting data other than the non-regular vehicle shooting data is stored in the memory 32 as regular shooting data in the same manner as in S229. In S220, the periodic photographing data stored in the non-volatile storage medium in S219 is uploaded to the server 73. At that time, similarly to S230, the attribute information of the periodical shooting data is also added and uploaded. When the regular shooting data is uploaded in S220, the shooting factor included in the attribute information is “normal”. After uploading the regular shooting data in S220, the cameras 2 to 7 activated in S211 are stopped in S221.

  If no other vehicle is recognized in S213 (S213: NO), it is determined in S222 whether or not an abnormal state has been recognized based on the respective photographing data. The abnormal state here includes at least a state in which animal dung is attached to the vehicle 1 (hereinafter referred to as a “feces attached state”), and the amount of water at the stop position of the vehicle 1 increases, resulting in inundation damage to the vehicle 1. This includes a state in which there is a risk of the occurrence of water (hereinafter referred to as a “water increase state”).

  The fecal attachment state can be determined as the fecal attachment state when the photographing data is analyzed and an object corresponding to the feces is recognized. The increased water state can be determined mainly based on the photographing data of the bottom camera 7. For example, an image obtained by photographing the wheel 66 of the vehicle 1 from the vehicle bottom camera 7 is compared with an image (reference image) obtained by photographing the wheel 66 from the vehicle bottom camera 7 when the vehicle is in a dry road state. From the difference, the water mass can be estimated. Then, when the estimated water level exceeds a predetermined threshold, it can be determined that the water level is increased.

  When an abnormal state is not recognized in S222 (S222: NO), the process proceeds to S229. When the abnormal state is recognized in S222 (S222: YES), the imaging data in which the abnormal state is recognized in S223 is stored in the memory 32 as abnormal imaging data in the same manner as in S229. In S224, the abnormal-time shooting data stored in the nonvolatile storage medium in S223 is uploaded to the server 73. At that time, similarly to S230, the attribute information of the abnormal photographing data is also added and uploaded. When the abnormal shooting data is uploaded in S224, the shooting factor included in the attribute information is “occurrence of abnormality”.

  After uploading the abnormal shooting data in S224, an abnormality flag is set in S225. In step S226, a mail notification for notifying that an abnormality has occurred is performed. This mail notification is performed in the same manner as S153 in FIG.

  In S227, shooting data other than the abnormal shooting data is stored in the memory 32 as regular shooting data in the same manner as in S229. In S228, the periodic shooting data stored in the nonvolatile storage medium in S227 is uploaded to the server 73 in the same manner as in S230. After uploading the regular shooting data in S228, the cameras 2 to 7 activated in S211 are stopped in S221.

(3-4) Monitoring Result Notification Process Next, the monitoring result notification process will be described with reference to FIG. The monitoring result notification process of FIG. 12 is for causing the process corresponding to the set flag to be executed when any of the above flags is set when the power switch 46a of the vehicle 1 is turned on. It is processing. The control unit 31 executes the monitoring result notification process of FIG. 12 once every time the power switch 46a is turned on after activation.

  When the monitoring result notification process of FIG. 12 is started by turning on the power switch 46a of the vehicle 1, the control unit 31 determines whether any flag is set in S251. The flags that may be set here include at least an unauthorized operation flag, a suspicious person flag, an animal proximity flag, an irregular vehicle flag, and an abnormality flag.

  If no flag is set (S251: NO), a normal notification process is executed in S253. The normal notification process is a process for notifying the driver of the vehicle 1 that no abnormality or the like has occurred during parking. For example, a message may be displayed on the display unit 47. After the normal notification process in S253, all flags are cleared in S254, and the monitoring result notification process is terminated.

  If any one of the flags is set in S251 (S251: YES), a notification process corresponding to the set flag is executed in S252. This notification process is a process for notifying the driver of the vehicle 1 that an event corresponding to the set flag has occurred. For example, the notification process corresponds to the set flag on the display unit 47. It is conceivable to display a message indicating an event (for example, when a suspicious person flag is set, a suspicious person approaches while parking). After the normal notification process in S253, all flags are cleared in S254, and the monitoring result notification process is terminated.

(3-5) Glove Box Monitoring Process Next, the glove box monitoring process will be described with reference to FIG. The glove box monitoring process of FIG. 13 is a process for photographing and storing at least the entire internal area of the G box 41 when the lid 42 of the G box 41 is open. By executing the glove box monitoring process, the security of the G box 41 is enhanced, and the G box 41 can be used as a valuable item storage, for example.

  After starting, the control unit 31 repeatedly executes the glove box monitoring process of FIG. 13 at a predetermined control cycle. When the glove box monitoring process of FIG. 13 is started, the control unit 31 determines whether or not the lid 42 of the G box 41 is open in S271. If the lid 42 is closed (S271: NO), the glove box monitoring process is terminated.

  If the lid 42 is open (S271: YES), the G box camera 8 is activated in S272. In S273, recording of shooting data (moving image in this case) of the activated G box camera 8 is started. In S274, as in S271, it is determined whether or not the lid 42 of the G box 41 is open. While the lid 42 is open (S274: YES), the determination process of S274 is continued.

  When the lid 42 is closed (S274: NO), the operation of the G box camera 8 is stopped and the recording of the shooting data of the G box camera 8 is ended in S275. In S276, the recorded shooting data of the G box camera 8 is stored in the memory 32 as G box moving image data in the same manner as in S162 of FIG. Note that when recording of shooting data is started in S273, shooting data may be stored in a nonvolatile storage medium from the beginning.

  In S277, the G box moving image data stored in the nonvolatile storage medium in S276 is uploaded to the server 73. At that time, similarly to S163 in FIG. 7, the attribute information of the G box moving image data is also added and uploaded. When the G box moving image data is uploaded by the glove box monitoring process of FIG. 13, the shooting factor included in the attribute information is “G box open”.

(3-6) Data Amount Adjustment Processing Next, the data amount adjustment processing will be described with reference to FIG. The data amount adjustment process in FIG. 14 is a process for suppressing an increase in the storage capacity of various types of shooting data stored in the memory 32. The controller 31 repeatedly executes the data amount adjustment process of FIG. 14 at a predetermined control cycle after activation.

  When the data amount adjustment process of FIG. 14 is started, the control unit 31 acquires the shooting date and time and the number of executions n of the data amount reduction process for each piece of shooting data stored in the memory 32 in S291.

  The data amount reduction process is a process for reducing the data amount of the photographic data with a predetermined algorithm. As an algorithm, for example, in the case of a moving image, an algorithm of reducing the frame rate as a result by thinning out a plurality of frames constituting the moving image can be considered. In the case of a still image, an algorithm that reduces the data size by reducing the image size by a known image processing method can be considered. Of course, these are merely examples, and other algorithms may be used as long as the data amount can be reduced within a range in which the contents of the captured data can be grasped.

  Further, the number of executions n of the data amount reduction process is a value indicating the number of executions of the data amount reduction process with respect to the same single imaged data. The data amount reduction process is repeated for each shooting data every time a certain number of days elapses from the shooting date and time, starting from the shooting date and time. For example, in a manner that the data amount is reduced by 10% at a time, every time a certain number of days elapses, the data amount at that time is reduced by 10%. For example, however, if there is a possibility that the data amount will be below the minimum data amount that can grasp the contents of the photographic data, no further data amount reduction processing is performed.

  In S292, it is determined whether there is shooting data to be subjected to data amount reduction processing. For example, if there is shooting data whose execution count n is 0 even though a certain number of days have passed since the shooting date and time, it is determined that the data amount reduction processing should be executed on the shooting data. Further, for example, even when the number of executions n is 2 even though the number of days three times the fixed number of days has elapsed since the shooting date and time, it is determined that the data amount reduction processing should be executed for the shooting data. On the contrary, for example, when the number of days five times the fixed number of days has elapsed from the shooting days and the number of executions n is 5, the data amount reduction processing has already been executed five times (number of times corresponding to the number of elapsed days). Therefore, it is determined that the data amount reduction process is unnecessary.

  If there is no shooting data to be subjected to the data amount reduction process (S292: NO), the data amount adjustment process in FIG. 14 is terminated. If there is shooting data to be subjected to data amount reduction processing (S292: YES), the data amount reduction processing is executed for each shooting data determined to be executed in S293, and the data amount of the shooting data is reduced. Reduce. In step S294, the number of executions n is updated (added by 1) for each piece of shooting data for which the data amount reduction processing has been executed, and the data amount adjustment processing ends.

  In this way, by reducing the amount of data each time the number of days elapses from the shooting date, the occupation rate of the storage area in which the shooting data is stored in the memory 32 is suppressed while retaining the information indicated by the shooting data. be able to.

(3-7) Emergency Recording Process Next, the emergency recording process will be described with reference to FIG. The emergency recording process of FIG. 15 is executed when the operation mode is set to a mode other than the monitoring mode (driving support mode and normal mode in this embodiment). The controller 31 executes the emergency recording process of FIG. 15 at a predetermined control cycle after activation.

  When the emergency recording process of FIG. 15 is started, the control unit 31 determines whether or not an emergency event has occurred in S311. As an emergency event, in this embodiment, at least the driver suddenly and strongly grips the handle 20, the sudden brake is applied, and an impact of a predetermined level or more is applied to the vehicle 1. .

  It can be determined from the change rate of the gripping force based on the detection signal from the handle pressure sensor 22 that the driver suddenly and strongly grips the handle 20. It can be determined based on a detection signal from the brake sensor 25 that the sudden braking has been applied. It can be determined based on a detection signal from the impact sensor 23 that an impact of a predetermined level or more has been applied to the vehicle 1. These emergency events are events that may cause the vehicle 1 to be in a dangerous state. In other words, any event that may cause the vehicle 1 to be in a dangerous state can be appropriately set as an emergency event.

  If no emergency event has occurred in S311, the emergency recording process is terminated. If an emergency event occurs (S311: YES), in S312, the camera group for shooting inside and outside the vehicle is activated, and recording of shooting data (moving images in this case) of the activated cameras 2 to 7 is started. .

  In S313, it is determined whether the emergency event is ongoing. While the emergency event continues, the determination process of S313 is repeated. When the emergency event disappears (S313: NO), the camera waits for a certain time in S314, and continues shooting with the cameras 2 to 7 constituting the camera group for shooting inside and outside the vehicle. In S315, the operations of the cameras 2 to 7 are stopped, and the recording of the shooting data of the cameras 2 to 7 is ended. In S316, each recorded shooting data is stored in the memory 32 as emergency moving image data.

  The emergency video data may also be uploaded to the server 73. In this case, the photographing factor included in the attribute information at the time of upload is “emergency event occurrence” (see FIG. 8).

(3-8) Driver Determination Process Next, the driver determination process will be described with reference to FIG. The driver determination process of FIG. 16 is executed when the operation mode is set to a mode other than the monitoring mode (driving support mode and normal mode in this embodiment). The controller 31 executes the driver determination process of FIG. 16 at a predetermined control cycle after activation.

  When the driver determination process of FIG. 16 is started, the control unit 31 starts capturing the face of the driver by the indoor camera 6 in S331. In S332, it is determined whether the photographed driver's face has already been collated (that is, the processing of S333 to S334 has been executed). If it has already been verified (S332: YES), the driver determination process is terminated. If not yet collated (S332: NO), in S333, the photographed driver's face is collated with the driver database.

  The driver database is a database for determining whether or not a photographed driver is a regular driver. For example, the owner of the vehicle 1 can register an image of the face of a person who is permitted to drive the vehicle 1, such as his / her face or family face, in advance in the driver database as a regular driver. For example, the owner of the vehicle 1 may register the face image of the authorized driver in the driver database by photographing the face image and registering the photographed data, or the control unit 31 performs a predetermined learning process. If the driver is regarded as a regular driver, the driver's face may be registered.

  In S334, it is determined whether or not the collation result in S333 is OK, that is, whether or not the photographed driver's face is registered in the driver database. If the collation result is OK (S334: YES), the driver determination process is terminated.

  If the collation is not OK in S334 (S334: NO), the photographed driver's photographing data is stored in the memory 32 as non-regular driver photographing data in S335. In S336, an irregular driver flag is set. In S337, an e-mail notification is sent to notify that the unauthorized driver has boarded the vehicle 1. This mail notification is performed in the same manner as S153 in FIG.

  Note that the non-regular driver shooting data stored in S335 may be uploaded to the server 73. In this case, the photographing factor included in the attribute information at the time of upload is “driver verification NG” (see FIG. 8).

(3-9) Shooting Data Acquisition Processing Next, shooting data acquisition processing will be described with reference to FIG. The imaging data acquisition process of FIG. 17 can be executed by the control unit 31 of the vehicle 1 and, by installing software for the imaging data acquisition process, data communication with a server 73 such as a portable terminal or a personal computer. It can also be executed in various possible communication terminals.

  The shooting data acquisition process of FIG. 17 is a process for selectively downloading and browsing shooting data uploaded from various vehicles stored in a database in the server 73 from the server 73. That is, the various communication terminals in which the software for capturing data acquisition processing of FIG. 17 is installed and the control device 30 of the vehicle 1 can download the captured data stored in the server 73.

  Here, as an example, the contents of the processing in FIG. 17 will be described on the assumption that the shooting data acquisition processing software in FIG. 17 is installed in the portable terminal 77 capable of data communication with the server 73 via the Internet 70.

  When the mobile terminal 77 (specifically, the internal CPU) starts the shooting data acquisition process of FIG. 17 by the user's activation operation, in S351, it is determined whether or not a real-time image has been designated by the user. In the shooting data acquisition processing of the present embodiment, not only shooting data already stored in the server 73 (hereinafter also referred to as “shot data”), but also a vehicle or a location is specified, and the current shooting by the specified vehicle is performed. It is also possible to acquire data or current shooting data (hereinafter also referred to as “real time data”) by a vehicle existing at a specified location in real time.

  When the user wants to acquire real-time data, it is necessary to specify a vehicle or a location. For example, when it is desired to acquire photographing data photographed by the camera of the vehicle 1 in real time, it is necessary to designate the vehicle 1 to the server 73. Various vehicle designation methods are conceivable. For example, if an address for data communication is set for each vehicle, a specific vehicle can be specified for the server 73 by specifying the address. Further, for example, if the license plate information of the vehicle and the address for data communication of the vehicle are associated with each other and registered in the server 73, by specifying the license plate information, the server 73 Data communication may be performed.

  When it is desired not to acquire a captured image of a specific vehicle in real time but to acquire a captured image of a specific location in real time, it is necessary to specify a location. The location can be specified by various methods. For example, a method of designating by inputting an address can be considered. In addition, for example, a method is conceivable in which a map is displayed on the mobile terminal 77 and a specific place in the displayed map is designated (for example, a tap operation), so that information on the designated place is transmitted to the server 73. .

  On the other hand, when it is desired to download specific photographed data from the photographed data already uploaded and stored in the server 73, it is necessary to perform vehicle designation or scene designation to the server 73. The vehicle designation means that the server 73 designates which vehicle the photographed data is to be downloaded. The scene designation means designating to the server 73 when and where to take the photographed data. If a scene is designated, it is possible to download photographed data photographed at a designated date and time, out of a large amount of photographed data stored in the server 73.

  If a real-time image is designated by the user in S351 (S351: YES), a real-time data request command is transmitted to the server 73 in S352. At that time, if the vehicle is designated by the user, the vehicle designation information indicating the designated vehicle is added and transmitted. If the user designates the location, the location designation information indicating the designated location is added. Then send.

  When photographed data is designated by the user in S351 (S351: NO), a photographed data request command is transmitted to the server 73 in S353. At that time, when the user designates a vehicle, vehicle designation information indicating the designated vehicle is added and transmitted, and when the user designates a scene, the designated scene (date and time) is indicated. The scene designation information is added and transmitted.

  In S354, the data list transmitted from the server 73 in response to the request command transmitted in S352 or S353 is received and displayed. For example, when a real-time data request command by location designation is transmitted in S352, detailed vehicle position and travel direction information is received as a data list for vehicles existing at the designated location. Further, for example, when a real-time data request command by vehicle designation is transmitted in S352, detailed position and travel direction information of the designated vehicle is received as a data list.

  Further, for example, when a photographed data request command by vehicle designation is transmitted in S353, information on the photographing data photographed by the designated vehicle and uploaded to the server 73 is received as a data list. Further, for example, when a photographed data request command by scene designation is transmitted in S353, information of photographing data photographed in the designated scene (date and place) and uploaded to the server 73 is received as a data list.

  In S355, the user's selection operation for the data list received and displayed from the server 73 is received, and it is determined whether or not the selection operation is completed. Until the selection operation is completed, the determination process of S355 is repeated. When the selection operation is completed (S355: YES), selection information indicating the selected data is transmitted to the server 73 in S356. For example, when a real-time data request command by location designation is transmitted in S352 and the detailed position and traveling direction information of the vehicle existing at the designated location is received from the server 73 as a data list, It is necessary to select a specific vehicle from the data list. Therefore, in this case, information indicating the vehicle selected by the user is transmitted as selection information.

  Further, for example, when the real-time data request command by vehicle designation is transmitted in S352 and the detailed position and travel direction information of the designated vehicle is received as a data list from the server 73, the data list of the designated vehicle Only will be displayed. In this case, the user needs to confirm the current position and traveling direction of the designated vehicle based on the data list, and if there is no problem (if the photographed data of the vehicle is desired), it is necessary to perform a selection confirmation operation indicating that fact. There is. In this case, information indicating that selection has been confirmed is transmitted as selection information. Depending on the current position and traveling direction of the designated vehicle, there may be cases where the shooting data of the vehicle is not desired. In that case, it is necessary to perform a selection canceling operation indicating that shooting data of the designated vehicle is unnecessary. In this case, information indicating that the selection has been canceled is transmitted as selection information.

  Further, for example, when the photographed data request command by the vehicle designation is transmitted in S353 and the information of the photographed data uploaded from the designated vehicle is received from the server 73 as the data list, the user It is necessary to select specific shooting data from among them. Therefore, in this case, information indicating the shooting data selected by the user is transmitted as selection information.

  After the selection information is transmitted in S356, in S357, imaging data corresponding to the selection information transmitted from the server 73 is received. In S358, a predetermined shooting data display process for displaying the received shooting data is executed. By this shooting data display process, the user can view the received shooting data.

(4) Description of Server Transmission Processing Next, server transmission processing executed in the server 73 will be described with reference to FIG. When the server transmission process of FIG. 18 is executed in the server 73, the vehicle and various communication terminals can download the shooting data from the server 73 by the shooting data acquisition process of FIG.

  That is, the server transmission process of FIG. 18 is a process for transmitting the requested shooting data to the request source in response to the shooting data request from the vehicle or various communication terminals. The server 73 repeatedly executes the server transmission process of FIG. 18 at a predetermined cycle.

  When the server 73 starts the server transmission process in FIG. 18, in S <b> 371, whether the server 73 has received a shooting data request command from a vehicle or various communication terminals (such as a terminal in which the shooting data acquisition process in FIG. 17 is executed). to decide. Specifically, it is determined whether or not either a real-time data request command (transmitted in S352 of FIG. 17) or a photographed data request command (transmitted in S353 of FIG. 17) has been received. If no request command has been received (S371: NO), the server transmission process is terminated.

  If any request command is received (S371: YES), the request target is determined in S372. Specifically, it is determined which of real-time data and photographed data is requested. If a real-time data request command is received, it is determined that real-time data is requested, and the process proceeds to S373. When the photographed data request command is received, it is determined that photographed data is requested, and the process proceeds to S378.

  In S373, a vehicle that can acquire the designated image (real-time data) is searched using road-to-vehicle communication. In this case, if a vehicle is designated, the designated vehicle is searched. Specifically, information on the current position and traveling direction of the designated vehicle is acquired by performing data communication with the designated vehicle via the network. If a location is specified, a vehicle that exists at the specified location is searched. In addition, you may make it perform said vehicle search using data communication other than road-to-vehicle communication.

  In S374, for each vehicle searched in S373, the information of the position and the traveling direction is transmitted as a data list to the request source of the photographing data (that is, the request command transmission source). In S375, it is determined whether selection information from the request source (transmitted in S356 in FIG. 17) has been received. The determination process in S375 is repeated until selection information is received. When the selection information is received (S375: YES), in S376, shooting data currently being shot in the specific vehicle is acquired from the specific vehicle indicated by the selection information (that is, the selected vehicle) via the network. In addition, when imaging | photography is not currently performed in the specific vehicle, imaging | photography is instruct | indicated temporarily and imaging data are transmitted. In S377, the imaging data acquired from the specific vehicle in S376 is transmitted to the request source.

  If it is determined in S372 that the photographed data is requested from the request source, the data list of the image designated by the request source is transmitted to the request source in S378 with reference to the database in FIG. In S379, it is determined whether selection information has been received from the request source. Until the selection information is received, the determination process of S379 is repeated. When the selection information is received (S379: YES), the imaging data indicated by the selection information is acquired via the network in S380.

(5) Effects of First Embodiment According to the vehicle 1 of the present embodiment described above, the cameras 2 to 7 used for realizing the driving support function in the driving support mode are used for realizing the monitoring function in the monitoring mode. It is done. That is, the shooting data of each camera 2-7 is used for different purposes. Therefore, it is possible to provide a service with high added value based on the shooting data of each camera.

  In the present embodiment, a plurality of timings for operating the camera in the monitoring mode are set. Mainly, when an unauthorized operation on the vehicle 1 is detected (see S131 in FIG. 6) and when an approach of a person or animal to the vehicle 1 is detected by an infrared sensor (see S133 in FIG. 6). The camera is activated and recording data is recorded.

  When the approach of a person is detected, in addition to storing the shooting data, it is also determined whether or not there is a possibility of a suspicious person. Intimidation is performed. Furthermore, when a direct unauthorized operation on the vehicle 1 is performed, in addition to storing shooting data, e-mail notification, alarm generation, blinker light blinking, and the like are performed. For example, when someone touches the vehicle 1, an impact at the time of the touch is detected, the camera is activated, and images inside and outside the vehicle at that time are taken. When an impact is applied not only to the touch but also to the vehicle, each camera operates to take a picture. Therefore, the crime prevention effect at a high level can be obtained.

  In addition, even if no human or animal approach is detected and no unauthorized operation is performed, the camera is periodically operated to monitor surrounding abnormalities and store photographing data. Therefore, a crime prevention effect at a higher level can be obtained.

  More specifically, in the present embodiment, when the water volume around the vehicle 1 has increased, this can be detected based on the photographing data of the bottom camera 7. Therefore, when the possibility of flooding into the vehicle 1 appears, this can be recognized quickly. Moreover, when an animal approaches, the animal can be driven away by generating an ultrasonic wave that the animal dislikes or threatening by operating the lamp. Thereby, the damage to the vehicle 1 by an animal can be reduced.

  Nevertheless, the animal may approach the vehicle 1 and drop feces near the vehicle 1 or directly on the vehicle 1. On the other hand, in the present embodiment, when feces are detected from the shooting data of the camera, mail notification is performed. Therefore, it is possible to quickly take an appropriate measure against dropping of feces.

  Each camera can also be used in the event that an emergency event occurs, such as when the driver suddenly and firmly grips the handle 20, suddenly brakes, or a large impact is applied to the vehicle 1 during the driving support mode or the normal mode. Shooting is performed. Therefore, if an accident occurs, the imaging data can be used effectively when searching for witnesses or investigating the cause of the accident.

  In addition, the server 73 appropriately uploads shooting data from not only the vehicle 1 but also many other vehicles and stores it as a database (see FIG. 8). Then, by performing vehicle designation or scene designation with respect to the server 73 from the outside, it is possible to acquire and browse desired shooting data out of the accumulated shooting data. For example, when it is desired to view shooting data shot with a specific vehicle, the shooting data shot with the vehicle can be acquired by designating the vehicle. Further, for example, when it is desired to view shooting data shot in a certain scene (date and time and place), by specifying the scene, if there is shooting data shot in that scene, the shooting data can be acquired. .

  Furthermore, not only the shooting data stored in the server 73 but also a current shot image in a specific vehicle or a specific place can be acquired in real time via the server. Therefore, the user can acquire a captured image in a desired scene or a captured image from a desired vehicle in a multifaceted manner.

  In the present embodiment, each of the cameras 2 to 8 corresponds to an example of a photographing unit of the present invention. The control device 30 corresponds to an example of the support control unit, the specific photographing control unit, and the transmission control unit of the present invention. Each of the infrared sensors 11 to 14 corresponds to an example of a human approach detection unit of the present invention. The driving support process in FIG. 5 corresponds to an example of the support control of the present invention. The memory 32 corresponds to an example of a storage unit of the present invention. Each condition that is positively determined in each determination process of S131, S133, and S135 in FIG. 6 (that is, that an illegal operation has been detected, that a person or animal has been detected, and that a periodical shooting timing has arrived) This corresponds to an example of specific photographing conditions of the invention. Further, the conditions that are affirmed in S271 in FIG. 13 (that is, the lid 42 of the G box 41 is open) and the conditions that are affirmed in S311 in FIG. 15 (that is, that an emergency event has occurred) This corresponds to an example of specific photographing conditions of the present invention.

[Second Embodiment]
The vehicle of the present embodiment is different from the vehicle 1 of the first embodiment in that the front camera 2 and the rear camera 3 are mainly configured to be exposed on the roof from the vehicle interior. Another difference is that a camera that is normally housed in the ceiling and exposed on the roof in the monitoring mode is separately provided. Another difference is that the cameras exposed on the roof have a function of mutual monitoring. The rest is basically the same as the vehicle 1 of the first embodiment. Therefore, in the present embodiment, the description will be focused on a configuration (mainly the above-described three configurations) different from the vehicle 1 of the first embodiment.

  As shown in FIG. 19, in the present embodiment, a front camera driving unit 81 for rotating the front camera 2 or moving it in the vertical direction is provided on the front ceiling in the vehicle interior. More specifically, the front camera drive unit 81 includes a vertical direction drive unit 81a and a horizontal rotation drive unit 81b.

  The vertical drive unit 81a moves the arm 83 supporting the front camera 2 in the vertical direction as shown by an arrow α in FIG. 20A, or rotates the arm 83 as shown by an arrow beta in FIG. 20A. be able to. The horizontal rotation drive unit 81b can rotate the front camera 2 in a horizontal plane as indicated by an arrow γ in FIG. 20A. 20A is a side view of the front camera 2 viewed from the left side of the vehicle in the same manner as FIG. 19, and FIG. 20B is a front view of the front camera 2 viewed from the front of the vehicle. A closing / fixing fixing plate 93 is provided between the arm 83 and the front camera 2.

  On the other hand, a substantially rectangular hole (roof hole) is formed in the roof 91 of the vehicle 1 above the front camera 2. The roof hole is closed by the front roof closing plate 92 when the operation mode is a mode other than the monitoring mode. When the operation mode is the monitoring mode, the front roof closing plate 92 is slid rearward by the front roof opening / closing portion 82, thereby opening the roof hole.

  More specifically, when the operation mode is a mode other than the monitoring mode, as shown in FIG. 19A, the front camera 2 is fixed in a state of being hung on the arm 83 in the vehicle interior. In this state, the front camera 2 can photograph the front side of the vehicle. In addition, you may enable it to change freely the imaging | photography direction of the front camera 2 within a horizontal surface by the horizontal rotation drive part 81b.

  On the other hand, when the operation mode is the monitoring mode, the front roof closing plate 92 slides to open the roof hole. Further, the arm 83 is moved downward by the vertical driving portion 81a. Then, after the arm 83 has moved to or near the lowest lower end in the vertical direction, the arm 83 is further rotated 180 degrees by the vertical direction drive unit 81a. As a result, the state shown in FIG. 19B is obtained.

  Thereafter, the arm 83 is moved upward by the vertical driving portion 81a, so that the front camera 2 is projected outward (upward) from the roof 91 as shown in FIG. The state is blocked by the fixing plate 93. In other words, when the operation mode is the monitoring mode, the front camera 2 jumps out on the roof 91, and the entire 360 degrees around the vehicle can be photographed from above the roof 91.

  The operations of the front camera driving unit 81 and the front roof opening / closing unit 82 are controlled by the control device 30. The control device 30 controls the front camera driving unit 81 and the front roof opening / closing unit 82 according to the operation mode, thereby setting the state of the front camera 2 to one of FIGS. 19A and 19C.

Although illustration is omitted, the rear camera 3 is also moved to either the vehicle interior or the roof depending on the operation mode, just like the front camera 2 shown in FIG.
Further, in the present embodiment, as shown in FIG. 21, a central camera 121 is mounted at a substantially central portion (a substantially intermediate position between the front camera 2 and the rear camera 3) in the roof 91. The central camera 121 can be moved in the vertical direction by the central camera driving unit 122.

  On the other hand, a substantially circular hole (roof center hole) is formed in the roof 91 of the vehicle 1 above the center camera 121. The roof central hole is closed by the central roof closing plate 112 when the operation mode is a mode other than the monitoring mode. When the operation mode becomes the monitoring mode, the central roof closing plate 112 is slid rearward by the central roof opening / closing part 111, whereby the roof central hole is opened.

  More specifically, when the operation mode is a mode other than the monitoring mode, as shown in FIG. 21A, the central camera 121 is in a state of being accommodated in the vehicle interior, and no photographing operation is performed. On the other hand, when the operation mode is the monitoring mode, the central roof closing plate 112 slides to open the roof central hole. Further, as the central camera 121 is moved upward by the central camera driving unit 122, the central camera 121 protrudes outward (upward) from the roof 91 and the roof central hole is closed as shown in FIG. 21B. It becomes. That is, the central camera 121 jumps out onto the roof 91 only when the operation mode is the monitoring mode, and is capable of photographing the entire 360 degrees around the vehicle and above the vehicle from above the roof 91.

  The operations of the central camera driving unit 122 and the central roof opening / closing unit 111 are controlled by the control device 30. The control device 30 controls the central camera driving unit 122 and the central roof opening / closing unit 111 according to the operation mode, thereby setting the state of the central camera 121 to one of FIGS. 21A and 21B.

  In the monitoring mode, the three cameras of the front camera 2, the rear camera 3, and the central camera 121 that protrude upward from the roof 91 are configured to monitor each other. Specifically, the control unit 31 of the control device 30 executes the camera mutual monitoring process shown in FIG. 22 in the monitoring mode.

  The control unit 31 repeatedly executes the camera mutual monitoring process of FIG. 22 at a predetermined control period while the operation mode is set to the monitoring mode. When the camera mutual monitoring process of FIG. 22 is started, the control unit 31 determines in S411 whether or not there is a camera (abnormal camera) from which the photographing data cannot be normally obtained among the three cameras 2, 3, and 121. To do. This determination can be made by acquiring shooting data from each of the cameras 2, 3, 121 and analyzing the shooting data.

  If there is no abnormal camera (S411: NO), the camera mutual monitoring process is terminated. If there is an abnormal camera (S411: YES), a camera abnormal flag is set in S412. In step S413, a camera capable of capturing an abnormal camera among the other two cameras other than the abnormal camera is operated to capture an image including the abnormal camera for at least a predetermined time. In the present embodiment, both the front camera 2 and the rear camera 3 can rotate 360 degrees in the horizontal plane, and the center camera 121 can shoot 360 degrees forward without rotating. Therefore, even if any of these three cameras becomes an abnormal camera, the other two cameras can respectively photograph the abnormal camera.

  In S414, the imaging data obtained in S413 is stored in the memory 32 as mutual monitoring image data. In step S415, the mutual monitoring image data is uploaded to the server 73.

  According to the second embodiment described above, a plurality of cameras monitor each other, and when an abnormality occurs in any one of the cameras, another camera captures the direction of the abnormal camera. Therefore, for example, when a malicious person blocks or destroys a lens for a certain camera, the person can be photographed by another camera.

[Other Embodiments]
As mentioned above, although embodiment of this invention was described, this invention can take a various form, without being limited to the said embodiment.

  (1) The number and mounting positions of the cameras mounted on the vehicle 1 are not limited to the numbers and mounting positions shown in FIG. A part or all of each camera may be a movable type, and the crime prevention effect may be further enhanced by causing the individual cameras to shoot in all directions as much as possible.

  (2) The data amount adjustment process shown in FIG. 14 may also be executed in the server 73. By doing so, the storage capacity of the photographic data in the server 73 can be suppressed. Note that the specific content of the data amount adjustment processing shown in FIG. 14 is merely an example, and as long as the data amount can be reduced continuously or stepwise according to the number of days elapsed from the shooting date, the specific amount of data reduction Various methods are conceivable.

  (3) It can be applied to SNS (Social Networking Service) by photographing other vehicles around the host vehicle while traveling. Specifically, when another vehicle is recognized from the shooting data, information on the other vehicle is stored. For example, when there is another vehicle that is frequently encountered and the vehicle can be specified, a message can be transmitted to the other vehicle. For example, in the case where a certain number always passes with another vehicle when commuting, a message may be transmitted by designating that number.

  (4) Although an infrared sensor is used as a sensor for detecting the approach of a person or an animal to the vehicle in the above embodiment, this is merely an example. For example, another sensor such as a temperature sensor or an ultrasonic sensor may be used, or a person or an animal may be detected from captured data by operating a camera.

  (5) In addition, the functions of one component in the above embodiment may be distributed as a plurality of components, or the functions of a plurality of components may be integrated into one component. Further, at least a part of the configuration of the above embodiment may be replaced with a known configuration having the same function. Moreover, you may abbreviate | omit a part of structure of the said embodiment. In addition, at least a part of the configuration of the above embodiment may be added to or replaced with the configuration of the other embodiment. In addition, all the aspects included in the technical idea specified only by the wording described in the claim are embodiment of this invention.

  DESCRIPTION OF SYMBOLS 1 ... Vehicle, 2 ... Front camera, 3 ... Back camera, 4 ... Left side camera, 5 ... Right side camera, 6 ... Indoor camera, 7 ... Vehicle bottom camera, 8 ... Glove box camera, 10 ... Front window, 11 ... Front infrared sensor, 12 ... rear infrared sensor, 13 ... left side infrared sensor, 14 ... right side infrared sensor, 20 ... handle, 21 ... solar radiation sensor, 22 ... handle pressure sensor, 23 ... impact sensor, 24 ... direction sensor, DESCRIPTION OF SYMBOLS 25 ... Brake sensor, 30 ... Control apparatus, 31 ... Control part, 32 ... Memory, 33 ... RTC, 41 ... Box, 42 ... Cover, 43 ... Internal space, 46 ... Operation part, 46a ... Power switch, 46b ... Driving support Switch, 46c ... Monitoring switch, 47 ... Display unit, 51 ... Travel drive control unit, 52 ... Brake control unit, 53 ... Steering control unit, 54 ... Vehicle speed sensor 56 ... Headlight, 57 ... Winker light, 58 ... Alarm generator, 59 ... Ultrasonic generator, 61 ... GPS communication unit, 62 ... Road-to-vehicle communication unit, 63 ... First wireless communication unit, 64 ... Second wireless communication , 66 ... wheels, 70 ... internet, 71, 72 ... relay station, 73 ... server, 76 ... other vehicles, 77 ... portable terminal, 81 ... front camera drive unit, 81a ... vertical drive unit, 81b ... horizontal rotation drive 82: Front roof opening / closing part, 83 ... Arm, 91 ... Roof, 92 ... Front roof closing plate, 93 ... Closing and fixing plate, 111 ... Central roof opening / closing part, 112 ... Central roof closing plate, 121 ... Central camera, 122... Central camera drive unit.

Claims (3)

  1. An in-vehicle camera system mounted on a vehicle,
    At least one photographing unit configured to take an image inside or outside the vehicle and output the photographing data;
    A support control unit configured to perform predetermined support control for supporting the driving operation of the vehicle by the driver based on the shooting data of the shooting unit;
    When a specific shooting condition is satisfied during a shooting suspension period in which the operation of the shooting unit is stopped without the support control performed by the support control unit, the shooting unit is activated to start shooting data of the shooting unit. A specific shooting control unit that stores the
    An in-vehicle camera system comprising:
  2. The in-vehicle camera system according to claim 1,
    A human approach detection unit configured to be able to detect that a person is approaching the vehicle outside the vehicle;
    As the photographing condition, at least that the person approaching detection unit detects that a person is approaching the vehicle is set.
  3. The in-vehicle camera system according to claim 1 or 2,
    A communication unit for performing data communication with an information processing device outside the vehicle;
    A first transmission function for transmitting shooting data stored in the storage unit by the specific shooting control unit to the information processing apparatus via the communication unit; and the shooting when a shooting instruction is received from the information processing apparatus A transmission control unit having at least one of a second transmission function that activates the unit and transmits the captured data to the information processing apparatus via the communication unit;
    An in-vehicle camera system comprising:
JP2014225391A 2014-11-05 2014-11-05 On-vehicle camera system Pending JP2016092614A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014225391A JP2016092614A (en) 2014-11-05 2014-11-05 On-vehicle camera system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014225391A JP2016092614A (en) 2014-11-05 2014-11-05 On-vehicle camera system

Publications (1)

Publication Number Publication Date
JP2016092614A true JP2016092614A (en) 2016-05-23

Family

ID=56019890

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014225391A Pending JP2016092614A (en) 2014-11-05 2014-11-05 On-vehicle camera system

Country Status (1)

Country Link
JP (1) JP2016092614A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019087770A1 (en) * 2017-10-31 2019-05-09 パナソニックIpマネジメント株式会社 Pressure-sensing device and vehicle

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07318644A (en) * 1994-05-27 1995-12-08 Fuji Heavy Ind Ltd Obstacle detector
JP2003304530A (en) * 2002-04-12 2003-10-24 Matsushita Electric Ind Co Ltd Monitoring apparatus
JP2005289265A (en) * 2004-04-02 2005-10-20 Hitachi Ltd Burglary monitoring device for vehicle
JP2006273122A (en) * 2005-03-29 2006-10-12 Aisin Seiki Co Ltd Parking brake assistance device
JP2007282406A (en) * 2006-04-07 2007-10-25 Tama Tlo Kk Braking force control system of vehicle
JP2008087651A (en) * 2006-10-03 2008-04-17 Calsonic Kansei Corp In-vehicle camera system
JP2009169869A (en) * 2008-01-18 2009-07-30 Fujitsu Ten Ltd Vehicle information recording system
JP2012121384A (en) * 2010-12-06 2012-06-28 Fujitsu Ten Ltd Image display system
JP2013191969A (en) * 2012-03-13 2013-09-26 Fujitsu Ten Ltd Image processor, image display system, display device, image processing method and program
JP2014120095A (en) * 2012-12-19 2014-06-30 Denso Corp Vehicle monitoring system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07318644A (en) * 1994-05-27 1995-12-08 Fuji Heavy Ind Ltd Obstacle detector
JP2003304530A (en) * 2002-04-12 2003-10-24 Matsushita Electric Ind Co Ltd Monitoring apparatus
JP2005289265A (en) * 2004-04-02 2005-10-20 Hitachi Ltd Burglary monitoring device for vehicle
JP2006273122A (en) * 2005-03-29 2006-10-12 Aisin Seiki Co Ltd Parking brake assistance device
JP2007282406A (en) * 2006-04-07 2007-10-25 Tama Tlo Kk Braking force control system of vehicle
JP2008087651A (en) * 2006-10-03 2008-04-17 Calsonic Kansei Corp In-vehicle camera system
JP2009169869A (en) * 2008-01-18 2009-07-30 Fujitsu Ten Ltd Vehicle information recording system
JP2012121384A (en) * 2010-12-06 2012-06-28 Fujitsu Ten Ltd Image display system
JP2013191969A (en) * 2012-03-13 2013-09-26 Fujitsu Ten Ltd Image processor, image display system, display device, image processing method and program
JP2014120095A (en) * 2012-12-19 2014-06-30 Denso Corp Vehicle monitoring system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019087770A1 (en) * 2017-10-31 2019-05-09 パナソニックIpマネジメント株式会社 Pressure-sensing device and vehicle

Similar Documents

Publication Publication Date Title
DE112012004767T5 (en) Complete vehicle ecosystem
EP2000889B1 (en) Monitor and monitoring method, controller and control method, and program
US9910436B1 (en) Autonomous data machines and systems
US10316571B2 (en) Vehicle alert system utilizing communication system
CN1928494B (en) Driving support apparatus
DE112013004591T5 (en) Collection and use of recorded vehicle data
US20160073254A1 (en) Method and apparatus for reducing mobile phone usage
US7482937B2 (en) Vision based alert system using portable device with camera
KR20100030566A (en) Intelligent driving assistant systems
US8855621B2 (en) Cellphone controllable car intrusion recording and monitoring reaction system
US20070014439A1 (en) Monitoring system, monitoring device and method, recording medium, and program
US8379924B2 (en) Real time environment model generation system
CN104952122B (en) The drive recorder and system of evidence obtaining violating the regulations can be carried out automatically
US9955326B2 (en) Responding to in-vehicle environmental conditions
JP5872764B2 (en) Image display system
CN103303205B (en) The vehicle surrounding monitoring apparatus
EP2892020A1 (en) Continuous identity monitoring for classifying driving data for driving performance analysis
JP2007094618A (en) Notification controller and notification control method, recording medium, and program
US10296083B2 (en) Driver assistance apparatus and method for controlling the same
CN106415686A (en) Trainable transceiver and camera systems and methods
US20160140872A1 (en) System and method for detecting a vehicle event and generating review criteria
CN106167045A (en) Driver assistance apparatus and control method for the same
JP2010217956A (en) Information processing apparatus and method, program, and information processing system
US20170187963A1 (en) Display device for vehicle and control method thereof
You et al. CarSafe: a driver safety app that detects dangerous driving behavior using dual-cameras on smartphones

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170921

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180611

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180724

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180827

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20190129