WO2019111375A1 - Monitoring device, monitoring system, monitoring method, and monitoring program - Google Patents

Monitoring device, monitoring system, monitoring method, and monitoring program Download PDF

Info

Publication number
WO2019111375A1
WO2019111375A1 PCT/JP2017/043994 JP2017043994W WO2019111375A1 WO 2019111375 A1 WO2019111375 A1 WO 2019111375A1 JP 2017043994 W JP2017043994 W JP 2017043994W WO 2019111375 A1 WO2019111375 A1 WO 2019111375A1
Authority
WO
WIPO (PCT)
Prior art keywords
monitoring
image data
monitoring target
image
notification
Prior art date
Application number
PCT/JP2017/043994
Other languages
French (fr)
Japanese (ja)
Inventor
圭史朗 金森
Original Assignee
圭史朗 金森
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 圭史朗 金森 filed Critical 圭史朗 金森
Priority to PCT/JP2017/043994 priority Critical patent/WO2019111375A1/en
Publication of WO2019111375A1 publication Critical patent/WO2019111375A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/31Detection related to theft or to other events relevant to anti-theft systems of human presence inside or outside the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems

Definitions

  • the present invention relates to a monitoring device, a monitoring system, a monitoring method, and a monitoring program.
  • Patent Document 1 a technology has been proposed in which a camera is mounted on a car to monitor the outside (for example, Patent Document 1).
  • the above-described technology is effective when the driver is in a car. However, while the car is parked, it is not possible to prevent the car from being harmed when the driver is away from the car.
  • the present invention is an attempt to solve such a problem, and when the driver is away from the vehicle, a monitoring device, a monitoring system, a monitoring method, and a notification that can be notified according to the external situation of the vehicle. It aims to provide a monitoring program.
  • a first invention is a monitoring device for monitoring the outside of a car, comprising: image data acquisition means for acquiring image data of an object; and object recognition means for recognizing a monitoring target among the objects included in the image data And notification means for notifying the outside according to a predetermined notification mode when the object recognition means recognizes the monitoring target.
  • the monitoring device can acquire image data by the image data acquisition means, and can recognize the monitoring target among the objects included in the image data by the object recognition means. Can be notified to the outside by Thus, when the driver is away from the vehicle, it is possible to notify in accordance with the external situation of the vehicle.
  • a second invention is an unmanned surveillance system according to the first invention, wherein the object recognition means refers to data generated by deep learning.
  • a third invention according to the first invention or the second invention further comprises distance estimation means for estimating the distance between the monitoring target and the vehicle, and the notification condition includes the monitoring It is a monitoring apparatus whose object is located within predetermined distance from the said motor vehicle.
  • a fourth invention is a monitoring device according to the configuration of the third invention, comprising notification mode determination means for determining a notification mode for notifying outside based on a distance between the monitoring target and the vehicle.
  • a fifth aspect of the invention is the configuration of the third aspect of the invention, further comprising an approach determination unit that determines whether the monitoring target is approaching the vehicle, and the notification condition is that the monitoring target is approaching the vehicle. It is a monitoring device that is doing.
  • the sixth invention is the configuration of the third invention, further comprising speed change recognition means for recognizing a change in speed at which the monitoring target approaches the vehicle, and the notification condition is that the speed has become faster. There is a monitoring device.
  • a seventh invention is the configuration according to the first invention or the second invention, wherein position information indicating the position of the vehicle is provided when the notification means notifies outside by a predetermined notification mode that the notification condition is satisfied. , And tracking means for tracking the vehicle.
  • An eighth aspect of the invention relates to the first aspect or the second aspect of the invention, further comprising: ambient light calculation means for calculating the average brightness or average brightness of the image data, wherein the predetermined color defining the monitoring target is It is a monitoring device specified by the hue of a predetermined range, the saturation of a predetermined range, and the brightness or lightness of a predetermined range for every above-mentioned average brightness or the above-mentioned average brightness.
  • a ninth invention is the configuration of the eighth invention, further comprising: a number-of-pixels calculating means for calculating the number of pixels in which the predetermined color continues, and the object recognizing means defines the number of pixels for each distance.
  • the monitoring device is configured to recognize that the object included in the image data is a monitoring target when the number of pixels is equal to or more than a predetermined number of pixels.
  • a tenth aspect of the present invention is a monitoring system for monitoring the outside of a car, comprising: image data acquisition means for acquiring image data of an object photographed by the photographing means; and a monitoring target among the objects included in the image data. It is a monitoring system which has the object recognition means to recognize, and the notification means for notifying outside, when the said object recognition means recognizes the said monitoring object.
  • the monitoring device for monitoring the outside of the vehicle recognizes an object to be monitored among the objects included in the image data acquisition step of acquiring image data of an object outside the vehicle and the image data And a notification step of notifying the outside when an object to be monitored is recognized in the object recognition step.
  • a control device that manages a monitoring device that monitors the outside of a car, an image data acquisition unit that acquires image data of an object outside the car, and the object included in the image data.
  • Object recognition means for recognizing a monitoring target having a predetermined color
  • notification means for transmitting an image of the object recognized as the monitoring target to an address registered in advance when the object recognition means recognizes the monitoring target , Is a monitoring program to function as.
  • a thirteenth invention is characterized in that, in the configuration of the twelfth invention, the control device further functions as ambient light calculating means for calculating average brightness or average brightness of the image data, and the predetermined color is It is a monitoring program specified by the hue of a predetermined range, the saturation of a predetermined range, and the brightness or lightness of a predetermined range for every average brightness or the above-mentioned average brightness.
  • control device further functions as a number-of-pixels calculating means for calculating the number of pixels in which the predetermined color continues.
  • the monitoring program is configured to recognize that the object included in the image data is a monitoring target when the number of pixels is equal to or more than a predetermined number of pixels defined for each distance.
  • a fifteenth invention is the configuration according to any one of the twelfth invention to the fourteenth invention, wherein the control device is further configured to determine whether the monitoring target is approaching the vehicle or not.
  • the notification means is configured to transmit an image of the object recognized as the monitoring target to a previously registered address on condition that the monitoring target is approaching the vehicle. It is a monitoring program.
  • the sixteenth invention is the configuration of the fifteenth invention, further including measuring the time during which the monitored object does not deviate from the vehicle after the monitored object approaches the vehicle.
  • the notification means is recognized as the monitoring target on condition that a state in which the monitoring target does not deviate from the vehicle continues for a predetermined time after the monitoring target approaches the vehicle. It is a monitoring program that is configured to transmit an image of an object to a pre-registered address.
  • the image data acquisition means is configured to acquire a 360-degree image in the horizontal direction
  • the control device further comprises: A second image acquisition unit configured to acquire an image different from the image acquired by the image acquisition unit, when the monitoring object is included in the image acquired by the acquisition unit; It is a monitoring program to function as a second notification unit that transmits an image acquired by the two-image acquisition unit to a previously registered address.
  • FIG. 1 is a schematic view of a first embodiment of the present invention. It is the schematic which shows the functional block of an imaging device. It is the schematic which shows the functional block of a monitoring apparatus. It is a conceptual diagram for demonstrating the feature data for image recognition. It is a figure for demonstrating notification conditions. It is a figure for demonstrating notification conditions. It is a conceptual diagram which shows notification mode data. It is a figure for demonstrating a tracking program. It is a flowchart which shows operation
  • the portable terminal 100 acquires (receives) image data of an image captured by the imaging device 10 from the imaging device 10 when the driver 350 leaves the vehicle 300.
  • the image captured and acquired by the imaging device 10 may be a still image or a moving image, but the portable terminal 100 receives the still image from the imaging device 10. Since the still image has a small data capacity, the communication speed can be increased. Note that, depending on the communication environment, the mobile terminal 100 may receive a moving image from the imaging device 10, unlike the present embodiment.
  • the portable terminal 100 When the monitoring target is included in the image data received from the imaging device 10, the portable terminal 100 recognizes the monitoring target by image recognition.
  • the monitoring target is, for example, a person in a predetermined clothes.
  • humans 360 and 362 correspond to the monitoring targets. Humans 360 and 362 wear red jackets 360a and 362a, respectively.
  • the portable terminal 100 recognizes the monitoring target among the objects included in the image data, and when the notification condition is satisfied, notifies the driver 350 that the notification condition is satisfied by sound or vibration. There is.
  • the portable terminal 100 is configured to request the imaging device 10 for position information indicating the position of the imaging device 10 when the predetermined condition is satisfied.
  • the position of the photographing device 10 is also the position of the car 300. Thus, when the car 300 is moved by a person other than the driver 350, the portable terminal 100 can track the car 300 by receiving the position information from the imaging device 10.
  • the imaging device 10 includes a central processing unit (CPU) 50, a storage unit 52, a wireless communication unit 54, a GPS (global positioning system) unit 56, an image processing unit 60, and a power supply unit 70.
  • CPU central processing unit
  • storage unit 52 a storage unit 52
  • wireless communication unit 54 a wireless communication unit 54
  • GPS (global positioning system) unit 56 a GPS (global positioning system) unit 56
  • image processing unit 60 a power supply unit 70.
  • the imaging device 10 can operate the camera (not shown) that is the central configuration of the imaging device 10 by the image processing unit 60 to acquire an external image.
  • the imaging device 10 can transmit image data to the portable terminal 100 by the wireless communication unit 54.
  • the imaging device 10 can measure the position of the imaging device 10 itself by the GPS unit 56.
  • the GPS unit 56 basically receives radio waves from four or more GPS satellites (navigation satellites) to measure the position of the imaging device 10.
  • the power supply unit 70 is connected to the power supply of the automobile 300 and receives power supply from the automobile 300.
  • the storage unit 52 stores a stop shooting program, an image data transmission program, and a position information transmission program, in addition to a general program as a drive recorder.
  • the CPU 50 and the at-stop imaging program are an example of the at-stop imaging means.
  • the CPU 50 and the image data transmission program are an example of an image transmission unit.
  • the CPU 50 and the position information transmission program are an example of position information transmission means.
  • the photographing device 10 photographs the outside while the car 300 is stopped by the photographing program at the time of stop.
  • the photographing device 10 continuously photographs the outside while the car 300 is traveling, but even when the car 300 is stopped, the switch of the camera of the photographing device 10 is operated by the driver 350 who is the user.
  • the external shooting is continued unless the shooting interruption process is performed, for example, the camera is turned off.
  • the imaging device 10 is configured to continuously transmit image data of an image captured while the vehicle is stopped to the portable terminal 100 by wire or wireless according to an image data transmission program.
  • the photographing device 10 is configured to continuously transmit the position of the photographing device 10 to the portable terminal 100 when the predetermined condition is satisfied by the position information transmission program.
  • the predetermined condition is that a transmission request for position information has been received from the portable terminal 100.
  • the portable terminal 100 includes a CPU 150, a storage unit 152, a wireless communication unit 154, a GPS unit 156, and a power supply unit 170.
  • the portable terminal 100 can receive image data from the imaging device 10 by the wireless communication unit 154. Further, it is possible to transmit a transmission request for position information to the imaging device 10 and receive the position information.
  • the portable terminal 100 can receive a positioning radio wave from a navigation satellite such as a GPS satellite by the GPS unit 156 and measure the position of the portable terminal 100 itself.
  • a navigation satellite such as a GPS satellite
  • the power supply unit 170 is a chargeable rechargeable battery, and supplies power to each unit of the portable terminal 100.
  • the storage unit 152 includes a general program as a mobile phone, an image data acquisition program, an object recognition program, a distance estimation program, an approach determination program, a speed change recognition program, a notification condition determination program, a notification mode determination program, notification A program and a tracking program are stored.
  • the CPU 150 and the image data acquisition program are an example of an image data acquisition unit.
  • the CPU 150 and the object recognition program are examples of object recognition means.
  • the CPU 150 and the distance estimation program are examples of distance estimation means.
  • the CPU 150 and the approach determination program are an example of the approach determination means.
  • the CPU 150 and the speed change recognition program are an example of the speed change recognition means.
  • the CPU 150 and the notification condition determination program are an example of notification condition determination means.
  • the CPU 150 and the notification mode determination program are an example of the notification mode determination means.
  • the CPU 150 and the notification program are an example of notification means.
  • the CPU 150 and the tracking program are an example of the tracking means.
  • the portable terminal 100 acquires image data from the imaging device 10 by wireless communication according to an image data acquisition program.
  • the portable terminal 100 recognizes an object included in the image data by the object recognition program, and recognizes a monitoring target.
  • the monitoring target is a man wearing a red jacket.
  • the portable terminal 100 refers to the feature data generated by deep learning by the object recognition program.
  • deep learning is machine learning of a multi-layered neural network
  • the field of image recognition is a powerful application field.
  • the portable terminal 100 identifies the feature of the object included in the image data acquired from the imaging device 10 based on a large number of image recognition feature data (hereinafter also referred to as “feature information”) for each category, It is possible to recognize the category of This category of recognition is called general object recognition.
  • feature information image recognition feature data
  • the categories are, for example, “human”, “dog”, “cat” and the like.
  • Each category is subdivided, and there is a subcategory 1 such as "male” or “female” in the "human” category, and a "jacket” or “jacket” in the subcategory 1 "male".
  • a large number of feature data are stored in the storage unit 152 for each of the categories described above.
  • the portable terminal 100 can refer to feature information of “human” acquired based on a large number of “human” image examples.
  • the feature information may be stored in an external storage device, and the portable terminal 100 may access and refer to the external storage device.
  • the correlation is determined in comparison with the features of each category acquired by deep learning.
  • the higher the degree of correlation the more likely an object in the acquired image belongs to a particular category.
  • category common probability the possibility of belonging to a specific category
  • the degree of correlation indicates the maximum value
  • the category common probability 100%. It is defined as The portable terminal 100 determines that the category of the object in the acquired image belongs to the specific category when the category common probability is a predetermined reference value, for example, 95% or more.
  • the portable terminal 100 refers to the feature data by the general object recognition described above, and is “human” and “male”, The "jacket” is worn, and the jacket is recognized as “red”, and the object in the image is recognized as a monitoring target.
  • the portable terminal 100 estimates the distance between the monitoring target and the vehicle 300 by the distance estimation program.
  • the number of pixels of the imaging device 10 is, for example, 3 million.
  • the number of pixels of the imaging device 10 and the number of pixels of the portable terminal 100 do not necessarily have to be the same. However, in the present embodiment, it is assumed that the total number of images of the portable terminal 100 is the same as the total number of images of the imaging device 10.
  • the monitoring target occupies a predetermined number of pixels in the image data according to the size thereof.
  • the distance to the car 300 is shorter.
  • the number of pixels in the height direction when the rod-like member having a height of 1 m (meters) is 500 m away
  • the number of pixels in the height direction when the rod-like member is 200 m away 100
  • the number of pixels in the height direction at meters away, the number of pixels in the height direction at 50 emails away, the number of pixels in the height direction at 10 meters away The reference value is stored.
  • a category reference value that the average height is 170 cm (centimeter) is stored.
  • the portable terminal 100 is configured to estimate the distance between the monitoring target and the vehicle 300 based on the number of pixels occupied by the actual monitoring target with reference to the basic reference value and the category reference value. For example, in the case of a "man" "human”, the average height is 170 cm, so the number of pixels in the height direction at a predetermined distance is 1.7 times the basic reference value A based on a height of 1 m. Calculate as a number (A ⁇ 1.7). In the present embodiment, since the estimated distance does not need to be accurate and a simple configuration is desirable, the distance is estimated based on the number of pixels in the image.
  • the portable terminal 100 determines whether the monitoring target approaches the car 300 according to the approach determination program. For example, as shown in FIG. 6, the number of pixels occupied by the monitoring target at time t2 when time ⁇ t has passed from time t1 than the number of pixels occupied by the monitoring target at time t1 (see FIG. b) When there are many cases), the mobile terminal 100 determines that the monitoring target is approaching the car 300. In order to simplify the process, only the number of pixels in the height direction is used.
  • the portable terminal 100 recognizes a change in speed at which the monitoring target approaches the vehicle 300 by the speed change recognition program. For example, between time t1 and time t2 at which time ⁇ t has elapsed from time t1, the increase ⁇ B1 in the number of pixels occupied by the monitoring target is time t2, which is the time zone thereafter, and time t2 has elapsed from time t2. If the increase in the number of pixels occupied by the monitoring target during the period is smaller than .delta.B2 (.delta.B1 ⁇ .delta.B2), it is determined that the monitoring target is accelerating.
  • the portable terminal 100 determines whether to notify the driver 350 who is the user of the portable terminal 100 of establishment of the notification condition by the notification condition determination program. For example, as shown in FIG. 5A, when one monitoring target is recognized in the image data, as the notification condition, as shown in FIG. 5B, two monitoring targets are displayed in the image data. When it is recognized, or as shown in FIGS. 6 (a) and 6 (b), the monitored object approaches the vehicle 300, or the monitored object approaches the vehicle 300 at a high speed. is there.
  • the notification conditions can be selected and set in advance by the driver 350 who is the user of the portable terminal 100.
  • the portable terminal 100 determines a notification mode for notifying the outside (driver 350) based on the distance between the monitoring target and the vehicle 300 by the notification mode determination program. As illustrated in FIG. 7, the portable terminal 100 refers to the notification mode data, and determines the notification mode according to the distance between the monitoring target and the vehicle 300.
  • the notification mode data is stored in the storage unit 152.
  • the notification mode A in the case of a distance of 200 m to 500 m is, for example, generation of a relatively small sound (or low frequency vibration or low sound) from a speaker
  • the notification mode B of a distance 100 m to 200 m is a notification mode A
  • the louder sound (or high frequency vibration) as it means that the speaker generates a louder sound (or high frequency vibration or high frequency sound). It is specified to generate high frequency sounds).
  • the portable terminal 100 notifies the outside (driver 350) of the notification mode determined by the notification mode determination program by the notification program.
  • the tracking program acquires position information from the imaging device 10 and tracks the position of the car 300. That is, when the portable terminal 100 notifies the outside of establishment of the notification condition, the portable terminal 100 transmits a transmission request of position information to the imaging device 10 and continuously receives the position information from the imaging device 10.
  • the mobile terminal 100 may perform tracking after notification to the outside, instead of always performing tracking, and may be performed when a predetermined condition is satisfied.
  • the predetermined condition is, for example, that the driver 350 has input an instruction to start tracking to the portable terminal 100.
  • the portable terminal 100 When receiving the position information from the photographing device 10, the portable terminal 100 displays the position of the photographing device 10 (the position of the car 300) on the display screen 100a as shown in FIG. In the example of FIG. 8, it is shown that the display screen 100 a has passed the route R1 from the first position P1 of the imaging device 10 and has reached the current position P2. That is, since the portable terminal 100 can track the position of the imaging device 10 (the position of the car 300) when the notification condition is satisfied, it can track even if the automated worker 300 is stolen.
  • the photographing device 10 acquires an image by photographing the outside of the automobile 300 (step ST1), and when transmitting the image data (step ST2), the portable terminal 100 receives the image data (step ST3).
  • the portable terminal 100 performs image recognition (step ST4), determines whether or not the notification condition is satisfied (step ST5), determines the notification mode when the notification condition is satisfied (step ST6), and notifies (step ST6) Step ST7). Subsequently, it is determined whether or not a predetermined condition (tracking condition) is satisfied (step ST8), and when the tracking condition is satisfied, the position information is requested to the imaging device 10 (step ST9). Are received (step ST10), and the imaging device 10 (car 300) is tracked (step ST11).
  • the imaging device 10 acquires an image, and the portable terminal 100 receives image data from the imaging device 10 and executes processing after image recognition.
  • the driver 350 is notified Processing other than the above may be performed by the imaging device 10.
  • the imaging device 10 transmits information indicating that the notification condition is satisfied to the portable terminal 100, and the portable terminal 100 notifies the outside, and the imaging device 10 and the portable terminal 100 are integrated and the monitoring device is integrated. Or, configure a monitoring system.
  • part of the processing other than the notification to the driver 350 may be performed by the imaging device 10.
  • the photographing device 10 and the portable terminal 100 integrally constitute a monitoring device or a monitoring system.
  • the imaging device 10 is not limited to the drive recorder, and may be a mobile phone or a smartphone as long as the imaging device 10 can capture an image.
  • the imaging device 10A (see FIG. 1) is an example of a monitoring device.
  • the imaging device 10A is, for example, a mobile phone (including a smartphone).
  • both the photographing device 10A and the mobile terminal 100 are mobile phones.
  • the imaging device 10A is configured to transmit an image to the portable terminal 100 possessed by the driver 350 when the predetermined condition is satisfied.
  • An image data acquisition program, an ambient light calculation program, an object recognition program, a distance estimation program, an approach determination program, and a clocking program are stored in the storage unit 52 (see FIG. 2) of the imaging device 10A. , And a notification program are stored.
  • the CPU 50 and the image data acquisition program are an example of an image data acquisition unit.
  • the CPU 50 and the ambient light calculation program are examples of ambient light calculation means.
  • the CPU 50 and the object recognition program are examples of object recognition means.
  • the CPU 50 and the distance estimation program are examples of distance estimation means.
  • the CPU 50 and the approach determination program are an example of the approach determination means.
  • the CPU 50 and the clocking program are an example of clocking means.
  • the object recognition program includes a pixel number calculation program, and the CPU 50 and the object recognition program are also an example of the pixel number calculation means.
  • the imaging device 10A captures an image of the outside of the automobile 300 by the image data acquisition program, and acquires image data.
  • the imaging device 10A calculates the average luminance of the entire acquired image data by the ambient light calculation program.
  • the average luminance is the average luminance of all pixels of the image data acquired by the imaging device 10A. For example, if it is daytime on a clear day, the average brightness is high, and if it is night, the average brightness is low.
  • the imaging device 10A recognizes the monitoring target according to the color and the number of pixels by the object recognition program.
  • the monitoring target is a red predetermined object having a predetermined feature.
  • the imaging device 10A refers to the average luminance classified HLS data shown in FIG. 10 by the object recognition program.
  • the color of the monitoring target in the acquired image data differs depending on the brightness of the atmosphere (environment) at the time of shooting the image.
  • the imaging device 10A recognizes the brightness of the atmosphere (environment) as the above-described average luminance.
  • the color to be monitored is stored as hue (H) in a predetermined range, luminance (L) in a predetermined range, and saturation (S) in a predetermined range for each average luminance. .
  • the imaging device 10A stores the color of the monitoring target for each average luminance.
  • the hue H of the monitoring target is in the range of a1 or more and a2 or less
  • the luminance L is in the range of b1 or more and b2 or less
  • the saturation S is c1 or more and c2 or less Range.
  • average brightness may be used instead of average brightness
  • brightness may be used instead of brightness (L) as a color to be monitored.
  • the photographing device 10A refers to the hue H, the luminance L and the saturation S corresponding to the average luminance of the acquired image data, and contrasts with the hue, the luminance and the saturation of the object in the acquired image data. Then, by judging whether or not there is an object satisfying the ranges of hue H, luminance L and saturation S of the monitoring target in the image data, the presence or absence of the possibility of being the monitoring target is determined. .
  • the imaging device 10A is further configured to recognize an object that satisfies a predetermined color as a monitoring target when the number of pixels of the object is equal to or greater than a predetermined number of pixels defined for each distance. There is. Thereby, false recognition of the monitoring target can be avoided with high accuracy.
  • the estimation of the distance is performed by a distance estimation program and an approach determination program. For example, when the distance between the monitoring target and the imaging device 10A is large, the number of pixels in the image data to be monitored is small, and when the distance is small, the number of pixels is large.
  • the photographing apparatus 10A determines whether the monitoring target approaches the vehicle 300 according to the approach determination program. Specifically, when the number of pixels occupied by the monitoring target at time t2 when time ⁇ t has elapsed from time t1 is larger than the number of pixels occupied by the monitoring target at time t1, the monitoring device 10A selects the car 300 as the monitoring target. Judge as approaching.
  • the photographing apparatus 10A measures a time during which the monitoring target does not depart from the car 300 after the monitoring target approaches the car 300 according to a clocking program.
  • the time is, for example, 30 seconds (s).
  • the imaging apparatus 10A transmits an image to be monitored to an address registered in advance (hereinafter, also referred to as a “registered address”) under a predetermined condition by the notification program.
  • the address registered in advance is the e-mail address of the mobile terminal 100.
  • the predetermined condition is, for example, when the imaging device 10A recognizes the monitoring target, when the monitoring target approaches the car 300, or when the monitoring target does not approach the car 300 and does not leave for a predetermined time. is there.
  • the imaging device 10A transmits the image to be monitored to the address registered in advance a plurality of times.
  • the image recognition in the present embodiment is performed based on the color of the object and the number of pixels occupied by the color of the object, but unlike the present embodiment, the shape of the object is the same as in the first embodiment. It may be carried out with consideration. As a result, the monitoring target can be determined with higher accuracy. Further, the image recognition may be simplified by implementing only the color of the object.
  • the imaging device 10A acquires an image (step ST101), performs image recognition (step ST102), and determines whether it is a monitoring target (step ST103).
  • step ST103 the average luminance of the entire image is calculated, and the hue, luminance and saturation range of the monitoring target in the average luminance are referenced to compare with the hue, luminance and saturation of the object in the image, It is determined whether the object is likely to be monitored.
  • the imaging device 10A transmits the image to the registered address (email address of the portable terminal 100). Are sent (step ST104). Subsequently, when it is determined that the monitoring target is approaching the car 300 (step ST105), the photographing apparatus 10A transmits an image to the registered address (step ST106). Furthermore, when it is determined that the state in which the monitoring target is approaching the car 300 continues for a predetermined time (step ST107), the photographing apparatus 10A transmits an image to the registered address (step ST108). The imaging device 10A determines whether the end condition is satisfied, for example, by receiving a signal of processing end from the portable terminal 100 (step ST109), and repeats the above process when the end condition is not satisfied.
  • the imaging device 10A may transmit an image to the portable terminal 100 via an external server instead of transmitting the image directly to the portable terminal 100.
  • the imaging device 10A may be configured to transmit all acquired images to an external server, and the server may perform processing after image recognition.
  • the hue (H) of the predetermined range of the monitoring target in the reference luminance (reference luminance), the luminance (L) of the predetermined range, and the saturation (S) of the predetermined range are stored.
  • the hue (H) of the predetermined range to be monitored in the actual average luminance, the luminance (L) of the predetermined range, and the saturation (S) of the predetermined range It may be calculated. For example, if the reference luminance is 50 and the actual average luminance is 60, the hue (H) of the predetermined range to be monitored, the luminance (L) of the predetermined range, and the saturation (S) of the predetermined range are predetermined. Shift to higher numbers in proportion.
  • the imaging device 10B of FIG. 12 is an example of a monitoring device.
  • the photographing device 10B is a mobile phone.
  • the 360-degree camera 12 capable of acquiring an image of a 360-degree angle in the horizontal direction acquires an external image while the car 300 is stopped, and the imaging device 10B wirelessly or wired the image.
  • the photographing device 10B is configured to transmit an image to an address (e-mail address of the portable terminal 100) registered in advance via the server 400.
  • the image capturing device 10B captures an image also in the image capturing device 10B itself, It is supposed to be sent.
  • An image data acquisition program, an ambient light calculation program, an object recognition program, a distance estimation program, an approach determination program, and a clocking program are stored in the storage unit 52 (see FIG. 2) of the imaging device 10B.
  • a notification program, a second image data acquisition program, and a second notification program are stored.
  • the CPU 50 and the image data acquisition program are an example of an image data acquisition unit.
  • the CPU 50 and the ambient light calculation program are examples of ambient light calculation means.
  • the CPU 50 and the object recognition program are examples of object recognition means.
  • the CPU 50 and the distance estimation program are examples of distance estimation means.
  • the CPU 50 and the approach determination program are an example of the approach determination means.
  • the CPU 50 and the clocking program are an example of clocking means.
  • the CPU 50 and the notification program are an example of notification means.
  • the CPU 50 and the second image data acquisition program are an example of a second image acquisition unit.
  • the CPU 50 and the second notification program are an example of a second notification unit.
  • the imaging device 10B acquires an image acquired by the camera 12 by 360 degrees by the image data acquisition program.
  • the method of acquiring an image from the 360 degree camera 12 is wired or wireless.
  • wireless for example, Bluetooth (Bluetooth) or Wi-Fi (Wi-Fi) is used.
  • the second image data acquisition program causes the camera of the imaging device 10B to acquire an image at the position of the monitoring target. It is determined whether or not it is the photographing possible direction, and the camera of the photographing device 10B acquires the image of the monitoring target.
  • the imaging device 10B is a mobile phone, and the camera can capture only an image in a specific direction (a direction in which shooting is possible).
  • the 360-degree camera 12 is, for example, an imaging device in which two lenses capable of imaging an angle of 180 degrees are arranged in mutually opposite directions.
  • the photographing apparatus 10B determines which lens of the 360 degree camera 12 is the photographed image, and further, based on the position of the monitored object in the image data, whether or not the monitored object is positioned in the photographing possible direction to decide.
  • the imaging device 10B transmits, to the server 400, the image of the monitoring target acquired and acquired by the imaging device 10B according to the second notification program. Then, the server 400 transmits the image to the address of the portable terminal 100.
  • the user 350 of the portable terminal 100 may determine whether the monitoring target is located outside the automobile 300 by referring to both the image acquired by the 360-degree camera 12 and the image acquired by the imaging device 10B. it can.
  • the imaging device 10B transmits an image to the registered address (step ST208).
  • the imaging device 10B determines whether the end condition is satisfied, for example, receives a signal of processing end from the portable terminal 100 via the server 400 (step ST209), and in the case where the end condition is not satisfied, Repeat the process of
  • the imaging device 10B determines whether the imaging device 10B itself is positioned in a direction in which the monitoring target can be imaged (step ST210) If it is determined that photographing is possible, the monitoring target is photographed and an image of the monitoring target is transmitted to a registered address registered in advance (step ST211).
  • the server 400A of FIG. 12 is an example of a monitoring device. That is, the imaging device 10C (see FIG. 12) transmits the image data acquired from the 360 degree camera 12 to the server 400A, and the server 400A performs processing after image recognition.
  • the imaging device 10C is a mobile phone.
  • the storage unit 454 of the server 400A shown in FIG. 13 includes an image data acquisition program, an ambient light calculation program, an object recognition program, a distance estimation program, an approach determination program, a clocking program, a notification program, in addition to a general program as a server.
  • a second image data acquisition program, and a second notification program are stored.
  • the CPU 452 and the image data acquisition program are an example of an image data acquisition unit.
  • the CPU 452 and the ambient light calculation program are an example of an ambient light calculation unit.
  • the CPU 452 and the object recognition program are examples of object recognition means.
  • the CPU 452 and the distance estimation program are examples of distance estimation means.
  • the CPU 452 and the approach determination program are an example of the approach determination means.
  • the CPU 452 and the clocking program are an example of clocking means.
  • the CPU 452 and the notification program are an example of notification means.
  • the CPU 452 and the second image data acquisition program are an example of a second image acquisition unit.
  • the server 400A acquires an image acquired by the camera 12 360 degrees from the imaging device 10C (step ST301), performs image recognition (step ST302), and determines whether it is a monitoring target (step ST303). If the object is a monitoring target, the server 400A transmits an image to the registered address (e-mail address of the portable terminal 100) (step ST304). Subsequently, when the server 400A determines that the monitoring target is approaching the automobile 300 (step ST305), the server 400A transmits an image to the registered address (step ST206).
  • the server 400A transmits an image to the registered address (step ST308).
  • the server 400A determines, for example, whether a termination condition is satisfied, such as receiving a processing termination signal from the portable terminal 100 (step ST309), and repeats the above processing if the termination condition is not satisfied.
  • each program such as an ambient light calculation program or an image recognition program is stored in an external server, and the portable terminal 100, the photographing apparatus 10A, etc. use those programs via the cloud (the Internet). It can also be configured.
  • Imaging device 12 360 degree camera 100 Mobile terminal 300 Automobile 352 Driver 360, 362 Monitored target 400, 400A Server

Abstract

[Problem] To provide a monitoring device, monitoring method, monitoring system, and monitoring program capable of issuing a notification conforming to the situation outside an automobile when a driver is in a location removed from the automobile. [Solution] Provided is a monitoring device 10 for monitoring an environment outside an automobile, said device comprising: an image data acquisition means for acquiring image data relating to objects; an object recognition means for recognizing a subject to be monitored from among the objects included in the image data; and a notification means for issuing a notification to the outside if the object recognition means has recognized the subject to be monitored.

Description

監視装置、監視システム、監視方法及び監視プログラムMonitoring apparatus, monitoring system, monitoring method and monitoring program
 本発明は、監視装置、監視システム、監視方法及び監視プログラムに関する。 The present invention relates to a monitoring device, a monitoring system, a monitoring method, and a monitoring program.
 従来、自動車にカメラを搭載して、外部を監視する技術が提案されている(例えば、特許文献1)。 2. Description of the Related Art Conventionally, a technology has been proposed in which a camera is mounted on a car to monitor the outside (for example, Patent Document 1).
特許第4161225号公報Patent No. 4161225
 ところで、上述の技術は、運転者が自動車に乗っているときには有効である。しかし、自動車の駐車中において、運転者が自動車から離れた場所にいるときに、自動車に危害を加えられることを防止することができない。 By the way, the above-described technology is effective when the driver is in a car. However, while the car is parked, it is not possible to prevent the car from being harmed when the driver is away from the car.
 本発明はかかる問題の解決を試みたものであり、運転者が自動車から離れた場所にいるときに、自動車の外部の状況に応じた通知をすることができる監視装置、監視システム、監視方法及び監視プログラムを提供することを目的とする。 The present invention is an attempt to solve such a problem, and when the driver is away from the vehicle, a monitoring device, a monitoring system, a monitoring method, and a notification that can be notified according to the external situation of the vehicle. It aims to provide a monitoring program.
 第一の発明は、自動車の外部を監視する監視装置であって、物体の画像データを取得する画像データ取得手段と、前記画像データに含まれる前記物体のうち、監視対象を認識する物体認識手段と、前記物体認識手段が前記監視対象を認識した場合、所定の通知態様によって外部に通知する通知手段と、を有する監視装置である。 A first invention is a monitoring device for monitoring the outside of a car, comprising: image data acquisition means for acquiring image data of an object; and object recognition means for recognizing a monitoring target among the objects included in the image data And notification means for notifying the outside according to a predetermined notification mode when the object recognition means recognizes the monitoring target.
 第一の発明の構成によれば、監視装置は、画像データ取得手段によって画像データを取得し、物体認識手段によって、画像データに含まれる物体のうち、監視対象を認識することができ、通知手段によって外部に通知することができる。これにより、運転者が自動車から離れた場所にいるときに、自動車の外部の状況に応じた通知をすることができる。 According to the configuration of the first aspect of the invention, the monitoring device can acquire image data by the image data acquisition means, and can recognize the monitoring target among the objects included in the image data by the object recognition means. Can be notified to the outside by Thus, when the driver is away from the vehicle, it is possible to notify in accordance with the external situation of the vehicle.
 第二の発明は、第一の発明の構成において、前記物体認識手段は、深層学習(ディープラーニング)によって生成されたデータを参照する無人監視装置である。 A second invention is an unmanned surveillance system according to the first invention, wherein the object recognition means refers to data generated by deep learning.
 第三の発明は、第一の発明または第二の発明のいずれかの構成において、前記監視対象と前記自動車との間の距離を推定する距離推定手段を有し、前記通知条件は、前記監視対象が前記自動車から所定の距離以内に位置することである、監視装置である。 A third invention according to the first invention or the second invention further comprises distance estimation means for estimating the distance between the monitoring target and the vehicle, and the notification condition includes the monitoring It is a monitoring apparatus whose object is located within predetermined distance from the said motor vehicle.
 第四の発明は、第三の発明の構成において、前記監視対象と前記自動車との間の距離に基づいて、外部に通知する通知態様を決定する通知態様決定手段を有する、監視装置である。 A fourth invention is a monitoring device according to the configuration of the third invention, comprising notification mode determination means for determining a notification mode for notifying outside based on a distance between the monitoring target and the vehicle.
 第五の発明は、第三の発明の構成において、前記監視対象が前記自動車に接近しているか否かを判断する接近判断手段を有し、前記通知条件は、前記監視対象が前記自動車に接近していることである、監視装置である。 A fifth aspect of the invention is the configuration of the third aspect of the invention, further comprising an approach determination unit that determines whether the monitoring target is approaching the vehicle, and the notification condition is that the monitoring target is approaching the vehicle. It is a monitoring device that is doing.
 第六の発明は、第三の発明の構成において、前記監視対象が前記自動車に接近する速度の変化を認識する速度変化認識手段を有し、前記通知条件は、前記速度が速くなったことである、監視装置である。 The sixth invention is the configuration of the third invention, further comprising speed change recognition means for recognizing a change in speed at which the monitoring target approaches the vehicle, and the notification condition is that the speed has become faster. There is a monitoring device.
 第七の発明は、第一の発明または第二の発明の構成において、前記通知手段によって前記通知条件を満たすことを所定の通知態様によって外部に通知した場合に、前記自動車の位置を示す位置情報を取得し、前記自動車を追跡する追跡手段を有する、監視装置である。 A seventh invention is the configuration according to the first invention or the second invention, wherein position information indicating the position of the vehicle is provided when the notification means notifies outside by a predetermined notification mode that the notification condition is satisfied. , And tracking means for tracking the vehicle.
 第八の発明は、第一の発明または第二の発明の構成において、前記画像データの平均輝度または平均明度を算出する環境光算出手段を有し、前記監視対象を規定する所定の色は、前記平均輝度または前記平均明度ごとに、所定範囲の色相、所定範囲の彩度、及び、所定範囲の輝度または明度によって規定されている、監視装置である。 An eighth aspect of the invention relates to the first aspect or the second aspect of the invention, further comprising: ambient light calculation means for calculating the average brightness or average brightness of the image data, wherein the predetermined color defining the monitoring target is It is a monitoring device specified by the hue of a predetermined range, the saturation of a predetermined range, and the brightness or lightness of a predetermined range for every above-mentioned average brightness or the above-mentioned average brightness.
 第九の発明は、第八の発明の構成において、前記所定の色が連続する画素数を算出する画素数算出手段を有し、前記物体認識手段は、前記画素数が距離ごとに規定された所定の画素数以上である場合に、前記画像データに含まれる前記物体が監視対象であると認識するように構成されている、監視装置である。 A ninth invention is the configuration of the eighth invention, further comprising: a number-of-pixels calculating means for calculating the number of pixels in which the predetermined color continues, and the object recognizing means defines the number of pixels for each distance. The monitoring device is configured to recognize that the object included in the image data is a monitoring target when the number of pixels is equal to or more than a predetermined number of pixels.
 第十の発明は、自動車の外部を監視する監視システムであって、撮影手段が撮影した物体の画像データを取得する画像データ取得手段と、前記画像データに含まれる前記物体のうち、監視対象を認識する物体認識手段と、前記物体認識手段が前記監視対象を認識した場合、外部に通知するための通知手段と、を有する監視システムである。 A tenth aspect of the present invention is a monitoring system for monitoring the outside of a car, comprising: image data acquisition means for acquiring image data of an object photographed by the photographing means; and a monitoring target among the objects included in the image data. It is a monitoring system which has the object recognition means to recognize, and the notification means for notifying outside, when the said object recognition means recognizes the said monitoring object.
 第十一の発明は、自動車の外部を監視する監視装置が、前記自動車の外部の物体の画像データを取得する画像データ取得ステップと、前記画像データに含まれる前記物体のうち、監視対象を認識する物体認識ステップと、前記物体認識ステップにおいて前記監視対象を認識した場合、外部に通知する通知ステップと、を有する監視方法である。 In an eleventh aspect of the present invention, the monitoring device for monitoring the outside of the vehicle recognizes an object to be monitored among the objects included in the image data acquisition step of acquiring image data of an object outside the vehicle and the image data And a notification step of notifying the outside when an object to be monitored is recognized in the object recognition step.
 第十二の発明は、自動車の外部を監視する監視装置を管理する制御装置を、 前記自動車の外部の物体の画像データを取得する画像データ取得手段、前記画像データに含まれる前記物体のうち、所定の色を有する監視対象を認識する物体認識手段、前記物体認識手段が前記監視対象を認識した場合、前記監視対象として認識された前記物体の画像を予め登録したアドレスに対して送信する通知手段、として機能させるための監視プログラムである。 According to a twelfth aspect of the invention, there is provided a control device that manages a monitoring device that monitors the outside of a car, an image data acquisition unit that acquires image data of an object outside the car, and the object included in the image data. Object recognition means for recognizing a monitoring target having a predetermined color, and notification means for transmitting an image of the object recognized as the monitoring target to an address registered in advance when the object recognition means recognizes the monitoring target , Is a monitoring program to function as.
 第十三の発明は、第十二の発明の構成において、前記制御装置を、さらに、 前記画像データの平均輝度または平均明度を算出する環境光算出手段として機能させ、前記所定の色は、前記平均輝度または前記平均明度ごとに、所定範囲の色相、所定範囲の彩度、及び、所定範囲の輝度または明度によって規定されている監視プログラムである。 A thirteenth invention is characterized in that, in the configuration of the twelfth invention, the control device further functions as ambient light calculating means for calculating average brightness or average brightness of the image data, and the predetermined color is It is a monitoring program specified by the hue of a predetermined range, the saturation of a predetermined range, and the brightness or lightness of a predetermined range for every average brightness or the above-mentioned average brightness.
 第十四の発明は、第十三の発明の構成において、前記制御装置を、さらに、 前記所定の色が連続する画素数を算出する画素数算出手段として機能させ、前記物体認識手段は、前記画素数が距離ごとに規定された所定の画素数以上である場合に、前記画像データに含まれる前記物体が監視対象であると認識するように構成されている、監視プログラムである。 According to a fourteenth aspect of the present invention, in the configuration according to the thirteenth aspect, the control device further functions as a number-of-pixels calculating means for calculating the number of pixels in which the predetermined color continues. The monitoring program is configured to recognize that the object included in the image data is a monitoring target when the number of pixels is equal to or more than a predetermined number of pixels defined for each distance.
 第十五の発明は、第十二の発明乃至第十四の発明のいずれかの構成において、 前記制御装置を、さらに、前記監視対象が前記自動車に接近しているか否かを判断する接近判断手段として機能させ、前記通知手段は、前記監視対象が前記自動車に接近していることを条件に前記監視対象として認識された前記物体の画像を予め登録したアドレスに対して送信するように構成されている、監視プログラムである。 A fifteenth invention is the configuration according to any one of the twelfth invention to the fourteenth invention, wherein the control device is further configured to determine whether the monitoring target is approaching the vehicle or not. The notification means is configured to transmit an image of the object recognized as the monitoring target to a previously registered address on condition that the monitoring target is approaching the vehicle. It is a monitoring program.
 第十六の発明は、第十五の発明の構成において、前記制御装置を、さらに、 前記監視対象が前記自動車に接近した後、前記監視対象が前記自動車から離反しない状態が継続する時間を計測する計時手段として機能させ、前記通知手段は、前記監視対象が前記自動車に接近した後、前記監視対象が前記自動車から離反しない状態が所定時間継続したことを条件に前記監視対象として認識された前記物体の画像を予め登録したアドレスに対して送信するように構成されている、監視プログラムである。 The sixteenth invention is the configuration of the fifteenth invention, further including measuring the time during which the monitored object does not deviate from the vehicle after the monitored object approaches the vehicle. The notification means is recognized as the monitoring target on condition that a state in which the monitoring target does not deviate from the vehicle continues for a predetermined time after the monitoring target approaches the vehicle. It is a monitoring program that is configured to transmit an image of an object to a pre-registered address.
 第十七の発明は、第十二の発明の構成において、前記画像データ取得手段は、水平方向における360度の画像を取得するように構成されており、さらに、前記制御装置を、前記画像データ取得手段によって取得した前記画像中に前記監視対象が含まれていた場合に、前記監視対象について、前記画像取得手段によって取得する画像とは異なる画像を取得する第二画像取得手段、及び、前記第二画像取得手段によって取得した画像を予め登録したアドレスに対して送信する第二通知手段、として機能させるための監視プログラムである。 According to a seventeenth invention, in the configuration of the twelfth invention, the image data acquisition means is configured to acquire a 360-degree image in the horizontal direction, and the control device further comprises: A second image acquisition unit configured to acquire an image different from the image acquired by the image acquisition unit, when the monitoring object is included in the image acquired by the acquisition unit; It is a monitoring program to function as a second notification unit that transmits an image acquired by the two-image acquisition unit to a previously registered address.
 以上のように、本発明によれば、運転者が自動車から離れた場所にいるときに、自動車の外部の状況に応じた通知をすることができる。 As described above, according to the present invention, when the driver is away from the car, it is possible to notify in accordance with the external situation of the car.
本発明の第一の実施形態の概略を示す図である。FIG. 1 is a schematic view of a first embodiment of the present invention. 撮影装置の機能ブロックを示す概略図である。It is the schematic which shows the functional block of an imaging device. 監視装置の機能ブロックを示す概略図である。It is the schematic which shows the functional block of a monitoring apparatus. 画像認識用特徴データを説明するための概念図である。It is a conceptual diagram for demonstrating the feature data for image recognition. 通知条件を説明するための図である。It is a figure for demonstrating notification conditions. 通知条件を説明するための図である。It is a figure for demonstrating notification conditions. 通知態様データを示す概念図である。It is a conceptual diagram which shows notification mode data. 追跡プログラムを説明するための図である。It is a figure for demonstrating a tracking program. 監視装置等の動作を示すフローチャートである。It is a flowchart which shows operation | movement of a monitoring apparatus etc. 本発明の第二の実施例における平均明度別HLSデータを説明するための図である。It is a figure for demonstrating the average brightness classified HLS data in the 2nd Example of this invention. 第二の実施形態の監視装置等の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the monitoring apparatus etc. of 2nd embodiment. 第三の実施形態の概略を示す図である。It is a figure which shows the outline of 3rd embodiment. サーバーの機能ブロックを示す概略図である。It is the schematic which shows the functional block of a server. 監視装置等の動作を示すフローチャートである。It is a flowchart which shows operation | movement of a monitoring apparatus etc. 第四の実施形態の監視装置等の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the monitoring apparatus etc. of 4th embodiment.
 本発明の実施形態を、図面を参照して説明する。なお、当業者が適宜実施できる構成については説明を省略し、本発明の基本的な構成についてのみ説明する。 Embodiments of the present invention will be described with reference to the drawings. The description of the configurations that can be appropriately implemented by those skilled in the art will be omitted, and only the basic configuration of the present invention will be described.
<第一の実施形態>
 図1に示すように、携帯端末100は、自動車300を離れた運転者350のコートのポケットに格納されている。携帯端末100は、例えば、携帯電話機(スマートフォンを含む)である。携帯端末100は監視装置の一例である。運転者350は、自動車300を道路R1の路肩の駐車スペースに駐車し、婦人352と共に食事に出かけるところである。自動車300のフロントガラスの内側上部近傍には、撮影装置10が配置されている。撮影装置10を構成するカメラ(図示せず)は撮影手段の一例である。撮影装置10は、例えば、ドライブレコーダーである。
First Embodiment
As shown in FIG. 1, the portable terminal 100 is stored in a coat pocket of a driver 350 who has left the automobile 300. The mobile terminal 100 is, for example, a mobile phone (including a smart phone). The portable terminal 100 is an example of a monitoring device. The driver 350 is about to park the car 300 in the parking space on the shoulder of the road R1 and go out for a meal with the woman 352. Near the inner upper portion of the windshield of the automobile 300, a photographing device 10 is disposed. The camera (not shown) which comprises the imaging device 10 is an example of an imaging means. The imaging device 10 is, for example, a drive recorder.
 携帯端末100は、運転者350が自動車300を離れているときに、撮影装置10が撮影した画像の画像データを撮影装置10から取得(受信)するようになっている。撮影装置10が撮影して取得する画像は、静止画像であってもよいし、動画であってもよいが、携帯端末100は、撮影装置10から静止画像を受信する。静止画像はデータ容量が小さいから、通信速度を速くすることができる。なお、通信環境によっては、本実施形態とは異なり、携帯端末100は撮影装置10から動画を受信するようにしてもよい。 The portable terminal 100 acquires (receives) image data of an image captured by the imaging device 10 from the imaging device 10 when the driver 350 leaves the vehicle 300. The image captured and acquired by the imaging device 10 may be a still image or a moving image, but the portable terminal 100 receives the still image from the imaging device 10. Since the still image has a small data capacity, the communication speed can be increased. Note that, depending on the communication environment, the mobile terminal 100 may receive a moving image from the imaging device 10, unlike the present embodiment.
 携帯端末100は、画像認識により、撮影装置10から受信した画像データ中に、監視対象が含まれている場合に、その監視対象を認識する。監視対象は、例えば、所定の服装の人間である。本実施形態において、人間360及び362が監視対象に該当する。人間360及び362は、それぞれ、赤色のジャケット360a及び362aを着用している。携帯端末100は、画像データに含まれる物体のうち、監視対象を認識し、さらに、通知条件を満たす場合に、通知条件が成立したことを音や振動で運転者350に通知するようになっている。さらに、携帯端末100は、所定条件を満たす場合に、撮影装置10に対して、撮影装置10の位置を示す位置情報を要求するようになっている。撮影装置10の位置は自動車300の位置でもある。これにより、自動車300が運転者350以外の者に移動させられた場合に、携帯端末100が撮影装置10から位置情報を受信することにより、自動車300を追跡することができる。 When the monitoring target is included in the image data received from the imaging device 10, the portable terminal 100 recognizes the monitoring target by image recognition. The monitoring target is, for example, a person in a predetermined clothes. In the present embodiment, humans 360 and 362 correspond to the monitoring targets. Humans 360 and 362 wear red jackets 360a and 362a, respectively. The portable terminal 100 recognizes the monitoring target among the objects included in the image data, and when the notification condition is satisfied, notifies the driver 350 that the notification condition is satisfied by sound or vibration. There is. Furthermore, the portable terminal 100 is configured to request the imaging device 10 for position information indicating the position of the imaging device 10 when the predetermined condition is satisfied. The position of the photographing device 10 is also the position of the car 300. Thus, when the car 300 is moved by a person other than the driver 350, the portable terminal 100 can track the car 300 by receiving the position information from the imaging device 10.
 図2に示すように、撮影装置10は、CPU(Central Processing Unit)50、記憶部52、無線通信部54、GPS(Global Positioning System)部56、画像処理部60、及び、電源部70を有する。 As shown in FIG. 2, the imaging device 10 includes a central processing unit (CPU) 50, a storage unit 52, a wireless communication unit 54, a GPS (global positioning system) unit 56, an image processing unit 60, and a power supply unit 70. .
 撮影装置10は、画像処理部60によって撮影装置10の中心的な構成であるカメラ(図示せず)を作動させて、外部の画像を取得することができる。 The imaging device 10 can operate the camera (not shown) that is the central configuration of the imaging device 10 by the image processing unit 60 to acquire an external image.
 撮影装置10は、無線通信部54によって画像データを携帯端末100へ送信することができる。 The imaging device 10 can transmit image data to the portable terminal 100 by the wireless communication unit 54.
 撮影装置10は、GPS部56によって、撮影装置10自体の位置を測定することができる。GPS部56は、基本的に、4つ以上のGPS衛星(航法衛星)からの電波を受信して撮影装置10の位置を計測する。 The imaging device 10 can measure the position of the imaging device 10 itself by the GPS unit 56. The GPS unit 56 basically receives radio waves from four or more GPS satellites (navigation satellites) to measure the position of the imaging device 10.
 電源部70は、自動車300の電源と接続し、自動車300から電力の供給を受けるようになっている。 The power supply unit 70 is connected to the power supply of the automobile 300 and receives power supply from the automobile 300.
 記憶部52には、ドライブレコーダーとしての一般的なプログラムのほか、停車時撮影プログラム、画像データ送信プログラム、位置情報送信プログラムが格納されている。CPU50と停車時撮影プログラムは停車時撮影手段の一例である。CPU50と画像データ送信プログラムは画像送信手段の一例である。CPU50と位置情報送信プログラムは位置情報送信手段の一例である。 The storage unit 52 stores a stop shooting program, an image data transmission program, and a position information transmission program, in addition to a general program as a drive recorder. The CPU 50 and the at-stop imaging program are an example of the at-stop imaging means. The CPU 50 and the image data transmission program are an example of an image transmission unit. The CPU 50 and the position information transmission program are an example of position information transmission means.
 撮影装置10は、停車時撮影プログラムによって、自動車300の停車中に外部を撮影する。撮影装置10は、自動車300の走行中には継続的に外部を撮影しているが、自動車300が停車した場合であっても、使用者である運転者350によって、撮影装置10のカメラのスイッチが切られるなど、撮影中断の処理がなされない限り、外部の撮影を継続するようになっている。 The photographing device 10 photographs the outside while the car 300 is stopped by the photographing program at the time of stop. The photographing device 10 continuously photographs the outside while the car 300 is traveling, but even when the car 300 is stopped, the switch of the camera of the photographing device 10 is operated by the driver 350 who is the user. The external shooting is continued unless the shooting interruption process is performed, for example, the camera is turned off.
 撮影装置10は、画像データ送信プログラムによって、停車中に撮影した画像の画像データを継続的に携帯端末100へ有線または無線で送信するようになっている。 The imaging device 10 is configured to continuously transmit image data of an image captured while the vehicle is stopped to the portable terminal 100 by wire or wireless according to an image data transmission program.
 撮影装置10は、位置情報送信プログラムによって、所定条件を満たした場合に、撮影装置10の位置を継続的に携帯端末100へ送信するようになっている。所定条件は、携帯端末100から位置情報の送信要求を受信したことである。 The photographing device 10 is configured to continuously transmit the position of the photographing device 10 to the portable terminal 100 when the predetermined condition is satisfied by the position information transmission program. The predetermined condition is that a transmission request for position information has been received from the portable terminal 100.
 図3に示すように、携帯端末100は、CPU150、記憶部152、無線通信部154、GPS部156、及び、電源部170を有する。 As shown in FIG. 3, the portable terminal 100 includes a CPU 150, a storage unit 152, a wireless communication unit 154, a GPS unit 156, and a power supply unit 170.
 携帯端末100は、無線通信部154によって撮影装置10から画像データを受信することができる。また、撮影装置10に対して、位置情報の送信要求を送信し、位置情報を受信することができる。 The portable terminal 100 can receive image data from the imaging device 10 by the wireless communication unit 154. Further, it is possible to transmit a transmission request for position information to the imaging device 10 and receive the position information.
 携帯端末100は、GPS部156によって、GPS衛星などの航法衛星からの測位用電波を受信して、携帯端末100自体の位置を測定することができる。 The portable terminal 100 can receive a positioning radio wave from a navigation satellite such as a GPS satellite by the GPS unit 156 and measure the position of the portable terminal 100 itself.
 電源部170は、充電可能な可充電電池であり、携帯端末100の各部へ電力を供給するようになっている。 The power supply unit 170 is a chargeable rechargeable battery, and supplies power to each unit of the portable terminal 100.
 記憶部152には、携帯電話機としての一般的なプログラムのほか、画像データ取得プログラム、物体認識プログラム、距離推定プログラム、接近判断プログラム、速度変化認識プログラム、通知条件判断プログラム、通知態様決定プログラム、通知プログラム、及び、追跡プログラムが格納されている。CPU150と画像データ取得プログラムは画像データ取得手段の一例である。CPU150と物体認識プログラムは物体認識手段の一例である。CPU150と距離推定プログラムは距離推定手段の一例である。CPU150と接近判断プログラムは接近判断手段の一例である。CPU150と速度変化認識プログラムは速度変化認識手段の一例である。CPU150と通知条件判断プログラムは通知条件判断手段の一例である。CPU150と通知態様決定プログラムは通知態様決定手段の一例である。CPU150と通知プログラムは通知手段の一例である。CPU150と追跡プログラムは追跡手段の一例である。 The storage unit 152 includes a general program as a mobile phone, an image data acquisition program, an object recognition program, a distance estimation program, an approach determination program, a speed change recognition program, a notification condition determination program, a notification mode determination program, notification A program and a tracking program are stored. The CPU 150 and the image data acquisition program are an example of an image data acquisition unit. The CPU 150 and the object recognition program are examples of object recognition means. The CPU 150 and the distance estimation program are examples of distance estimation means. The CPU 150 and the approach determination program are an example of the approach determination means. The CPU 150 and the speed change recognition program are an example of the speed change recognition means. The CPU 150 and the notification condition determination program are an example of notification condition determination means. The CPU 150 and the notification mode determination program are an example of the notification mode determination means. The CPU 150 and the notification program are an example of notification means. The CPU 150 and the tracking program are an example of the tracking means.
 携帯端末100は、画像データ取得プログラムによって、撮影装置10から画像データを無線通信によって取得する。 The portable terminal 100 acquires image data from the imaging device 10 by wireless communication according to an image data acquisition program.
 携帯端末100は、物体認識プログラムによって、画像データに含まれる物体を認識し、監視対象を認識する。本実施形態において、監視対象は、赤色のジャケットを着ている男の人間であるとする。 The portable terminal 100 recognizes an object included in the image data by the object recognition program, and recognizes a monitoring target. In the present embodiment, it is assumed that the monitoring target is a man wearing a red jacket.
 携帯端末100は、物体認識プログラムによって、深層学習(ディープラーニング)によって生成された特徴データを参照する。ここで、深層学習とは、多層構造のニューラルネットワークの機械学習であり、画像認識の分野が有力な活用分野の一つである。 The portable terminal 100 refers to the feature data generated by deep learning by the object recognition program. Here, deep learning is machine learning of a multi-layered neural network, and the field of image recognition is a powerful application field.
 携帯端末100は、カテゴリーごとの多数の画像認識用特徴データ(以下、「特徴情報」とも呼ぶ。)に基づいて、撮影装置10から取得した画像データに含まれる物体の特徴を識別して、物体のカテゴリーを認識できるようになっている。このカテゴリーの認識を一般物体認識と呼ぶ。図4の画像認識用特徴データの概念図に示すように、カテゴリーは、例えば、「人間」、「犬」、「猫」等である。各カテゴリーは、細分化されており、「人間」のカテゴリーの中に、「男」、「女」等のサブカテゴリー1があり、サブカテゴリー1の「男」の中に、「ジャケット」、「パンツ」などのサブカテゴリー2があり、サブカテゴリー2の「ジャケット」の中に、「赤色」「青色」等のサブカテゴリー3がある。記憶部152には、上述のカテゴリーごとに、多数の特徴データが格納されている。携帯端末100は、例えば、「人間」の多数の画像例に基づいて取得した「人間」の特徴情報を参照することができるようになっている。なお、本実施形態とは異なり、特徴情報は、外部の記憶装置に記憶しておいて、携帯端末100が外部の記憶装置にアクセスして参照するようにしてもよい。 The portable terminal 100 identifies the feature of the object included in the image data acquired from the imaging device 10 based on a large number of image recognition feature data (hereinafter also referred to as “feature information”) for each category, It is possible to recognize the category of This category of recognition is called general object recognition. As shown in the conceptual diagram of the image recognition feature data in FIG. 4, the categories are, for example, “human”, “dog”, “cat” and the like. Each category is subdivided, and there is a subcategory 1 such as "male" or "female" in the "human" category, and a "jacket" or "jacket" in the subcategory 1 "male". There is a subcategory 2 such as "pants" and a subcategory 3 such as "red" and "blue" in the "jacket" of the subcategory 2. A large number of feature data are stored in the storage unit 152 for each of the categories described above. For example, the portable terminal 100 can refer to feature information of “human” acquired based on a large number of “human” image examples. Unlike the present embodiment, the feature information may be stored in an external storage device, and the portable terminal 100 may access and refer to the external storage device.
 一般物体認識においては、画像データについて、例えば、輪郭や個々の構成の方向といった特徴を多数抽出し、ディープラーニングで取得した各カテゴリーの特徴と対比して、相関性(相関度)を判断する。相関度が高いほど、取得した画像中の物体が特定のカテゴリーに属する可能性が高い。例えば、相関度が0の場合には、特定のカテゴリーに属する可能性(以下、「カテゴリー共通確率」と呼ぶ。)は0として、相関度が最大値を示すときに、カテゴリー共通確率が100%であると定義する。携帯端末100は、カテゴリー共通確率が所定の基準値である、例えば、95%以上であるときに、取得した画像中の物体のカテゴリーが、特定のカテゴリーに属すると判断する。 In general object recognition, for image data, for example, a large number of features such as contours and directions of individual configurations are extracted, and the correlation (degree of correlation) is determined in comparison with the features of each category acquired by deep learning. The higher the degree of correlation, the more likely an object in the acquired image belongs to a particular category. For example, when the degree of correlation is 0, the possibility of belonging to a specific category (hereinafter referred to as "category common probability") is 0, and when the degree of correlation indicates the maximum value, the category common probability is 100%. It is defined as The portable terminal 100 determines that the category of the object in the acquired image belongs to the specific category when the category common probability is a predetermined reference value, for example, 95% or more.
 取得した画像中に赤色のジャケットを着ている男の人間が含まれる場合、携帯端末100は、上述の一般物体認識によって、特徴データを参照し、「人間」であり、「男」であり、「ジャケット」を着ており、ジャケットが「赤色」であると認識し、画像中の物体が監視対象であると認識する。 When the acquired image includes a man wearing a red jacket, the portable terminal 100 refers to the feature data by the general object recognition described above, and is “human” and “male”, The "jacket" is worn, and the jacket is recognized as "red", and the object in the image is recognized as a monitoring target.
 携帯端末100は、距離推定プログラムによって、監視対象と自動車300との間の距離を推定する。撮影装置10の画素数は例えば、300万画素である。撮影装置10の画素数と携帯端末100の画素数は必ずしも同数である必要はないが、本実施形態においては、携帯端末100の画総数は撮影装置10の画総数と同一であるとする。 The portable terminal 100 estimates the distance between the monitoring target and the vehicle 300 by the distance estimation program. The number of pixels of the imaging device 10 is, for example, 3 million. The number of pixels of the imaging device 10 and the number of pixels of the portable terminal 100 do not necessarily have to be the same. However, in the present embodiment, it is assumed that the total number of images of the portable terminal 100 is the same as the total number of images of the imaging device 10.
 図5(a)に示すように、監視対象は画像データにおいて、その大きさに応じて、所定の画素数を占める。監視対象が占める画素数が多いほど、自動車300との距離は近い。携帯端末100は、例えば、1m(メートル)の高さの棒状部材が500m離れた位置にあるときの高さ方向の画素数、200メートル離れた位置にあるときの高さ方向の画素数、100メートル離れた位置にあるときの高さ方向の画素数、50メール離れた位置にあるときの高さ方向の画素数、10メートル離れた位置にあるときの高さ方向の画素数というような基本基準値を記憶している。また、カテゴリーごとに、例えば、「男」の「人間」であれば、平均身長が170cm(センチメートル)であるというカテゴリー基準値を記憶している。携帯端末100は、基本基準値とカテゴリー基準値を参照して、実際の監視対象が占める画素数に基づいて、監視対象と自動車300との間の距離を推定するようになっている。例えば、「男」の「人間」であれば、平均身長が170cmとしたから、所定の距離における高さ方向における画素数は、1mの高さに基づく基本基準値Aの1.7倍の画素数(A×1.7)として計算する。なお、本実施形態において、推定した距離は正確であることを要せず、簡潔な構成が望ましいから、画像中の画素数に基づいて距離を推定するという構成となっている。 As shown in FIG. 5A, the monitoring target occupies a predetermined number of pixels in the image data according to the size thereof. As the number of pixels occupied by the monitoring target is larger, the distance to the car 300 is shorter. For example, the number of pixels in the height direction when the rod-like member having a height of 1 m (meters) is 500 m away, the number of pixels in the height direction when the rod-like member is 200 m away, 100 The number of pixels in the height direction at meters away, the number of pixels in the height direction at 50 emails away, the number of pixels in the height direction at 10 meters away The reference value is stored. In addition, for each category, for example, in the case of a "man" "human", a category reference value that the average height is 170 cm (centimeter) is stored. The portable terminal 100 is configured to estimate the distance between the monitoring target and the vehicle 300 based on the number of pixels occupied by the actual monitoring target with reference to the basic reference value and the category reference value. For example, in the case of a "man" "human", the average height is 170 cm, so the number of pixels in the height direction at a predetermined distance is 1.7 times the basic reference value A based on a height of 1 m. Calculate as a number (A × 1.7). In the present embodiment, since the estimated distance does not need to be accurate and a simple configuration is desirable, the distance is estimated based on the number of pixels in the image.
 携帯端末100は、接近判断プログラムによって、監視対象が自動車300に接近しているか否かを判断する。例えば、図6に示すように、時刻t1における監視対象が占める画素数(図6(a)参照)よりも、時刻t1から時間Δtだけ経過した時刻t2における監視対象が占める画素数(図6(b)参照)が多い場合に、携帯端末100は、監視対象が自動車300に接近していると判断する。なお、処理を簡潔にするために、画素数は、高さ方向の画素数のみを使用する。 The portable terminal 100 determines whether the monitoring target approaches the car 300 according to the approach determination program. For example, as shown in FIG. 6, the number of pixels occupied by the monitoring target at time t2 when time Δt has passed from time t1 than the number of pixels occupied by the monitoring target at time t1 (see FIG. b) When there are many cases), the mobile terminal 100 determines that the monitoring target is approaching the car 300. In order to simplify the process, only the number of pixels in the height direction is used.
 携帯端末100は、速度変化認識プログラムによって、監視対象が自動車300に接近する速度の変化を認識する。携帯端末100は、例えば、時刻t1から時間Δtだけ経過した時刻t2との間において監視対象が占める画素数の増加δB1が、その後の時間帯である、時刻t2から時間Δtだけ経過した時刻t3との間において監視対象が占める画素数の増加δB2よりも小さかった場合(δB1<δB2)には、監視対象が速度を速くしていると判断する。 The portable terminal 100 recognizes a change in speed at which the monitoring target approaches the vehicle 300 by the speed change recognition program. For example, between time t1 and time t2 at which time Δt has elapsed from time t1, the increase δB1 in the number of pixels occupied by the monitoring target is time t2, which is the time zone thereafter, and time t2 has elapsed from time t2. If the increase in the number of pixels occupied by the monitoring target during the period is smaller than .delta.B2 (.delta.B1 <.delta.B2), it is determined that the monitoring target is accelerating.
 携帯端末100は、通知条件判断プログラムによって、携帯端末100の使用者である運転者350に通知条件の成立を通知するか否かを判断する。通知条件は、例えば、図5(a)に示すように、監視対象を1名、画像データ中に認識した場合、図5(b)に示すように、監視対象を2名、画像データ中に認識した場合、あるいは、図6(a)及び(b)に示すように、監視対象が自動車300に接近している場合、あるいは、監視対象が速度を速めて自動車300に接近している場合である。通知条件は、携帯端末100の使用者である運転者350が、予め選択し、設定しておくことができるようになっている。 The portable terminal 100 determines whether to notify the driver 350 who is the user of the portable terminal 100 of establishment of the notification condition by the notification condition determination program. For example, as shown in FIG. 5A, when one monitoring target is recognized in the image data, as the notification condition, as shown in FIG. 5B, two monitoring targets are displayed in the image data. When it is recognized, or as shown in FIGS. 6 (a) and 6 (b), the monitored object approaches the vehicle 300, or the monitored object approaches the vehicle 300 at a high speed. is there. The notification conditions can be selected and set in advance by the driver 350 who is the user of the portable terminal 100.
 携帯端末100は、通知態様決定プログラムによって、監視対象と自動車300との間の距離に基づいて、外部(運転者350)に通知する通知態様を決定する。携帯端末100は、図7に示すように、通知態様データを参照し、監視対象と自動車300との距離に応じて通知態様を決定する。通知態様データは、記憶部152に格納されている。距離200m~500mの場合の通知態様Aは、例えば、比較的小さな音(あるいは周波数の低い振動や低い音)をスピーカーから発生させることであり、距離100m~200mの通知態様Bは、通知態様Aよりも大きな音(あるいは周波数が大きい振動、あるいは周波数が高い音)をスピーカーから発生させることであるというように、自動車300と監視対象との距離が短くなるほど、大きな音(あるいは周波数が高い振動、周波数が高い音)を発生させるように、規定されている。 The portable terminal 100 determines a notification mode for notifying the outside (driver 350) based on the distance between the monitoring target and the vehicle 300 by the notification mode determination program. As illustrated in FIG. 7, the portable terminal 100 refers to the notification mode data, and determines the notification mode according to the distance between the monitoring target and the vehicle 300. The notification mode data is stored in the storage unit 152. The notification mode A in the case of a distance of 200 m to 500 m is, for example, generation of a relatively small sound (or low frequency vibration or low sound) from a speaker, and the notification mode B of a distance 100 m to 200 m is a notification mode A As the distance between the automobile 300 and the object to be monitored is reduced, the louder sound (or high frequency vibration), as it means that the speaker generates a louder sound (or high frequency vibration or high frequency sound). It is specified to generate high frequency sounds).
 携帯端末100は、通知プログラムによって、通知態様決定プログラムで決定した通知態様で、外部(運転者350)に通知する。 The portable terminal 100 notifies the outside (driver 350) of the notification mode determined by the notification mode determination program by the notification program.
 携帯端末100は、通知条件の成立を外部(運転者350)に通知すると、追跡プログラムによって、撮影装置10から位置情報を取得し、自動車300の位置を追跡する。すなわち、携帯端末100は、通知条件の成立を外部に通知すると、撮影装置10に位置情報の送信要求を送信し、撮影装置10から継続的に位置情報を受信する。なお、携帯端末100は、外部への通知の後、常に追跡を実施するのではなく、所定条件が成立した場合に実施してもよい。所定条件は、例えば、運転者350によって、携帯端末100に追跡開始の指示が入力されたことである。携帯端末100は、撮影装置10から位置情報を受信すると、図8に示すように、撮影装置10の位置(自動車300の位置)を表示画面100a上に表示する。図8の例では、表示画面100aにおいて、撮影装置10の最初の位置P1から経路R1を通過し、現在位置P2に到達したことが示されている。すなわち、携帯端末100は、通知条件が成立すると、撮影装置10の位置(自動車300の位置)を追跡することができるから、自動者300が盗難にあっても、追跡することが可能である。 When the portable terminal 100 notifies the outside (driver 350) that the notification condition has been established, the tracking program acquires position information from the imaging device 10 and tracks the position of the car 300. That is, when the portable terminal 100 notifies the outside of establishment of the notification condition, the portable terminal 100 transmits a transmission request of position information to the imaging device 10 and continuously receives the position information from the imaging device 10. Note that the mobile terminal 100 may perform tracking after notification to the outside, instead of always performing tracking, and may be performed when a predetermined condition is satisfied. The predetermined condition is, for example, that the driver 350 has input an instruction to start tracking to the portable terminal 100. When receiving the position information from the photographing device 10, the portable terminal 100 displays the position of the photographing device 10 (the position of the car 300) on the display screen 100a as shown in FIG. In the example of FIG. 8, it is shown that the display screen 100 a has passed the route R1 from the first position P1 of the imaging device 10 and has reached the current position P2. That is, since the portable terminal 100 can track the position of the imaging device 10 (the position of the car 300) when the notification condition is satisfied, it can track even if the automated worker 300 is stolen.
 以下、携帯端末100及び撮影装置10の動作例を図9のフローチャートで説明する。撮影装置10は自動車300の外部を撮影することによって画像を取得し(ステップST1)、画像データを送信すると(ステップST2)、携帯端末100はその画像データを受信する(ステップST3)。携帯端末100は、画像認識を行い(ステップST4)、通知条件を満たすか否かを判断し(ステップST5)、通知条件を満たす場合には、通知態様を決定し(ステップST6)、通知する(ステップST7)。続いて、所定条件(追跡条件)を満たすか否かを判断し(ステップST8)、追跡条件を満たす場合には、撮影装置10に位置情報を要求し(ステップST9)、撮影装置10から位置情報を受信し(ステップST10)、撮影装置10(自動車300)を追跡する(ステップST11)。 Hereinafter, an operation example of the portable terminal 100 and the photographing device 10 will be described with reference to the flowchart of FIG. The photographing device 10 acquires an image by photographing the outside of the automobile 300 (step ST1), and when transmitting the image data (step ST2), the portable terminal 100 receives the image data (step ST3). The portable terminal 100 performs image recognition (step ST4), determines whether or not the notification condition is satisfied (step ST5), determines the notification mode when the notification condition is satisfied (step ST6), and notifies (step ST6) Step ST7). Subsequently, it is determined whether or not a predetermined condition (tracking condition) is satisfied (step ST8), and when the tracking condition is satisfied, the position information is requested to the imaging device 10 (step ST9). Are received (step ST10), and the imaging device 10 (car 300) is tracked (step ST11).
 上述の実施形態においては、撮影装置10は画像を取得し、携帯端末100が撮影装置10から画像データを受信して画像認識以後の処理を実施するものとして説明したが、運転者350への通知以外の処理を撮影装置10に実施させるようにしてもよい。この場合、撮影装置10は、通知条件が成立したことを示す情報を携帯端末100へ送信し、携帯端末100が外部へ通知する構成となり、撮影装置10と携帯端末100が一体となって監視装置、あるいは、監視システムを構成する。あるいは、運転者350への通知以外の処理の一部を撮影装置10に実施させるようにしてもよい。この場合においても、撮影装置10と携帯端末100が一体となって監視装置、あるいは、監視システムを構成する。また、撮影装置10は、撮影できる装置であれば、ドライブレコーダーに限らず、携帯電話機やスマートフォンであってもよい。 In the above-described embodiment, the imaging device 10 acquires an image, and the portable terminal 100 receives image data from the imaging device 10 and executes processing after image recognition. However, the driver 350 is notified Processing other than the above may be performed by the imaging device 10. In this case, the imaging device 10 transmits information indicating that the notification condition is satisfied to the portable terminal 100, and the portable terminal 100 notifies the outside, and the imaging device 10 and the portable terminal 100 are integrated and the monitoring device is integrated. Or, configure a monitoring system. Alternatively, part of the processing other than the notification to the driver 350 may be performed by the imaging device 10. Also in this case, the photographing device 10 and the portable terminal 100 integrally constitute a monitoring device or a monitoring system. Moreover, the imaging device 10 is not limited to the drive recorder, and may be a mobile phone or a smartphone as long as the imaging device 10 can capture an image.
<第二の実施形態>
 第二の実施形態について、第一の実施形態と異なる点を中心に説明する。第二の実施形態においては、撮影装置10A(図1参照)が監視装置の一例である。撮影装置10Aは、例えば、携帯電話機(スマートフォンを含む)である。第二の実施形態においては、撮影装置10Aと携帯端末100の双方が携帯電話機である。撮影装置10Aは、所定の条件を充足すると、運転者350が所持する携帯端末100に画像を送信するように構成されている。
Second Embodiment
The second embodiment will be described focusing on differences from the first embodiment. In the second embodiment, the imaging device 10A (see FIG. 1) is an example of a monitoring device. The imaging device 10A is, for example, a mobile phone (including a smartphone). In the second embodiment, both the photographing device 10A and the mobile terminal 100 are mobile phones. The imaging device 10A is configured to transmit an image to the portable terminal 100 possessed by the driver 350 when the predetermined condition is satisfied.
 撮影装置10Aの記憶部52(図2参照)には、携帯電話機としての一般的なプログラムのほか、画像データ取得プログラム、環境光算出プログラム、物体認識プログラム、距離推定プログラム、接近判断プログラム、計時プログラム、及び、通知プログラムが格納されている。CPU50と画像データ取得プログラムは画像データ取得手段の一例である。CPU50と環境光算出プログラムは環境光算出手段の一例である。CPU50と物体認識プログラムは物体認識手段の一例である。CPU50と距離推定プログラムは距離推定手段の一例である。CPU50と接近判断プログラムは接近判断手段の一例である。CPU50と計時プログラムは計時手段の一例である。物体認識プログラムは画素数算出プログラムを含み、CPU50と物体認識プログラムは画素数算出手段の一例でもある。 An image data acquisition program, an ambient light calculation program, an object recognition program, a distance estimation program, an approach determination program, and a clocking program are stored in the storage unit 52 (see FIG. 2) of the imaging device 10A. , And a notification program are stored. The CPU 50 and the image data acquisition program are an example of an image data acquisition unit. The CPU 50 and the ambient light calculation program are examples of ambient light calculation means. The CPU 50 and the object recognition program are examples of object recognition means. The CPU 50 and the distance estimation program are examples of distance estimation means. The CPU 50 and the approach determination program are an example of the approach determination means. The CPU 50 and the clocking program are an example of clocking means. The object recognition program includes a pixel number calculation program, and the CPU 50 and the object recognition program are also an example of the pixel number calculation means.
 撮影装置10Aは、画像データ取得プログラムによって、自動車300の外部を撮影し、画像データを取得する。 The imaging device 10A captures an image of the outside of the automobile 300 by the image data acquisition program, and acquires image data.
撮影装置10Aは、環境光算出プログラムによって、取得した画像データ全体の平均輝度を算出する。平均輝度とは、撮影装置10Aが取得した画像データのすべての画素の平均の輝度である。例えば、晴れの日の昼間であれば、平均輝度は高く、夜間であれば、平均輝度は低い。 The imaging device 10A calculates the average luminance of the entire acquired image data by the ambient light calculation program. The average luminance is the average luminance of all pixels of the image data acquired by the imaging device 10A. For example, if it is daytime on a clear day, the average brightness is high, and if it is night, the average brightness is low.
 撮影装置10Aは、物体認識プログラムによって、色と画素数によって、監視対象を認識する。監視対象は、所定の特徴を有する赤色の所定の物体であるとする。 The imaging device 10A recognizes the monitoring target according to the color and the number of pixels by the object recognition program. The monitoring target is a red predetermined object having a predetermined feature.
 撮影装置10Aが実施する色の認識について、説明する。撮影装置10Aは、物体認識プログラムによって、図10に示す平均輝度別HLSデータを参照する。 The color recognition performed by the photographing device 10A will be described. The imaging device 10A refers to the average luminance classified HLS data shown in FIG. 10 by the object recognition program.
 取得した画像データにおける監視対象の色は、画像を撮影するときの雰囲気(環境)の明るさによって、異なる。撮影装置10Aは、雰囲気(環境)の明るさを上述の平均輝度として認識する。撮影装置10Aの記憶部52には、監視対象の色が、平均輝度ごとに、所定範囲の色相(H)、所定範囲の輝度(L)、所定範囲の彩度(S)として記憶されている。図10に示すように、撮影装置10Aは平均輝度ごとの監視対象の色を記憶している。例えば、平均輝度x1がe1以上e2以下の場合には、監視対象の色相Hはa1以上a2以下の範囲であり、輝度Lはb1以上b2以下の範囲であり、彩度Sはc1以上c2以下の範囲である。なお、本実施形態とは異なり、平均輝度に代えて平均明度を使用し、監視対象の色として輝度(L)に代えて明度を使用してもよい。 The color of the monitoring target in the acquired image data differs depending on the brightness of the atmosphere (environment) at the time of shooting the image. The imaging device 10A recognizes the brightness of the atmosphere (environment) as the above-described average luminance. In the storage unit 52 of the photographing apparatus 10A, the color to be monitored is stored as hue (H) in a predetermined range, luminance (L) in a predetermined range, and saturation (S) in a predetermined range for each average luminance. . As shown in FIG. 10, the imaging device 10A stores the color of the monitoring target for each average luminance. For example, when the average luminance x1 is e1 or more and e2 or less, the hue H of the monitoring target is in the range of a1 or more and a2 or less, the luminance L is in the range of b1 or more and b2 or less, and the saturation S is c1 or more and c2 or less Range. Unlike the present embodiment, average brightness may be used instead of average brightness, and brightness may be used instead of brightness (L) as a color to be monitored.
 撮影装置10Aは、取得した画像データの平均輝度に対応する色相H、輝度L及び彩度Sを参照し、取得した画像データ中の物体の色相、輝度及び彩度と対比する。そして、画像データ中に監視対象の色相H、輝度L及び彩度Sの範囲を満たす物体があるか否かを判断することによって、監視対象である可能性の有無を判断するようになっている。 The photographing device 10A refers to the hue H, the luminance L and the saturation S corresponding to the average luminance of the acquired image data, and contrasts with the hue, the luminance and the saturation of the object in the acquired image data. Then, by judging whether or not there is an object satisfying the ranges of hue H, luminance L and saturation S of the monitoring target in the image data, the presence or absence of the possibility of being the monitoring target is determined. .
 撮影装置10Aは、さらに、所定の色を満たした物体について、その物体の画素数が距離ごとに規定された所定の画素数以上である場合に、監視対象であると認識するように構成されている。これにより、監視対象の誤認識を精度よく回避することができる。距離の推定は、距離推定プログラム及び接近判断プログラムによって行う。例えば、監視対象と撮影装置10Aとの距離が大きい場合には、画像データ中の監視対象の画素数は少なく、距離が小さい場合にはその画素数が大きい。 The imaging device 10A is further configured to recognize an object that satisfies a predetermined color as a monitoring target when the number of pixels of the object is equal to or greater than a predetermined number of pixels defined for each distance. There is. Thereby, false recognition of the monitoring target can be avoided with high accuracy. The estimation of the distance is performed by a distance estimation program and an approach determination program. For example, when the distance between the monitoring target and the imaging device 10A is large, the number of pixels in the image data to be monitored is small, and when the distance is small, the number of pixels is large.
 撮影装置10Aは、接近判断プログラムによって、監視対象が自動車300に接近しているか否かを判断する。具体的には、時刻t1における監視対象が占める画素数よりも、時刻t1から時間Δtだけ経過した時刻t2における監視対象が占める画素数が多い場合に、撮影装置10Aは、監視対象が自動車300に接近していると判断する。 The photographing apparatus 10A determines whether the monitoring target approaches the vehicle 300 according to the approach determination program. Specifically, when the number of pixels occupied by the monitoring target at time t2 when time Δt has elapsed from time t1 is larger than the number of pixels occupied by the monitoring target at time t1, the monitoring device 10A selects the car 300 as the monitoring target. Judge as approaching.
 撮影装置10Aは、計時プログラムによって、監視対象が自動車300に接近した後、監視対象が自動車300から離反しない状態が継続する時間を計測する。その時間は、例えば、30秒(s)である。 The photographing apparatus 10A measures a time during which the monitoring target does not depart from the car 300 after the monitoring target approaches the car 300 according to a clocking program. The time is, for example, 30 seconds (s).
 撮影装置10Aが、通知プログラムによって、所定の条件のもと、予め登録したアドレス(以下、「登録アドレス」とも呼ぶ。)に監視対象の画像を送信する。予め登録したアドレスは、携帯端末100のメールアドレスである。所定の条件は、例えば、撮影装置10Aが監視対象を認識した場合、監視対象が自動車300に接近した場合、あるいは、監視対象が自動車300に接近して離反しない状態が所定時間継続した場合などである。撮影装置10Aは、このように、複数回にわたって、予め登録したアドレスに監視対象の画像を送信する。 The imaging apparatus 10A transmits an image to be monitored to an address registered in advance (hereinafter, also referred to as a “registered address”) under a predetermined condition by the notification program. The address registered in advance is the e-mail address of the mobile terminal 100. The predetermined condition is, for example, when the imaging device 10A recognizes the monitoring target, when the monitoring target approaches the car 300, or when the monitoring target does not approach the car 300 and does not leave for a predetermined time. is there. Thus, the imaging device 10A transmits the image to be monitored to the address registered in advance a plurality of times.
 なお、本実施形態における画像認識は、物体の色、及び、物体の色が占める画素数に基づいて実施したが、本実施形態とは異なり、第一の実施形態と同様に、物体の形状を加味して実施してもよい。これにより、一層精度よく、監視対象を判断することができる。また、画像認識は、物体の色についてのみ実施することにより、簡易な構成としてもよい。 The image recognition in the present embodiment is performed based on the color of the object and the number of pixels occupied by the color of the object, but unlike the present embodiment, the shape of the object is the same as in the first embodiment. It may be carried out with consideration. As a result, the monitoring target can be determined with higher accuracy. Further, the image recognition may be simplified by implementing only the color of the object.
 以下、撮影装置10Aの動作例を図11のフローチャートで説明する。撮影装置10Aは画像を取得し(ステップST101)、画像認識を行い(ステップST102)、監視対象か否かを判断する(ステップST103)。ステップST103においては、画像全体の平均輝度を算出し、その平均輝度における監視対象の色相、輝度及び彩度の範囲を参照し、画像中の物体の色相、輝度及び彩度と対比することによって、物体が監視対象である可能性があるか否かを判断する。そして、監視対象である可能性がある物体が占める画素数が所定数以上であれば、物体が監視対象であると判断し、撮影装置10Aは、登録アドレス(携帯端末100のメールアドレス)へ画像を送信する(ステップST104)。続いて、撮影装置10Aは、監視対象が自動車300に接近していると判断すると(ステップST105)、登録アドレスへ画像を送信する(ステップST106)。さらに、撮影装置10Aは、監視対象が自動車300に接近している状態が所定時間継続していると判断すると(ステップST107)、登録アドレスへ画像を送信する(ステップST108)。撮影装置10Aは、例えば、携帯端末100から処理終了の信号を受信するなど、終了条件を満たすか否かを判断し(ステップST109)、終了条件を満たさない場合には、上記の処理を繰り返す。 Hereinafter, an operation example of the photographing device 10A will be described with reference to the flowchart of FIG. The imaging device 10A acquires an image (step ST101), performs image recognition (step ST102), and determines whether it is a monitoring target (step ST103). In step ST103, the average luminance of the entire image is calculated, and the hue, luminance and saturation range of the monitoring target in the average luminance are referenced to compare with the hue, luminance and saturation of the object in the image, It is determined whether the object is likely to be monitored. Then, if the number of pixels occupied by the object that is likely to be monitored is equal to or greater than the predetermined number, it is determined that the object is to be monitored, and the imaging device 10A transmits the image to the registered address (email address of the portable terminal 100). Are sent (step ST104). Subsequently, when it is determined that the monitoring target is approaching the car 300 (step ST105), the photographing apparatus 10A transmits an image to the registered address (step ST106). Furthermore, when it is determined that the state in which the monitoring target is approaching the car 300 continues for a predetermined time (step ST107), the photographing apparatus 10A transmits an image to the registered address (step ST108). The imaging device 10A determines whether the end condition is satisfied, for example, by receiving a signal of processing end from the portable terminal 100 (step ST109), and repeats the above process when the end condition is not satisfied.
 なお、本実施形態とは異なり、撮影装置10Aは、携帯端末100に直接的に画像を送信するのではなく、外部のサーバーを経由して、携帯端末100に画像を送信するようにしてもよい。また、撮影装置10Aは、取得した画像をすべて外部のサーバーに送信し、画像認識以後の処理をサーバーが実施するように構成してもよい。また、本実施形態とは異なり、基準となる輝度(基準輝度)における監視対象の所定範囲の色相(H)、所定範囲の輝度(L)、及び、所定範囲の彩度(S)を記憶しておいて、実際の平均輝度と基準輝度の相違から、実際の平均輝度における監視対象の所定範囲の色相(H)、所定範囲の輝度(L)、及び、所定範囲の彩度(S)を算出するようにしてもよい。例えば、基準輝度を50として、実際の平均輝度が60であれば、監視対象の所定範囲の色相(H)、所定範囲の輝度(L)、及び、所定範囲の彩度(S)を所定の割合で高い方向の数値に移行する。 Note that, unlike the present embodiment, the imaging device 10A may transmit an image to the portable terminal 100 via an external server instead of transmitting the image directly to the portable terminal 100. . Alternatively, the imaging device 10A may be configured to transmit all acquired images to an external server, and the server may perform processing after image recognition. Also, unlike the present embodiment, the hue (H) of the predetermined range of the monitoring target in the reference luminance (reference luminance), the luminance (L) of the predetermined range, and the saturation (S) of the predetermined range are stored. From the difference between the actual average luminance and the reference luminance, the hue (H) of the predetermined range to be monitored in the actual average luminance, the luminance (L) of the predetermined range, and the saturation (S) of the predetermined range It may be calculated. For example, if the reference luminance is 50 and the actual average luminance is 60, the hue (H) of the predetermined range to be monitored, the luminance (L) of the predetermined range, and the saturation (S) of the predetermined range are predetermined. Shift to higher numbers in proportion.
<第三の実施形態>
 第三の実施形態について、第二の実施形態と異なる点を中心に説明する。第三の実施形態においては、図12の撮影装置10Bが監視装置の一例である。撮影装置10Bは、携帯電話機である。第三の実施形態においては、水平方向において360度の角度の画像を取得できる360度カメラ12が、自動車300の停止中において外部の画像を取得し、撮影装置10Bがその画像を無線または有線で受信する。そして、撮影装置10Bは、サーバー400を経由して予め登録したアドレス(携帯端末100のメールアドレス)へ画像を送信するようになっている。さらに、撮影装置10Bは、監視対象が撮影装置10B自体で画像撮影が可能な方向に位置すれば、撮影装置10B自体においても画像を撮影し、サーバー400を経由して予め登録したアドレスへ画像を送信するようになっている。
Third Embodiment
The third embodiment will be described focusing on differences from the second embodiment. In the third embodiment, the imaging device 10B of FIG. 12 is an example of a monitoring device. The photographing device 10B is a mobile phone. In the third embodiment, the 360-degree camera 12 capable of acquiring an image of a 360-degree angle in the horizontal direction acquires an external image while the car 300 is stopped, and the imaging device 10B wirelessly or wired the image. To receive. Then, the photographing device 10B is configured to transmit an image to an address (e-mail address of the portable terminal 100) registered in advance via the server 400. Furthermore, when the monitoring device 10B is positioned in a direction in which the image capturing can be performed by the image capturing device 10B itself, the image capturing device 10B captures an image also in the image capturing device 10B itself, It is supposed to be sent.
 図13に示すように、サーバー400は、CPU452、記憶部454、無線通信部456、及び、電源部470を有する。サーバー400は、撮影装置10Bから画像を取得した場合に、予め登録したアドレス(携帯端末100のアドレス)へその画像を送信するようになっている。 As illustrated in FIG. 13, the server 400 includes a CPU 452, a storage unit 454, a wireless communication unit 456, and a power supply unit 470. When the server 400 acquires an image from the imaging device 10B, the server 400 transmits the image to an address (address of the portable terminal 100) registered in advance.
 撮影装置10Bの記憶部52(図2参照)には、携帯電話機としての一般的なプログラムのほか、画像データ取得プログラム、環境光算出プログラム、物体認識プログラム、距離推定プログラム、接近判断プログラム、計時プログラム、通知プログラム、第二画像データ取得プログラム、及び、第二通知プログラムが格納されている。CPU50と画像データ取得プログラムは画像データ取得手段の一例である。CPU50と環境光算出プログラムは環境光算出手段の一例である。CPU50と物体認識プログラムは物体認識手段の一例である。CPU50と距離推定プログラムは距離推定手段の一例である。CPU50と接近判断プログラムは接近判断手段の一例である。CPU50と計時プログラムは計時手段の一例である。CPU50と通知プログラムは通知手段の一例である。CPU50と第二画像データ取得プログラムは、第二画像取得手段の一例である。CPU50と第二通知プログラムは、第二通知手段の一例である。 An image data acquisition program, an ambient light calculation program, an object recognition program, a distance estimation program, an approach determination program, and a clocking program are stored in the storage unit 52 (see FIG. 2) of the imaging device 10B. , A notification program, a second image data acquisition program, and a second notification program are stored. The CPU 50 and the image data acquisition program are an example of an image data acquisition unit. The CPU 50 and the ambient light calculation program are examples of ambient light calculation means. The CPU 50 and the object recognition program are examples of object recognition means. The CPU 50 and the distance estimation program are examples of distance estimation means. The CPU 50 and the approach determination program are an example of the approach determination means. The CPU 50 and the clocking program are an example of clocking means. The CPU 50 and the notification program are an example of notification means. The CPU 50 and the second image data acquisition program are an example of a second image acquisition unit. The CPU 50 and the second notification program are an example of a second notification unit.
 撮影装置10Bは、画像データ取得プログラムによって、360度カメラ12が取得した画像を取得する。360度カメラ12からの画像の取得方法は、有線または無線である。無線の場合は、例えば、ブルートゥース(Bluetooth)やワイファイ(Wi-Fi)を利用する。 The imaging device 10B acquires an image acquired by the camera 12 by 360 degrees by the image data acquisition program. The method of acquiring an image from the 360 degree camera 12 is wired or wireless. In the case of wireless, for example, Bluetooth (Bluetooth) or Wi-Fi (Wi-Fi) is used.
 撮影装置10Bは、360度カメラ12から受信した画像中に監視対象を認識した場合、第二画像データ取得プログラムによって、監視対象の位置が撮影装置10Bのカメラによって画像を取得できる撮影可能方向であるか否かを判断し、撮影可能方向である場合に、撮影装置10Bのカメラによって監視対象の画像を取得する。撮影装置10Bは、携帯電話機であり、そのカメラは、特定方向(撮影可能方向)の画像のみを撮影することができる。360度カメラ12は、例えば、180度の角度を撮影できるレンズが互いに反対方向に2枚配置された撮影装置である。撮影装置10Bは、360度カメラ12のいずれのレンズが撮影した画像であるかを判断し、さらに、画像データにおける監視対象の位置に基づいて、監視対象が撮影可能方向に位置するか否かを判断する。 When the imaging device 10B recognizes the monitoring target in the image received from the 360-degree camera 12, the second image data acquisition program causes the camera of the imaging device 10B to acquire an image at the position of the monitoring target. It is determined whether or not it is the photographing possible direction, and the camera of the photographing device 10B acquires the image of the monitoring target. The imaging device 10B is a mobile phone, and the camera can capture only an image in a specific direction (a direction in which shooting is possible). The 360-degree camera 12 is, for example, an imaging device in which two lenses capable of imaging an angle of 180 degrees are arranged in mutually opposite directions. The photographing apparatus 10B determines which lens of the 360 degree camera 12 is the photographed image, and further, based on the position of the monitored object in the image data, whether or not the monitored object is positioned in the photographing possible direction to decide.
 撮影装置10Bは、第二通知プログラムによって、撮影装置10Bが撮影して取得した監視対象の画像をサーバー400に送信する。そして、サーバー400がその画像を携帯端末100のアドレスへ送信する。携帯端末100の使用者350は、360度カメラ12で取得した画像と撮影装置10Bで取得した画像の双方を参照して、自動車300の外部に監視対象が位置するか否かを判断することができる。 The imaging device 10B transmits, to the server 400, the image of the monitoring target acquired and acquired by the imaging device 10B according to the second notification program. Then, the server 400 transmits the image to the address of the portable terminal 100. The user 350 of the portable terminal 100 may determine whether the monitoring target is located outside the automobile 300 by referring to both the image acquired by the 360-degree camera 12 and the image acquired by the imaging device 10B. it can.
 以下、撮影装置10Bの動作例を図14のフローチャートで説明する。撮影装置10Bは360度カメラ12から画像を取得し(ステップST201)、画像認識を行い(ステップST202)、監視対象か否かを判断する(ステップST203)。物体が監視対象であれば、撮影装置10Bは、サーバー400経由で登録アドレス(携帯端末100のメールアドレス)へ画像を送信する(ステップST204)。続いて、撮影装置10Bは、監視対象が自動車300に接近していると判断すると(ステップST205)、登録アドレスへ画像を送信する(ステップST206)。さらに、撮影装置10Bは、監視対象が自動車300に接近している状態が所定時間継続していると判断すると(ステップST207)、登録アドレスへ画像を送信する(ステップST208)。撮影装置10Bは、例えば、サーバー400経由にて携帯端末100から処理終了の信号を受信するなど、終了条件を満たすか否かを判断し(ステップST209)、終了条件を満たさない場合には、上記の処理を繰り返す。 Hereinafter, an operation example of the photographing device 10B will be described with reference to a flowchart of FIG. The imaging device 10B acquires an image from the 360-degree camera 12 (step ST201), performs image recognition (step ST202), and determines whether it is a monitoring target (step ST203). If the object is a monitoring target, the imaging device 10B transmits an image to the registered address (e-mail address of the portable terminal 100) via the server 400 (step ST204). Subsequently, when it is determined that the monitoring target is approaching the automobile 300 (step ST205), the imaging device 10B transmits an image to the registered address (step ST206). Furthermore, when it is determined that the state in which the monitoring target is approaching the car 300 continues for a predetermined time (step ST207), the imaging device 10B transmits an image to the registered address (step ST208). The imaging device 10B determines whether the end condition is satisfied, for example, receives a signal of processing end from the portable terminal 100 via the server 400 (step ST209), and in the case where the end condition is not satisfied, Repeat the process of
 撮影装置10Bは、上述のステップST203、ステップST205及びステップST207において、条件を満たすと判断した場合には、撮影装置10B自体が監視対象が撮影可能な方向に位置するか否かを判断し(ステップST210)、撮影可能であると判断すると、監視対象を撮影し、監視対象の画像を予め登録した登録アドレスへ送信する(ステップST211)。 When it is determined that the condition is satisfied in step ST203, step ST205 and step ST207 described above, the imaging device 10B determines whether the imaging device 10B itself is positioned in a direction in which the monitoring target can be imaged (step ST210) If it is determined that photographing is possible, the monitoring target is photographed and an image of the monitoring target is transmitted to a registered address registered in advance (step ST211).
<第四の実施形態>
 第四の実施形態について、第三の実施形態と異なる点を中心に説明する。第四の実施形態においては、図12のサーバー400Aが監視装置の一例である。すなわち、撮影装置10C(図12参照)は、360度カメラ12から取得した画像データをサーバー400Aへ送信し、サーバー400Aが画像認識以降の処理を実施するようになっている。撮影装置10Cは、携帯電話機である。
Fourth Embodiment
The fourth embodiment will be described focusing on differences from the third embodiment. In the fourth embodiment, the server 400A of FIG. 12 is an example of a monitoring device. That is, the imaging device 10C (see FIG. 12) transmits the image data acquired from the 360 degree camera 12 to the server 400A, and the server 400A performs processing after image recognition. The imaging device 10C is a mobile phone.
 図13に示すサーバー400Aの記憶部454には、サーバーとしての一般的なプログラムのほか、画像データ取得プログラム、環境光算出プログラム、物体認識プログラム、距離推定プログラム、接近判断プログラム、計時プログラム、通知プログラム、第二画像データ取得プログラム、及び、第二通知プログラムが格納されている。CPU452と画像データ取得プログラムは画像データ取得手段の一例である。CPU452と環境光算出プログラムは環境光算出手段の一例である。CPU452と物体認識プログラムは物体認識手段の一例である。CPU452と距離推定プログラムは距離推定手段の一例である。CPU452と接近判断プログラムは接近判断手段の一例である。CPU452と計時プログラムは計時手段の一例である。CPU452と通知プログラムは通知手段の一例である。CPU452と第二画像データ取得プログラムは、第二画像取得手段の一例である。CPU452と第二通知プログラムは、第二通知手段の一例である。 The storage unit 454 of the server 400A shown in FIG. 13 includes an image data acquisition program, an ambient light calculation program, an object recognition program, a distance estimation program, an approach determination program, a clocking program, a notification program, in addition to a general program as a server. , A second image data acquisition program, and a second notification program are stored. The CPU 452 and the image data acquisition program are an example of an image data acquisition unit. The CPU 452 and the ambient light calculation program are an example of an ambient light calculation unit. The CPU 452 and the object recognition program are examples of object recognition means. The CPU 452 and the distance estimation program are examples of distance estimation means. The CPU 452 and the approach determination program are an example of the approach determination means. The CPU 452 and the clocking program are an example of clocking means. The CPU 452 and the notification program are an example of notification means. The CPU 452 and the second image data acquisition program are an example of a second image acquisition unit. The CPU 452 and the second notification program are an example of a second notification unit.
 以下、サーバー400Aの動作例を図15のフローチャートで説明する。サーバー400Aは、撮影装置10Cから360度カメラ12で取得した画像を取得し(ステップST301)、画像認識を行い(ステップST302)、監視対象か否かを判断する(ステップST303)。物体が監視対象であれば、サーバー400Aは、登録アドレス(携帯端末100のメールアドレス)へ画像を送信する(ステップST304)。続いて、サーバー400Aは、監視対象が自動車300に接近していると判断すると(ステップST305)、登録アドレスへ画像を送信する(ステップST206)。さらに、サーバー400Aは、監視対象が自動車300に接近している状態が所定時間継続していると判断すると(ステップST307)、登録アドレスへ画像を送信する(ステップST308)。サーバー400Aは、例えば、携帯端末100から処理終了の信号を受信するなど、終了条件を満たすか否かを判断し(ステップST309)、終了条件を満たさない場合には、上記の処理を繰り返す。 Hereinafter, an operation example of the server 400A will be described with reference to the flowchart of FIG. The server 400A acquires an image acquired by the camera 12 360 degrees from the imaging device 10C (step ST301), performs image recognition (step ST302), and determines whether it is a monitoring target (step ST303). If the object is a monitoring target, the server 400A transmits an image to the registered address (e-mail address of the portable terminal 100) (step ST304). Subsequently, when the server 400A determines that the monitoring target is approaching the automobile 300 (step ST305), the server 400A transmits an image to the registered address (step ST206). Furthermore, when determining that the state in which the monitoring target is approaching the car 300 continues for a predetermined time (step ST307), the server 400A transmits an image to the registered address (step ST308). The server 400A determines, for example, whether a termination condition is satisfied, such as receiving a processing termination signal from the portable terminal 100 (step ST309), and repeats the above processing if the termination condition is not satisfied.
 なお、本発明の監視装置、監視システム、監視方法及び監視プログラムは、上記実施形態に限らず、本発明の要旨を逸脱しない範囲内において種々変更を加えることができる。例えば、環境光算出プログラムや画像認識プログラム等の各プログラムを外部のサーバーに記憶させておいて、携帯端末100や撮影装置10A等は、クラウド(インターネット)経由にて、それらのプログラムを使用するように構成することもできる。 The monitoring apparatus, the monitoring system, the monitoring method, and the monitoring program of the present invention are not limited to the above embodiments, and various changes can be made without departing from the scope of the present invention. For example, each program such as an ambient light calculation program or an image recognition program is stored in an external server, and the portable terminal 100, the photographing apparatus 10A, etc. use those programs via the cloud (the Internet). It can also be configured.
10,10A,10B,10C 撮影装置
12 360度カメラ
100 携帯端末
300 自動車
352 運転者
360,362 監視対象
400,400A サーバー
 

 
10, 10A, 10B, 10C Imaging device 12 360 degree camera 100 Mobile terminal 300 Automobile 352 Driver 360, 362 Monitored target 400, 400A Server

Claims (17)

  1.  自動車の外部を監視する監視装置であって、
     物体の画像データを取得する画像データ取得手段と、
     前記画像データに含まれる前記物体のうち、監視対象を認識する物体認識手段と、
     前記物体認識手段が前記監視対象を認識した場合、所定の通知態様によって外部に通知する通知手段と、
    を有する監視装置。
    A monitoring device for monitoring the outside of a car,
    Image data acquisition means for acquiring image data of an object;
    An object recognition unit that recognizes a monitoring target among the objects included in the image data;
    Notification means for notifying the outside according to a predetermined notification mode when the object recognition means recognizes the monitoring target;
    Monitoring device having.
  2.  前記物体認識手段は、深層学習(ディープラーニング)によって生成されたデータを参照する請求項1に記載の監視装置。 The monitoring device according to claim 1, wherein the object recognition means refers to data generated by deep learning.
  3.  前記監視対象と前記自動車との間の距離を推定する距離推定手段を有し、
     前記通知条件は、前記監視対象が前記自動車から所定の距離以内に位置することである、
    請求項1または請求項2に記載の監視装置。
    A distance estimation unit configured to estimate a distance between the monitoring target and the vehicle;
    The notification condition is that the monitoring target is located within a predetermined distance from the vehicle.
    The monitoring device according to claim 1 or 2.
  4.  前記監視対象と前記自動車との間の距離に基づいて、外部に通知する通知態様を決定する通知態様決定手段を有する、
    請求項3に記載の監視装置。
    It has notification mode determination means for determining a notification mode to be notified to the outside based on the distance between the monitoring target and the car.
    The monitoring device according to claim 3.
  5.  前記監視対象が前記自動車に接近しているか否かを判断する接近判断手段を有し、
     前記通知条件は、前記監視対象が前記自動車に接近していることである、
    請求項3に記載の監視装置。
    It has an approach judgment means which judges whether the said monitoring object is approaching the said motor vehicle,
    The notification condition is that the monitored object is approaching the car.
    The monitoring device according to claim 3.
  6.  前記監視対象が前記自動車に接近する速度の変化を認識する速度変化認識手段を有し、
     前記通知条件は、前記速度が速くなったことである、
    請求項3に記載の監視装置。
    It has speed change recognition means for recognizing a change in speed at which the monitoring target approaches the vehicle,
    The notification condition is that the speed has become faster.
    The monitoring device according to claim 3.
  7.  前記通知手段によって前記通知条件を満たすことを所定の通知態様によって外部に通知した場合に、前記自動車の位置を示す位置情報を取得し、前記自動車を追跡する追跡手段を有する、
    請求項1または請求項2に記載の監視装置。
    It has tracking means for acquiring position information indicating the position of the vehicle and tracking the vehicle when the notification means notifies the outside that the notification condition is satisfied by a predetermined notification mode.
    The monitoring device according to claim 1 or 2.
  8.  前記画像データの平均輝度または平均明度を算出する環境光算出手段を有し、
     前記監視対象を規定する所定の色は、前記平均輝度または前記平均明度ごとに、所定範囲の色相、所定範囲の彩度、及び、所定範囲の輝度または明度によって規定されている、
    請求項1または請求項2に記載の監視装置。
    It has ambient light calculation means for calculating the average brightness or the average brightness of the image data,
    The predetermined color defining the monitoring target is defined by the hue of the predetermined range, the saturation of the predetermined range, and the luminance or brightness of the predetermined range, for each of the average brightness or the average brightness.
    The monitoring device according to claim 1 or 2.
  9.  前記所定の色が連続する画素数を算出する画素数算出手段を有し、
     前記物体認識手段は、前記画素数が距離ごとに規定された所定の画素数以上である場合に、前記画像データに含まれる前記物体が監視対象であると認識するように構成されている、
    請求項8に記載の監視装置。
    And a pixel number calculation unit that calculates the number of pixels in which the predetermined color continues.
    The object recognition means is configured to recognize that the object included in the image data is a monitoring target when the number of pixels is equal to or more than a predetermined number of pixels defined for each distance.
    The monitoring device according to claim 8.
  10.  自動車の外部を監視する監視システムであって、
     撮影手段が撮影した物体の画像データを取得する画像データ取得手段と、
     前記画像データに含まれる前記物体のうち、監視対象を認識する物体認識手段と、
     前記物体認識手段が前記監視対象を認識した場合、外部に通知する通知手段と、
    を有する監視システム。
    A monitoring system for monitoring the outside of a car,
    Image data acquisition means for acquiring image data of an object photographed by the photographing means;
    An object recognition unit that recognizes a monitoring target among the objects included in the image data;
    A notification unit that notifies the outside when the object recognition unit recognizes the monitoring target;
    Monitoring system with.
  11.  自動車の外部を監視する監視装置が、
     前記自動車の外部の物体の画像データを取得する画像データ取得ステップと、
     前記画像データに含まれる前記物体のうち、監視対象を認識する物体認識ステップと、
     前記物体認識ステップにおいて前記監視対象を認識した場合、外部に通知する通知ステップと、
    を有する監視方法。
    A monitoring device that monitors the outside of the car
    An image data acquisition step of acquiring image data of an object outside the car;
    An object recognition step of recognizing a monitoring target among the objects included in the image data;
    A notification step of notifying the outside when the monitoring target is recognized in the object recognition step;
    Monitoring method.
  12.  自動車の外部を監視する監視装置を管理する制御装置を、
     前記自動車の外部の物体の画像データを取得する画像データ取得手段、
     前記画像データに含まれる前記物体のうち、所定の色を有する監視対象を認識する物体認識手段、
     前記物体認識手段が前記監視対象を認識した場合、前記監視対象として認識された前記物体の画像を予め登録したアドレスに対して送信する通知手段、
    として機能させるための監視プログラム。
    A control device that manages a monitoring device that monitors the outside of the vehicle,
    Image data acquisition means for acquiring image data of an object outside the car,
    An object recognition unit that recognizes a monitoring target having a predetermined color among the objects included in the image data;
    A notification unit that transmits an image of the object recognized as the monitoring target to a previously registered address when the object recognition unit recognizes the monitoring target;
    Monitoring program to function as.
  13.  前記制御装置を、さらに、
     前記画像データの平均輝度または平均明度を算出する環境光算出手段として機能させ、
     前記所定の色は、前記平均輝度または前記平均明度ごとに、所定範囲の色相、所定範囲の彩度、及び、所定範囲の輝度または明度によって規定されている、
    請求項12に記載の監視プログラム。
    Furthermore, the control device may
    Function as ambient light calculation means for calculating average brightness or average brightness of the image data,
    The predetermined color is defined by the hue of the predetermined range, the saturation of the predetermined range, and the luminance or brightness of the predetermined range, for each of the average luminance or the average lightness.
    A monitoring program according to claim 12.
  14.  前記制御装置を、さらに、
     前記所定の色が連続する画素数を算出する画素数算出手段として機能させ、
     前記物体認識手段は、前記画素数が距離ごとに規定された所定の画素数以上である場合に、前記画像データに含まれる前記物体が監視対象であると認識するように構成されている、
    請求項13に記載の監視プログラム。
    Furthermore, the control device may
    Function as pixel number calculation means for calculating the number of consecutive pixels of the predetermined color,
    The object recognition means is configured to recognize that the object included in the image data is a monitoring target when the number of pixels is equal to or more than a predetermined number of pixels defined for each distance.
    The monitoring program according to claim 13.
  15.  前記制御装置を、さらに、
     前記監視対象が前記自動車に接近しているか否かを判断する接近判断手段として機能させ、
     前記通知手段は、前記監視対象が前記自動車に接近していることを条件に前記監視対象として認識された前記物体の画像を予め登録したアドレスに対して送信するように構成されている、
    請求項12乃至請求項14のいずれかに記載の監視プログラム。
    Furthermore, the control device may
    It functions as an approach determination unit that determines whether the monitoring target is approaching the vehicle,
    The notification means is configured to transmit an image of the object recognized as the monitoring target to an address registered in advance on condition that the monitoring target is approaching the vehicle.
    The monitoring program according to any one of claims 12 to 14.
  16.  前記制御装置を、さらに、
     前記監視対象が前記自動車に接近した後、前記監視対象が前記自動車から離反しない状態が継続する時間を計測する計時手段として機能させ、
     前記通知手段は、前記監視対象が前記自動車に接近した後、前記監視対象が前記自動車から離反しない状態が所定時間継続したことを条件に前記監視対象として認識された前記物体の画像を予め登録したアドレスに対して送信するように構成されている、
    請求項15に記載の監視プログラム。
    Furthermore, the control device may
    After the monitoring target approaches the car, it functions as a clocking means for measuring the time during which the monitoring target does not deviate from the car.
    The notification means previously registered the image of the object recognized as the monitoring target on condition that the monitoring target does not deviate from the vehicle for a predetermined time after the monitoring target approaches the vehicle. Configured to send to the address,
    The monitoring program according to claim 15.
  17.  前記画像データ取得手段は、水平方向における360度の画像を取得するように構成されており、
     さらに、前記制御装置を、
     前記画像データ取得手段によって取得した前記画像中に前記監視対象が含まれていた場合に、前記監視対象について、前記画像取得手段によって取得する画像とは異なる画像を取得する第二画像取得手段、及び、
     前記第二画像取得手段によって取得した画像を予め登録したアドレスに対して送信する第二通知手段、
    として機能させるための請求項12に記載の監視プログラム。

     
    The image data acquisition means is configured to acquire a 360 degree image in the horizontal direction,
    Furthermore, the control device
    A second image acquisition unit that acquires an image different from the image acquired by the image acquisition unit, when the monitoring object is included in the image acquired by the image data acquisition unit; ,
    A second notifying unit for transmitting an image acquired by the second image acquiring unit to an address registered in advance;
    The monitoring program according to claim 12 for functioning as.

PCT/JP2017/043994 2017-12-07 2017-12-07 Monitoring device, monitoring system, monitoring method, and monitoring program WO2019111375A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/043994 WO2019111375A1 (en) 2017-12-07 2017-12-07 Monitoring device, monitoring system, monitoring method, and monitoring program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/043994 WO2019111375A1 (en) 2017-12-07 2017-12-07 Monitoring device, monitoring system, monitoring method, and monitoring program

Publications (1)

Publication Number Publication Date
WO2019111375A1 true WO2019111375A1 (en) 2019-06-13

Family

ID=66751385

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/043994 WO2019111375A1 (en) 2017-12-07 2017-12-07 Monitoring device, monitoring system, monitoring method, and monitoring program

Country Status (1)

Country Link
WO (1) WO2019111375A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01153903A (en) * 1987-12-11 1989-06-16 Toshiba Corp Monitoring device
JP2003054370A (en) * 2001-08-08 2003-02-26 System Advance:Kk Method and system for vehicular anti-theft
JP2005250778A (en) * 2004-03-03 2005-09-15 Seiko Epson Corp Vertical direction decision of image
JP2006259828A (en) * 2005-03-15 2006-09-28 Omron Corp Monitoring system, device and method, recording medium and program
JP2007199840A (en) * 2006-01-24 2007-08-09 Denso Corp Theft prevention system for vehicle, theft prevention device for vehicle, theft prevention program for vehicle, and management system
JP2012109733A (en) * 2010-11-16 2012-06-07 Sumitomo Electric Ind Ltd Monitoring system and monitoring apparatus
JP2013171476A (en) * 2012-02-22 2013-09-02 Nec Corp Portable back camera system for face recognition crime prevention and crime prevention determination method used for the same
JP2017154530A (en) * 2016-02-29 2017-09-07 株式会社オートネットワーク技術研究所 On-vehicle machine and vehicle security system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01153903A (en) * 1987-12-11 1989-06-16 Toshiba Corp Monitoring device
JP2003054370A (en) * 2001-08-08 2003-02-26 System Advance:Kk Method and system for vehicular anti-theft
JP2005250778A (en) * 2004-03-03 2005-09-15 Seiko Epson Corp Vertical direction decision of image
JP2006259828A (en) * 2005-03-15 2006-09-28 Omron Corp Monitoring system, device and method, recording medium and program
JP2007199840A (en) * 2006-01-24 2007-08-09 Denso Corp Theft prevention system for vehicle, theft prevention device for vehicle, theft prevention program for vehicle, and management system
JP2012109733A (en) * 2010-11-16 2012-06-07 Sumitomo Electric Ind Ltd Monitoring system and monitoring apparatus
JP2013171476A (en) * 2012-02-22 2013-09-02 Nec Corp Portable back camera system for face recognition crime prevention and crime prevention determination method used for the same
JP2017154530A (en) * 2016-02-29 2017-09-07 株式会社オートネットワーク技術研究所 On-vehicle machine and vehicle security system

Similar Documents

Publication Publication Date Title
US11358525B2 (en) Methods and systems for providing alerts to a connected vehicle driver and/or a passenger via condition detection and wireless communications
US11364845B2 (en) Methods and systems for providing alerts to a driver of a vehicle via condition detection and wireless communications
JP7184148B2 (en) Monitoring system, management device and monitoring method
US10779337B2 (en) Method, apparatus and system for establishing connection between devices
US20190137622A1 (en) Method and System for Gauging External Object Movement and Conditions for Connected and Autonomous Vehicle Safety
JP2022033135A (en) Server, client terminal, control method and program
KR101481051B1 (en) Private black box apparatus and driviing method thereof
WO2017002240A1 (en) Monitoring system, photography-side device, and verification-side device
US10225525B2 (en) Information processing device, storage medium, and control method
US10341616B2 (en) Surveillance system and method of controlling the same
CN109204114B (en) Projection method and device for vehicle welcome lamp
US9692956B2 (en) Imaging apparatus, communication device, imaging method, and information notification method
CN107945512A (en) A kind of traffic accident treatment method and system
JP5018534B2 (en) Vehicle security system
JP6998726B2 (en) Monitoring equipment, monitoring system, monitoring method and monitoring program
US10872421B2 (en) Object tracking method and object tracking device
WO2019111375A1 (en) Monitoring device, monitoring system, monitoring method, and monitoring program
KR102480424B1 (en) Personal mobility having local monitogring function
JP2013219608A (en) Information processing apparatus, control method for information processing apparatus, and program
US11385641B2 (en) Information processing device and information processing method
JP6278065B2 (en) Information acquisition system and information acquisition method
JP2014016706A (en) Information acquisition system and information acquisition method
JP6429712B2 (en) Monitoring device and monitoring system
JP2022040931A (en) Traffic monitoring device, traffic monitoring system, traffic monitoring method, and traffic monitoring program
WO2019207749A1 (en) Computer system, moving body trajectory estimation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17933981

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17933981

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP