JP2013114536A - Safety support apparatus and safety support method - Google Patents

Safety support apparatus and safety support method Download PDF

Info

Publication number
JP2013114536A
JP2013114536A JP2011261494A JP2011261494A JP2013114536A JP 2013114536 A JP2013114536 A JP 2013114536A JP 2011261494 A JP2011261494 A JP 2011261494A JP 2011261494 A JP2011261494 A JP 2011261494A JP 2013114536 A JP2013114536 A JP 2013114536A
Authority
JP
Japan
Prior art keywords
vehicle
subject
safety support
information
pedestrian
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2011261494A
Other languages
Japanese (ja)
Other versions
JP2013114536A5 (en
Inventor
Toshiki Saito
敏樹 齋藤
Original Assignee
Seiko Epson Corp
セイコーエプソン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp, セイコーエプソン株式会社 filed Critical Seiko Epson Corp
Priority to JP2011261494A priority Critical patent/JP2013114536A/en
Publication of JP2013114536A publication Critical patent/JP2013114536A/en
Publication of JP2013114536A5 publication Critical patent/JP2013114536A5/en
Withdrawn legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To enable an object person to obtain information necessary for securing safety.SOLUTION: A safety support apparatus includes an imaging device for capturing an image in a predetermined range from a vehicle, an object person detection device for detecting an object person from the captured image, and a projection device for projecting information on the vehicle in a position at a predetermined distance from the detected object person.

Description

  The present invention relates to a safety support device and a safety support method.

  When driving a vehicle, the driver predicts the traveling area of the vehicle based on visual observation and experience. However, when the prediction is wrong, contact between the vehicle and the pedestrian may occur. Therefore, an apparatus for preventing such an accident has been considered.

  Patent Document 1 discloses that a human detection unit is mounted on a vehicle, and when a pedestrian is detected, visible light is irradiated around the pedestrian.

JP 2006-252264 A

  However, with the method of Patent Document 1, the pedestrian cannot know information necessary for ensuring safety, such as from which direction the vehicle is approaching. Therefore, it is desired that a target person such as a pedestrian can obtain information necessary for ensuring safety.

  The present invention has been made in view of such circumstances, and an object of the present invention is to enable a target person to obtain information necessary for ensuring safety.

The main invention for achieving the above object is:
An imaging device that captures a predetermined range of images from the vehicle;
A subject detection device for detecting a subject from the captured image;
A projection device that projects information about the vehicle at a position a predetermined distance from the detected subject;
Is a safety support device.

  Other features of the present invention will become apparent from the description of the present specification and the accompanying drawings.

It is a block diagram of the safety assistance apparatus 1 in this embodiment. 2 is an explanatory diagram of a mounting position of a laser projector. FIG. It is a flowchart of the safety assistance method in this embodiment. It is a figure which shows an example of the projection pattern. It is explanatory drawing of the projection of the projection pattern in this embodiment.

At least the following matters will become clear from the description of the present specification and the accompanying drawings. That is,
An imaging device that captures a predetermined range of images from the vehicle;
A subject detection device for detecting a subject from the captured image;
A projection device that projects information about the vehicle at a position a predetermined distance from the detected subject;
Is a safety support device.
In this way, information about the vehicle can be projected at a predetermined distance (near) from a target person such as a pedestrian, so the target person needs to secure safety by looking at the information about the vehicle projected in the vicinity. Information can be obtained.

In this safety support apparatus, it is preferable that the information is at least one of a traveling direction of the vehicle, a distance between the vehicle and the subject, and a speed of the vehicle.
By doing so, information necessary for ensuring safety can be projected, so that the subject can know information necessary for ensuring safety.

The projection apparatus preferably includes a laser projector.
Laser projectors are always in focus without passing through a lens, so they can be focused on any area of the road surface and always accurately show the projection pattern, even when mounted on the front grille of a vehicle. Can do.

The imaging device preferably includes an infrared camera.
By doing in this way, an image including a subject can be taken even in the dark at night.

In addition, it is desirable that the color of the information to be projected varies depending on the possibility of contact between the vehicle and the subject.
In this way, the subject can easily recognize the degree of possibility of contact.

In addition, it is preferable that the detection device further detects a traveling direction of the subject, and the information is projected forward of the traveling direction of the subject.
By doing in this way, the subject who is progressing forward can easily notice the projection pattern.

The detection device preferably includes a discriminator having a gradient direction histogram as a feature amount.
In this way, the target person can be detected based on the luminance information of the captured image.

In addition, at least the following matters will become clear from the description of the present specification and the accompanying drawings. That is,
Capturing a predetermined range of images from the vehicle;
Detecting a target person from the captured image;
Projecting information about the vehicle at a position a predetermined distance from the detected subject;
This is a safety support method including
In this way, information about the vehicle can be projected at a predetermined distance (near) from a target person such as a pedestrian, so the target person needs to secure safety by looking at the information about the vehicle projected in the vicinity. Information can be obtained.

=== Embodiment ===
FIG. 1 is a block diagram of a safety support apparatus 1 in the present embodiment. In the figure, a laser projector 10, a subject detection device 20, and an infrared camera 30 are shown.

  The laser projector 10 (projection device) is prepared separately from the headlight, and projects a projection pattern P, which will be described later, on the feet of the detected pedestrian M. The reason why the laser projector 10 is used here is that the laser projector 10 is projected without passing through a lens, and therefore is in focus at any position. Therefore, even when the projection pattern P is projected at any position, the projection pattern P can be projected without blurring.

  For example, when projection is performed from the front grill portion of the vehicle 5, the distance is different between a position close to the front grill portion and a position away from the front grill portion. Therefore, there is a possibility that a focused position and a non-focused position may occur when the lens is passed. However, since the laser projector 10 does not pass the lens, the laser projector 10 is focused at any position and projects the projection pattern P without blurring. be able to.

  Note that the projection device is not limited to the laser projector 10, and other devices may be used as long as the projection pattern P can be projected.

  The subject detection apparatus 20 detects whether or not the pedestrian M is included in an image captured by an infrared camera 30 described later. Further, the direction of the detected pedestrian M and the distance from the vehicle 5 are estimated. There are various detection methods for the pedestrian M, but here, detection is performed using a discriminator using Histograms of Oriented Gradients (HOG) features as described later.

  The infrared camera 30 captures the wavelengths of the mid-infrared and far-infrared and transmits a digital video signal to the subject detection device 20. Here, the middle infrared light is light having a wavelength of 2.5 μm to 4 μm, and the far infrared light is light having a wavelength of 4 μm to 1000 μm. In the present embodiment, the body temperature is detected using a wavelength of 8 to 14 μm, and an image corresponding to the detected temperature is output.

  FIG. 2 is an explanatory diagram of attachment positions of the laser projector 10 and the infrared camera 30. FIG. 2 shows a state in which the laser projector 10 and the infrared camera 30 are attached to the front grill portion of the vehicle 5. By doing in this way, the infrared camera 30 images a subject such as a forward pedestrian M, and the laser projector 10 projects a projection pattern P described later on the feet of the forward pedestrian M or the like. The attachment positions of the laser projector 10 and the infrared camera 30 are not limited to this, and may be attached to the rear portion of the vehicle.

  FIG. 3 is a flowchart of the safety support method in the present embodiment. Hereinafter, the safety support method in the present embodiment will be described with reference to the flowchart.

  First, it is determined whether or not the vehicle 5 is traveling (S102). Whether or not the vehicle is traveling can be determined by acquiring the speed information of the vehicle 5. If the vehicle is traveling, the process proceeds to step S104. On the other hand, if the vehicle is not running, the process is terminated.

  In step S104, imaging around the vehicle 5 (a predetermined range from the vehicle 5) is performed by the infrared camera 30 (S104). Then, based on the captured image, it is detected whether or not there is a pedestrian M (subject) around the vehicle 5 (S106).

  Various existing methods can be applied to the detection method of the pedestrian M. Here, as an example, a detection method using Histograms of Oriented Gradients (HOG) features is used.

  The detection of the pedestrian M is divided into a learning process, which is the previous stage, and a detection process in which detection is actually performed. In step S106, the detection process is used, and the learning process is performed in advance.

  First, the outline of the learning process will be described. In the learning process, the luminance gradient is calculated from each of a large amount of images (infrared image) including the pedestrian M and a large amount of images (infrared image) not including the pedestrian M. Next, a gradient direction histogram of luminance is created from the gradient intensity and gradient direction. The gradient direction histogram is normalized. Note that the gradient direction histogram may treat a plurality of pixels (for example, 5 × 5 pixels) as one cell.

  Adaboost is applied using such a gradient direction histogram as a feature amount. Features are automatically selected by the weak classifier of Adaboost after learning, and finally a discriminator that can discriminate between pedestrian M and other than pedestrian M is generated by weighted majority vote of a number of weak classifiers Is done.

  The discriminator is for each direction in which the pedestrian M is facing (for example, a discriminator walking in the right direction, a discriminator walking in the left direction, a discriminator walking in the direction approaching the vehicle, and It is desirable that the discriminator is walking in a direction away from the vehicle. This is because, by applying each of these classifiers, not only can the pedestrian M be detected, but also the detected direction of the pedestrian M can be detected.

  These classifiers are set in the subject detection device 20. Then, the subject detection device 20 detects whether or not the pedestrian M is included in the image sent from the infrared camera 30.

  Next, the detection process (S106) will be described. In the detection process, the pedestrian M is detected from the image (captured image) captured by the infrared camera 30 using the discriminator generated in the learning process. Specifically, the discriminator changes the detection window from the upper left of the captured image, and performs a plurality of raster scans. And it is detected whether the pedestrian M is included with respect to each detection window.

  When the pedestrian M is detected, the position of the detection window and the size of the scale are stored. Also, the type of discriminator used for detection (from the discriminator walking in the right direction, the discriminator walking in the left direction, the discriminator walking in the direction approaching the vehicle 5, and the vehicle 5 One of the discriminators walking in the direction of moving away is also stored.

  Further, the subject detection device 20 determines the direction of the pedestrian M from the vehicle 5 based on the position of the detection window. Further, the distance of the pedestrian M from the vehicle 5 is obtained from the size of the scale (the distance from the vehicle 5 is close if the size of the scale is large, and the distance from the vehicle 5 is long if the size is small). Furthermore, the advancing direction of the pedestrian M is calculated | required based on the kind of discriminator used for the detection. These pieces of information are used for generating and projecting the projection pattern P.

If a pedestrian M is detected as a result of the detection, a pattern to be projected onto the feet of the pedestrian M is generated (S108).
FIG. 4 is a diagram showing an example of the projection pattern P. As shown in FIG. In the figure, direction arrows and distances are shown in a circular pattern. For example, the circular pattern can be configured in yellow, which is easily noticeable even at night, and the arrow can be configured in red. The arrow may indicate the direction in which the vehicle to be projected is directed. Moreover, the display which shows distance can display the distance of the vehicle 5 and the pedestrian M to project.

  As described above, the direction in which the vehicle heads can be obtained from the positional relationship between the detected pedestrian M and the vehicle. The distance between the vehicle 5 to be projected and the detected pedestrian M can be obtained from the size of the detection window when the pedestrian M is detected.

  The projection pattern P is not limited to the above pattern. For example, the speed of the vehicle to be projected can be included in the pattern to be projected. Further, the shape of the projection pattern is not limited to a circle, and may be a rectangle. Further, the color of the projection pattern P is not limited to yellow, and the color can be varied according to the degree of danger.

Next, the generated projection pattern is projected in the vicinity of the detected pedestrian M (a position at a predetermined distance from the pedestrian M) (S110).
FIG. 5 is an explanatory diagram of projection of the projection pattern in the present embodiment. In the figure, a traveling vehicle 5, a pedestrian M, and a projection pattern P are shown. In this way, the projection pattern P is projected from the laser projector 10 of the vehicle 5 onto the feet of the pedestrian M. The pedestrian M can know the traveling direction of the approaching vehicle 5 and the distance to the vehicle 5 by looking at the projected projection pattern P. And based on these information, the safety measure which avoids a contact with a vehicle can be considered.

  In FIG. 5, the projection pattern P is projected on a line segment connecting the pedestrian M and the vehicle 5, but the projected position is not limited to this. For example, it is possible to project the detected pedestrian M ahead of the traveling direction. Further, the projection is not limited to the road surface, and may be performed on the wall surface.

  Further, here, the target person is the pedestrian M and the projection pattern is projected in the vicinity of the pedestrian M. However, the target person detection device 20 detects the other vehicles, and the projection pattern is projected in the vicinity of these other vehicles. P may be projected. By doing so, it is possible to notify information on the traveling direction of the vehicle 5 and the distance to the vehicle 5 that are approaching the driver driving the other vehicle.

  The above-described embodiments are for facilitating the understanding of the present invention, and are not intended to limit the present invention. The present invention can be changed and improved without departing from the gist thereof, and it is needless to say that the present invention includes equivalents thereof.

1 Safety support device,
5 vehicles,
10 Laser projector (projection device),
20 subject detection device,
30 Infrared camera (imaging device)

Claims (8)

  1. An imaging device that captures a predetermined range of images from the vehicle;
    A subject detection device for detecting a subject from the captured image;
    A projection device that projects information about the vehicle at a position a predetermined distance from the detected subject;
    A safety support device comprising:
  2.   The safety support device according to claim 1, wherein the information is at least one information of a traveling direction of the vehicle, a distance between the vehicle and the subject, and a speed of the vehicle.
  3.   The safety support apparatus according to claim 1, wherein the projection apparatus includes a laser projector.
  4.   The safety support apparatus according to claim 1, wherein the imaging apparatus includes an infrared camera.
  5.   The safety support device according to claim 1, wherein a color of the information to be projected is changed according to a possibility of contact between the vehicle and the subject.
  6. The subject detection device further detects the traveling direction of the subject,
    The safety support device according to claim 1, wherein the information is projected forward of the subject in the traveling direction.
  7.   The safety support device according to claim 1, wherein the subject detection device includes a discriminator having a gradient direction histogram as a feature amount.
  8. Capturing a predetermined range of images from the vehicle;
    Detecting a target person from the captured image;
    Projecting information about the vehicle at a position a predetermined distance from the detected subject;
    Including safety support methods.
JP2011261494A 2011-11-30 2011-11-30 Safety support apparatus and safety support method Withdrawn JP2013114536A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011261494A JP2013114536A (en) 2011-11-30 2011-11-30 Safety support apparatus and safety support method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011261494A JP2013114536A (en) 2011-11-30 2011-11-30 Safety support apparatus and safety support method

Publications (2)

Publication Number Publication Date
JP2013114536A true JP2013114536A (en) 2013-06-10
JP2013114536A5 JP2013114536A5 (en) 2015-01-15

Family

ID=48710013

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011261494A Withdrawn JP2013114536A (en) 2011-11-30 2011-11-30 Safety support apparatus and safety support method

Country Status (1)

Country Link
JP (1) JP2013114536A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001116564A (en) * 1999-10-15 2001-04-27 Toshiba Corp Mobile communication system
JP2004331021A (en) * 2003-05-12 2004-11-25 Nissan Motor Co Ltd Night obstacle informing device at night
JP2005161977A (en) * 2003-12-02 2005-06-23 Honda Motor Co Ltd Vehicular travel supporting device
JP2008143505A (en) * 2006-11-16 2008-06-26 Denso Corp Headlight control device
JP2010191793A (en) * 2009-02-19 2010-09-02 Denso It Laboratory Inc Alarm display and alarm display method
JP2010282388A (en) * 2009-06-04 2010-12-16 Mazda Motor Corp Vehicular pedestrian detection device
JP2011210238A (en) * 2010-03-10 2011-10-20 Dainippon Printing Co Ltd Advertisement effect measuring device and computer program
JP2012221162A (en) * 2011-04-07 2012-11-12 Toyota Central R&D Labs Inc Object detection device and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001116564A (en) * 1999-10-15 2001-04-27 Toshiba Corp Mobile communication system
JP2004331021A (en) * 2003-05-12 2004-11-25 Nissan Motor Co Ltd Night obstacle informing device at night
JP2005161977A (en) * 2003-12-02 2005-06-23 Honda Motor Co Ltd Vehicular travel supporting device
JP2008143505A (en) * 2006-11-16 2008-06-26 Denso Corp Headlight control device
JP2010191793A (en) * 2009-02-19 2010-09-02 Denso It Laboratory Inc Alarm display and alarm display method
JP2010282388A (en) * 2009-06-04 2010-12-16 Mazda Motor Corp Vehicular pedestrian detection device
JP2011210238A (en) * 2010-03-10 2011-10-20 Dainippon Printing Co Ltd Advertisement effect measuring device and computer program
JP2012221162A (en) * 2011-04-07 2012-11-12 Toyota Central R&D Labs Inc Object detection device and program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JPN6016005247; 竹内 大介 Daisuke Takeuchi: '安心・安全(画像センシング技術) Security and Safety (Image Sensing Technology)' パナソニック技報 第54巻 第4号 Panasonic Technical Journal 第54巻,第4号, 20090115, p.33-35, パナソニック株式会社 Panasonic Corporation *
JPN6016005249; 片岡 裕雄 Hirokatsu Kataoka: '対称性判断と輝度強度情報を加えたCoHOGによる歩行者検出 Pedestrians Detection Using Symmetry and' SSII2010 第16回 画像センシングシンポジウム講演論文集 [CD-ROM] 16th Symposium o , 20100609, IS4-01-1-IS4-01-6, 画像センシング技術研究会 *

Similar Documents

Publication Publication Date Title
KR20190028386A (en) Control of the host vehicle based on the characteristics of the detected parking vehicle
US8995723B2 (en) Detecting and recognizing traffic signs
US20190311206A1 (en) Vehicular vision system with auxiliary light source
US20150296135A1 (en) Vehicle vision system with driver monitoring
US8543254B1 (en) Vehicular imaging system and method for determining roadway width
US9904859B2 (en) Object detection enhancement of reflection-based imaging unit
US7486802B2 (en) Adaptive template object classification system with a template generator
US10643467B2 (en) System and method for detecting and recording traffic law violation events
KR101095234B1 (en) Sight-line end estimation device and driving assist device
JP5592441B2 (en) Light distribution control device
JP4123787B2 (en) Vehicle driving support device and vehicle driving support system
KR101579098B1 (en) Stereo camera, driver assistance apparatus and Vehicle including the same
JP5096836B2 (en) Method and system for imaging the surroundings of a vehicle
JP6445024B2 (en) Traffic sign recognition method and apparatus
JP5742937B2 (en) Visibility support device for vehicle
JP6013884B2 (en) Object detection apparatus and object detection method
JP5867807B2 (en) Vehicle identification device
EP2924671A1 (en) Automatic automotive vehicle classification system
JP5680573B2 (en) Vehicle driving environment recognition device
EP2234086B1 (en) Vehicle periphery monitoring device, vehicle, vehicle periphery monitoring program, and vehicle periphery monitoring method
JP6036065B2 (en) Gaze position detection device and gaze position detection method
US20120081544A1 (en) Image Acquisition Unit, Acquisition Method, and Associated Control Unit
EP2910971B1 (en) Object recognition apparatus and object recognition method
US9317754B2 (en) Object identifying apparatus, moving body control apparatus, and information providing apparatus
EP1923280B1 (en) Light-sensitive sensor in the automobile field

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20141121

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20141121

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20150107

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150723

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150728

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150909

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160216

A761 Written withdrawal of application

Free format text: JAPANESE INTERMEDIATE CODE: A761

Effective date: 20160407