WO2017043404A1 - 映像記録装置 - Google Patents

映像記録装置 Download PDF

Info

Publication number
WO2017043404A1
WO2017043404A1 PCT/JP2016/075612 JP2016075612W WO2017043404A1 WO 2017043404 A1 WO2017043404 A1 WO 2017043404A1 JP 2016075612 W JP2016075612 W JP 2016075612W WO 2017043404 A1 WO2017043404 A1 WO 2017043404A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving object
recording
data
unit
vehicle
Prior art date
Application number
PCT/JP2016/075612
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
悠太 平野
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to CN201680052080.2A priority Critical patent/CN107950020B/zh
Publication of WO2017043404A1 publication Critical patent/WO2017043404A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/915Television signal processing therefor for field- or frame-skip recording or reproducing

Definitions

  • the present invention relates to a video recording apparatus that is mounted on a vehicle and records a video taken around the vehicle.
  • Patent Document 1 also describes that when an obstacle is detected via a sonar, only a camera for photographing the direction in which the obstacle is detected may be operated.
  • the recording mode for recording data corresponding to the captured video is just by starting shooting in that direction. It is set as a normal aspect.
  • the recording mode is a recording mode when data corresponding to a video is recorded on a recording medium. For this reason, even when an obstacle as a moving object is detected in a specific direction, a process for controlling the recording mode and optimizing the amount of recording medium used when recording the data is not performed.
  • the present invention provides a video recording apparatus that can appropriately leave the movement of a moving object around a vehicle as a record by appropriately adjusting the usage amount of the recording medium. It is an object.
  • the video recording apparatus includes an imaging unit, a recording medium, a moving object detection unit, a direction determination unit, and a recording control unit.
  • the shooting unit performs shooting in a plurality of directions around the vehicle.
  • data corresponding to the video imaged by the imaging unit in each direction is recorded in time series.
  • the moving object detection unit detects a moving object that moves relative to the vehicle.
  • the direction determining unit determines which one of the plurality of directions is closer to the moving object detected by the moving object detecting unit.
  • the recording control unit records the data related to the direction determined by the direction determination unit as the existence position is close.
  • the recording mode of the data with respect to the recording medium is controlled so that the amount of the recording medium to be used is larger than when the moving object detection unit does not detect the moving object.
  • the recording mode is changed, and the amount of recording medium used for recording the data in the direction is not detected. Larger than Therefore, the movement of the moving object around the vehicle can be recorded as a good record.
  • FIG. 1 is a block diagram of a video recording apparatus according to a first embodiment of the present invention. It is a figure which shows notionally the sonar installation position and detection range of the video recording apparatus of FIG. It is a figure which shows notionally the camera installation position and imaging
  • a video recording apparatus 1 shown in FIG. 1 includes a periphery monitoring unit 2 and a video recording unit 3.
  • the surrounding monitoring unit 2 is connected with six sonars 20FC, 20FR, 20FL, 20BC, 20BR, and 20BL (hereinafter also referred to simply as sonar 20 when it is not necessary to distinguish). Yes.
  • a sonar 20FC is provided at the front end portion of the vehicle C at the center of the front end surface of the vehicle C.
  • a sonar 20FR is provided at the front end portion of the right side surface of the vehicle C.
  • a sonar 20FL is provided at the front end portion of the left side surface of the vehicle C.
  • a sonar 20BC is provided at the center of the rear end surface of the vehicle C.
  • a sonar 20BR is provided at the rear end of the right side surface of the vehicle C.
  • a sonar 20BL is provided at the rear end of the left side surface of the vehicle C.
  • FIG. 2 conceptually shows the detection range by each sonar 20 by three arcs having a common center and angle and different radii.
  • the detection range of the sonar 20FC covers the vehicle width range on the front side of the vehicle C.
  • the detection range of the sonar 20FR covers the vehicle front side portion on the right side of the vehicle C.
  • the detection range of the sonar 20FL covers the vehicle front side portion on the left side of the vehicle C.
  • the detection range of the sonar 20BC covers the vehicle width range on the rear side of the vehicle C.
  • the detection range of the sonar 20BR covers a vehicle rear side portion on the right side of the vehicle C.
  • the detection range of the sonar 20BL covers the vehicle rear side portion on the left side of the vehicle C.
  • the peripheral monitoring unit 2 is configured as a microcomputer including a CPU (central processing unit), a ROM (read-only memory), a RAM (random access memory), an input / output interface, and the like (not shown).
  • a CPU central processing unit
  • ROM read-only memory
  • RAM random access memory
  • These detection results and determination results are output to the video recording unit 3 as electrical signals.
  • the video recording unit 3 is connected with four cameras 30F, 30R, 30L, and 30B (hereinafter simply referred to as the camera 30 when it is not necessary to distinguish them).
  • the four cameras 30 are provided on the front, rear, left and right of the vehicle C as shown in FIG. More specifically, the camera 30F has a shooting range in front of the periphery of the vehicle C, the camera 30R has a shooting range in the right side around the vehicle C, the camera 30L has a shooting range in the left side around the vehicle C, and the camera 30B has a shooting range in the vehicle C.
  • the rear of the periphery is the shooting range.
  • the camera 30F may be provided on the rear surface of the rearview mirror, or may be provided at other vehicle front positions.
  • the cameras 30R and 30L may be provided under the left and right side mirrors and under the auxiliary winkers of the side doors, or may be provided at other vehicle side positions.
  • the camera 30B may be provided inside the rear window, or may be provided at another vehicle rear position.
  • the video recording unit 3 includes input buffers 31F, 31R, 31L, and 31B, a recording video generation unit 33, a frame buffer 34, and a recording medium 35.
  • the recording medium 35 is configured by a known recording medium such as a flash memory, a hard disk, a RAM, and a video RAM.
  • the recorded video generation unit 33 is configured as a microcomputer including a CPU (central processing unit), a ROM (read-only memory), a RAM (random access memory), and an input / output interface (not shown). Data corresponding to the video taken by the camera 30 is recorded in the frame buffer 34. Then, the data recorded in the frame buffer 34 is recorded on the recording medium 35 in the same manner. More specifically, the recorded video generation unit 33 converts the data sent from each of the cameras 30F, 30R, 30L, and 30B to a digital that conforms to the standards of the frames F1, F2,.
  • the recorded video generation unit 33 receives the electrical signal (that is, the detection result and the determination result) indicating the presence and direction of the moving object described above from the periphery monitoring unit 2.
  • step S1 step S represents a step
  • step S1 the detection result of the presence or absence of a moving object, and the case where a moving object is detected
  • the direction in which the moving object is determined to be near that is, the direction in which the moving object is detected
  • step S2 it is determined whether or not the received detection result indicates that a moving object has been detected.
  • step S3 the process of recording the video in each direction with the frames F1, F2,... Equally allocated is performed, and the flow of the process proceeds to the above-described step S1.
  • step S3 as shown in FIG. 5, frames F1, F2,... are cyclically assigned in a predetermined order as captured in the front, rear, left, and right directions captured through the cameras 30F, 30B, 30L, 30R. Recorded in round robin format.
  • shooting data in front of the vehicle C shot by the camera 30F is recorded in the first frame F1.
  • this data is also referred to as shooting data of the previous camera.
  • photographing data on the right side of the vehicle C photographed by the camera 30R is recorded.
  • this data is also referred to as right camera shooting data.
  • photographing data on the left side of the vehicle C photographed by the camera 30L is recorded.
  • this data is also referred to as left camera shooting data.
  • shooting data behind the vehicle C shot by the camera 30B is recorded.
  • this data is also referred to as rear camera shooting data.
  • the respective data are cyclically recorded after the frame F5 in the order of the shooting data of the previous camera, the shooting data of the right camera, and so on.
  • An arrow T shown in FIG. 5 conceptually shows the time axis. The meaning of the arrow T is the same in FIGS.
  • step S1 if the detection result received in step S1 indicates that a moving object has been detected (that is, if the determination result in step S2 is YES), processing is performed.
  • the flow proceeds to step S4.
  • step S4 the video in the direction in which the moving object is detected is recorded using all the frames F1, F2,..., And the process flow proceeds to the above-described step S1. For example, when the moving object is detected in the forward direction, as illustrated in FIG. 6, the shooting data of the front camera is recorded in all the frames F1, F2,.
  • step S3 In the video recording apparatus 1 according to the first embodiment, when no moving object is detected (that is, when the determination result in step S2 is NO), shooting data in each of the front, rear, left, and right directions is a frame. F1, F2,... Are recorded by a round robin method that cyclically assigns them (step S3). For this reason, even when a moving object is not detected (that is, when the determination result in step S2 is NO), it is possible to record a video taken in each direction with high resolution.
  • step S3 In the video recording apparatus 1 according to the first embodiment, when no moving object is detected (that is, when the determination result in step S2 is NO), shooting data in each of the front, rear, left, and right directions is recorded. Therefore, the frames F1, F2,... Are evenly allocated (step S3). For this reason, when a moving object is not detected (that is, when the determination result in step S2 is NO), it is possible to evenly record images taken in each direction.
  • the cameras 30 are individually provided in the front, rear, left, and right directions, and shooting is performed in the corresponding directions. For this reason, it is not necessary to change the direction of the camera 30 with respect to the vehicle C, and the configuration of the apparatus can be simplified.
  • the cameras 30F, 30R, 30L, and 30B are the imaging unit
  • the sonars 20FC, 20FR, 20FL, 20BC, 20BR, and 20BL are the moving object detection unit
  • the periphery monitoring unit 2 is the direction determination unit.
  • the recording video generation unit 33 corresponds to a recording control unit.
  • step S3 of the present embodiment as shown in FIG. 7, each of the frames F1, F2,... Is equally divided into four parts, and the captured data of the previous camera, the captured data of the right camera, Are recorded by an image division method in which the shooting data of the camera and the shooting data of the rear camera are respectively recorded.
  • step S4 the entire area of each frame F1, F2,... Is used to record the shooting data in the direction in which the moving object is detected.
  • step S4 in the first embodiment shown in FIG. Is the same recording mode.
  • each frame F1, F2 when no moving object is detected (that is, when the determination result in step S2 is NO), each frame F1, F2,.
  • recording is performed by an image division method in which shooting data in each of the front, rear, left, and right directions is assigned to each divided area. For this reason, the video image
  • the cameras 30 are individually provided in the front, rear, left, and right directions, and shooting is performed in the corresponding directions. For this reason, compared with the case where the scan which changes the direction of the camera 30 with respect to the vehicle C is made, it can suppress more favorably that a decisive moment is overlooked.
  • the camera may be two in total, one at the front and back, or two at the left and right, and five or more (for example, six according to the position of each sonar 20) are provided. May be.
  • the number of sonars 20 can be variously changed. Further, the arrangement of the camera 30 or the sonar 20 can be changed variously.
  • the shooting data in the direction in which the moving object is detected includes all the frames F1, F2, and F2. Recorded using.... Instead, for example, when a moving object is detected in front, the recording form using the frames F1, F2,... In the order of front, right, left, rear,. ,... May be increased in the frequency of recording the shooting data in the direction in which the moving object is detected.
  • step S2 when a moving object is detected (that is, when the determination result in step S2 is YES), shooting data in the direction in which the moving object is detected is captured in each frame F1, F2,. Recorded using the entire area. Instead, for example, when a moving object is detected in front, the area in which the shooting data of the previous camera is recorded is 70% of the entire frame F, and the area in which the shooting data in other directions is recorded is the frame F. For example, the area in which shooting data in the direction in which the moving object is detected is recorded may be increased, such as 10% of the whole.
  • the order of right, left, back, ... was set.
  • the predetermined order may be arbitrary as long as it is a preset order, and can be changed as appropriate.
  • the predetermined order is an order that increases the frequency of recording the shooting data in a specific direction (in this example, front), such as front, right, front, back, front, left, front, right, and so on. This may be set even when a moving object is not detected.
  • all frames F1, F2 such as front, front, front, front,. ,... Are used for recording the shooting data related to the forward direction
  • the frames F1, F2,... are related to the forward direction, such as front, front, right, front, front, rear, front, front, left,. It is conceivable to change the recording form such as further increasing the frequency used for recording the photographic data.
  • the order in which the shooting data related to a specific direction is recorded is set when a moving object is not detected, the direction other than the specific direction is set.
  • an order may be set in which the frequency at which shooting data relating to the direction is recorded is increased to the same extent as the specific direction.
  • the predetermined ratio at which the frames F1, F2,... Are divided is 25%: 25% : 25%: 25%.
  • the predetermined ratio may be arbitrary as long as it is a preset ratio, and can be changed as appropriate. For example, the ratio of the area in which shooting data in a specific direction (front in this example) is recorded is increased such that the ratio of front: right: left: back is 40%: 20%: 20%: 20%.
  • the recording mode may be set even when a moving object is not detected.
  • the recording mode in which the ratio of the area in which the shooting data relating to the specific direction (the previous example) is recorded is set when the moving object is not detected, the specific direction
  • a recording mode is set in which the ratio of the area in which shooting data relating to the direction is recorded is increased to the same extent as the specific direction. May be.
  • the round robin method as in the first embodiment and the screen division method as in the second embodiment may be used in combination.
  • the frames F1, F2,... are used in a round robin manner in the order of front, right and left, rear,.
  • the frame F may be divided into two, and the shooting data on the right side and the shooting data on the left side may be recorded in one frame F.
  • the usage amount (that is, the number of frames or the size of the divided area) of the recording medium 35 used for recording the shooting data from any one of the cameras 30 is increased. Control was made. Instead, for example, when the shooting range of one camera 30 can be further subdivided, the usage amount of the recording medium 35 used for recording shooting data related to the specific shooting range of the specific camera 30 is large. It is possible to change to such a recording mode. In this case, for example, when a pedestrian is detected in the vicinity of the vehicle C, it is possible to record the shooting data relating to the shooting range including the pedestrian at a high frame rate and with a high resolution. .
  • the sonar 20 is used as the moving object detection unit.
  • the moving object detection unit may be a proximity sensor other than sonar, and the camera 30 may operate as a proximity sensor.
  • each camera 30 captures the direction in which each camera 30 captures is fixed with respect to the vehicle C.
  • two cameras 30 are provided at the right front end portion and the left rear end portion of the vehicle C, respectively, and are rotated so that the former camera 30 moves the front and right sides of the vehicle C to the latter camera 30.
  • the rear and left side of the vehicle may be taken.
  • an omnidirectional camera or the like can also be used as the photographing unit, and in that case, the entire circumference of the vehicle C may be photographed by one omnidirectional camera. Even in such a case, the same effect as in each of the above-described embodiments can be obtained by controlling the usage amount of the recording medium 35 as in each of the above-described embodiments.
  • recording on the recording medium 35 is performed as the frames F1, F2,.
  • any recording medium may be used as long as data corresponding to the captured video is recorded in time series.
  • various types of recording media such as a recording medium having no frame can be used.
  • the photographing unit, the recording medium, the moving object detection unit, the direction determination unit, and the recording control unit are all mounted on the vehicle C.
  • the data recorded in the frame buffer 34 may be transmitted to and recorded on a computer provided outside the vehicle C via a communication line such as the Internet.
  • the memory of the computer corresponds to a recording medium.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)
PCT/JP2016/075612 2015-09-10 2016-09-01 映像記録装置 WO2017043404A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680052080.2A CN107950020B (zh) 2015-09-10 2016-09-01 影像记录装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-178364 2015-09-10
JP2015178364A JP6485296B2 (ja) 2015-09-10 2015-09-10 映像記録装置

Publications (1)

Publication Number Publication Date
WO2017043404A1 true WO2017043404A1 (ja) 2017-03-16

Family

ID=58239600

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/075612 WO2017043404A1 (ja) 2015-09-10 2016-09-01 映像記録装置

Country Status (3)

Country Link
JP (1) JP6485296B2 (zh)
CN (1) CN107950020B (zh)
WO (1) WO2017043404A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019040364A (ja) * 2017-08-24 2019-03-14 株式会社デンソーテン 車両データ記録装置及び車両データ記録システム

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6683235B1 (ja) * 2018-11-14 2020-04-15 株式会社Jvcケンウッド 車両用記録制御装置、車両用記録装置、車両用記録制御方法およびプログラム
JP7435082B2 (ja) * 2020-03-16 2024-02-21 株式会社Jvcケンウッド ドライブレコーダ、データ記録方法およびプログラム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007288586A (ja) * 2006-04-18 2007-11-01 Matsushita Electric Ind Co Ltd 車両周囲状況確認装置
JP2009230343A (ja) * 2008-03-21 2009-10-08 Nec Personal Products Co Ltd ドライブレコーダー装置
JP2012019450A (ja) * 2010-07-09 2012-01-26 Yazaki Corp ドライブレコーダおよびドライブレコーダの映像記録方法
JP2013098919A (ja) * 2011-11-04 2013-05-20 Honda Motor Co Ltd 車両周辺監視装置
JP2013211623A (ja) * 2012-03-30 2013-10-10 Panasonic Corp ドライブレコーダ
JP2014236492A (ja) * 2013-06-05 2014-12-15 株式会社デンソー 車両用監視システム
JP2015088794A (ja) * 2013-10-28 2015-05-07 株式会社デンソー 車両周辺映像記録システム、ソナー制御装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007288586A (ja) * 2006-04-18 2007-11-01 Matsushita Electric Ind Co Ltd 車両周囲状況確認装置
JP2009230343A (ja) * 2008-03-21 2009-10-08 Nec Personal Products Co Ltd ドライブレコーダー装置
JP2012019450A (ja) * 2010-07-09 2012-01-26 Yazaki Corp ドライブレコーダおよびドライブレコーダの映像記録方法
JP2013098919A (ja) * 2011-11-04 2013-05-20 Honda Motor Co Ltd 車両周辺監視装置
JP2013211623A (ja) * 2012-03-30 2013-10-10 Panasonic Corp ドライブレコーダ
JP2014236492A (ja) * 2013-06-05 2014-12-15 株式会社デンソー 車両用監視システム
JP2015088794A (ja) * 2013-10-28 2015-05-07 株式会社デンソー 車両周辺映像記録システム、ソナー制御装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019040364A (ja) * 2017-08-24 2019-03-14 株式会社デンソーテン 車両データ記録装置及び車両データ記録システム

Also Published As

Publication number Publication date
JP6485296B2 (ja) 2019-03-20
CN107950020A (zh) 2018-04-20
JP2017055290A (ja) 2017-03-16
CN107950020B (zh) 2020-12-22

Similar Documents

Publication Publication Date Title
US10204275B2 (en) Image monitoring system and surveillance camera
JP6860433B2 (ja) 処理装置、処理システム、方法及びプログラム
CN106797421B (zh) 一种摄像机系统
WO2011108039A1 (ja) 障害物検知装置およびそれを備えた障害物検知システム、並びに障害物検知方法
JP6565188B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
WO2017043404A1 (ja) 映像記録装置
JP6991733B2 (ja) 制御装置、制御方法、及びプログラム
JP6589313B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
US20230283898A1 (en) Image processing device, image processing method and non-transitory computer-readable recording medium
JP7089355B2 (ja) 画像処理装置、画像処理装置の制御方法及びプログラム
US8155386B2 (en) Method and apparatus for obtaining depth information
JP2016173795A (ja) 画像処理装置、画像処理方法およびプログラム
JP4668863B2 (ja) 撮像装置
KR20160138885A (ko) 영상 처리 장치, 영상 처리 방법, 및 촬영 장치
JP2940582B2 (ja) 斜め後方走行車検出方法
JP6460510B2 (ja) 画像処理装置、画像処理方法及びプログラム
JP6763215B2 (ja) 信号処理装置
JP2009033358A (ja) 撮影装置
CN113784875B (zh) 相机位置检测装置和方法、相机单元以及存储介质
JP4643995B2 (ja) 表示制御装置
JP6953594B2 (ja) 画像処理装置、撮像装置、画像処理方法、プログラムおよび記録媒体
JP2024041310A (ja) 情報処理装置
WO2022168667A1 (ja) 画像処理装置および画像処理方法
EP4216536A1 (en) Image processing device and image processing program
WO2020095790A1 (ja) 被写体検知装置、システム、方法および記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16844265

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16844265

Country of ref document: EP

Kind code of ref document: A1